Content ServicesInformation ManagementProducts

7 critical requirements for application retirement to the cloud

Finding a simplified and compliant application retirement framework

IT leaders executing a broad cloud infrastructure migration often face a considerable obstacle: obsolete or aging mainframe and server-resident applications that don’t fit any reasonable budgetary model for migration, leaving a massive local data center expense behind. Moving these applications into a hosted data center is too expensive, especially for applications that are now inactive and are operating only to house complex data for business continuity and compliance. Organizations struggling with these issues face very few options:

  • Do nothing – continue paying to maintain your mainframe and other applications and related systems
  • Migrate to a live application — mount an expensive data migration of business-complete data to a modern cloud application
  • Archive data — move business-complete data to a purpose-built cloud repository that provides both data accessibility and compliance

Given the ongoing expense and headaches of doing nothing, or the high cost of migrating data to a live application, compliant archiving is becoming the standout option. This is especially true because data from multiple applications can be combined into a single cloud infrastructure with centralized authentication, user access, data retention controls, and legal hold management. 

Consolidation of data within an archive can often improve data access and usability and organizational agility. Compared with antiquated legacy software, the right archive strategy can improve productivity while allowing for future changes to compliance, such as emerging privacy and data security requirements. Achieving savings through archiving is much easier when moving to cloud infrastructure, particularly with cumbersome, complex applications.

When archiving legacy data to the cloud for long-term retention and compliance, seven key factors should be considered:

  1. Accessibility — Once archived, how will users search and retrieve archived data and attachments they use in their daily work? Will the data still make sense once in the archive?
  2. Fidelity — Is the data maintained with fidelity to the original in terms of format, context, and compliance with the duty to preserve?  This applies to both structured data and document attachments.
  3. Agility — Can the archive help IT cope with complex changes to business processes, changes in compliance, privacy requirements, or respond to legal holds?
  4. Security — Is the data protected according to a “zero-trust” assumption, providing multiple layers of role-based access and encryption?  
  5. Privacy — Can sensitive data such as personal identifiers, credit information, and health history be protected separately from data needed in day-to-day operations?
  6. Data sovereignty — Can the archive manage data within the required jurisdiction without adding unnecessary management overhead, infrastructure and contract costs?
  7. Hosting options — Can the archive accommodate the preferred hosting model, whether managed services or public hyperscaler such as Amazon Web Services, Google Cloud, or Microsoft Azure?

Ultimately, what can be achieved with a well-planned, cloud-based archive can mean that users have simple, high-speed access to legacy data in a secure platform with radically reduced data hosting overhead and costs over the long term.

What benefits can IT derive from moving to a long-term archive strategy for their structured data and content?

  • Ease of deployment and updates — A proper cloud architecture allows the organization to focus on details that matter to their business, such as data fidelity and user access, not terse deployment and upgrade details
  • Additional low-cost storage options – IT should have multiple options for storing their data, including file system native, cloud-native services such as Amazon S3 or Google Coldline, and encryption with customer-owned keys
  • Elastic scalability to support petabytes of data — An archive should be able to scale up and down as necessary to address ingest loads or data exports or to wind down to save on usage costs when data demands are reduced
  • Ease of search, retrieval and controls — Archive data should remain accessible through flexible and contextually relevant search, data viewing, and attachment viewing options without requiring significant custom development projects
  • Cloud commits and the marketplace transactions — Firms participating in consumption commitment programs can utilize their existing commitment to apply as much as 100% of their archive costs to the value of a managed services offering with OpenText.  Marketplace programs are available with cloud vendors including Google Cloud, Amazon Web Services, and Microsoft Azure.

Solutions like OpenText™ InfoArchive address all aspects of archiving your most important legacy and business-complete data in the cloud through an innovative archiving offering. Some of the world’s largest companies entrust petabytes of high-value content and data to InfoArchive, taking advantage of the managed services available and low-maintenance, self-managed cloud archiving on public hyperscalers such as Amazon, Google, and Microsoft Azure.

Mike Safar

Mike Safar leads product marketing for OpenText information governance products and serves as a subject matter expert on information governance solutions and best practices. Mike's previous experience has been in product management and marketing of leading information governance products for over 25 years, starting with integrated document management, records systems, and most recently intelligent content analytics solutions. His past experience includes positions at Interwoven, Hewlett Packard Enterprise, and PC DOCS Group.

Related Posts

Back to top button