Enterprise Content Management

How Does Content Add-Value in Your Organization?

It’s January, and what does that mean?Lists, lists, lists. It seems to me that as a society we’re rapidlybecoming obsessed with boiling things down into a Top 10 list and inJanuary that seems to reach a fever pitch. Thanks Letterman. Even mycolleague Chris Walker has jumped on the bus with this list onSlideShare of “2013 IT anti-predictions”. SureI jest, but I have to admit, without my lists…I’d be lost. My to dolist, my goals list, my ‘don’t forget to take it’ list, my playlists forthe gym, my list of people I need to see this month. Seriously, I thinkI have somewhere in the area of 10 apps on my iPad and Blackberrydedicated to keeping my bits and pieces straight when I’m on the go, andI’d be lost without my Outlook Tasks. I never end my quest for theperfect app that does it all for me. But as I ended 2012, it becameclearer than ever—there’s no way I will ever finish checking off theentries. I need to think about what’s really important for me in 2013and prioritize! Thatprioritization and goal setting is critical for all of us, more so thanever in our professional lives. I’ve read countless blog posts andarticles over the past few weeks talking about the big picture trends.Bob Evans calls out what he believes are the hottest issues for CIOs in this articlefrom Forbes. Trends and priorities I’ve heard echoed many times over inthe discussions in and around OpenText. But when it comes to channelingmy inner Kreskin, there’s one tidbit I picked out of my colleagueDeborah Miller’s recent CMS Wire article that I’d like to delve into more. Deb pointed to recent IDC research that indicates: “Thetrend toward industry-specific solutions will be further driven by theincreased participation of line of business (LoB) executives in ITinvestment decisions. In 2013, nearly 60% of new IT investments willdirectly involve LoB execs (with them as the decision maker in 25% ofthe investments).” Makes sense. If there’s not a business case for using technology that either helpsyou make money, save money or improve the life of someone you touch – beit a customer or a team member – then why embark on the effort? Andwhen looking for those little golden nuggets of opportunity, few arebetter poised than those in the trenches to provide a practical view ofthe problems and the potential solutions. Whether you call it Kaizen,continuous improvement, six-sigma, or something else, the philosophy ofsmaller incremental change is proven to deliver the potential for largescale, impactful benefit.What do you need to be successful? Ideally, you need to breed a culturethat’s fluid, adaptable and insightful—and technology that can act asbuilding blocks to meet that challenge. This is where I see the trueadvantage of supporting process transformation with an Enterprise Information Management (EIM) based approach. GivenI spend my days focused on delivering compelling product experiencesfor our OpenText Content Server customers, a product that acts as thefoundation to an effective EIM strategy, I got to thinking at a morepragmatic level. I would like to hear more from you, our Content Servercustomers, about your goals and projects for 2013. A few of the many questions I’m interested in hearing your thoughts on: What strategies are you focused on to increase the value from your content? What business processes have you content-enabled with Content Server, and what’s next? How do you prioritize projects? If you feel strongly about a project that hasn’t been deemed a priority, how do you build the business case? How mature is your information governance strategy? And do you dovetail process improvement initiatives when building on that strategy? Ithink this could be a very enlightening discussion for many of us,myself included, so I really hope you’ll chime in. Or, whether you’realready an OpenText Content Server customer or not, if you have a specific project you’d like to discuss, drop me a note at aclarke@opentext.com or on Twitter @alimclarke.

Read More

A Chilling Look at Unchecked Information Growth

“New information is like snow and legacy collections of data are like glaciers: each year, layers are added, some melt away, but the accumulation keeps growing.” So writes analyst Grego Kosinski in Stop Employees from Hoarding Electronic Documents, a new Contoural white paper sponsored by OpenText. White Paper It’s an apt analogy, one I quite like. The image of a glacier – immense and inert – conjures up what many IT pros are facing in the near future without a solid plan for shoveling today’s snow. Without a solid archiving strategy, flexibility in your business may soon prove positively glacial. Ok, that’s enough with the metaphor. 🙂 Seriously though, the massive volume of content and information inside your business is a problem that’s not going away. Compounding and exponential growth of data from email, business applications, documents, and files of every type are presenting a major challenge for enterprises that need to keep business information over the long term. Archiving strategy addresses an increasingly critical situation compounded by skyrocketing volumes of data, a worldwide tightening of regulatory and compliance requirements, a growing need for litigation preparedness, and the reality of budget constraints. The lifecycle of enterprise content and information starts with the business applications used most in your organization: Email: Management of email solutions from on-premise solutions (Microsoft Exchange, Lotus Notes) and Cloud email (Office 365 Exchange Online, Gmail). ERP and CRM Suites: Business applications from firms like SAP require a reliable long-term archiving solution for millions of documents. Microsoft SharePoint: Managing SharePoint sites and mission critical business information across your enterprise. File systems: More than just backup, organizations need a plan to ease the expensive storage burden on file systems and transition this content to archives or delete unnecessary content. Social/Mobile Content:New forms of collaboration in the enterprise from social feeds to mobile communications are only increasing the speed and time sensitivity of EIM. Keeping a separate archive for each of these applications is proving too costly and complex. A centralized archiving strategy is a must. Quantifying the Business Value of Archiving Strategic archiving programs deliver long-term storage savings between 20 and 40%, compared to existing silo’d archiving environments. These savings are delivered by a combination of rationalization of storage infrastructure, smarter storage decisions and decommissioning of legacy systems. Reducing Storage Costs: Indiscriminate choices about where to store content during various phases of its lifecycle are costly. The refrain “storage is cheap” doesn’t apply when the repository is busting at the seams and is a disarray of un-indexed content and information. Reducing long term storage can be achieved with the following: Tiered storage: An enterprise-wide repository for long-term retention across multiple storage devices are cost appropriate for level of access required. A tiered approach to storing archived content supports the best storage mechanisms including optical media, hard disk, cloud and tape. The retention-controlled archive can run either online or as near line or off line storage. Only applications can access it – there are no capabilities for direct end-user access. Single-instance Archiving: Especially in highly collaborative work environments, identical documents can be a risk of being stored several times. Single instance archiving (SIA) ensures you keep the same document only once. Depending on the amount of expected redundancy of email attachments,SIA can reduce required storage space significantly. Compression: In order to save storage space, content should be compressed before writing to storage system. Legacy Decommissioning: Multiple archiving repositories across legacy systems are a complex reality that can grow into a major problem for organizations. These systems are often maintained simply to reference historical information and become more expensive over time to manage as administrator skillsets focus on more current systems.A roadmap is required to lead to a single archiving strategy in one repository with legacy decommissioning tools to migrate content or manage existing content in place. Security: Risks associated with unsecured archived information can be as costly as security breaches on more current enterprise information.Intellectual property and other sensitive organizational information in archives must be subject to corporate policies that protect all enterprise content. Timestamps can ensure that document components cannot be modified unnoticed after they have been archived, guaranteeing the authenticity of archived business documents.By encrypting the document data, critical data such as salary tables can archived securely. Learn more about OpenText Enterprise Information Archiving.

Read More

OpenText Debuts Regulated Documents 10 at DIA EDM Europe

This is a guest post by Ethan Smith, Life Sciences Industry Strategist, OpenText. For OpenText Life Sciences business, Europe is a critical market, and one where we have had great success. In late November, I was able to participate in and witness this first hand at the DIA European Electronic Document Management Conference in Munich. The conference had representation from Life Sciences companies across Central and Eastern Europe as well as a number of attendees from Asia. Presentations were made by regulators overseeing the submissions and approvals processes for the US FDA, EMA and the Chinese SFDA. Aside from the exciting agenda, this was a big event for OpenText as we debuted our newest release of our Regulated Documents offering. Reg Docs as it’s commonly known is now available for ContentServer 10, our best in class ECM platform. The power of Reg Docs comes from the fact that it is a configuration-based solution, rather than a customized version of the underlying platform, enabling our Life Sciences customers to deploy Reg Docs alongside other solutions on the same instance of ContentServer. We gave a number of demos of the solution and received very positive feedback from the attendees. It became clear as we looked around the hall of exhibitors that OpenText has taken a very unique and powerful approach to the document management challenges facing this industry. Many of the other solutions are custom applications that may be built using strong technologies, but the heavy customizations have rendered them difficult to maintain, upgrade and support. OpenText Regulated Documents seems to be the only solution of its kind in this space and we were very proud to show it off. To learn more, check out this whitepaper.

Read More

Impressions from ARMA 2012

It has taken me a couple of weeks to gather my thoughts from ARMA 2012 held in Chicago this year. As usual, it was a great conference; it provided lots of opportunity to meet with old friends and lots of opportunity to talk to existing and potential customers. It was also alot information and interaction crammed into a couple of days, hence the lag in pulling together my thoughts. First of all, this year, more than any other, I think I can say that Records Managers get it. For the last few years, quite honestly, not long after the FRCP changes in 2006, we have been telling anybody that would listen that Records Management has to expand beyond the traditional scope of “Official Records”. Records Management and Records Management practices now have to address much more than the Official Records if organizations want to address much of the risk and cost inherent to the ever expanding volume of Electronically Stored Information (ESI). This idea has now become firmly entrenched in Information Governance, in no small part due to the work of some of the thought leaders in Information Governance like Barclay Blair and Mark Diamond. Certainly at ARMA, there was lots and lots of information and vendor positioning in Information Governance and it was evident to me that Records Managers were embracing Information Governance. But (and there is always a but), I think there is still a big gap between embracing Information Governance and doing anything about it. It is that gap that I tried to address in my Solution Showcase presentation. Evolving RM to Information Governance to Protect Your Organizations from ludlow6 The basic premise is that, in order to take advantage of the benefits of retention control, it is necessary to be able to classify content. And, in order to classify content, we must be able to capture it some way for classification. This might sound obvious and seem simple, but these two problems, capture and classification, are the problems I see most organizations struggling with on a daily basis. They know the benefits of retention control, in particular the ability defensibly delete content, but they are also very worried about the impact classifying and capturing content will have on end-users and the way people work. First of all, there is the problem of capture. We need some way to have the content being created put under management. There are lots of options in this area. The OpenText strategy is to create integration points with the costliest and riskiest sources of content, and provide capabilities to capture and centrally archive that content. In particular, some of the costliest and potentially riskiest content can be found in Exchange, Notes, SharePoint, SAP and Oracle. Off-loading content from these sources can reduce IT costs and storage costs, but more importantly, place content somewhere where it is searchable, can be placed on litigation hold, and classified for retention and disposition. I also discussed what I believe to be one of the most disruptive technologies faced by ECM and RM professionals – File Synching. File Synching could potentially be one of the biggest risks in terms of users loading and sharing content using consumer-based file synching. Not only is there risk in terms of IP loss, etc., but we also have to assume that users are also circumnavigating our “official” ECM solutions. But, File Synching is also a massive opportunity for RM and ECM. If File Synching is attached to your ECM solution like OpenText Tempo, File Synching is also one of the best ways to encourage users to place content into the ECM system, not because we tell them to, but because they want to. File Synching is all about the making life easier for the end-user. The fact that we are capturing content for retention, disposition, security and litigation hold is completely transparent to the end user. I also took on one of the most contentious areas in Records Management and Information Governance these days – the battle between centralized management and manage-in-place. This is a very big topic, but I think the short-form answer is that most organizations are going to end up taking a hybrid approach. The areas where there is risk and cost in the lead application, like email, will probably continue to centrally archived. However, content that resides in systems where there is built-in business logic, but no retention capabilities, a case can be made for connecting to, and indexing this content to manage it in place. In the end, in-place vs. centralized will be another one of those balancing acts that we associate with Information Governance. This is a topic that will continue to be top of mind for many organizations as they try to bring more and more of their unstructured information under control. Capture is only the first step in getting content under control. It also has to be classified. Again, this is an area where RIM professionals are going to need to balance the needs of users and the needs of the organization. The presentation covers the pros and cons of the different methods for classification. There is no one single right answer, and most organizations will end up using most if not all of these methods based on the content that needs to be classified and the business process associated to that content. I took some additional time to go through our latest release, Auto-Classification, and the requirement for defensibility in Auto-Classification. The presentation finishes up with five uncomfortable questions to ask IT and Legal. I think asking, and answering these questions will provide some of the impetus it will take to move from embracing Information Governance and actually doing it.

Read More

The Email Pyramid

One of the key issues in Records Management is how to manage email. Email is the dominant content type requested by FOIA and eDiscovery because it often contains the “smoking gun,” and because people tend to be unguarded in their actions while using this medium for communication. Therefore email must be governed to reduce liability. Email also serves as key business records that document correspondence between parties, often as the only record for important decisions or acknowledgements. Email volume is growing exponentially and is a significant issue in most organizations. There are many types of email solutions, but the prevailing approach today is one that is a component of an Enterprise Information Management (EIM) Suite such as OpenText Content Lifecycle Management. This EIM approach provides Information Governance for all content, regardless of media, managed by an integrated Records Management solution. The solution applies governance and retention rules to all content, from the time it is created or ingested until its final disposition. As email volume continues to grow exponentially, organizations face several issues: • Optimization of the production email system• Capture of some email as important business records• Retention of email according to the organization’s retention policy• Deletion of transitory or non-record email An EIM system, using the latest integrated archiving and auto-classification tools, solves this dilemma in a practical way. This approach involves a concept called the Email Pyramid. I initially learned about an early form of this approach from Benjamin Wright, an attorney who specializes in this area, at an ARMA conference several years ago. I also would like to acknowledge the contribution of Jason Baron, Director of Litigation for NARA, who also promotes a similar approach to email for classification purposes. I have added some ideas to the concept, including a pragmatic approach for implementing email governance in the real world. The Email Pyramid The general Email Pyramid concept is that there are three components of email in an organization. As shown in Figure 1, there are Role Based Classification, Business Records, and the remaining vast volume of unclassified emails that most organizations simply do not have the resources or tools to classify. I will explain each component and how current EIM solutions address these requirements. Figure 1, The Email Pyramid Role Based Classification There are a small number of key individuals in an organization such as the president, general counsel, chief financial officer, and so on whose email should be retained for longer periods of time than other email. The rationale for this is that this email contains a historical record of key activities within the organization. It often contains background information on important events and is frequently used in litigation. For government this email is retained permanently. For private industry it may be retained based on a retention schedule that reflects its importance, and each organization will schedule these records appropriately. For classification, using tools within the EIM system, these emails are classified based on role. All emails for these individuals are retained and are moved into the EIM system. Identification of the emails for archival and classification is a straightforward process using standard email rules. Business Records The middle tier, Business Records, includes those emails that are used to document business transactions. These may contain the only record of an approval for a change order, authorization to proceed for a contract, acknowledgement of a transaction, and so on. Many organizations have their users print these emails and file them in a project or contract folder. Often they remain in the email system, ungoverned and unclassified. These emails should be moved from the production email system to the EIM system. Ideally automated business processes accomplish this without manual intervention. All notifications, acknowledgements, and other correspondence that is generated or received by a business application should be configured to store these as records within the EIM system, with appropriate metadata. These records are automatically classified based on Document Type such as Contract, Change Order, FOIA Request, and so on. For processes that are not yet automated, the EIM system should be configured to allow users to drag and drop emails from the email system into the appropriate folder on the EIM system. It is important that the manual process be as easy as possible so that users will adopt the process without significant resistance. The result is an EIM solution that maintains a complete history of business transactions, to include all supporting documents and emails. This approach supports eDiscovery, FOIA, compliance and audits, making all required information available using standard search tools. Auto Classification The remaining emails often number in the multi-millions. There is no practical way to manually identify or classify this volume, and having users do it themselves is usually not feasible. Today, however, there are auto classification tools available to help accomplish this daunting task. These emails are archived, and the email archiving tools eliminate duplicate emails. This significantly reduces the number of emails being archived. Auto classification tools such as OpenText Auto Classification automatically classify the remaining emails according to the RM retention schedule. A key component of this approach is to create a “Transitory” big bucket that represents those emails that are non-records. You assign 180 days or a similar rule to this category and in this manner a very significant portion of the email volume, sometimes 60 to 70 percent, is destroyed in a legally defensible manner. A recommended approach for this type of email classification is to use a big bucket retention schedule. The auto classification engine must be trained, using example emails, to recognize which classification an email belongs to. This task is much more difficult using a typical granular retention schedule. The result is an email archive that has deleted much of the duplicate and transitory content, and which contains remaining emails that are classified according to the official big bucket retention schedule. This approach is not 100% accurate. Accuracy of 60% or better is considered successful. A good classification engine, using ongoing training, should be able to attain 70% to 80%, or better, accuracy. This “good enough” approach, when compared to no governance at all, meets most legal requirements. Summary Email management is a complicated subject, and most organizations use an approach that is piecemeal at best. Best practice using current technology is to treat email like any other business record, saving what is important, discarding the chaff, and applying standard retention policy to what is retained.

Read More

The State of the Art in Records Management – Part 2

In my previous blog, the State of the Art in Records Management – Part 1, I discussed the new market drivers for records management: eDiscovery/Compliance Audit Readiness Big Bucket Retention Schedules The New Definition of a Record The Obama Managing Government Records Directive In this blog, I will cover the vendor response to these new market drivers. As recently as five years ago, customers often had to build custom integrations between “best of breed” products to achieve an enterprise class solution for ECM and RM. This approach is expensive, and once accomplished it is expensive and difficult to modify, expand or upgrade. It did work, however, and the resulting solutions often provided significant return on investment. The challenge with these systems is to adapt to rapidly changing technology and new market drivers. Over the past five to ten years there has been a frenzy of acquisitions by the major players, resulting in market consolidation. IBM, OpenText, EMC, Oracle, and HP have now consumed most of the independent vendors in the Enterprise Content Management space. For example, IBM purchased Tarian Software, FileNet and DataCap; EMC purchased Documentum and Captiva; HP purchased Autonomy and Tower Software; OpenText purchased Hummingbird, Metastorm, Captaris, Vignette, RedDot and Global 360, and Oracle purchased Stellant and Fatwire (this is only a small sample). Over 100 formerly independent vendors are now consolidated into the top five. Many of those custom integrations have been affected by the market consolidation. Some of those best of breed products are now owned by different companies and future compatibility with custom solutions may be in question. As a result of market consolidation, major vendors are now offering comprehensive ECM suites that offer a broad set of features and functions. Customers can purchase those suite components they need, and they can add others as needed without (theoretically) having to develop significant custom integrations. Some of the key feature sets now available as components, or “modules”, of ECM Suites include: Records Management as a component of a larger ECM solution Combination of Electronic and Physical Records Management, with DoD 5015.2 Email Archiving and Email Management, with Records Management (Exchange, Lotus Notes, GroupWise and now Google) Integration with MS SharePoint and MS Office Business Process Management, Workflow, Business Process Analysis Auto Classification Litigation Holds, Litigation Support Enterprise Storage Architecture Document Management with Version Control Integration with Key Business Applications such as SAP and Oracle Basic and Advanced Search, Full Text Search Advanced Reporting and Business Analytics Digital Asset Management Web Content Management, Portals Social Media Management and Governance Mobile Extensions to phones and tablets Capture of paper, fax, and electronic input streams Electronic Forms, E-Signature Output Management On Premise or Cloud Hosting This suite approach is designed to allow customers to address the full complement of requirements, driven by the market drivers outlined in Part 1 of this blog, using a homogenous set of modules. Users can implement point solutions and add new applications without having to re-architect the infrastructure. Customers can purchase the various components from a single vendor and thus manage contracts and support much more easily. Software upgrades (again, theoretically) can be managed in a much more harmonious manner because the components are all from the same code or solution base. This is all good news for organizations that are planning to implement enterprise solutions. Technology has kept up with the new market drivers and vendors are offering solution suites that address user needs. For Federal agencies that are making plans to move to a completely digital records environment by 2019, as the new NARA/OMB directive mandates, available solutions from companies such as OpenText are ready to meet the opportunity. Customers must challenge their vendors during the selection process to prove that all of these components do indeed work together as advertised. Having said all that, successful solutions are not based on technology alone. Clear requirements, budget, resources, change management, operational support, training and all of the intangibles are necessary to make deployments as successful as envisioned. In this new technology suite, Records Management becomes an integrated component of the overall solution. Ideally records classification is performed automatically upon indexing of content within a business process. The new Big Bucket Retention Schedule facilitates efficient use of automated technologies to classify records. Automated tools such as Auto Classification are used to classify large volumes of content such as email and file shares. Business transactions are automated using Business Process Management, and all related documents and email are automatically stored and classified. User classification of content, strictly for Records Management purposes, is kept to a minimum. Information Governance is applied when content is created, captured or ingested, making RM part of the DNA of business applications. Email, instant messages, social media, correspondence of all types, reports, and more are all managed by the solution, including physical files. Searches are media neutral, and when electronic content is located it can be retrieved if a user has the proper system privileges. If the content is a physical file, then the system should indicate the location of the file and offer check out and check in features. All business transactions, if the system is fully implemented, include a full audit trail of ancillary original documents that facilitate audits, FOIA requests, and discovery. Public or customer facing transactions provide better customer service. Elimination of paper from the organization, as mandated by the Obama directive for Federal agencies, results in very significant cost savings – for storage, staffing, user productivity, and transaction cycle times. Fully digitized records and business processes result in shorter cycle times and huge productivity and cost savings. So, as you can see, the new market drivers are being addressed by technology in a way that makes it quite reasonable to implement enterprise wide solutions that provide a rapid return on investment with a high rate of success, establishing an infrastructure that can evolve and grow over time.

Read More

Faster, Higher, Stronger: OT Archiving Test Results

Guest post by Paul Briggs, Senior Product Marketing Manager, OpenText In the spirit of the upcoming London Olympic Games, we decided to title this post “Faster, Higher, Stronger” to represent OpenText’s mission to deliver industry-leading product performance. OpenText continually runs performance testing of our software integrating with key partner platforms. Earlier this year, OpenText and Microsoft conducted performance and scalability testing on the email monitoring and records management components of the OpenText ECM Suite running on Microsoft SQL Server 2012 data-management software. See the record setting results here OpenText on SQL Results. More recently, testing with IBM and HP benchmarked OpenText Archive Server’s performance on hardware from partners running write/retrieval operations of SAP data and documents. The results are good news for customers running OpenText Extended ECM for SAP across a variety of server environments including IBM and HP. Businesses running SAP rely on OpenText Archive Server as part of the OpenText Enterprise Content Management (ECM) Suite for efficiently storing and retrieving large volumes of fixed content. These solutions help reduce storage costs and improve system performance by archiving SAP application data and documents in one central repository. This also gives users quick and easy retrieval, sharing, forwarding and reuse of content in the correct business context, and according to strict compliance demands. IBM: Archive Server Testing on IBM Power7 Servers At the IBM Innovation Center in Ehningen, Germany, OpenText achieved top marks for Archiving and Lifecycle Management scaling on IBM Power7 servers. Testing scenarios focused on SAP document archiving and lifecycle management, showing that customers could save as much as 90% in long-term storage costs. That’s based on Archive Server’s very high write/retrieval performance on the IBM Power7 platform and support for inexpensive tape storage solutions. Let’s look at those two areas in more detail: Performance: Document Throughput For archiving documents out of SAP to hard disk, OpenText Archive Server achieved a throughput of 1.8 million documents per hour. That was based on a 20 kb file size running on a 1 Gbit / s network bandwidth. At 500 kilobytes, approximately 800,000 documents per hour were archived, capped by the network bandwidth limit. What does that mean in real world terms? SAP documents typically average no more than 250 kb. So customers can expect well over one million archived documents per hour with continuous encryption of data transmission. In a continuous 10-hour day, that equates to between 10 and 15 million SAP documents that can be archived. This figure is much higher for serving read accesses to the Archive Server with disk based storage systems. These dramatic throughput levels mean that customers can deploy a single Archive Server to meet their needs with capacity to spare. In this time of squeezed IT budgets, this is the type of news customers love to hear. Not only does this performance level keep archive server requirements to a minimum, customers can also utilize existing tape infrastructure with IBM Tivoli Storage Manager (TSM) or IBM System Storage Archive Manager. Efficient Tape Support: Unique to OpenText Archive Server Archiving of SAP documents in a secure and cost-effective near-line storage option like IBM TSM is a unique capability of OpenText Archive Server. The cost savings of tape storage is to 90% compared to other storage options. OpenText Archive Server supports TSM with an intelligent algorithm for document reload from tape and sorting reload according to sequence of tape. This eliminates unnecessary seeks on tape/tape changes, and minimal wear and tear of tape. HP: Archive Server Testing on HP ProLiant DL580 G7 servers Working in conjunction with HP, OpenText conducted the testing in May 2012 at the HP Partner Technology Access Lab based in Houston, Texas. The test environment was run on an HP Converged Infrastructure, including HP ProLiant DL580 G7 servers. The test configuration was designed by OpenText, HP and SAP to prove the performance capabilities of a system designed from the ground up to: • Provide a holistic, value-based and integrated approach to managing enterprise content • Reduce risk by enabling better management of governance and compliance • Lower both operational costs and the total cost of ownership (TCO) of an ECM solution • Extend the value of the SAP® Business Suite to enable business transformation OpenText surpassed the SAP certification requirements. All functional requirements of the ArchiveLink test were fulfilled and the ArchiveLink load test showed a performance well above the required KPIs. Additional tests independent from SAP showed an even higher performance. The strong performance results underline the strength and scalability of a single Archive Server instance on HP ProLiant servers, and give OpenText and HP customers the confidence of deploying an SAP certified enterprise-ready solution. SAP customers can store ~10 million documents a day (at 10 hours/day operations) on a single Archive Server. High Volume Archiving: Industry Examples What these performance benchmarks show is the ability for OpenText Archive Server to handle massive volumes of data, increasingly typical in many industries today. Tens of millions of archived documents may seem a tad astronomical, but it’s the reality for certain transaction-heavy industries. Here are a few examples: Telco: Millions of monthly bills are met with customer inquiries in the following few days after issue. Call center employees receiving those engagements require a fast and easy retrieval of archived items to access data and solve issues for customers. Banking:High performance when reading archived documents is also critical for end user scenarios in banking. More than 400,000 simultaneous online banking sessions are common in Europe, each requiring unique authentication identifiers. Pharma: In industries like Pharmaceuticals, documents are very large comparatively. These docs contain extensive documentation to support rigorous regulatory compliance and volumes of testing prior to product release.

Read More

Why the “RMA” is No Longer Relevant

The Records Management Application (RMA) as defined in days past is now an anachronism. The traditional RMA was a back-end system that only managed records once they were declared. This “view from the basement” approach is fraught with issues such as lack of funding, lack of resources, and low organizational priority. I have been to dozens of seminars and conferences devoted to Records Management and they consistently refer to the RMA as if it is a stand-alone system out of context from the mainline business of an organization. In today’s world, driven by issues such as eDiscovery, audit readiness and compliance, this is no longer the case. The industry has transformed from Records Management to Information Governance. Content must be governed from the time it is created, captured or ingested. ALL content must be governed, not just declared records. Transitory records must be explicitly classified so that they can be destroyed according to policy. This new paradigm, called Content Lifecycle Management, addresses building Information Governance into all mainline business processes. Enterprise Content Management suites now include Records Management as a subset of functionality so that it is built into the DNA of business transactions in a way that is transparent to end users. For example, a workflow process supporting a contracts management effort automatically saves all documents into the ECM system, including metadata that is part of the workflow process. Saved documents are classified automatically according to document type against the retention schedule, using this metadata. No individual has to “declare” a record; it is done as a function of the business process. The benefit to the organization is that they now have a full audit trail of each business transaction that can be used for legal discovery, litigation holds, audits, and more. Transitory documents and email are destroyed according to published policy and therefore the organization benefits from lower storage and discovery costs. Potentially HUGE cost savings are realized through lower discovery costs, lower sanctions and fines, and via the ramifications of NOT failing audits due to lack of documentation. Auto Classification and eDiscovery tools are now included in some ECM suites, so organizations can now deploy tightly integrated solutions that encompass the majority of Information Governance requirements. Business Process Management (BPM) tools orchestrate business processes to marry business transactions with unstructured content that bridges information silos that include ERP systems such as SAP, custom database applications, email, collaboration tools, mobile and social media. BPM, for example, is a mainline technology that is funded at the highest levels in an organization to improve productivity and lower transaction costs that are essential to most organizations. Deploying BPM as part of an ECM suite that includes this complete audit trail of transactions and records is now the state of the art in our industry. RM is essential to this approach, but it is an integrated component, not a stand-alone system. Yes, the traditional back-end RMA is DEAD.

Read More

The State of the Art in Records Management – Part One

Over the past couple of years I have come to the realization that with the many changes taking place in our industry there is a new “State of the Art” in Records Management. This change has come about because of new market drivers combined with new vendor solutions in response to these requirements. Part One of this blog focuses on the new market drivers affecting the industry. Part Two will focus on how the industry has responded to these new drivers. eDiscovery The Federal Rules of Civil Procedure dictate that ALL available documents are discoverable in a court case. This includes electronic content, regardless of media. Email is the largest type of electronic content that is requested in eDiscovery. Instant messages, voice mail, copies, drafts, personal correspondence, and so on are all discoverable. Therefore, it does not matter if your organization has declared “official” records, all content whether declared or not is discoverable. Discovery costs are very high. The ramifications of missing documents in a litigation can be very high as well, leading to fines, sanctions, and negative judgments; in some cases leading to jail time for defendants. Therefore, it is very important that Information Governance is applied to all content so that it can be found easily, so that it can be automatically classified to the correct retention rule, and so that it can be discarded if it is not relevant. This approach lowers discovery and storage costs and eliminates or significantly reduces the missing documents issue. This approach, called Content Lifecycle Management, applies Information Governance to all content from the time it is created, captured, or ingested, all the way through final disposition. Governance is applied to email, social media, electronic documents and all versions, file shares, SharePoint, and content created or captured by business applications such as SAP, case management applications, collaboration tools, and more. This market driver has changed the foundation of records management, leading to a redefinition of a Record. In the past, a record was “declared” once its business process was completed, and the Records Retention Schedule defined how this static record was to be filed, stored, and retained. File Plans and Retention Schedules were based on a paper paradigm, focusing on how to organize boxes so that they could be accessioned, archived, and dispositioned effectively. The new definition is “everything is a record.” Therefore, all content must be managed. This approach requires a new Big Bucket Record Series called “Transitory.” Using this approach content that is considered to be a non-record such as personal email, non-business correspondence, drafts, copies and so on can be explicitly classified as Transitory. A retention rule is applied such as 120 days, and this Transitory content can be defensibly destroyed. This approach reduces your content and storage requirements significantly and lowers discovery costs. In this new paradigm, Records Management (RM) is a subset of an Enterprise Content Management (ECM) system. The ECM solution enables RM to become part of the DNA of critical business processes such as Human Resources, Accounts Payable, Finance and Accounting, Case Management and more. These processes receive and create records that are captured and indexed as a function of the business process so that RM classification is automatic. Building these mainline business processes to include RM produces a full audit trail for all business transactions. This supports the next key new business driver, Audit Readiness. Audit Readiness Audit Readiness is a term coined by some U.S. Department of Defense organizations to refer to the ability of an organization to respond quickly, accurately, and completely to an audit. Paper based and non-automated solutions to this requirement can produce negative results and serious ramifications. An automated process, on the other hand, can allow organizations to relatively easily produce a full audit trail of their business transactions, reducing the time and effort required to respond to an audit. Because failed audits can result in potential costs in the millions of dollars depending on the context, this issue has come to the forefront as a market driver. Using the Content Lifecycle model, this complete audit trail of information that supports mainline business processes is attainable. For example, using OpenText’s Extended ECM for SAP, all supporting documentation around key business processes are capture, classified and retained based on a set of business rules. The Presidential Memo – Managing Government Records President Obama issued a memorandum to all agencies on Nov. 28, 2011 requiring each agency to appoint a senior official to deliver a plan for moving the agency from paper based records management to electronic records, specifically email, social media, cloud solutions, and so on. Each agency has submitted their plan to the National Archives and the Office of Management and Budget. This summer a directive will be generated to all agencies that will incorporate a number of ideas from agencies and the public for how to move to electronic records. As is the case with many government directives, this is an unfunded mandate. However, moving from paper to digital based recordkeeping and processes, as we know, has a very significant return on investment. Therefore any solution deployed to solve this mandate, if deployed wisely, will pay for itself many times over. Doing More with Less One of the tag lines in the U.S. Federal Government these days is “doing more with less.” This refers to the dynamic of shrinking budgets and resources combined with increasing volumes of information to manage. This applies to the private sector as well as all levels of government in these challenging economic times. Technology and automation are keys to successfully meeting this challenge. Tools such as integration of RM and Information Governance into key business processes and email archiving, and using new tools such as Auto Classification help organizations cope with these increasing volumes. Big Bucket Retention Schedules The new paradigm for RM includes a redefinition of Records Retention Schedules. Legacy schedules were developed around a paper-based paradigm and typically are very granular, with many arbitrary retention rules that are now unnecessary and undesirable. Many organizations have hundreds or even thousands of individual record series. In the past several years there is an increasing trend toward Big Bucket Record Schedules that boil everything down to just a handful of “buckets.” In the context of an ECM/RM solution, the ECM system contains all the metadata needed to search for and find records, so the retention schedule only needs to determine how long to keep it. A streamlined Big Bucket schedule is easier to implement, is more intuitive, and compliance is much more effective. The Government Accountability Office (GAO) is a leader in this trend, with a new schedule that contains three big buckets and a total of 27 sub-categories. Other agencies are following GAO’s lead, and similar moves are being made in the private sector. NARA has endorsed this approach so it is becoming increasingly more accepted. To recap, the new market drivers that are now “State of the Art”: • eDiscovery and the Federal Rules of Civil Procedure• Redefinition of a Record to Include Everything• Audit Readiness• Presidential Mandate to Move from Paper to Digital Records• Do More with Less• Big Bucket Retention Schedules Stay tuned for Part Two of this blog, how industry has responded to these new drivers.

Read More

Implementing an IG Program – Where to Begin?

Implementing an Information Program – Where to Begin? I am not sure if this has anything to do with the latest heat wave in Ottawa, Canada – yes, we are experiencing summer-like weather in March. Trust me, I am not complaining, but spring is in the air and it seems like it’s making everyone wake up to a very important topic: where to begin with information governance? I have received a flurry of inquiries over the past week, all with a common theme – where should our organization start?Something that is critical, and reiterated by many, is to ensure an information governance program lies with the Chief Information Officer and Chief Legal Officer, while involving records management and the key stakeholders from different business units. It is a team approach and not something that can be successful if only driven by one team.As opposed to writing an extensive blog post on all the key considerations, I have decided to talk about just one: start with the riskiest content. One important element is to look at your riskiest content first as it’s not always possible to conquer all requirements at once. There is nothing wrong with a phased approach; proving success in small increments can prove to be a good approach. One of the riskiest content pieces is email. Why?Simply because no one ever gets rid of it or if they do, it’s done on a whim with no consideration to its content. This type of activity can translate into an enormous exposure to risk due to the likelihood of smoking guns and spoliation. So, can email be managed?Absolutley.I strongly encourage flexibility, specifically when it comes to classifying email – one approach will not fit all.There will be some users that want the ability to drag and drop; others will want to choose their classification from picklists or favourites, while others will want the system to automatically classify emails.All methods are encouraged as you want to make sure you provide a system the end-user will use. As an example, take a look; well actually have a listen to how NuStar Energy addressed its email management challenges. NuStar Energy L.P. is a publicly traded, limited partnership based in San Antonio, with 8,417 miles of pipeline; 89 terminal and storage facilities that store and distribute crude oil, refined products and specialty liquids; and two asphalt refineries and a fuels refinery with a combined throughput capacity of 118,500 barrels per day. With operations in the United States, Canada, Mexico, the Netherlands, the United Kingdom and Turkey, the company uses OpenText Email Management for Microsoft Exchange to help reduce the cost and risk of mismanaged corporate email. “Having all email in one location, being able to search in one place and put a legal hold in one location instead of potentially seven or eight, is huge for us on the legal end. It’s all managed by OpenText.”— Clint Wentworth, Records and Information Manager, NuStar Energy. There has been a lot of traction around the topic of auto classification. A very important element is to make sure the offering is defensible at the same time as being transparent. Defensible so that the organization can state to the courts and auditors the processes taking place to test, tweak and monitor the way the information is being classified.Transparent so there is no disruption to the way our businesses work. Yes, there is at lot to consider when starting to implement an information governance program.A solid strategy, involvement from key stakeholders and addressing your riskiest content first is a sound way to begin!

Read More

Governance: It Doesn’t Matter What, It Really Matters Who

As I just got back fromLegalTech NY, where I spent many hours speaking with General Counsels and IT Directors about their business requirements, I wanted to share and repost a blog, which is relevant to those discussions. The following is a guest post by Dave Martin Originally posted on Dave Martin’sblog on SharePointProMagazine. Read the other posts in this series here. Over the years one of the many things I’ve been involved with is governance.To most the word governance is synonymous with compliance, which is then in turn synonymous with records management.After that the focus becomes very specific.What I recommend people do when trying to understand how they should approach governance is to approach it as a strategy and make sure that strategy involves and intertwines three things: people, process and technology. If this sounds familiar it was an integral part the first post I wrote in this series around understanding SharePoint from a big picture perspective.When it comes to governance specifically there is a certain part of this triumvirate that stands out: the people.We often run headstrong into governance deployments without really understanding who needs to be involved before the code hits the servers and processes are under way. The very first step organizations need to take is defining that small group, who will steer the solution to and through implementation.Obviously IT pops up first as we look to define this working group, and they are unquestionably a very big part as they will be responsible for the technology doing what it needs to do.Another group that should also be considered a bit of a no-brainer is the group or department, or in many cases, the individual responsible for records.This person may be by title the compliance officer, records manager, IT security or legal counsel, regardless they are responsible for the information policy management of the organization.And lastly, but certainly not least we must include someone, or some group that represents the line of business worker, or end-user. Surprisingly, I have seen this last group consistently excluded from the planning process.Not because they are a problem or difficult to work with, but because the people that are actually going to use the solution are often an afterthought, or as IT would consider them: the customer.DO NOT forget to include this group!At the end of the day they will literally make or break the deployment’s success causing problems for both those other groups at the table as they won’t understand the technology (frustrating IT) or they don’t execute according to policy (putting the company at risk). Once we have the right contributors at the table we can start to define the governance strategy.When people are defining their governance strategy I always promote that they ask themselves a few key questions to help better understand what they want to do, who it will affect and what they need to do it.Once these questions have been answered a plan can be more easily defined. The first question is: do you understand your content?This is very important and can also be made as a statement: know your content!We have content broadly spread across our environment, not just in SharePoint.If we are planning to move large portions of that content into SharePoint – file share replacement is one of the top uses of SharePoint – think about what you are moving over.Is this relevant data? Is this data that must live under compliance?Is this duplicate data?Is this active data? This last question is an important one to consider in terms of SharePoint.SharePoint is an active content solution, and a relatively costly place to store content.If you are moving massive volumes of data into SharePoint it just does not make sense to move old, inactive content into SharePoint from a cost perspective.This content should move directly into an archive that lives on a lower and cheaper tier of storage. Once again we must consider “the who” for a second here.Even though we are moving content out of SharePoint and into a more cost effective compliant place we cannot forget that users should be able to access it or restore it (permissions pending) directly from SharePoint. My next question is: what are your specific compliance requirements?This varies widely from company to company and industry to industry – every company has corporate policies specific to their internal requirements, and many companies have to adhere to industry regulations.SharePoint does a great job of managing the content in SharePoint as records, but does an even better job when supported by partners.As broad as SharePoint’s records capabilities are when it comes to supporting industry regulations and government guidelines like the Department of Defense 5015.2 (DoD 5015.2), physical records and records living outside of SharePoint’s native repositories a third-party add-on solution is a requirement. And for my last question, we go back to “the who” again: How will we govern the people?Again, for most, information governance has to do with the information, but we must also be sure to govern the people if we are going to be successful.This question relates to how we are enabling people to leverage the core strengths of SharePoint, and this all starts with the creation of Sites and filling them with content.Organizations have to have a Site provisioning plan in place or they risk putting the organization as a whole at risk.Site sprawl is not just a myth, it is a reality, but it doesn’t have to be feared.Attaching a lifecycle and policies to a Site at the point of creation will ensure that Sites are connected to the data center and can be managed under the watchful eye of IT.Not only this, but we can now monitor those same sites and move them to the appropriate tier of storage once they have become dormant or inactive.Site provisioning allows organizations to permit the creation of as many or as few sites required all in a controlled fashion. As you can see, understanding “the who” when defining your governance strategy for SharePoint is a pretty big deal.Not to downplay the value of process or technology, but to use an analogy: it is the person that drives the car down the right road, and it really helps when that person knows where they’re going.Just like a good governance plan for SharePoint, people who drive cars will get to their destination faster if they have good maps. To find out more, join me on February 21st at noon EST where I’ll be participating in the webinar Extending SharePoint Across Your Information Infrastructure. You’ll learn key concepts required to turn SharePoint into a multifaceted, stable, and powerful IT tool set.

Read More