Enterprise Content Management

Customer Communications Management: The Rundown

Here at Actuate, we get asked this question a lot: What exactly is Customer Communications Management (CCM)? Consider organizations that handle vast amounts of information for millions of customers: cell phone bills, credit card statements or insurance policies. They need to ensure that the process that gets those statements to their customers runs smoothly, while managing every step in the data flow. The process also has to be flexible enough to respond to changing business, IT and customer requirements. For that to happen, organizations rely on the six steps of CCM: Data Acquisition and Analytics. To make it useable for transactional documents, raw data is captured, normalized and augmented to ensure accuracy and completeness. Document Composition. Data is formatted in a way that’s visually appealing and user friendly, adhering to corporate brand standards and accommodating personalized advertising with intelligent, targeted trans-promo offers. Document Processing and Transformation. Business rules are defined that outline how content will be processed, how and when data will be acquired, what composition templates will be used to format the data, where the resulting documents will be delivered, what repository the content will be stored within, what format will be used to store it and what index or metadata information should accompany that content. Multi-Channel Delivery. Everything is formatted so that customers can receive their information in the way that works best for them, whether that means print distribution, online, smart phone or tablet access. Electronic Archiving. Data is archived to satisfy regulatory requirements, provide high-speed search and retrieval for internal discovery and analysis, reduce internal helpdesk costs and improve information availability. Portal Technology. CCM portals connect customer-facing applications with relevant content to increase information availability and reduce manual processing. Portals deliver communications in traditional static formats, like PDF, for simple statement review, as well as in interactive HTML formats that provide insight through rich graphical visualizations and allow users to interact directly with the transactional content. Getting the CCM process right helps organizations use data effectively and efficiently. That’s where Actuate comes in. Learn more about Actuate’s Customer Communications Solution. Learn more about our Customer Communications Solution.

Read More

Next Gen File Transfer Enhances Transmittal Management

Until now, transferring largefiles to points outside the corporate firewall has been an exercise ininconvenience, irritation, and risk. Believe me, as a technologyprofessional working out of a home office, I can’t tell you the hoursI’ve spent managing the movement of multi-GB files to colleagues andcustomers–either electronically through FTP and its ilk, or (shudder)physically through the shipping of storage media. The fact is, we’ve all had to make do with inadequate large-file transfer options over the past decade: Email is a no-go if your attachment is larger than 10MB or so. FTP, USB drives and DVDs are time-intensive, unreliable and present sizable security and compliance risks. Public file-sharing services?Don’t even go there, friend. Aside from the ever-present threat ofhacking and uncontrolled distribution, I’ll bet you didn’t know that theservice providers themselves generally reserve the right to access yourdata at any time, for a wide variety of reasons. The Future of Large File Transfer is Already Here Thetechnology exists for a better way, though. I’ve seen it every time myson effortlessly downloads a multi-GB video game from an online retailerwith a simple click. And I finally had the opportunity to experiencethe same convenience in a B2B setting with the recent introduction ofthe OpenText Managed File Transfer (MFT) solution. Howeasy is it to use? Transferring a 3.9GB file from head office to myremote laptop involved simply receiving an email informing me there was afile waiting, clicking on a link in the email so I could direct MFTwhere to put the file on my hard drive, and going back to work. That’sit, that’s all. The download finished seamlessly in a few minutesand I had no reason to worry about the transfer at all. Not only is MFTfast and user friendly, it will auto-resume if the connection hangs andit encrypts the data during transmission. The genius behind thesolution–and the reason we have patents pending on it–can be explored herebut, in plain user-speak, I saved a nice chunk of time, worked with afamiliar email-based interface, and securely received a complete,non-corrupted file with an auditable trail. What more could you ask for? Howabout the fact that OpenText is now integrating the progressivecapabilities of our new file transfer solution with our existing Transmittal Management application? It’s all part of our commitment to providing optimal value to our customers through a comprehensive, cross-pillar Enterprise Information Management (EIM) strategy. Thinking about this combination in action really got the wheels in my head turning! A Step-Change in Seamless, Secure Transmittal Management Theengineering sector is my specialty. And I know from listening toDocument Controllers across the industry that the process of efficientlymanaging project transmittal informationaround the world has become a major issue. In recent years, the scopeand complexity of these transmittals has increased in lock step withtheir financial and legal implications. Document Controllers nowregularly spend stress-filled hours struggling with inefficient methodsof transferring contracts, drawings, specifications, and othertime-sensitive, mission-critical information. What’s more, thesetransmittals are often destined for remotely located engineeringprojects utilizing networks of varying quality and stability. Theintegration of OpenText Transmittal Management with OpenText ManagedFile Transfer will be all they need to adapt to these fluctuatingenvironments while ensuring that essential files are deliveredcompletely, securely, and in full compliance with corporate policies andindustry regulations. Granted, not everyone’s daily activitiesinvolve transmitting the blueprints for a hydro-electric dam to thejungles of Borneo, but if the new OpenText Managed File Transferintegration with OpenText Transmittal Management is designed to excel insome of the most demanding environments in cyberspace, think of thestability, security and efficiency it can add to your organization.

Read More

Five Common ECM Wins…and the Straightforward Cloud Solution

As cloud-based Enterprise Content Management (ECM)implementations become increasingly commonplace, I’m seeing a growingnumber of noteworthy client success stories. The kind that overcomesubstantial hurdles to literally transform a client’s content managementinfrastructure from non-existent to world class in one seamless,magnificent leap. The really intriguing element in the vast majority of them is witnessing the tangible, quantifiable benefits the cloudbrings to information management. For me, it’s one thing to discuss thepractice of cloud-based content management as a theoretical talkingpoint, it’s quite another to see it in action, elevating contentsolutions from great to how-did-we-ever-operate-without-this throughadditional functionality, security and efficiency. The OpenText team has collaborated on yet another exampleof this recently; designing and implementing a cloud-based ECM platformfor a non-profit, clinical research firm. The core challenges and goalsof this organization are remarkably similar to most large enterprises,namely, an extensive, remote workforce all generating and collaboratingon vast amounts of mission-critical data that’s highly sensitive andstringently regulated. Sound familiar? What’s more, you may alsorecognize the environment that sparked this particular initiative:Essential data stored in isolated network drives and filing cabinets,email serving as the principal means of collaboration, security andgovernance tools struggling with oversight of highly mobile staff. (As aside note, did you just mentally give your organization one, two, orthree checkmarks? That’s what I thought!) In short, thisparticular enterprise was at the mercy of the disjointed and exceedinglymanual processes resulting from the lack of an integrated contentmanagement solution. For participants, the frustration factor wassetting in with the inefficiencies posing challenges to meet deadlinesand an overall lack of confidence around the information being used tofuel decision making. On top of this, every day was like rolling thedice with potentially catastrophic implications for data security,financial stability, regulatory compliance, and reporting credibility. Tobe sure, these were considerable issues that permeated right to thecore of the enterprise. Yet the solution was relatively, evensurprisingly, straightforward: A tailored version of OpenText Content Serverprovided the centralized and standardized data management structurethey needed to ensure optimal security, management, and compliance. Whatput it over the top (and into my personal business hit parade) was theastute decision to utilize an OpenText Cloudsolution to host the application – ensuring easy, real-time access forfar-flung contributors, both those internal and external to theorganization, and collaborators through a custom-designed web interface. Voila…One undemanding yet multi-faceted suite of ECM tools solving five common issues: A secure repository for storage and distribution of version-controlled documents Increased adoption, efficiency, and collaboration via a user-friendly, web-based interface Comprehensive auditing and reporting functions from one, centralized source Full compliance with all applicable government and industry regulations Simplified system management that exceeds security requirements thanks to OpenText Hosting Services It’s a beautiful thing. And I invite you to read through their entire success storyto get the full story. For us here at OpenText, these contentmanagement successes are both rewarding and invigorating. Not only dothey fulfill our mission of contributing to our customers’ success – youcan literally see the stress from years of legacy inefficiencies fallaway as velocity, security and adoption skyrocket – they also serve asnotice that we, as an organization, are on the right path. Cloud-basedservices, be they ECM or broader-based Enterprise Information Management (EIM) initiatives, are the future. There’s just too much to be gained with them. It’stime to start considering the benefits of cloud services. I know theycontinue to change my perception of what’s possible in the world ofenterprise content management.

Read More

Putting the “Super” in QSuper

Submitted by Michelle Dufty on May 17, 2013 We’re Talking Smart Process Applications for Financial Services We have all heard the story before: organizations trying to become more customer-centric but can’t because they are held back by ageing applications and siloed operations that leave them unable to keep up with the demands of the modern customer. I have been working a lot recently with customers in the Financial Services industry where this issue is extremely prevalent. The good news is that there are alternatives to legacy systems that no longer fit the bill, and at OpenText we call them Smart Process Applications. Smart Process Apps combine the power of Business Process Management or Case Management, with ECM, capture, analytics, collaboration, and customer communications management to allow organizations to be more collaborative and support the dynamic and demanding customer environment that exists today. Smart Process Apps break down the barriers of application silos while still referencing the legacy applications as the system of record. I am really pleased to announce a new case study we did with QSuper, one of Australia’s largest superannuation (aka retirement) funds that service government employees, related entity workers, and their spouses. Like many financial services organizations, QSuper operates in a highly competitive and dynamic environment, where legacy applications with limited functionality and no central view of the customer could limit their ability to open new accounts, provide stellar customer service, and ensure consistence execution of operations. QSuper used to rely on a workflow system that was embedded within one part of their organization. The system had limited functionality, no disaster recovery capabilities, and experienced frequent downtime – that led to disruptions in their business. In addition to these issues, eight different systems were used by operations staff with numerous repositories for customer information, which included a mixture of paper and electronic documents. Like many other financial services organizations, QSuper realized that they needed to modernize their application infrastructure to provide a single view of their customer in order to become more efficient and improve the customer experience. This new system, workQ, is a Smart Process Application that now handles 78 percent of customer administration processes and is used across QSuper, from knowledge workers processing claims to business operations and IT staff to mid-and senior-level management. They have also been able to decommission 5 of the 8 former customer systems, which has significantly reduced business operations costs and improved employee responsiveness to customer inquiries. The advanced analytics and reporting in the new system has reduced the manual effort to create reports by 99% and provides a much clearer view of overall business performance. QSuper is just one example of numerous customers who are providing better customer service with a modern, Smart Process Application infrastructure. To learn more about Smart Process Apps and other customer success stories, follow me here.

Read More

Driving simple, powerful, extensible and visual ECM

On March 6, Mark Barrenechea posted a blog about the recent OpenText Resonate KT acquisition. This blog post will explore more about how these new ECM additions can help customers of ECM and Extended ECM for SAP. ECM Made Easy Oneof the biggest challenges every organization faces as they implemententerprise software is user adoption. While the VP of ApplicationDevelopment and the Business Architect seek to bring the highest levelof functionality they can to drive business processes and applications,they are faced with the issue of how well the system will be adopted bythe people that use it. How much training will be required? Will theusers be happy with how the systems fit into their daily work or willthey find a way to minimize use of it? Some of the most common things we see customers do in order to make the experience simple for casual users include: • simplifying content views and choice lists to ensure users aren’t faced with long menus of options • customizing the interface so that common information is clearly presented in a simple, uncluttered view • driving business processes through forms and workflows • providing simple, interactive dashboard views containing information that is relevant and in context • providing reports, notifications and lists of critical information to support business operations UsingWebReports and ActiveView, customers are now able toprovide theseuser-driven views and operations through simple customization of the Content Server interface and workflows. IT groups implementing ECM can easily simplify the interface for a user, change the view by their role and deliver views tailored to the device they are working from. In addition to this they can create dashboards with report driven content for the user and business situation, and create powerful workflow processes and forms to drive critical business operations in user-centered ways. Custom dashboards present real time information in context Power with Simplicity Inevery organization there are situations and users that demand both easeof use and the flexibility to work in different ways. Power-usersdrive productivity through creative use of systems and customizing theiroperation to best meet their own personal needs. While theseindividuals can provide challenges to the application development group,they can also be the system’s strongest supporters and those that gainthe most from working with it. For thesesituations WebReports and ActiveView turn ECM into an adaptableenvironment that simplifies business processes by utilizing commoninterface paradigms such as tabbed and tree views of content, inlineediting of metadata, and mouse-over ability to see more about content. These tools also provide the ability to work differently on differentdevices, providing a mobile experience that brings the power of ECM to aconsumer-like mobile experience. This is easily managed by IT withrole and device based views. Tabbed View provides quick access to common sets of data A Picture is worth a thousand words Sometimesthe best way to analyze content is to see it in action, to look at itin a visual way. Charts, graphs and KPI indicators are well recognizedas ways to quickly provide a snapshot of information that can then bedrilled into for more detail. With OT ECM it is now simple to embedvisualization of information, charts, graphs and KPI’s into views forall types of users. Understanding content and metrics has never beeneasier. Key Performance Indicators (KPI’s) provide simple view of status against targets Delighting Developers Someof the requests we hear most often come from IT groups and applicationdevelopers. These people work with all types of enterprise software andthey want to have easy and standard ways to extend their system; costeffective deployment and maintenance; and the ability to buildapplications that meet their industry and business needs while gaininguser acceptance. Add to that the concept of making working with thesystem fun, and you have the winning combination we now provide withOpenText ECM. The WebReports engine uses a simple Tag andTemplate concept which should be familiar to any web savvy developer. Its standard HTML templates provide the interface, while the tags enableinsertion of data into the layout or actions to be performed oncontent. When put together with WebReports WorkFlow Extensions,developing an interface built around business processes is easily done,all without requiring any knowledge of OScript. This powerful engineallows for easy creation of custom deployment and development ofbusiness applications, and provides a level of insulation from coresystem changes that reduces the time and cost of upgrading. Develop using standard HTML templates ITconcerns in using any add-on to a core system include how well it isintegrated, does it keep pace with core product changes and does it addto the size or complexity of the infrastructure. As a long time OTpartner, RKT has always been focused exclusively on additions to ContentServer and core ECM, developed to meet the latest Content Serversecurity standards. All of these products are tightly integrated,optimized for performance, and do not require extra hardware. Releasesof these modules have always closely followed core product releases andnow that the RKT team is part of OpenText these add-on modules will bereleased coincident with core product releases. WebReports hasbeen a commonly used ECM and xECM extension for a number of years andsupports over 500 customers in 20 countries. The driving force behindthe usage of this module has often been requests from solutionconsultants, services teams and partners because they find increasedflexibility with a reduced amount of development resulting in faster,less expensive deployment and maintenance of business specific ECMimplementations. The addition of WebReports Workflow Extensions andActiveView expands this benefit further still. We invite you to learn more and see product demonstrations. For more information about the Resonate KT acquisition, visit our CEO blog or see the press release.

Read More

The Value of Information

There is a lot written about Value of Information andthe conclusion of almost everyone is that you need information beforeyou make your decision for it to be of any use. Can we act quickly basedon historical information? What happens with the information thatcompanies have gathered on their customers? Is it easy to access theinformation? Do companies use the information to build profiles and thenconnect with them in a more efficient, personal way, either directly orthrough an intermediary? Mostcompanies still have an “Archive all” policy where all recordsbeingarchived. And whether that is in cabinets or electronically, itisstored in an archive to be there for any given period, exposing a compliancy risk when being audited not to mention the cost of storage. Companies that want to reduce that risk and especially enterprise businesses that create billions of records may have applied Records Management in their Enterprise Content Management system and offload records that would no longer be needed for compliance purposes through a defined set of rules. Very internal driven to reduce risk and cost which is very good, I am sure we all agree. However, as said in general most of the records are stored in (several) legacy archive systems, either on-premises, off-premises or nowadays in the cloud. Records that contain information. Information that can be of much value when it is used in a proper way. Today EnterpriseContent Management technologies allow organizations to take fulladvantage of enterprise information and gain better business insight,create a positive business impact, increase process velocity, reducerisks related to information governance, and protect intellectualproperty from internal leaks and external threats. But it takes a goodstrategy to really take advantage of valuable (historical) information. I amsure that it must be difficult for organizations to take advantage ofall that unstructured data. The volume of unstructured data is huge andjust growing and growing anyway. But can we somehow start using theinformation we archive and unleash that information? Information that isstored in many different repositories or business applications where itis hard to access and use the information. For internal purposes, yesof course but a customer should also to be able to access informationthat concerns them, old or new information, on any device they preferand update information where needed and/or required. Then businessesshould use the information to target their customers in a moreeffective, personal way, through all communication channels, also whenusing social media. All the valuable historical information must givethem the possibility to build a profile of their customers, right? Yet,I still see another challenge for CIO’s and business managers to findthe right balance in unleashing the information in a secure environmentand on every required device. How much information can be shared andused? Allowing business operations departments using information tobuild profiles to target and communicate with customers on a morepersonal way and still remain compliant with regulations that have beenput in place over the years and those to follow. Not to mention theirchallenge in finding one vendor that can offer solutions in all areasbecause that would be ideal. OpenText customer DDR Corp. DDR Real Estate Investment Trust,is one of those companies where their internal business can alreadyview the history of previous lease contracts. They applied severalOpenText solutions and continue to look for new ways of developing Mobilizing Enterprise Content,like using the Content Server for property managers, depositing moreinformation in their Content Server and exposing it via web access.Another OpenText customer Blokker plans to store a profile of each shopkeeper and wholesale customer inthe OpenText system using it to preferences on how they want to receiveinformation. Are these companies done? I don’t think so but they see thevalue of information. The challenge is on;finding the perfect balance in reducing legal risk and IT cost tounleashing valuable information and do more business while remainingcompliant. Yes, there is a lot to consider and a long way to go but onething I am sure about is that once an organization forms an informationgovernance strategy that includes using valuable information it can overtime gain better business insight, take the right decisions andcapitalize on opportunities to positively impact the business. I mean,all that historical information and new information that is created on adaily basis gives companies the perfect opportunity to build customerprofiles based on the most important pieces of information. Thenconstantly refresh those profiles based on new information. And why notmake information available to the customer in an environment thatremains compliant and managed so they can update and influence theirprofile? Don’t you think it will get companies closer to the customersand vice versa? Don’t wait, start using the valueable information andmake it part of your information governance strategy.

Read More

Information does matter

In recent years there have been manyhorror stories about the mismanagement of information, whether itpertains to personal, private or public data in the form of lostlaptops, discs, files and briefcases etc. How should information bemanaged? Could anything have been done to avoid the loss or minimise therisk of human error? Is there an easy answer? The HM Government released a whitepaper entitled“Information Matters: Building Governments Capability in ManagingKnowledge and Information” this highlights an extension of‘Transformational Government into data, information and knowledgemanagement where there is a need for best practice policy supported bytechnology’. Ithas been said before that this is the century or age of information.More information is being created every day, this in turn means thatmore information is being stored every day too. Businesses, Services,Governments and other Organisations all need this ‘lifeblood’ ofinformation to be accessible, useable, safe and accountable. Thegovernment is committed to addressing specific aspects of informationmanagement and information security (BS10012 and BS27001). This is allvery well, but having just information management on its own is notenough. Good information management needs to be aligned with goodknowledge management. Well, what use is information if it is not usedcorrectly? If you go to an ATM to withdraw money, you expect that thebank has used the information about you correctly, to ensure that youget your money from the correct account when you need it. But what ifthis information was not managed properly and you were abroad andneeding to access your funds and were unable to? This is a simplescenario but think about how information is used when you renew your cartax online, at passport control or to ensure you have the correct taxcode etc. It is not just about having the information but using iteffectively. Recently Knowledge Management and Information Managementhave been formally recognised as functions of government, in the sameway that finance, IT and communications are. With more and moreinformation being created, how long should you keep certain pieces ofinformation before it loses its usefulness or becomes dangerous? Whodecides what parameters are set for this? How does this impact on dataprotection laws? These are just a few of the many important questionsraised. Each organisation will have differing requirements on thismatter. There are guidelines online for organisations which help them tomeet the necessary regulations required by law, but you still need tomanage this effectively. So what do Governments and Businessesneed to do in order to deploy an effective information management andknowledge management strategy? The government, here in the UK, has setout guidelines highlighted in their Information Matters whitepaper and have organised a committee to help manage this. Many businesseshave done the same, but some are not seeing the bigger picture yet.People are talking about big data and the age of information but whatare they doing about it? Many of their current systems andprocess have been in place for many years and a lot of the informationis paper based. Technology is moving forward at an exponential rate,particularly with smart phones and tablet devices. Many businessprocesses nowadays are handled electronically with little or no actualpaperwork involved, but how is this information tracked and handled?Electronic document and records management software (EDRMS) appears tobe the answer. Many vendors will offer this at a departmental level orin some cases at enterprise level. Having an EDRMS system in place willensure that your business or government department meets the necessarylegislations and ensure that you have an effective informationmanagement strategy. However, there is a relatively new approach called Enterprise Information Management(EIM). In effect what EIM does is brings structure to the unstructured by unleashing the power of information to the organisation. With the growth of information coupled with the myriad of differentformats,only one organisation is standing up to be the leader in thisfield,with the goal of becoming recognised as the #1 EIM vendor. This organisation is already demonstrating leadership, according toanalystsGartner and Forrester, and is well on its way to being leaderin allfive pillars of EIM, namely Enterprise Content Management (ECM), Business Process Management (BPM), Customer Experience Management(CEM),Information Exchange and Discovery. If I were a CIO of a majororganisation or government department, I know full well what I would bedoing. I would contact my local OpenText office and ask for guidance. By acting now, I would hope to avoid any mishapsor issues around information management, compliance and legislationwithin my organisation.

Read More

How Does Content Add-Value in Your Organization?

It’s January, and what does that mean?Lists, lists, lists. It seems to me that as a society we’re rapidlybecoming obsessed with boiling things down into a Top 10 list and inJanuary that seems to reach a fever pitch. Thanks Letterman. Even mycolleague Chris Walker has jumped on the bus with this list onSlideShare of “2013 IT anti-predictions”. SureI jest, but I have to admit, without my lists…I’d be lost. My to dolist, my goals list, my ‘don’t forget to take it’ list, my playlists forthe gym, my list of people I need to see this month. Seriously, I thinkI have somewhere in the area of 10 apps on my iPad and Blackberrydedicated to keeping my bits and pieces straight when I’m on the go, andI’d be lost without my Outlook Tasks. I never end my quest for theperfect app that does it all for me. But as I ended 2012, it becameclearer than ever—there’s no way I will ever finish checking off theentries. I need to think about what’s really important for me in 2013and prioritize! Thatprioritization and goal setting is critical for all of us, more so thanever in our professional lives. I’ve read countless blog posts andarticles over the past few weeks talking about the big picture trends.Bob Evans calls out what he believes are the hottest issues for CIOs in this articlefrom Forbes. Trends and priorities I’ve heard echoed many times over inthe discussions in and around OpenText. But when it comes to channelingmy inner Kreskin, there’s one tidbit I picked out of my colleagueDeborah Miller’s recent CMS Wire article that I’d like to delve into more. Deb pointed to recent IDC research that indicates: “Thetrend toward industry-specific solutions will be further driven by theincreased participation of line of business (LoB) executives in ITinvestment decisions. In 2013, nearly 60% of new IT investments willdirectly involve LoB execs (with them as the decision maker in 25% ofthe investments).” Makes sense. If there’s not a business case for using technology that either helpsyou make money, save money or improve the life of someone you touch – beit a customer or a team member – then why embark on the effort? Andwhen looking for those little golden nuggets of opportunity, few arebetter poised than those in the trenches to provide a practical view ofthe problems and the potential solutions. Whether you call it Kaizen,continuous improvement, six-sigma, or something else, the philosophy ofsmaller incremental change is proven to deliver the potential for largescale, impactful benefit.What do you need to be successful? Ideally, you need to breed a culturethat’s fluid, adaptable and insightful—and technology that can act asbuilding blocks to meet that challenge. This is where I see the trueadvantage of supporting process transformation with an Enterprise Information Management (EIM) based approach. GivenI spend my days focused on delivering compelling product experiencesfor our OpenText Content Server customers, a product that acts as thefoundation to an effective EIM strategy, I got to thinking at a morepragmatic level. I would like to hear more from you, our Content Servercustomers, about your goals and projects for 2013. A few of the many questions I’m interested in hearing your thoughts on: What strategies are you focused on to increase the value from your content? What business processes have you content-enabled with Content Server, and what’s next? How do you prioritize projects? If you feel strongly about a project that hasn’t been deemed a priority, how do you build the business case? How mature is your information governance strategy? And do you dovetail process improvement initiatives when building on that strategy? Ithink this could be a very enlightening discussion for many of us,myself included, so I really hope you’ll chime in. Or, whether you’realready an OpenText Content Server customer or not, if you have a specific project you’d like to discuss, drop me a note at aclarke@opentext.com or on Twitter @alimclarke.

Read More

A Chilling Look at Unchecked Information Growth

“New information is like snow and legacy collections of data are like glaciers: each year, layers are added, some melt away, but the accumulation keeps growing.” So writes analyst Grego Kosinski in Stop Employees from Hoarding Electronic Documents, a new Contoural white paper sponsored by OpenText. White Paper It’s an apt analogy, one I quite like. The image of a glacier – immense and inert – conjures up what many IT pros are facing in the near future without a solid plan for shoveling today’s snow. Without a solid archiving strategy, flexibility in your business may soon prove positively glacial. Ok, that’s enough with the metaphor. 🙂 Seriously though, the massive volume of content and information inside your business is a problem that’s not going away. Compounding and exponential growth of data from email, business applications, documents, and files of every type are presenting a major challenge for enterprises that need to keep business information over the long term. Archiving strategy addresses an increasingly critical situation compounded by skyrocketing volumes of data, a worldwide tightening of regulatory and compliance requirements, a growing need for litigation preparedness, and the reality of budget constraints. The lifecycle of enterprise content and information starts with the business applications used most in your organization: Email: Management of email solutions from on-premise solutions (Microsoft Exchange, Lotus Notes) and Cloud email (Office 365 Exchange Online, Gmail). ERP and CRM Suites: Business applications from firms like SAP require a reliable long-term archiving solution for millions of documents. Microsoft SharePoint: Managing SharePoint sites and mission critical business information across your enterprise. File systems: More than just backup, organizations need a plan to ease the expensive storage burden on file systems and transition this content to archives or delete unnecessary content. Social/Mobile Content:New forms of collaboration in the enterprise from social feeds to mobile communications are only increasing the speed and time sensitivity of EIM. Keeping a separate archive for each of these applications is proving too costly and complex. A centralized archiving strategy is a must. Quantifying the Business Value of Archiving Strategic archiving programs deliver long-term storage savings between 20 and 40%, compared to existing silo’d archiving environments. These savings are delivered by a combination of rationalization of storage infrastructure, smarter storage decisions and decommissioning of legacy systems. Reducing Storage Costs: Indiscriminate choices about where to store content during various phases of its lifecycle are costly. The refrain “storage is cheap” doesn’t apply when the repository is busting at the seams and is a disarray of un-indexed content and information. Reducing long term storage can be achieved with the following: Tiered storage: An enterprise-wide repository for long-term retention across multiple storage devices are cost appropriate for level of access required. A tiered approach to storing archived content supports the best storage mechanisms including optical media, hard disk, cloud and tape. The retention-controlled archive can run either online or as near line or off line storage. Only applications can access it – there are no capabilities for direct end-user access. Single-instance Archiving: Especially in highly collaborative work environments, identical documents can be a risk of being stored several times. Single instance archiving (SIA) ensures you keep the same document only once. Depending on the amount of expected redundancy of email attachments,SIA can reduce required storage space significantly. Compression: In order to save storage space, content should be compressed before writing to storage system. Legacy Decommissioning: Multiple archiving repositories across legacy systems are a complex reality that can grow into a major problem for organizations. These systems are often maintained simply to reference historical information and become more expensive over time to manage as administrator skillsets focus on more current systems.A roadmap is required to lead to a single archiving strategy in one repository with legacy decommissioning tools to migrate content or manage existing content in place. Security: Risks associated with unsecured archived information can be as costly as security breaches on more current enterprise information.Intellectual property and other sensitive organizational information in archives must be subject to corporate policies that protect all enterprise content. Timestamps can ensure that document components cannot be modified unnoticed after they have been archived, guaranteeing the authenticity of archived business documents.By encrypting the document data, critical data such as salary tables can archived securely. Learn more about OpenText Enterprise Information Archiving.

Read More

OpenText Debuts Regulated Documents 10 at DIA EDM Europe

This is a guest post by Ethan Smith, Life Sciences Industry Strategist, OpenText. For OpenText Life Sciences business, Europe is a critical market, and one where we have had great success. In late November, I was able to participate in and witness this first hand at the DIA European Electronic Document Management Conference in Munich. The conference had representation from Life Sciences companies across Central and Eastern Europe as well as a number of attendees from Asia. Presentations were made by regulators overseeing the submissions and approvals processes for the US FDA, EMA and the Chinese SFDA. Aside from the exciting agenda, this was a big event for OpenText as we debuted our newest release of our Regulated Documents offering. Reg Docs as it’s commonly known is now available for ContentServer 10, our best in class ECM platform. The power of Reg Docs comes from the fact that it is a configuration-based solution, rather than a customized version of the underlying platform, enabling our Life Sciences customers to deploy Reg Docs alongside other solutions on the same instance of ContentServer. We gave a number of demos of the solution and received very positive feedback from the attendees. It became clear as we looked around the hall of exhibitors that OpenText has taken a very unique and powerful approach to the document management challenges facing this industry. Many of the other solutions are custom applications that may be built using strong technologies, but the heavy customizations have rendered them difficult to maintain, upgrade and support. OpenText Regulated Documents seems to be the only solution of its kind in this space and we were very proud to show it off. To learn more, check out this whitepaper.

Read More

Impressions from ARMA 2012

It has taken me a couple of weeks to gather my thoughts from ARMA 2012 held in Chicago this year. As usual, it was a great conference; it provided lots of opportunity to meet with old friends and lots of opportunity to talk to existing and potential customers. It was also alot information and interaction crammed into a couple of days, hence the lag in pulling together my thoughts. First of all, this year, more than any other, I think I can say that Records Managers get it. For the last few years, quite honestly, not long after the FRCP changes in 2006, we have been telling anybody that would listen that Records Management has to expand beyond the traditional scope of “Official Records”. Records Management and Records Management practices now have to address much more than the Official Records if organizations want to address much of the risk and cost inherent to the ever expanding volume of Electronically Stored Information (ESI). This idea has now become firmly entrenched in Information Governance, in no small part due to the work of some of the thought leaders in Information Governance like Barclay Blair and Mark Diamond. Certainly at ARMA, there was lots and lots of information and vendor positioning in Information Governance and it was evident to me that Records Managers were embracing Information Governance. But (and there is always a but), I think there is still a big gap between embracing Information Governance and doing anything about it. It is that gap that I tried to address in my Solution Showcase presentation. Evolving RM to Information Governance to Protect Your Organizations from ludlow6 The basic premise is that, in order to take advantage of the benefits of retention control, it is necessary to be able to classify content. And, in order to classify content, we must be able to capture it some way for classification. This might sound obvious and seem simple, but these two problems, capture and classification, are the problems I see most organizations struggling with on a daily basis. They know the benefits of retention control, in particular the ability defensibly delete content, but they are also very worried about the impact classifying and capturing content will have on end-users and the way people work. First of all, there is the problem of capture. We need some way to have the content being created put under management. There are lots of options in this area. The OpenText strategy is to create integration points with the costliest and riskiest sources of content, and provide capabilities to capture and centrally archive that content. In particular, some of the costliest and potentially riskiest content can be found in Exchange, Notes, SharePoint, SAP and Oracle. Off-loading content from these sources can reduce IT costs and storage costs, but more importantly, place content somewhere where it is searchable, can be placed on litigation hold, and classified for retention and disposition. I also discussed what I believe to be one of the most disruptive technologies faced by ECM and RM professionals – File Synching. File Synching could potentially be one of the biggest risks in terms of users loading and sharing content using consumer-based file synching. Not only is there risk in terms of IP loss, etc., but we also have to assume that users are also circumnavigating our “official” ECM solutions. But, File Synching is also a massive opportunity for RM and ECM. If File Synching is attached to your ECM solution like OpenText Tempo, File Synching is also one of the best ways to encourage users to place content into the ECM system, not because we tell them to, but because they want to. File Synching is all about the making life easier for the end-user. The fact that we are capturing content for retention, disposition, security and litigation hold is completely transparent to the end user. I also took on one of the most contentious areas in Records Management and Information Governance these days – the battle between centralized management and manage-in-place. This is a very big topic, but I think the short-form answer is that most organizations are going to end up taking a hybrid approach. The areas where there is risk and cost in the lead application, like email, will probably continue to centrally archived. However, content that resides in systems where there is built-in business logic, but no retention capabilities, a case can be made for connecting to, and indexing this content to manage it in place. In the end, in-place vs. centralized will be another one of those balancing acts that we associate with Information Governance. This is a topic that will continue to be top of mind for many organizations as they try to bring more and more of their unstructured information under control. Capture is only the first step in getting content under control. It also has to be classified. Again, this is an area where RIM professionals are going to need to balance the needs of users and the needs of the organization. The presentation covers the pros and cons of the different methods for classification. There is no one single right answer, and most organizations will end up using most if not all of these methods based on the content that needs to be classified and the business process associated to that content. I took some additional time to go through our latest release, Auto-Classification, and the requirement for defensibility in Auto-Classification. The presentation finishes up with five uncomfortable questions to ask IT and Legal. I think asking, and answering these questions will provide some of the impetus it will take to move from embracing Information Governance and actually doing it.

Read More

The Email Pyramid

One of the key issues in Records Management is how to manage email. Email is the dominant content type requested by FOIA and eDiscovery because it often contains the “smoking gun,” and because people tend to be unguarded in their actions while using this medium for communication. Therefore email must be governed to reduce liability. Email also serves as key business records that document correspondence between parties, often as the only record for important decisions or acknowledgements. Email volume is growing exponentially and is a significant issue in most organizations. There are many types of email solutions, but the prevailing approach today is one that is a component of an Enterprise Information Management (EIM) Suite such as OpenText Content Lifecycle Management. This EIM approach provides Information Governance for all content, regardless of media, managed by an integrated Records Management solution. The solution applies governance and retention rules to all content, from the time it is created or ingested until its final disposition. As email volume continues to grow exponentially, organizations face several issues: • Optimization of the production email system• Capture of some email as important business records• Retention of email according to the organization’s retention policy• Deletion of transitory or non-record email An EIM system, using the latest integrated archiving and auto-classification tools, solves this dilemma in a practical way. This approach involves a concept called the Email Pyramid. I initially learned about an early form of this approach from Benjamin Wright, an attorney who specializes in this area, at an ARMA conference several years ago. I also would like to acknowledge the contribution of Jason Baron, Director of Litigation for NARA, who also promotes a similar approach to email for classification purposes. I have added some ideas to the concept, including a pragmatic approach for implementing email governance in the real world. The Email Pyramid The general Email Pyramid concept is that there are three components of email in an organization. As shown in Figure 1, there are Role Based Classification, Business Records, and the remaining vast volume of unclassified emails that most organizations simply do not have the resources or tools to classify. I will explain each component and how current EIM solutions address these requirements. Figure 1, The Email Pyramid Role Based Classification There are a small number of key individuals in an organization such as the president, general counsel, chief financial officer, and so on whose email should be retained for longer periods of time than other email. The rationale for this is that this email contains a historical record of key activities within the organization. It often contains background information on important events and is frequently used in litigation. For government this email is retained permanently. For private industry it may be retained based on a retention schedule that reflects its importance, and each organization will schedule these records appropriately. For classification, using tools within the EIM system, these emails are classified based on role. All emails for these individuals are retained and are moved into the EIM system. Identification of the emails for archival and classification is a straightforward process using standard email rules. Business Records The middle tier, Business Records, includes those emails that are used to document business transactions. These may contain the only record of an approval for a change order, authorization to proceed for a contract, acknowledgement of a transaction, and so on. Many organizations have their users print these emails and file them in a project or contract folder. Often they remain in the email system, ungoverned and unclassified. These emails should be moved from the production email system to the EIM system. Ideally automated business processes accomplish this without manual intervention. All notifications, acknowledgements, and other correspondence that is generated or received by a business application should be configured to store these as records within the EIM system, with appropriate metadata. These records are automatically classified based on Document Type such as Contract, Change Order, FOIA Request, and so on. For processes that are not yet automated, the EIM system should be configured to allow users to drag and drop emails from the email system into the appropriate folder on the EIM system. It is important that the manual process be as easy as possible so that users will adopt the process without significant resistance. The result is an EIM solution that maintains a complete history of business transactions, to include all supporting documents and emails. This approach supports eDiscovery, FOIA, compliance and audits, making all required information available using standard search tools. Auto Classification The remaining emails often number in the multi-millions. There is no practical way to manually identify or classify this volume, and having users do it themselves is usually not feasible. Today, however, there are auto classification tools available to help accomplish this daunting task. These emails are archived, and the email archiving tools eliminate duplicate emails. This significantly reduces the number of emails being archived. Auto classification tools such as OpenText Auto Classification automatically classify the remaining emails according to the RM retention schedule. A key component of this approach is to create a “Transitory” big bucket that represents those emails that are non-records. You assign 180 days or a similar rule to this category and in this manner a very significant portion of the email volume, sometimes 60 to 70 percent, is destroyed in a legally defensible manner. A recommended approach for this type of email classification is to use a big bucket retention schedule. The auto classification engine must be trained, using example emails, to recognize which classification an email belongs to. This task is much more difficult using a typical granular retention schedule. The result is an email archive that has deleted much of the duplicate and transitory content, and which contains remaining emails that are classified according to the official big bucket retention schedule. This approach is not 100% accurate. Accuracy of 60% or better is considered successful. A good classification engine, using ongoing training, should be able to attain 70% to 80%, or better, accuracy. This “good enough” approach, when compared to no governance at all, meets most legal requirements. Summary Email management is a complicated subject, and most organizations use an approach that is piecemeal at best. Best practice using current technology is to treat email like any other business record, saving what is important, discarding the chaff, and applying standard retention policy to what is retained.

Read More

The State of the Art in Records Management – Part 2

In my previous blog, the State of the Art in Records Management – Part 1, I discussed the new market drivers for records management: eDiscovery/Compliance Audit Readiness Big Bucket Retention Schedules The New Definition of a Record The Obama Managing Government Records Directive In this blog, I will cover the vendor response to these new market drivers. As recently as five years ago, customers often had to build custom integrations between “best of breed” products to achieve an enterprise class solution for ECM and RM. This approach is expensive, and once accomplished it is expensive and difficult to modify, expand or upgrade. It did work, however, and the resulting solutions often provided significant return on investment. The challenge with these systems is to adapt to rapidly changing technology and new market drivers. Over the past five to ten years there has been a frenzy of acquisitions by the major players, resulting in market consolidation. IBM, OpenText, EMC, Oracle, and HP have now consumed most of the independent vendors in the Enterprise Content Management space. For example, IBM purchased Tarian Software, FileNet and DataCap; EMC purchased Documentum and Captiva; HP purchased Autonomy and Tower Software; OpenText purchased Hummingbird, Metastorm, Captaris, Vignette, RedDot and Global 360, and Oracle purchased Stellant and Fatwire (this is only a small sample). Over 100 formerly independent vendors are now consolidated into the top five. Many of those custom integrations have been affected by the market consolidation. Some of those best of breed products are now owned by different companies and future compatibility with custom solutions may be in question. As a result of market consolidation, major vendors are now offering comprehensive ECM suites that offer a broad set of features and functions. Customers can purchase those suite components they need, and they can add others as needed without (theoretically) having to develop significant custom integrations. Some of the key feature sets now available as components, or “modules”, of ECM Suites include: Records Management as a component of a larger ECM solution Combination of Electronic and Physical Records Management, with DoD 5015.2 Email Archiving and Email Management, with Records Management (Exchange, Lotus Notes, GroupWise and now Google) Integration with MS SharePoint and MS Office Business Process Management, Workflow, Business Process Analysis Auto Classification Litigation Holds, Litigation Support Enterprise Storage Architecture Document Management with Version Control Integration with Key Business Applications such as SAP and Oracle Basic and Advanced Search, Full Text Search Advanced Reporting and Business Analytics Digital Asset Management Web Content Management, Portals Social Media Management and Governance Mobile Extensions to phones and tablets Capture of paper, fax, and electronic input streams Electronic Forms, E-Signature Output Management On Premise or Cloud Hosting This suite approach is designed to allow customers to address the full complement of requirements, driven by the market drivers outlined in Part 1 of this blog, using a homogenous set of modules. Users can implement point solutions and add new applications without having to re-architect the infrastructure. Customers can purchase the various components from a single vendor and thus manage contracts and support much more easily. Software upgrades (again, theoretically) can be managed in a much more harmonious manner because the components are all from the same code or solution base. This is all good news for organizations that are planning to implement enterprise solutions. Technology has kept up with the new market drivers and vendors are offering solution suites that address user needs. For Federal agencies that are making plans to move to a completely digital records environment by 2019, as the new NARA/OMB directive mandates, available solutions from companies such as OpenText are ready to meet the opportunity. Customers must challenge their vendors during the selection process to prove that all of these components do indeed work together as advertised. Having said all that, successful solutions are not based on technology alone. Clear requirements, budget, resources, change management, operational support, training and all of the intangibles are necessary to make deployments as successful as envisioned. In this new technology suite, Records Management becomes an integrated component of the overall solution. Ideally records classification is performed automatically upon indexing of content within a business process. The new Big Bucket Retention Schedule facilitates efficient use of automated technologies to classify records. Automated tools such as Auto Classification are used to classify large volumes of content such as email and file shares. Business transactions are automated using Business Process Management, and all related documents and email are automatically stored and classified. User classification of content, strictly for Records Management purposes, is kept to a minimum. Information Governance is applied when content is created, captured or ingested, making RM part of the DNA of business applications. Email, instant messages, social media, correspondence of all types, reports, and more are all managed by the solution, including physical files. Searches are media neutral, and when electronic content is located it can be retrieved if a user has the proper system privileges. If the content is a physical file, then the system should indicate the location of the file and offer check out and check in features. All business transactions, if the system is fully implemented, include a full audit trail of ancillary original documents that facilitate audits, FOIA requests, and discovery. Public or customer facing transactions provide better customer service. Elimination of paper from the organization, as mandated by the Obama directive for Federal agencies, results in very significant cost savings – for storage, staffing, user productivity, and transaction cycle times. Fully digitized records and business processes result in shorter cycle times and huge productivity and cost savings. So, as you can see, the new market drivers are being addressed by technology in a way that makes it quite reasonable to implement enterprise wide solutions that provide a rapid return on investment with a high rate of success, establishing an infrastructure that can evolve and grow over time.

Read More

Faster, Higher, Stronger: OT Archiving Test Results

Guest post by Paul Briggs, Senior Product Marketing Manager, OpenText In the spirit of the upcoming London Olympic Games, we decided to title this post “Faster, Higher, Stronger” to represent OpenText’s mission to deliver industry-leading product performance. OpenText continually runs performance testing of our software integrating with key partner platforms. Earlier this year, OpenText and Microsoft conducted performance and scalability testing on the email monitoring and records management components of the OpenText ECM Suite running on Microsoft SQL Server 2012 data-management software. See the record setting results here OpenText on SQL Results. More recently, testing with IBM and HP benchmarked OpenText Archive Server’s performance on hardware from partners running write/retrieval operations of SAP data and documents. The results are good news for customers running OpenText Extended ECM for SAP across a variety of server environments including IBM and HP. Businesses running SAP rely on OpenText Archive Server as part of the OpenText Enterprise Content Management (ECM) Suite for efficiently storing and retrieving large volumes of fixed content. These solutions help reduce storage costs and improve system performance by archiving SAP application data and documents in one central repository. This also gives users quick and easy retrieval, sharing, forwarding and reuse of content in the correct business context, and according to strict compliance demands. IBM: Archive Server Testing on IBM Power7 Servers At the IBM Innovation Center in Ehningen, Germany, OpenText achieved top marks for Archiving and Lifecycle Management scaling on IBM Power7 servers. Testing scenarios focused on SAP document archiving and lifecycle management, showing that customers could save as much as 90% in long-term storage costs. That’s based on Archive Server’s very high write/retrieval performance on the IBM Power7 platform and support for inexpensive tape storage solutions. Let’s look at those two areas in more detail: Performance: Document Throughput For archiving documents out of SAP to hard disk, OpenText Archive Server achieved a throughput of 1.8 million documents per hour. That was based on a 20 kb file size running on a 1 Gbit / s network bandwidth. At 500 kilobytes, approximately 800,000 documents per hour were archived, capped by the network bandwidth limit. What does that mean in real world terms? SAP documents typically average no more than 250 kb. So customers can expect well over one million archived documents per hour with continuous encryption of data transmission. In a continuous 10-hour day, that equates to between 10 and 15 million SAP documents that can be archived. This figure is much higher for serving read accesses to the Archive Server with disk based storage systems. These dramatic throughput levels mean that customers can deploy a single Archive Server to meet their needs with capacity to spare. In this time of squeezed IT budgets, this is the type of news customers love to hear. Not only does this performance level keep archive server requirements to a minimum, customers can also utilize existing tape infrastructure with IBM Tivoli Storage Manager (TSM) or IBM System Storage Archive Manager. Efficient Tape Support: Unique to OpenText Archive Server Archiving of SAP documents in a secure and cost-effective near-line storage option like IBM TSM is a unique capability of OpenText Archive Server. The cost savings of tape storage is to 90% compared to other storage options. OpenText Archive Server supports TSM with an intelligent algorithm for document reload from tape and sorting reload according to sequence of tape. This eliminates unnecessary seeks on tape/tape changes, and minimal wear and tear of tape. HP: Archive Server Testing on HP ProLiant DL580 G7 servers Working in conjunction with HP, OpenText conducted the testing in May 2012 at the HP Partner Technology Access Lab based in Houston, Texas. The test environment was run on an HP Converged Infrastructure, including HP ProLiant DL580 G7 servers. The test configuration was designed by OpenText, HP and SAP to prove the performance capabilities of a system designed from the ground up to: • Provide a holistic, value-based and integrated approach to managing enterprise content • Reduce risk by enabling better management of governance and compliance • Lower both operational costs and the total cost of ownership (TCO) of an ECM solution • Extend the value of the SAP® Business Suite to enable business transformation OpenText surpassed the SAP certification requirements. All functional requirements of the ArchiveLink test were fulfilled and the ArchiveLink load test showed a performance well above the required KPIs. Additional tests independent from SAP showed an even higher performance. The strong performance results underline the strength and scalability of a single Archive Server instance on HP ProLiant servers, and give OpenText and HP customers the confidence of deploying an SAP certified enterprise-ready solution. SAP customers can store ~10 million documents a day (at 10 hours/day operations) on a single Archive Server. High Volume Archiving: Industry Examples What these performance benchmarks show is the ability for OpenText Archive Server to handle massive volumes of data, increasingly typical in many industries today. Tens of millions of archived documents may seem a tad astronomical, but it’s the reality for certain transaction-heavy industries. Here are a few examples: Telco: Millions of monthly bills are met with customer inquiries in the following few days after issue. Call center employees receiving those engagements require a fast and easy retrieval of archived items to access data and solve issues for customers. Banking:High performance when reading archived documents is also critical for end user scenarios in banking. More than 400,000 simultaneous online banking sessions are common in Europe, each requiring unique authentication identifiers. Pharma: In industries like Pharmaceuticals, documents are very large comparatively. These docs contain extensive documentation to support rigorous regulatory compliance and volumes of testing prior to product release.

Read More

Why the “RMA” is No Longer Relevant

The Records Management Application (RMA) as defined in days past is now an anachronism. The traditional RMA was a back-end system that only managed records once they were declared. This “view from the basement” approach is fraught with issues such as lack of funding, lack of resources, and low organizational priority. I have been to dozens of seminars and conferences devoted to Records Management and they consistently refer to the RMA as if it is a stand-alone system out of context from the mainline business of an organization. In today’s world, driven by issues such as eDiscovery, audit readiness and compliance, this is no longer the case. The industry has transformed from Records Management to Information Governance. Content must be governed from the time it is created, captured or ingested. ALL content must be governed, not just declared records. Transitory records must be explicitly classified so that they can be destroyed according to policy. This new paradigm, called Content Lifecycle Management, addresses building Information Governance into all mainline business processes. Enterprise Content Management suites now include Records Management as a subset of functionality so that it is built into the DNA of business transactions in a way that is transparent to end users. For example, a workflow process supporting a contracts management effort automatically saves all documents into the ECM system, including metadata that is part of the workflow process. Saved documents are classified automatically according to document type against the retention schedule, using this metadata. No individual has to “declare” a record; it is done as a function of the business process. The benefit to the organization is that they now have a full audit trail of each business transaction that can be used for legal discovery, litigation holds, audits, and more. Transitory documents and email are destroyed according to published policy and therefore the organization benefits from lower storage and discovery costs. Potentially HUGE cost savings are realized through lower discovery costs, lower sanctions and fines, and via the ramifications of NOT failing audits due to lack of documentation. Auto Classification and eDiscovery tools are now included in some ECM suites, so organizations can now deploy tightly integrated solutions that encompass the majority of Information Governance requirements. Business Process Management (BPM) tools orchestrate business processes to marry business transactions with unstructured content that bridges information silos that include ERP systems such as SAP, custom database applications, email, collaboration tools, mobile and social media. BPM, for example, is a mainline technology that is funded at the highest levels in an organization to improve productivity and lower transaction costs that are essential to most organizations. Deploying BPM as part of an ECM suite that includes this complete audit trail of transactions and records is now the state of the art in our industry. RM is essential to this approach, but it is an integrated component, not a stand-alone system. Yes, the traditional back-end RMA is DEAD.

Read More

The State of the Art in Records Management – Part One

Over the past couple of years I have come to the realization that with the many changes taking place in our industry there is a new “State of the Art” in Records Management. This change has come about because of new market drivers combined with new vendor solutions in response to these requirements. Part One of this blog focuses on the new market drivers affecting the industry. Part Two will focus on how the industry has responded to these new drivers. eDiscovery The Federal Rules of Civil Procedure dictate that ALL available documents are discoverable in a court case. This includes electronic content, regardless of media. Email is the largest type of electronic content that is requested in eDiscovery. Instant messages, voice mail, copies, drafts, personal correspondence, and so on are all discoverable. Therefore, it does not matter if your organization has declared “official” records, all content whether declared or not is discoverable. Discovery costs are very high. The ramifications of missing documents in a litigation can be very high as well, leading to fines, sanctions, and negative judgments; in some cases leading to jail time for defendants. Therefore, it is very important that Information Governance is applied to all content so that it can be found easily, so that it can be automatically classified to the correct retention rule, and so that it can be discarded if it is not relevant. This approach lowers discovery and storage costs and eliminates or significantly reduces the missing documents issue. This approach, called Content Lifecycle Management, applies Information Governance to all content from the time it is created, captured, or ingested, all the way through final disposition. Governance is applied to email, social media, electronic documents and all versions, file shares, SharePoint, and content created or captured by business applications such as SAP, case management applications, collaboration tools, and more. This market driver has changed the foundation of records management, leading to a redefinition of a Record. In the past, a record was “declared” once its business process was completed, and the Records Retention Schedule defined how this static record was to be filed, stored, and retained. File Plans and Retention Schedules were based on a paper paradigm, focusing on how to organize boxes so that they could be accessioned, archived, and dispositioned effectively. The new definition is “everything is a record.” Therefore, all content must be managed. This approach requires a new Big Bucket Record Series called “Transitory.” Using this approach content that is considered to be a non-record such as personal email, non-business correspondence, drafts, copies and so on can be explicitly classified as Transitory. A retention rule is applied such as 120 days, and this Transitory content can be defensibly destroyed. This approach reduces your content and storage requirements significantly and lowers discovery costs. In this new paradigm, Records Management (RM) is a subset of an Enterprise Content Management (ECM) system. The ECM solution enables RM to become part of the DNA of critical business processes such as Human Resources, Accounts Payable, Finance and Accounting, Case Management and more. These processes receive and create records that are captured and indexed as a function of the business process so that RM classification is automatic. Building these mainline business processes to include RM produces a full audit trail for all business transactions. This supports the next key new business driver, Audit Readiness. Audit Readiness Audit Readiness is a term coined by some U.S. Department of Defense organizations to refer to the ability of an organization to respond quickly, accurately, and completely to an audit. Paper based and non-automated solutions to this requirement can produce negative results and serious ramifications. An automated process, on the other hand, can allow organizations to relatively easily produce a full audit trail of their business transactions, reducing the time and effort required to respond to an audit. Because failed audits can result in potential costs in the millions of dollars depending on the context, this issue has come to the forefront as a market driver. Using the Content Lifecycle model, this complete audit trail of information that supports mainline business processes is attainable. For example, using OpenText’s Extended ECM for SAP, all supporting documentation around key business processes are capture, classified and retained based on a set of business rules. The Presidential Memo – Managing Government Records President Obama issued a memorandum to all agencies on Nov. 28, 2011 requiring each agency to appoint a senior official to deliver a plan for moving the agency from paper based records management to electronic records, specifically email, social media, cloud solutions, and so on. Each agency has submitted their plan to the National Archives and the Office of Management and Budget. This summer a directive will be generated to all agencies that will incorporate a number of ideas from agencies and the public for how to move to electronic records. As is the case with many government directives, this is an unfunded mandate. However, moving from paper to digital based recordkeeping and processes, as we know, has a very significant return on investment. Therefore any solution deployed to solve this mandate, if deployed wisely, will pay for itself many times over. Doing More with Less One of the tag lines in the U.S. Federal Government these days is “doing more with less.” This refers to the dynamic of shrinking budgets and resources combined with increasing volumes of information to manage. This applies to the private sector as well as all levels of government in these challenging economic times. Technology and automation are keys to successfully meeting this challenge. Tools such as integration of RM and Information Governance into key business processes and email archiving, and using new tools such as Auto Classification help organizations cope with these increasing volumes. Big Bucket Retention Schedules The new paradigm for RM includes a redefinition of Records Retention Schedules. Legacy schedules were developed around a paper-based paradigm and typically are very granular, with many arbitrary retention rules that are now unnecessary and undesirable. Many organizations have hundreds or even thousands of individual record series. In the past several years there is an increasing trend toward Big Bucket Record Schedules that boil everything down to just a handful of “buckets.” In the context of an ECM/RM solution, the ECM system contains all the metadata needed to search for and find records, so the retention schedule only needs to determine how long to keep it. A streamlined Big Bucket schedule is easier to implement, is more intuitive, and compliance is much more effective. The Government Accountability Office (GAO) is a leader in this trend, with a new schedule that contains three big buckets and a total of 27 sub-categories. Other agencies are following GAO’s lead, and similar moves are being made in the private sector. NARA has endorsed this approach so it is becoming increasingly more accepted. To recap, the new market drivers that are now “State of the Art”: • eDiscovery and the Federal Rules of Civil Procedure• Redefinition of a Record to Include Everything• Audit Readiness• Presidential Mandate to Move from Paper to Digital Records• Do More with Less• Big Bucket Retention Schedules Stay tuned for Part Two of this blog, how industry has responded to these new drivers.

Read More

Implementing an IG Program – Where to Begin?

Implementing an Information Program – Where to Begin? I am not sure if this has anything to do with the latest heat wave in Ottawa, Canada – yes, we are experiencing summer-like weather in March. Trust me, I am not complaining, but spring is in the air and it seems like it’s making everyone wake up to a very important topic: where to begin with information governance? I have received a flurry of inquiries over the past week, all with a common theme – where should our organization start?Something that is critical, and reiterated by many, is to ensure an information governance program lies with the Chief Information Officer and Chief Legal Officer, while involving records management and the key stakeholders from different business units. It is a team approach and not something that can be successful if only driven by one team.As opposed to writing an extensive blog post on all the key considerations, I have decided to talk about just one: start with the riskiest content. One important element is to look at your riskiest content first as it’s not always possible to conquer all requirements at once. There is nothing wrong with a phased approach; proving success in small increments can prove to be a good approach. One of the riskiest content pieces is email. Why?Simply because no one ever gets rid of it or if they do, it’s done on a whim with no consideration to its content. This type of activity can translate into an enormous exposure to risk due to the likelihood of smoking guns and spoliation. So, can email be managed?Absolutley.I strongly encourage flexibility, specifically when it comes to classifying email – one approach will not fit all.There will be some users that want the ability to drag and drop; others will want to choose their classification from picklists or favourites, while others will want the system to automatically classify emails.All methods are encouraged as you want to make sure you provide a system the end-user will use. As an example, take a look; well actually have a listen to how NuStar Energy addressed its email management challenges. NuStar Energy L.P. is a publicly traded, limited partnership based in San Antonio, with 8,417 miles of pipeline; 89 terminal and storage facilities that store and distribute crude oil, refined products and specialty liquids; and two asphalt refineries and a fuels refinery with a combined throughput capacity of 118,500 barrels per day. With operations in the United States, Canada, Mexico, the Netherlands, the United Kingdom and Turkey, the company uses OpenText Email Management for Microsoft Exchange to help reduce the cost and risk of mismanaged corporate email. “Having all email in one location, being able to search in one place and put a legal hold in one location instead of potentially seven or eight, is huge for us on the legal end. It’s all managed by OpenText.”— Clint Wentworth, Records and Information Manager, NuStar Energy. There has been a lot of traction around the topic of auto classification. A very important element is to make sure the offering is defensible at the same time as being transparent. Defensible so that the organization can state to the courts and auditors the processes taking place to test, tweak and monitor the way the information is being classified.Transparent so there is no disruption to the way our businesses work. Yes, there is at lot to consider when starting to implement an information governance program.A solid strategy, involvement from key stakeholders and addressing your riskiest content first is a sound way to begin!

Read More

Governance: It Doesn’t Matter What, It Really Matters Who

As I just got back fromLegalTech NY, where I spent many hours speaking with General Counsels and IT Directors about their business requirements, I wanted to share and repost a blog, which is relevant to those discussions. The following is a guest post by Dave Martin Originally posted on Dave Martin’sblog on SharePointProMagazine. Read the other posts in this series here. Over the years one of the many things I’ve been involved with is governance.To most the word governance is synonymous with compliance, which is then in turn synonymous with records management.After that the focus becomes very specific.What I recommend people do when trying to understand how they should approach governance is to approach it as a strategy and make sure that strategy involves and intertwines three things: people, process and technology. If this sounds familiar it was an integral part the first post I wrote in this series around understanding SharePoint from a big picture perspective.When it comes to governance specifically there is a certain part of this triumvirate that stands out: the people.We often run headstrong into governance deployments without really understanding who needs to be involved before the code hits the servers and processes are under way. The very first step organizations need to take is defining that small group, who will steer the solution to and through implementation.Obviously IT pops up first as we look to define this working group, and they are unquestionably a very big part as they will be responsible for the technology doing what it needs to do.Another group that should also be considered a bit of a no-brainer is the group or department, or in many cases, the individual responsible for records.This person may be by title the compliance officer, records manager, IT security or legal counsel, regardless they are responsible for the information policy management of the organization.And lastly, but certainly not least we must include someone, or some group that represents the line of business worker, or end-user. Surprisingly, I have seen this last group consistently excluded from the planning process.Not because they are a problem or difficult to work with, but because the people that are actually going to use the solution are often an afterthought, or as IT would consider them: the customer.DO NOT forget to include this group!At the end of the day they will literally make or break the deployment’s success causing problems for both those other groups at the table as they won’t understand the technology (frustrating IT) or they don’t execute according to policy (putting the company at risk). Once we have the right contributors at the table we can start to define the governance strategy.When people are defining their governance strategy I always promote that they ask themselves a few key questions to help better understand what they want to do, who it will affect and what they need to do it.Once these questions have been answered a plan can be more easily defined. The first question is: do you understand your content?This is very important and can also be made as a statement: know your content!We have content broadly spread across our environment, not just in SharePoint.If we are planning to move large portions of that content into SharePoint – file share replacement is one of the top uses of SharePoint – think about what you are moving over.Is this relevant data? Is this data that must live under compliance?Is this duplicate data?Is this active data? This last question is an important one to consider in terms of SharePoint.SharePoint is an active content solution, and a relatively costly place to store content.If you are moving massive volumes of data into SharePoint it just does not make sense to move old, inactive content into SharePoint from a cost perspective.This content should move directly into an archive that lives on a lower and cheaper tier of storage. Once again we must consider “the who” for a second here.Even though we are moving content out of SharePoint and into a more cost effective compliant place we cannot forget that users should be able to access it or restore it (permissions pending) directly from SharePoint. My next question is: what are your specific compliance requirements?This varies widely from company to company and industry to industry – every company has corporate policies specific to their internal requirements, and many companies have to adhere to industry regulations.SharePoint does a great job of managing the content in SharePoint as records, but does an even better job when supported by partners.As broad as SharePoint’s records capabilities are when it comes to supporting industry regulations and government guidelines like the Department of Defense 5015.2 (DoD 5015.2), physical records and records living outside of SharePoint’s native repositories a third-party add-on solution is a requirement. And for my last question, we go back to “the who” again: How will we govern the people?Again, for most, information governance has to do with the information, but we must also be sure to govern the people if we are going to be successful.This question relates to how we are enabling people to leverage the core strengths of SharePoint, and this all starts with the creation of Sites and filling them with content.Organizations have to have a Site provisioning plan in place or they risk putting the organization as a whole at risk.Site sprawl is not just a myth, it is a reality, but it doesn’t have to be feared.Attaching a lifecycle and policies to a Site at the point of creation will ensure that Sites are connected to the data center and can be managed under the watchful eye of IT.Not only this, but we can now monitor those same sites and move them to the appropriate tier of storage once they have become dormant or inactive.Site provisioning allows organizations to permit the creation of as many or as few sites required all in a controlled fashion. As you can see, understanding “the who” when defining your governance strategy for SharePoint is a pretty big deal.Not to downplay the value of process or technology, but to use an analogy: it is the person that drives the car down the right road, and it really helps when that person knows where they’re going.Just like a good governance plan for SharePoint, people who drive cars will get to their destination faster if they have good maps. To find out more, join me on February 21st at noon EST where I’ll be participating in the webinar Extending SharePoint Across Your Information Infrastructure. You’ll learn key concepts required to turn SharePoint into a multifaceted, stable, and powerful IT tool set.

Read More