Enterprise Content Management

Announcing OpenText Content Suite: Accelerating Time to Governance

Walking out of the keynote presentations at Enterprise World this morning the air was buzzing after OpenText CEO, Mark J. Barrenechea unveiled the details of the much anticipated OpenText product releases. After months of preparation by teams throughout OpenText, Mark shared details of 5 new Enterprise Information Management product suites, new innovations in our Tempo family of products and AppWorks, our new common API and developer platform across all of the OpenText product suites. On behalf of the entire enterprise content management team, it’s truly my pleasure to announce OpenText Content Suite, a comprehensive, integrated suite of ECM products that accelerate time to information governance. OpenText Content Suite, the foundation for OpenText’s Enterprise Information Management (EIM) offerings, is a comprehensive Enterprise Content Management platform with applications and add-ons designed to manage the flow of information from capture through archiving and disposition. OpenText Content Suite ensures agile information governance and reduces risk while allowing organizations to focus on using information to drive strategic growth and innovation. Capture more content; govern all your enterprise information—no matter the source! Agility is the key to addressing the challenges related to the explosive growth of business information in an increasingly complex and dynamic regulatory landscape and the rapid growth of business information. OpenText Content Suite supports an agile strategy by building information governance into the applications users work in every day to manage content throughout its lifecycle. Content becomes smarter as it travels throughout the organization, via collaboration and processing, its value enhanced with rich metadata allowing for greater control of security, visibility, findability, mobility, and validity. What’s New in OpenText Content Suite 10.5 What’s New in OpenText Content Suite 10.5 from OpenText The OpenText Content Suite is a comprehensive Enterprise Content Management Platform with applications and add-ons to manage the flow of information from capture through archiving and disposition. Content Suite 10.5 is Dependable Simple to administer. Flexible to deploy. Bulletproof Information Governance you can depend on for the evolving needs of your enterprise. Information Governance needs are evolving to include new content types and increasing demand for deployment and management of content in the cloud. Based on OpenText’s proven Archive Server platform, two new visionary products were announced: OpenText Email Archive for Google Apps OpenText Cloud Archive These offerings expand our footprint and extend Information Governance further into the Cloud supporting governance of popular Google and O365 cloud-based mail, file systems and any CMIS content type. Content Suite 10.5 is Extendable Easier, faster, standardized and documented development tools make rapid application creation and system integration infinitely easier. This release contains a number of features that make life easier and simpler for the developer, allowing them to work in standard development environments that do not require special programming skills. A common, documented, standard layer of development tools including a new REST API based on AppWorks and a contemporary IDE based on Eclipse, ensure developers can simply and easily develop applications that integrate with and extend the Content Suite platform. Ease of use coupled with the ability to modify the experience completely through UI widgets and visual access tools such as ActiveView further allow organizations to build information governance into business processes. Content Suite 10.5 is Adaptable Adapts user experience to business strategy, improving adoption, efficiency, and impact Adoption is key to a successful ECM deployment. Over 600 customer and business driven features have been made in the Content Suite. Key areas of focus include a more visual UI with document thumbnails and dashboards, ease of use and administration with features like HTML5 drag and drop and mobility with ECM Everywhere and Tempo. Want to learn more? This is the first in a series of posts discussing the latest innovations in Content Suite 10.5 so please watch for more news to come! For those of you on Oracle’s ERP system you should also check out my post on OpenText Extended ECM for Oracle® E-Business Suite another new ECM focused product offering announced this week. This new product brings together OpenText industry leading ECM with Oracle’s best-selling ERP system, E-Business Suite to drive productivity and governance within critical ERP-driven business processes. For more information on the OpenText Content Suite see www.opentext.com/content and to find out more about Extended ECM for Oracle® visit www.opentext.com/oracle. To learn more about how OpenText can help support you with Agile Information Governance www.opentext.com/infogov And if you want to follow the action as it unfolds for this week at, follow us on Twitter with the hashtags #OTEW2013 #OTinfogov and #ECM !

Read More

Do you Learn by Experiencing, Memorizing or Retrieving Technical Content?

technical content

I still remember the phone number of my best friend in elementary school. I recall walking home from kindergarten, muttering David’s number to commit it to memory so that I could have my mom call his mom – because as a kindergartner, that’s how we communicated with friends (and because my six-year-old self apparently didn’t have the problem solving skills to write the number down). I still remember my grandmother’s number as well. Those tiny pieces of knowledge were so valuable at the time, that I memorized them permanently for safekeeping. Today, all of us approach knowledge, learning and memorization in a vitally different manner, thanks to several significant changes. I think not only about what learning differently means for our Documentum and Captiva training courses, but also for all of us as working professionals, parents and community members. I thought I’d share a take on three trends in particular. 1. Memorize or Retrieve The first shift has more to it than the surface level discussions I’ve heard on the topic. We are all well aware that we have a powerful immediacy of knowledge, whether we learn information through Google searches, wikis, or discussion forums. The more interesting aspect of this immediacy to me, is that once I know I can go and get information, I don’t feel as compelled to memorize it. As part of my daily tasks, I could memorize things like the order of the arguments in SQL Server’s DATEADD() function and the different order in MySQL’s DATE_ADD(). But since it’s essentially a 15-second Google search, I choose not to memorize this bit of data. The critically important thing to know there is not the syntax, but rather that the function exists, what it does, and that the syntaxes vary. I have the knowledge, but have decided that I don’t need to retain the specific data needed to apply that knowledge because it is simple to look up. In our Education world, this is why we have both reference material and developer forums. 2. Becoming Your Own Curator This brings up the second major shift I see as affecting our learning – we are becoming curators of our own knowledge stores. Each of personally balances what information is worth learning, and which is only worth retrieving. Some people publish blogs just so they have log of what they’ve done and can relocate it. This emphasizes the power of the cloud. I use Evernote as my own documentation system: my own private cloud. Meeting notes, favorite queries, frequently used documents, and ideas and sketches for past, present, and future projects are all captured in a single repository that can be searched and accessed across all of my devices. Technical learners are even more judicious about what they know and choose to learn, and how they curate. In some cases, for pieces of knowledge, they use forums to ask colleagues or search for an answer. For knowledge they feel differentiates themselves or is useful to progress their career, they often choose methodologies or learning paths that allow for instructor interaction, hands-on trial and error opportunities, and networking with others in a class. I think of it almost like museum curators. They know what to keep in the front of the gallery, what to put in the basement, and what to get on loan from other institutions when they need it. All of us are starting to curate and parse data to learn, and it’s likely something we will all continually get better at. Not only will we continue to improve at this skill, the tools that are available to us to curate will continue to evolve and improve making the task easier and seamless. Going back to real-world Education scenarios, you may use Evernote to capture a key phrase or workaround an instructor shared. And you might choose Microsoft Outlook Contacts to safeguard the email addresses of interesting students you met in class. But you likely won’t try to repeat the core material, since you’ll have it in a reference document. 3. Breaking Time & Space Boundaries The third change I’ve witnessed is the break in synchronicity. We already broke the need for students to be in the same place at the same time by delivering courses and training live online. Students are dispersed physically, but they experience the class at the same time online. The dream of e-learning is to break the time as well, to enable each person to be independent of both time and place. This reminds me of how content in popular culture has evolved, from individuals reading books, to groups of people watching movies in a theater together, to social media groups watching mini-trailers individually yet commenting collectively. That break between time and place and how the knowledge is experienced changes the role of the curriculum developer, and greatly inflates it. They need to anticipate what learners need, how they might curate it, and – back to the first trend – what those learners may have retained or instead choose only to retrieve. This anticipatory curation process by the curriculum developer becomes critical to our students. If our technical publications team writes the book, then our curriculum developers produce the movie version of that content. They choose where to point the camera and how to develop the scenes. They determine the areas they think are most important from that core piece of knowledge. Furthermore, understanding the mechanisms that are used by learners today, our training is weighted even more toward imparting knowledge and hands-on skills, rather than just providing information (that can easily be looked up). Students leave our classes knowing how to use our products, how to find additional information, and critically, the capabilities (and limitations) of the products. Obviously we are already very close to overcoming more time and place barriers, thanks to tools like Skype, Google Plus hangouts, and other Web conferencing tools like WebEx. One remaining area to explore is video in the virtual classroom. Should we show the face of each student to other students in a virtual class? Will that turn the tables on who is the most skilled instructor and runs the class? Or have we reached the point that collaboration and interaction are as equal in skillset as the trained instructor model? We are considering all of these aspects now, and would appreciate hearing your thoughts on what you’d like to see in future courses. In future, I’ll talk more about how trends in learning are affecting how we design and deliver educational services. For this post, I’d be very interested in your feedback, especially on whether adding video to our student participants fills you with fear or excitement. Please post comments below.

Read More

Invisible, Ubiquitous, ‘Intelligent’: Enterprise Content Management in 2018

Guest Blog By: John Fisk, SAP In planning for an Enterprise World session about the future of ECM with Michael Cybala from OpenText, I spent some time thinking about what the market expectations will be in 2018, and how new technology might enable those possibilities. As prognostication is always more fun than working, I thought I would share a few of the ideas that emerged from the exercise. Some are mundane, and some are far-out, but I’d be very interested in feedback on any level– so please comment freely! (Note – these thought are only a personal perspective, and do not reflect any future product commitments by either SAP or OpenText) Invisible ECM Over the past seven years, SAP and OpenText have developed a product portfolio built on a simple value proposition: integrate content and ECM capabilities directly into SAP processes and applications, combining ‘seamless’ UIs, integrated data hierarchies, unified access controls and many other features into a unified solution. This integration has resonated with every industry in every region, and we believe our joint solutions are the best in the world, but there is still a long way to go. Over the next five years, customers should expect that the ECM platforms of today become ever more deeply integrated with all major applications. And key point is that this integration should become deeper, more pervasive, and ever more invisible to end users. Most of the time, they should be blissfully unaware that they are creating, sharing, or accessing content managed by a central ECM system. In a perfect world, the right content is always available when they need it, in familiar applications and systems of engagement. By 2018, I expect an ever deeper set of native integrations, repository abstraction layers, APIs and web services will enable applications to instantly and invisibly find the content and functionality needed in almost any situation. Ubiquitous ECM A related theme is the ubiquity of ECM. ECM capabilities will be made available across an ever-expanding set of devices, interfaces and UIs. In 2013 terms, that means end users can access this functionality view web, email, paper, text, mobile, tablet, desktop or application-specific views. By 2018, one might imagine that this includes a slew of voice-driven interfaces, a wide range of social media tools, digital agents or even robotic assistants. Who knows, maybe by then home deliveries of milk will be made by robotic ‘deliverymen’ and accepted by an authorized home security system, which, after reviewing the entire digital record of the cows health and production information and delivery records, will accept the food with a digital signature. Or surgically embedded retinal projectors will present media to end users directly on the inside of their eyeball…. Oh yes… I am getting ahead of myself… But let me opine that for almost any sci-fi scenario you can imagine concerning information access and presentation one can imagine, ECM and other technologies, will be critical enablers. ‘Intelligent’ ECM And that brings us to ‘intelligence’. Note the apostrophes please – we are clearly still a long way off from what anyone might call true intelligence – but here we just mean ability to improve process efficiency by the use of more advanced technology. Today, many leading ECM solutions offer little more than a crude method to attach static content to a process. Almost all other steps to support content-rich processes (think accounting, procurement, contracts, HR, etc) require a lot of mundane human effort. SAP and OpenText have advanced the ECM market by integrating ECM UIs, data structures, access controls, etc with SAP processes, embedding OCR/ICR capabilities, and combining workflows in order to ‘intelligently’ present content in the context of business process information. But again there is far more we can do and by 2018, I believe we will be starting to deliver genuinely ‘intelligent’ content management to eliminate almost all mundane work from processes. I know what you are thinking – we have been hearing this promise for years, and most of the individual technology pieces have been available for years – but only now can we combine these more advanced capabilities with a system that offers an underlying integration between structured and unstructured information. In short, combining tools such as semantic analysis, rules-based decision engines, ICR, fuzzy search, and – most importantly – integration with related structured content (i.e. Big Data) may finally yield a practical system that can ‘understand’ the meaning and context of content, and ‘intelligently’ make basic decisions about it. It would be a revolution long in coming. Evolving Underlying Technology All of these changes depend on the continued evolution of Enterprise Information technology. At the deepest levels of the stack, SAP has embarked on a journey to reinvent the underlying infrastructure for enterprise information with the invention of our HANA in-memory database. Keeping all enterprise information in a single, real-time data base has many benefits such as IT simplification, speed, and ability to keep ‘one version of the truth’ for both production and analytic data. These are all relevant to our vision of 2018, but one of the most intriguing possibilities that HANA enables is the possibility to fully automate transaction-analysis feedback loops. In the past, analysis was conducted on different systems and a different cycle from process transactions. But by integrating process information, unstructured content, and the ability to run analysis on all of that information, we have an environment where information can be captured, analyzed, integrated with process workflows, transactions executed further analysis conducted, and then content generated and output in a matter of seconds. So What? Technology is great, but why does any of this matter? Well, consider one obvious example – the laborious process of applying for a home loan today, with reams of documentation going back and forth between the lender and the borrower in elaborate and tedious due diligence process. In advanced banks, almost all relevant data about an individual credit and employment history is already available digitally in various backend systems, and all the forms can be created dynamically. There is no reason that by 2018, this process cannot be completed within seconds, far more accurately and efficiently than today. Or a little further out, imagine how digital personal assistants might evolve, and if they were able to automatically interact with the vast amount of digital information surrounding an individual. Assistants might be able to automatically interpret email and spoken requests to schedule meetings, arrange doctor visits, handle insurance paperwork, develop simple legal agreements, etc etc. But this would require systematic ways of managing data and unstructured content, analyzing it through various lenses, and then taking actions. Again it may sound sci-fi today, but the technology pieces are ready. The world just needs someone to put them all together and surprise us. What do you think? 

Read More

The Generation Gap in Engineering


Visiting engineering companies, I often hear the same story. And when I recently met with one of my long-standing clients, I heard it again: “I’m having a hard time with the new generation of engineers. They constantly complain about the usability of our applications. They don’t want to be bothered with the intricate processes to access our engineering documentation, standard operating procedures, and compliance information—especially because it’s difficult for them to remember what buttons to click and what screen to use to perform their tasks. To make matters worse, they also have to go to our portal, which is a separate application, to perform searches which return mostly irrelevant results.” So what are they were looking for? The answer is simple. These new engineers want a Facebook, Twitter, and Google experience all at once. They want to interact with information and processes at work in exactly the same way they do at home or in their personal lives. The duality between their business and personal experiences is simply not working and is making them less efficient. But it seems that simply providing a super intuitive way of working is not enough. The applications need to be intelligent in the way they control and secure information, trigger complex processes, form intrinsic relationships, and generate compelling intelligence without burdening the end user or forcing them to do it manually. The engineers want content to be automatically delivered to them based on who, where and what they are working on. To sum it up, these applications need to be extremely flexible so they can immediately adapt to changing business needs. I wholeheartedly agree with this view. After all, these engineers, architects, document controllers, and project managers are smart and highly educated users. Delivering design specifications or working through an operating procedure is already complex enough without making them work in unintuitive applications. It is simply a recipe for disaster. Out-of-date applications aren’t a unique situation in the engineering world, CAD drawings also seem like they are “locked in a dungeon”.  At the time, these applications were a significant advancement in the evolution of their manual processes, so engineers – especially experienced engineers – tolerate these tools, even though they are clunky compared to today’s sophisticated applications. Perhaps it’s time to re-evaluate through the eyes of the next generation to see a fresh perspective. In another blog, I’ll explain how this challenge can be solved using technology that exists today. Any guesses?

Read More

Mobility and Security Spark Innovation in File Sharing Today, but What Else Tomorrow?

file sharing

History keenly documents how the industrial age push for productivity eventually led to an innovative electric light bulb. That contained energy, in turn, vastly impacted how people worked, lived, traveled, and experienced our world. I don’t think it’s a far stretch today to look at work and personal lifestyle pressures that are prompting innovation in our content management space. Two trends in particular are at the forefront of our new Syncplicity release today – Enabling Mobile File Sharing and Protecting the Mobile, Enabled Users. 1.  Enabling Mobile File Sharing The power, storage capabilities, and screen style of mobile devices has not only changed how people share content, but has catalyzed their motivation to share it. When it’s intuitive and touch enabled to share files with colleagues, the odds quickly improve to do it more often. As we’ve seen with the skyrocketing number and usage of consumer apps, offering a compelling, highly functional and well-designed interface builds volumes of users. Coming at mobility from a content management lens, the average worker today owns about 3-7 devices and constantly interacts with unstructured data, especially Office files. Mobility to them means intense travel schedules, a mix between personal and commercial on the same device, and time pressures forcing the need for significant productivity. They have little time to respect process, policy, or seemingly “intrusive” security controls. These dimensions of mobility have sparked innovation in our Syncplicity design principle. Make file sharing a simple, productive, and enhanced experience that makes users want to use it again and again. Use a cloud-delivery model to keep features comparable, if not better than, innovations in consumer apps, to ensure users consistently use it over time. And keep adding capabilities to shave off user time, worry, or frustration. But mobility cannot be viewed as a stand-alone pressure, else security risks will elevate together with usage. Just like electricity eventually spawned the need for circuit breakers that prevented fires, a powerful file sharing solution needs to address security concerns as well. 2. Protecting the Mobile, Enabled Users While the free flow of files may thrill and delight end users, it may also alarm and disturb those charged with protecting an organization. Files may contain sensitive information provoking legal action, intellectual property, internal process instructions that can be exploited, and many other truly harmful types of content if fallen into the wrong hands. Together with innovation for mobile users, the latest Syncplicity release equally builds in several powerful security capabilities for IT professionals. Set policy across files. Graduate the levels of protection based on the user’s geography, profile, or other parameters. Choose whether data resides in the cloud, in your private network, or in a hybrid environment. Thanks to the intuitive design that attracts users, IT can also set security policy across a single solution that’s happily and voluntarily used by the majority of stakeholders. Finally, user convenience is not at the expense of compliance, and security in fact enhances productivity rather than detracting from it. What’s Next? As I look at these two formerly opposing trends and see how we’ve woven innovation together to address them, I also think about what’s next. Now that users can safely share files around the world, will they create more of them? Now that we have the broadest and safest levels of reach for content, will global knowledge and productivity rates rise? Once users get a taste of “awesome” secure apps in their enterprise, will that pressure HR benefit tools, Sales tools and other enterprise apps to improve as well?

Read More

A Golden Solution for National Health IT


Being born in the former Soviet Union I was not fortunate to have the quality of a public healthcare system. However, after the collapse of the Soviet Union, countries that formed the CIS block and neighbouring European countries started the modernization of social services. While I was only able to observe the changes happening from 1990-2000 in Russia, I was able to gain a more global view once I began my career in health IT in the early 2000s. With that said, I will be writing a series of blog posts where I will share my views and experiences on the differences and commonalities in national healthcare systems across different emerging markets and the so called “green field” countries, where real modernization barely started a few years ago. And so right to the first thought. As most everyone knows, starting from scratch isn’t easy. However, in health IT you hardly find any country or even a region within a country that has a similar healthcare system legacy. Everyone is essentially starting from scratch because no two regions have the exact same circumstances. For example some countries, like Georgia, passed through a centrally driven privatisation, which  made it challenging for national initiatives to impact on a local level. Poland has a mixture of private and public medical institutions, so private institutions can participate in the public healthcare system. Ukraine has no current working insurance system and still relies on direct funding. Countries, like Russia have created their own standards in health IT, but this is less common. These differences are amplified with scattered investments and a need for a visionary approach. Even though the attempt to replicate development models from mature countries can be quite challenging, countries continue to look to one another for best practices. We recently saw a Russian delegation visit Singapore to get an idea of how quickly a national EMR can be built and an Armenian delegation did the same in South Korea. The profiles of the visited countries are so different for those who attended, however any successful implementation is worthy of being explored. There are still  many projects in the health IT space that have not yet reached national scale. In later posts I hope to deep-dive into a working system, but for starters I’d like to share a few factors that I have observed in many underfinanced healthcare systems: 1. Government needs a quick win with citizens and medical personnel Depending on the country or region size, both in terms of population and scale, a cloud approach is the simplest way to go. Most likely the underlying healthcare provider isn’t very well automated and reliable in terms of sharing data. A light-weight centralized EMR along with the remaining paper based process can be quickly provisioned from the Government Cloud to the underlying medical institutions. Citizens and medical doctors will benefit from an integrated and unified medical record even with limited but centralized content. It is important to create a future proof architecture in the very beginning so it will be possible to scale and develop systems further. Obviously the most secure way is to base it on emerging interoperability standards like those recommended by IHE. 2. Healthcare Ministry Approaches The lack of funding for some public social services has prevented  various ministries from driving complex modernization projects, and IT projects are no exception. Nations that scale health IT based on  the variety of different stakeholders, legacy of systems and influences, require strong project management skills. It is especially important when MoH  isn’t empowered with the budget directly and there is an IT related Ministry or Ministry of Finance paying and implementing systems. Closer collaboration of ministries is recommended.  Various goals and objectives should be allowed and a balance between operational efficiency and quality of care should be found to achieve those goals. In such circumstances, the national data sharing platform or information exchange hub can satisfy many participants and can be chosen as a first step.  During the following phases the project could be parted to financial, medical, infrastructure and others under one national program. 3. Healthcare Standards Maturity Lack of  modernised legislation holds back   any transformational project and especially  those in the health IT realm. Standards are the foundation of the any healthcare system, therefore nothing can be changed before the legislation base is in place. Some changes in legislation take months and even years, however in some cases I have seen IT project preparations trigger quicker introductions of supporting laws like electronic signatures, medical record retention policies transformation and/or decisions for ICD9 or 10 and LOINC implementations in the country. By nature those projects shouldn’t be united. They are not dependent on each other but obviously influence each other and support transformation for the sake of a global modernization program. These considerations are not the recipe for success but important elements to be considered for success. When implementing a big scale healthcare IT system it is crucial to work with the guidance of a team of trusted companies who can work together and can bring solutions to the table that are unified. In my next post I will share an example of one of the working consortiums in one of the countries of Former Soviet Union territories which is successfully building a national electronic medical record system.

Read More

Scaring Away Your Content Management Goblins

content management

Common Disguises that Hold Back Your Organization Halloween is in the air, and it’s a good time to face your fears and scare away any content management goblins holding back your organization. Are you worried you can’t trace your cash flow accurately, even if you’re a regulated financial institution? Are you sure five years’ worth of compliance documentation is saved “somewhere” and easy to retrieve? Finding some “treats” in your company’s content can be as simple as collecting candy. But you need to start by looking behind the seemingly intimidating masks and myths. 1 )  “I’m already backing up my data on an external hard drive.” This disguise is the false belief that backing up information is the same as retrieving it, manipulating it, and gaining value from it. Just because you dress like a doctor on Halloween doesn’t make you a skilled heart surgeon. It’s one thing to relocate a file on a hard drive. It’s quite another to easily extract structured metadata such as “order date” or “trade number” inside of a document buried amidst hundreds of terabytes of stored information. Reusing stored data to protect or advance your business goals is the real “treat” while back-up alone is only a “trick.” The growing trend towards BYOD (bring your own device) and BYOD (bring your own backup), which companies are increasingly having to contend with from a security and manageability viewpoint, create a number of key problems that Chief Risk and Information Officers can well do without. The issue is more that organizations have information on disparate “File Shares” resulting in:- fragmented silos of information (creating access issues);  unsecured records stores; and non-usability as a trusted records management system as they lack basic content lifecycle and compliance requirements. 2) “There’s too much information to control.” The continually changing variety, volume and velocity of data in your organization might feel initially overwhelming. But it doesn’t mean an ostrich suit is your only option. Instead of burying your feathered head in data, up level and focus on your most pressing business problem. Which issue is costing you the most money (e.g. I can’t reconcile cash flow, can we meet our imminent funding obligations without borrowing)? What pieces of information might help resolve this problem (e.g. cleared funds position delivered to my traders every hour)? With frequent changes to regulation and rising compliance obligations fortunately there are industry-leading solutions to help. Specifically, tools are available today to enable you to 1) develop the “right “information policies” to meet your compliance obligations; 2) analyze/ search/discover only the relevant content needing control, and lastly; 3) the ability to automatically “enforce” those information governance policies across your entire organization. Spend time – or hire someone to spend time – to look at exactly how your business operates, what the key challenges are, and how reusing information can be to your advantage. Not only will you find that your initial problem can get resolved, but there may also be additional goodies generated along the way (e.g. faster audit time, happier employees, more multinational customers). 3) “All of my divisions, lines of business, or customers are different. There’s no way to standardize.” Perhaps the most menacing goblin of all, this fear creates inertia and cost that is far more damaging than facing any content management task. Do you really need multiple disconnected disparate applications just because the banking system in France is different than in Switzerland? Or can you structure content, parameterize and personalize account schemas and information views which share information in and out of newer, consolidated systems? For every legacy application you decommission, you eliminate the need to keep paying for license fees, specialized staffing, and never-ending integration work. Not to mention you’ll stop employees from taping passwords right onto older, little-used machines…a security and compliance risk and a practice sure to come back and haunt you. Instead, take your legacy structured and unstructured data – even with its differences – and place it into a single, manageable retrieval repository in the most modern format. Your content will become accessible, reusable, and easy to analyze, retrieve, and manipulate – and change from being a dead storage headache to being a valuable asset to the business. In one example, pinpointing data discrepancies helped a large European bank drive to standardization. It was evident that simple and common data elements such as a ‘date’ field meant different things across their geographically dispersed bank branches. “10/11” in the US is October 11th. In the UK, it means November 10th. Identifying and unifying the format and syntax of data elements within all documents, such as dates and values, enabled previously onerous data transformation tasks to be automated. Nuances were kept while standardization was brought in, enabling true Straight Through Processing. In follow-on blogs I’ll address more specifics about content management in the financial services sector. Post your questions below and I’ll be happy to address them.

Read More

ABA Convention 2013: Banks Bounce Back and Look for Ways to Improve Customer Experience

First published on ECM Trends from the Field. The 2013 American Bankers Association (ABA) Annual Convention in New Orleans has just drawn to a close. Banks are bouncing back! Actuate Content Services Group attended the ABA Convention to greet C-Level bank executives from all over the US. We also held a Power Breakfast Session to exchange ideas on how modern banks use analytics to improve customer experience. But first let’s talk about the Convention experience and at the end – about the feedback we received from the Power Breakfast Session attendees. Most of the ABA Convention attendees were C-Level executives from small, community banks with assets of <$1B and they were mostly from the Southeast given that the venue was New Orleans this year. We talked to many banks, and received great insight into some of the things that are important to them. Dodd-Frank is in the midst of being fully implemented with tighter lending guidelines for QM or Qualified Mortgages hitting in Jan 2014, however, the head of the CFPB stated that there would be a gradual easing-in of the guidelines with little to no litigation in the first few months. The bottom line for the small banks was that if they have a traditionally good track record making loans in their markets with acceptable loss rates, this new legislation should have little or no impact. The reaction of the CEO’s ranged however from activist “we should show up with thousands to march on Washington” to more resigned “let’s not dwell on the regulation but focus on our business.” The attitudes of the CEO’s matched almost exactly the geographies from which they hail. The West Coast CEO focusing on a vibrant, young work force, the East Coast bank implementing new technology but only to the extent that it would pay off with its specific customer profile. The CEO from a deep south bank explained how his bank’s branches had been “slabbed” during Katrina, a verb meaning completely demolished save the cement slab upon which the building used to sit. But the employees were at work the next day at fold-out tables doing business, ultimately handing out up to $100M total in cash to bank customers needing money to evacuate immediately. I’m not sure what was more moving – the image of employees at those tables in front of their erstwhile branches, or the news that the bank experienced less than 1% loss from this cash handout despite the complete lack of ID validation. “How can you force someone to produce ID, when you can plainly see that they ‘swam’ out of their house this morning.” This, said the CEO, is proof positive that we are still a nation of trustworthy individuals. This particular bank has expanded rapidly since Katrina, capturing a double digit increase in market share right after the storm. How’s that for a positive customer experience? The best part of Actuate’s presentation to the ABA attendees during the Power Breakfast Session on 10/22 was not the presentation itself but the really good questions that followed. Doug Koppenhofer, Regional VP, delivered the presentation and answered the questions: Q: Is it possible to do Communication Management completely independent of Analytics? We answered that yes, one could definitely find low hanging fruit in the customer communication area especially in correspondence, without having to tie in analytics. We explained that many banks in particular are getting a better handle on the way they manage the correspondence with their clients to ensure that they leverage their brand, manage message consistency and even compliance. We mentioned the example of the bank we met at the Customer Experience Exchange who found countless examples of letters to customers stating that they “did not meet their standards.” We do believe that analytics helps you embed much more personalization and power into your messaging and that most banks will at some point couple analytics to their communication hubs. Q: Can you elaborate on what impact privacy laws might have on using publicly available social media and other content as your source for customer messaging? My only answer to that was that we are not the compliance or bank law experts. In talking with a compliance vendor later, he saw no issue with this because as individuals we opt in to disclosing all manner of personal information to Google, Facebook, etc. Bottom line: if the banks decide not to take advantage of this publicly available information, somebody else may do so instead. Q: Any ideas on how to know the identity of clicks to specific ads you would embed? That’s a darn good question, and I wonder if there is any way the e-Ad could somehow capture session information. The question was posed by an SVP of Marketing of a 20 branch bank who is doing very well with embedding ads in their real estate channel that should drive sales to their lending channel and they are getting a lot of good clicks but, how could they ever tell who it is that they need to follow up with. More to follow as we could not come up with a complete answer to this very good question. This Q & A session was first published on ECM Trends from the Field Blog by Doug Koppenhofer. You can find the Power Breakfast Session presentation here.

Read More

How Life Science Organizations can Leverage Technology to Achieve Compliance With GMP Standards


Life sciences organizations face more pressure than ever to achieve operating efficiencies, minimize risk and achieve cost savings. Information is the lifeblood of an organization, and failing to manage the exploding data and documents typically scattered throughout the enterprise can sabotage an organization’s efforts to achieve its operating objectives. Unfortunately, many organizations find themselves in “reaction” mode, often employing ad hoc information approaches that can be implemented fairly quickly with familiar technology. All of this leads to disparate and unreliable solutions that can also render an organization vulnerable when it comes to managing compliance with GMP standards. Gaining Control with Enterprise Content Management Another approach is to tackle the data explosion head on by viewing technology not just as part of overhead, but as a key strategic asset. The question becomes “What is the best approach for moving forward?” One option is to implement an Enterprise Content Management (ECM) solution. At its core, ECM is comprised of integrated strategies, methods, and technologies that consistently create, store, distribute, discover, archive and manage content across the enterprise. A well designed ECM solution supports compliance with GMP standards through automated management of standard operating procedures and process control documents. The technology is important, but it is just the beginning. To achieve the full benefits of ECM, consistent end user adoption is critical. So, before you select and implement a solution, consider the following: Consistency and Accountability: Select a system designed to provide a consistent user experience with vendor accountability throughout the process With OpenText™ Documentum for Life Sciences solution suite, the application layer and repository are built and supported by the same vendor. This approach promotes accountability, dependable development standards and simplified training – all of which supports increased efficiency and effectiveness of workers sharing data. Ease of Use: If a system is hard to operate, users won’t adopt. For compliance objectives to be reached the system has to be used, and used correctly. For optimum results, implement a system that provides an intuitive interface with the ability to be configured to provide personalized views and functionality that meets the types of requirements found in life sciences organizations. Configuration vs. Code: Implement a system that can provide a truly configurable user experience. For example, the Documentum for Life Sciences solution has removed the need for custom coding simplifying the user experience and enhancing the productivity of knowledge workers by easily tailoring their user experience for their specific requirements. The configuration approach expedites project implementations and reduces total cost of ownership while enabling life sciences organizations to build and deliver content centric solutions to meet the needs of the business. With careful planning, life science organizations can achieve compliance with GMP standards through a proactive implementation of ECM that leverages technology and fosters consistent use for effective content management throughout the enterprise.

Read More

Improving Transmittal Management: Three Things Document Controllers Want to Know

transmittal management

One of the most important topics that my Energy and Engineering customers want to talk about is transmittal management. The flow of this project documentation is like the bloodstream in the human body—it is essential for survival. They want to improve, automate and accelerate the process. It may seem trivial because a transmittal is nothing more than a package of important documents – with important information about project details and status – that one company sends to another. However, the real challenge is volume. During large capital projects, companies send and receive hundreds of transmittals daily. If the process is not reliable and auditable, projects can stall, confusion can mount, and in some cases, companies can be fined for non-compliance. There are three areas of improvement for transmittal management that I always discuss with Document Controllers in the context of Electronic Document Management Systems (EDMS). Automatic Synchronization Typically Document Controllers exchange hundreds of transmittals daily using an FTP server or email attachments. It is possible to partly automate the process with FTP client software; however, it remains intensive manual work. Document Controllers have dreamt of an automatic way to coordinate transmittal contents. Now, synchronization is a reality with easy, secure file sharing that allows for seamless, automatic synchronization of transmittals. Here’s how it works. The transmittal is saved in a specific folder on a local computer, where Syncplicity immediately and automatically uploads it to the cloud where others will automatically have it downloaded to their local folder. Documentum Capital Projects is integrated with Syncplicity, so the transmittal process is controlled and automated. Document Controllers do not waste time manually distributing transmittals. Easy Loading Since transmittals can include hundreds of drawings, it is important to be able to automatically load large amounts. But what does a Document Controller do if there are errors in transmittal? Many of my customers have validated a best practice to handle this. Using an automatically generated Excel spreadsheet that lists the documents and metadata (such as number and revision number) for each transmittal, Documentum Capital Projects validates the metadata and reports errors. If there are no errors, the transmittal is automatically loaded. If there’s an error, the system updates the Excel spreadsheet, adds a comment or explanation next to the error, and it’s ready to send it to the contractor. Sounds easy, right? Fast Distribution Once all transmittals are uploaded, work doesn’t end for the Document Controller. The next step is to distribute them to the engineering teams for review. Document Controllers love this feature in Documentum Capital Project because it allows them to define standard distribution matrixes once and reuse them throughout the project. After completing the review process, Document Controllers must generate outgoing transmittals. With Documentum Capital Projects, this process is also automated. Just select the documents and the distribution matrix to have the transmittal – including the coversheet – automatically created. Finally, the complete transmittal package can be automatically uploaded to Syncplicity for distribution. So as you can see, OpenText knows how to automate and optimize transmittals flow. Documentum Capital Projects is designed with industry best practices and covers the entire transmittal process. Many of my customers have reduced their transmittal cycle by 50% or more by eliminating many manual steps. Visit the Documentum Energy website to learn more about Documentum Capital Projects as well as other our solutions for Energy and Engineering.

Read More

Migrating-in-Place for ECM Consolidation

First published on ECM Trends from the Field. We recently covered why organizations are moving swiftly to Next Gen Repositories. No longer are modern organizations satisfied with the old “system of record” with all of its upkeep, overhead, and user charges. They want something from which relevant, actionable data can be derived and pulled out by Gen X & Y users as they demand it. A perfect example of this new, “system of engagement” is being delivered as part of the actual migration from the old repository to the new. Let me explain. Instead of simply mass-migrating the existing data to the new target system, many are choosing to “Migrate in Place.” In other words, they leave the existing archive in place and allow users to continue to pull and view content. What results is a process by which users dictate the content that is actually required in the new system. Behind the scenes, the data being actively selected gets migrated dynamically to the new environment. If you are from the document scanning world, this is similar to “scan on demand” which left paper archives in place until retrieved and scanned. Once retrieved, they live again as a digital asset, and are ingested into workflow and repository systems with specific retention periods. One big assumption here is that the organization has decided that keeping the legacy system in place for some period is desirable. This may be due to factors such as: Legacy system can be accessed, with or without API’s. Legacy system maintenance is no longer required to maintain system access. Knowledgeable staff is no longer around to properly administer the legacy system (“How can we shut down what we don’t even understand?”). What happens to data that never gets selected? It simply ages off, or is migrated based upon specific configurable requirements. For example, a life insurance company may have policies in force for life of the insured (100 years?) + seven years, so those policy documents would obviously be chosen to move to the new system even if not chosen by user retrievals. This type of “Migrate Based on Retention” process can be configured and automated. One other important note is that it’s not necessary to own API’s to the legacy system because of the advent of new access methods such as the CMIS Interface, an open access protocol to many ECM systems, and due to repository adapters which can be easily built by savvy integrators, with or without a CMIS interface. One of the dangers of any mass data migration is what happens when you find entire libraries of data that don’t appear to have any disposition requirements. “What is this data and to whom does it belong?” Worse than that: “Who do we even ask to find out who owns the data?” Vital to any successful migration is maintaining a regular communication channel with key user communities which access this data. Think about setting up a spreadsheet or table with content type, user group and “go-to” individuals’ contact details so that you can make these important disposition decisions expeditiously. The cost of this process is not in running the process but in deciding what to do with “orphan” data and how to bring the business and IT together long enough to decide. At the end of the day, you have many fewer orphan decisions to make when you migrate-in-place. “Isn’t setting all this up really expensive?” is one obvious question. The answer is: “not really.” The key is deploying a relatively light weight repository with excellent process flow and decision capabilities. If done right, you can easily access both the legacy and new systems with zero coding. Here are the Do’s and Don’ts for Migrating-in-Place: Implement a light-weight repository with excellent process flow capabilities so that decision making can be automated. Don’t worry about using or buying API’s to the legacy system as there are other ways to access the content. Avoid orphan document situations by establishing regular touch points with business users. Set up simultaneous access to both systems making it transparent to users.

Read More

San Diego, Sun and Some Scoop on the DIA Conference!


Each year, the Drug Information Association (DIA) holds a conference in the USA (and another in Europe) on the topic of Document Management and Agency Submissions. These conferences have been held for longer than I can remember, but at least the past 20 years. A real highlight of these conferences is getting the scoop from the Regulatory Agencies on what’s coming and what’s important to them (since they govern the process and approval of authorization to sell drugs). What’s better than San Diego, sun and some inside scoop? Unfortunately, some of the best laid plans go awry. Due to the current government shutdown here in the U.S., the FDA was not allowed to attend and therefore the agenda had to be adjusted. Attendees were not happy about that but there was still plenty to discuss based on existing programs. We had a sizable team attending as well as booth presence. I have received comments from the DIA Program committee that it is sure good to see us at DIA events this year. The program committee is happy to have us and we’re happy to be here. There were several important points discussed at the annual meeting that deserves reiteration. In the absence of the FDA, the audience formulated many questions that they would have for the FDA and the promise of the EDM Program committee was that they (with avid note-taking) would present the questions to missing representatives so that they could answer them during a webinar that has been promised. Everything from: Possible requirement for Digital Signatures An electronic IND Site selection data sets Datasets for nonclinical studies Electronic application forms Action to take when a reviewer asks for something not in the specifications Timeline for IDMP implementation To tracking correspondence (XML, two-way) Several international regulators did attend and gave some helpful insight from Health Canada and the Danish Health and Medicines Authority. Key topics that were discussed by industry representatives covered such topics as: xEVMPD, IDMP and UDI standards and their impact on RIM (Regulatory Information Management) and SmPC Direct access to the eTMF for Health Authority inspections and changes in the way future inspections will be conducted The complexity of TMF and the role and future impact of (emerging) standards such as CareLex (OASIS), CDISC, risk-based monitoring on Clinical Trials Clouds including definitions, solutions and progress towards industry acceptance • Industry implementations of Digital Signatures The attendance this year was lower than usual. We don’t know if the cause was the missing FDA or that California was too far away for a large bolus of industry attendees to travel. It was too bad that so many people missed the interesting subjects and the great chances to network with their peers. It was a successful conference in spite of the challenges.

Read More

Making the Tough Call: Point Product or ECM Platform?


The decision to invest in content management sometimes leads customers to a seemingly either/or decision. Choose a point product, or invest in an enterprise content management (ECM) platform? In my opinion, this is not a trade off that Energy customers need to make. By choosing a “verticalized” solution with industry-specific workflows and integrations, customers can have the best of both worlds. These are some of the aspects I think are critical when considering so-called “point products.” Especially if you’re building a capital asset subject to regulatory compliance, or you face inspections, you’re better off having rich content management capabilities than relying on a simple product that just gets you through the initial process. While these key capabilities are often missing from individual point products, they can be part of a leading industry solution: True records management functionality will ensure records are not deleted and are proven to exist for the timetables required by regulations or contracts. The ability to purge documents that are out of date or no longer accurate is also important. Audit trails – for compliance sake, you want the ability to know what was done, by whom, and when. But also for ongoing process improvements, a supervisor’s review of logs and authorization sign-offs can indicate where things may be inefficient or incorrect, and can improve processes moving forward. Access control – security and compliance both require controlling the right level of access to the right people. This control should be granular, to allow for denying or allowing access based on content types or groups of documents, for example. Content storage for a broad variety of content types is often overlooked in some point products, because they grew from a single document type system, such as engineering documents, while a robust platform can readily handle all types. And these are some of the value-adds I believe an Energy industry solution built on an industry-leading ECM platform can bring to your implementation: Time to deployment is accelerated because you’re not recreating the wheel and developing workflows from scratch. Configuration is the focus, not customization. Industry standard workflows make sure the right change management processes are built around common requirements, whether creating a transmittal, signing off on a design review, or handing over content to somebody as part of handover process. Knowledge of content types specific to the industry is already built in, to support not only standard Office documents, but also engineering drawings, design notes, reports, and certification data from the construction phase, for example. Document organization is already figured out, so to speak, for common industry-specific tasks you might perform. It saves time knowing how to organize documents and what metadata to add to documents to find, track and pull them quickly for Transmittals. Vertical solutions often use APIs or exports in common formats so content can be consumed by other solutions typical in the industry, such as project management, maintenance management, and ERP systems. Expertise often comes along with the technology, whether consultants who have completed implementations or strategists who can advise how best to go about the changes needed. Combining the critical functionality for content management mentioned above with the richness of industry knowledge means there’s nothing you have to give up. In fact, if you choose an enterprise-class industry solution based upon a robust content management platform, you have the best of everything. You can then use the same system across projects for scale, consistency, and ease of use. And you can share dashboards that look at certain sets of documents or entire projects to have a closer understanding of true project status. So before you feel you have to make a tough call between a point product versus a platform, think about choosing a vertical solution that has E) all of the above.

Read More

Everything You Wanted to Know about ECM at Enterprise World

This year’s OpenText Enterprise World conference is better than ever! I am looking forward to a great week ofcollaborating with customers and partners in Orlando. There is somuch going on that I wanted to highlight some of the ECM focused partsoftheevent. Training Sessions:Sunday through Tuesday offers over 30 ECM related training courses. Youcan sign up to learn more about key areas of the portfolio includingadministration, upgrading, records management, developing extensionswith WebReports and ActiveView, and more. Whatever your role or thestate of your ECM implementation, there are training sessions for you.Check out the schedule and register for session Partner Summit: Monday and Tuesday are partner days with keynotes from our executives and partner leadership team. Theykeynotes will be followed by ECM and Discovery breakout sessions thatwill zero in on the topics you have requested. ECM and Discoverystrategy and products, sales tips and resources will be the focus. We’relooking forward to discussions with individual partners on theirspecific issues and requests. Techie Tuesday: We’ve listened to your input! This day has been added to specifically address your requests for more in-depth technical sessions on key topics. These are 90-minute technical deep dives on key topicsthat will help both customers and partners enhance their current andfuture ECM implementations. Topics include: new ways to integrate withECM APIs and UI Widgets, Customizing ECM with WebReports and ActiveView,and advice on scaling and deploying large ECM projects in the cloud oron premise. There are also topics listedin other areas that could be of interest to ECM technical peopleincluding Discovery–Content Migration 101 and Building applications withInfoFusion, in addition to the AppWorks sessions. All these sessionsprovide a great opportunity for both customers and partners to learn andconnect with OpenText product team experts. Enterprise World Keynotes: Wednesday and Thursday morning will provide a set of entertaining and informative keynote presentations from Mark Barrenchea, Muhi Mahzoub, Kevin Cochrane, partners, customers and our very special guest speaker, William Shatner. They’ll be discussing the latest innovations…plus a whole lot more. I can’t give too much away, but I can tell you, you won’t want to miss these sessions! The ECM Track keynote promises to be exciting, educational, and enlightening. John Mancini, President of AIIM, will speak about Information Governance and why it’s imperative that you act now. Nic Carter, lead Product Manager for ECM, will detail key innovations within the ECM portfolio, and we will be joined by a customer and partner who will share their business challenges and how they addressed Information Governance with ECM. This keynote will set you up for the remainder of the sessions within the event, both from a business and technical viewpoint. ECM Breakouts: There are many ECM sessions on Wednesday afternoon in the ECM and Discovery tracks. ECM will also be highlighted and integrated within the IX, BPMand CEM tracks. Our most popular sessions are back Content Server, RM,Email Management, Workflow and Enterprise Connect. In addition, thereare panel sessions where customers share their insights and breakoutsthat help with building applications, integrating with ERP systems, andimplementing in the cloud. I strongly recommend you check out thebreakouts in all the tracks as you will find sessions that will help with both your ECM and Information Governance goals.Tech Talks:Do you want to spend time with our product experts, ask questions aboutyour implementation, and share feedback directly with them? These set of sessions are held in a less formal environment and are designed to let you dojust that. The Enterprise Connect team will be there, as well as theWebReports/ActiveView team, OTDS, Search, Security Clearance, Archive,and Content Server upgrade and deployment specialists. Innovation Sessions: If you are looking to discuss broader issues with your peers, you will love this track. These sessions take innovative approaches to understanding and brainstorming on business issues around the Information Enterprise. There are always surprises and great discussions in these sessions. Industry Breakouts: Thursday we focus on industry specific solutions and implementations.There are quite a few sessions that cover ECM implementations andproducts. Hear from customers and product teams about innovations andimplementations in Public Sector, Energy, Finance, Life Sciences andMedia/Entertainment. These breakouts are a great place to meet with andhear from your industry peers and share best practices. Expo: In the Expo area there will be pods staffed by product experts where you can see demos and ask specific questions. There will be an ECM lab where you can see an integrated demonstration featuring a number of new innovations. In addition, there are some fun games such as the WebReports and ActiveView team contest. If you are not in a session, the Expo is the place to be to get quality time with the product experts! Innovation Lab: Do you want to get hands on with innovations that aren’t yet released and be part of shaping their direction? Sign up early for a time at the Innovation Lab to spend time in the design process for planned innovations. We need your feedbackandyou’ll have some fun with it. Networking and meetings:This is some of the best time at Enterprise World. It’s your chance tomeet with the OpenText team, your peers and other similar organizations.Take advantage of every opportunity to meet with us. Whether you are acustomer, prospective customer, or partner, we look forward to thechance to meet with you at this event. We learn from every meeting andthis helps us to better understand your business. It’s going to be an educational, informative and fun week at Enterprise World this year. I can’t wait to see you there! Not registered yet? You don’t want to miss this event. Register now.

Read More

Ignoring Privacy Compliance can Trigger a Public Problem

The paperless office inches closer every day. Nosurprise there. It has long been heralded as the promised land oforganizational operations for its efficiencies, cost savings andproductivity increases for both public and private entities. Andever-improving technologies such as the OpenText ECM platform have made the collection, management, and application of digital data effortless. That’s all well and good, but what fascinates me is how the digital information landscape offers intriguing data sharing possibilities. And that, of course, opens the door to a host of operational and compliance quandaries in the process. Let’s dig in. Here’sa theoretical example: National government agency “A” makes theinvestment and converts a previously physical data collection process todigital. It works great, reams of paper disappear along with the laborrequired to manage it all. Records are instantly retrievable by agencystaff and can be trusted for accuracy and timeliness. The world is awonderful place. However, with all thisconstituent-related content now condensed down to an electronic stream,opportunities for information sharing are hard to ignore: Theconstituents themselves have long complained that they have poor accessto information collected by the agency. Hmmm, a simple query-driveninternet portal wouldn’t hurt, would it? Otherfederal government departments, both within this agency and others,could certainly benefit from seeing this data. Why not develop a simplereporting function that draws from the database and allows them togenerate a more complete constituent profile? Similargovernment bodies from regional, municipal and even foreignjurisdictions would find value in the data. No harm in taking requeststo pass on data sets, is there? Thesame scenario can also be applied to the private sector, as well. Justsubstitute a multi-national financial institution and its relatedcontemporaries into the equation above. The intentions driving thesehypothetical situations‑everything from increasing case-management efficiency to furthering public relations to improving decision-making accuracy‑maybe admirable, but there are a number of data management factors thatmust be considered and implemented first. Together, they form elementsof a comprehensive information governance program that will reduce risk, improve effectiveness, and instill institutional confidence in the whole process. Legacy SystemsFormany organizations, years of mergers, acquisitions and siloeddevelopment have resulted in a convoluted web of marginallyinterconnected data collection tools and repositories. The good news isthat a best-in-class ECM solution can be configured to work within this environment, but job #1 requires stakeholders to get a handle on what’s where and why. Data Classification and RetentionWhichinformation to keep? Where? For how long? And then what? With theamount of information flowing into and out of organizations growing atexponential rates, simply defaulting to archiving it all forever is nolonger an option. Incorporating the best practices of records managementis a great place to start, allowing organizations to clearly classifyall forms of structured and unstructured information and definecollection, access, archiving, and disposition parameters. Privacy ComplianceThisis the big one. Every organization is subject to corporate, industryand government regulations defining in the simplest terms who can accesswhich pieces of accumulated information and what they can use it for.Attaining compliance may involve addressing everything from geographicalstorage locations to security framework to auditable data trails. Form aworking committee, incorporate an appropriate privacy policy, andsource an optimal content management solution. Your brand credibility,operational efficiency, and legal fees depend on it. Withouta comprehensive information governance policy encompassing the pointsabove, there’s a very real possibility that instituting any level ofinformation sharing will have serious operational, legal, or integrityrepercussions. Non-compliant data sharing and privacy practices are oneof the few areas where there is virtually no margin for error in the public’s eyes. Fortunately, introducing a solution is not as overwhelming as you may think. Believeme, as part of the ECM team here at OpenText, we’ve more than likelydesigned, developed, and implemented a program very similar to one thatwill work for you. The exact level of secure, defensible compliance andtransparency you need is attainable. And it begins with educatingyourself. Start with an information governance whitepaper, case study, or eBook and then let’s talk.

Read More

Why Attend Enterprise World? One Word – Discovery

The leaves are falling, the days are getting shorter, and here in Canada, the temperature is dropping precipitously. That means that OpenText Enterprise World is on its way – as is the opportunity to enjoy some warmer weather in Florida. Enterprise World kicks off on November 17th and wraps up on the 21st. Enterprise World is not just an opportunity enjoy a warm climate, it is also the best opportunity to learn about the OpenText solutions that you already have, about the new things we are doing in Enterprise Information Management – not to mention network and have fun. Current and prospective customers will find a wide variety of ways to learn about OpenText Discovery applications, their use cases and customer successes. Discovery applications use search and content analytics to solve problems in Information Governance and Web Experience Management. We have also been working hard at OpenText to create and document API’s and reusable libraries that make sure that developing and maintaining your own applications can be fast and inexpensive. A few places to learn more about Discovery inlcude: Techie Tuesday Techie Tuesday is the day we’ve set aside to host deep dive discussions on how to integrate, configure and customize OpenText solutions. It’s a rare opportunity to meet with OpenText technologists to get the information you need to get the most out of your organization’s OpenText products. The Discovery track will have three sessions. Two technical sessions around InfoFusion cover creating InfoFusion Applications and using InfoFusion for content migration. The third session is for Web Site developers and Web Site managers. It covers developing effective and compelling site search to improve SEO and content find-ability. These are practical sessions led by OpenText engineers, Product Managers and Consultants responsible for creating and deploying these products. A complete list of Discovery technical sessions can be found here. Break-Out Sessions There are a wide variety of Breakout sessions targeted at Information Management professionals. These sessions will highlight the variety of search and content analytics solutions that make up Discovery. They will also provide insight into the problems we are trying to solve and how we address them. Some of the big-picture issues are outlined in the below SlideShare. A complete list of Discovery break-outs can be found here. Why Attend OpenText Enterprise World? One Word – Discovery from Stephen Ludlow Keynotes The keynote presentations given by OpenText executives are a great way to understand where OpenText is headed from a strategic and product perspective. They are also full of stories about our customers, and how they have been successful in Enterprise Information Management. They are also entertaining and thought provoking. I’m pretty sure William Shatner will be both, and more. Networking Year after year, we hear the same thing. The most valued aspect of Enterprise World is the ability to network, chat and bend the ear of specialists and industry experts in OpenText and from amongst your peers in the wide variety of OpenText customers. This year will be no different. Supporting Discovery and ECM, you will find Product Managers, software developers, consultants, partners, implementers, computational linguists, Information Governance experts and much, much more. We are busy preparing the biggest and best Enterprise World for our 15th anniversary. We look forward to welcoming you, educating you, entertaining you, and most importantly, listening to you. If you haven’t already, please register soon.

Read More

Top 3 Questions about your ECM Migration Project

A successful document migration involves planning and the proper technology, as well as a strong methodology to see it through from start to finish. The following are some of the common questions Actuate has received about ECM migration best practices. Q: How long does an ECM migration typically take? Many of the methods used for extraction are time-consuming and without the relevant expertise can literally take years to perform on large datasets. However, with the proper professional expertise and tools, the time spent implementing these methods can be greatly reduced. It ultimately depends on many variables including volume of content, transformation of content, changing the index structure from the source system to the target and others. All of the processes involved into managing these variables can be automated with technology. Q: We’re extracting data from a legacy system built before XML. How does that affect the process? With legacy systems built before XML, it’s particularly difficult to maintain existing metadata associations – an important step when you’re extracting data to move to another ECM system. In a system such as this, it’s complicated by the fact that different document types often infer different metadata rules and indexing requirements. One solution is to rebuild the indexes during migration through a technique known as “re-indexing.” Q: Besides converting content to a compatible format, we have heard that the transformation stage migration process can have added benefits. What might those be? On top of converting and repurposing data, the transformation process helps to organize data as well. Some of the print resources – such as fonts and images used by different document types – can lead to problems if they’re not dealt with properly: resources with duplicate names may lead to the wrong logo or signature showing up, for example. The transformation process allows those embedded resources to be extracted and catalogued, aiding the retrieval process later on. The transformation process also introduces additional benefits to the overall outcome of the migration project: Document proofing. Data is validated to ensure that what came out of one ECM system is exactly what goes into the other. Long-term storage. Data is transformed into standard formats to comply with records management and industry regulations as well as potentially being generated as PDF/UA making them fully accessible for the visually impaired. Reverse composition. The elements of the content can be extracted into formats such as XML, to be used in other applications. Enrichment. Color, images and other creative touches can be added to refresh old content. Compliance. Data can also be redacted to ensure that information is not viewable based on any PCI standards or compliance issues. For more information on the document migration process, and to help ensure that your document migration is well-planned and non-disruptive to your business, read ECM Content Migration: Best Practices in Document Archive Convergence, co-written by Actuate and AIIM, a global community of information professionals dedicated to sharing thought leadership content.

Read More

Cloud Technology: Deeper Dive

cloud technology

I have talked about managing the risks associated with cloud implementation in previous blogs. Now let’s take a deeper dive into the technology so that you can better understand which offering best suits your needs. First, let’s take a look at the type of service, which, as described below, is generally classified into three categories depending on the “service” that is consumed, and the level of control desired by the organization. Software as a Service (SaaS) – in this scenario, the Cloud Service Provider (“CSP”) provides all of the software, servers, storage capacity and infrastructure management. The cloud subscriber generally can control preference selections and limited administrative application settings. The tradeoff for simplicity and cost savings is that security provisions are carried out mainly by the CSP. Infrastructure as a Service (IaaS) –  is at the other end of the cloud spectrum. Cloud subscribers are able to maintain control of their software environment, but do not maintain any equipment. The basic computing infrastructure of servers, software and network equipment is provided to the customer as an on-demand service upon which a platform to develop and execute applications can be established. Platform as a Service (PaaS) – is somewhere in between IaaS and SaaS. In this scenario, the vendor provides an operating system and database services in the cloud, on which the customer can deploy applications. This model is typically utilized when an organization wants to create and maintain control over their applications while reducing the cost and complexity of buying, housing and managing the underlying hardware and software components of the platform. Once you have determined the type of service, the next step is to choose the mode of deployment of the cloud services. Public cloud: The platform is managed for the client in a CSP data center. In a public cloud deployment, the infrastructure and other computational resources are made available via the Internet. The server hardware is physically located outside of the organization’s premises, and management of the cloud is fully outsourced. Private cloud: The computing environment is operated exclusively for an organization. It may be managed either by the organization or a third party, and it may be hosted outside or within the organization’s data center. A private cloud gives the organization greater control over the infrastructure and computational resources. Hybrid cloud: A hybrid cloud infrastructure consists of two or more clouds (typically a public cloud and another organization’s private cloud) that remain unique entities but are bound together by standardized or proprietary technology that enables unified service delivery while also creating interdependency. Conclusion: Technology is just the first step Once you determine the best technical platform you will still need to: • Select a vendor • Protect your interests with policies and procedures for your organization’s cloud computing initiative • Prepare effective agreements with your CSP • Look into the need for any additional technical protections such as firewalls Stay tuned! More to come on these topics as we continue to explore the brave new world of cloud computing.

Read More

The File Share Dilemma

You’rein IT management. Let me ask what keeps you up at night? Standard stufflike health and retirement savings? Or, is it that new hire inmarketing? The one leaving the office every night with confidentialcampaign plans copied to a flash drive. Or maybe it’s the R&Dmanager who’s using public file-sync and share services to transfersensitive product development specs between their work and homecomputers? Ifeither of those scenarios is familiar, that’s what you should bestressing over. And for a couple of reasons: At the most basic level,that’s your organization’s critical information‑it’s lifeblood‑out thereroaming beyond the firewall. At a higher level, it also means yourenterprise probably doesn’t have a secure, compliant, user-friendly filesynch and share solution integrated into its ECM platform. You’renot alone. If it makes you feel any better, many organizations arestruggling to adapt to a rapidly evolving work environment that nowencompasses anywhere, anytime, and on any device. To help put thechanging landscape in perspective, here are some results I’ve pulledtogether from a few surveys: 65% of respondentshave accessed work-related data on their mobile device, though only 10%have corporate-issued devices. Shockingly, over 50% said access totheir devices wasn’t password protected. 78% of companiessay the number of personal devices connecting to their networks hasdoubled over the past two years. However, less than 10% are fully awareof which devices are logging in. 93% of companieswithout an enterprise file sync and share platform say their employeesare specifically using Dropbox, despite (or, more likely, unaware of)several recently documented security issues. BYOx has Arrived. What’s Your Response? Factis, companies are expecting more out of their employees, andresourceful staff are doing their best to deliver. So much so that theconcept of BYOD (Bring Your Own Device)is quickly morphing into BYOx, where “x” is defined as whatever’snecessary to get the job done‑devices, applications, web services, cloudstorage, and more. Good on the staff for showing initiative, but it’snow all on the infrastructure architects to provide them with a secure,productive sandbox to play in. I’mnot alone in saying that adopting an “anything goes” policy forexternal information sharing and storage is a no-win proposition. Itresults in an inefficient, tangled mess for users and gruesome security and governance risksfor information guardians. There really is only one, true win-win inthis new world, and it’s in the form of a cohesive, dedicated file syncand sharing application that’s built from the ground up with inherentsecurity and compliance to excel at all three aspects of the corporatesync-and-share paradigm: Usability, Governance and Security. The Best of All File Sharing Worlds is in One Simple Solution So, at the most basic level, it seems there are two paths to meeting thedemands of the next-gen workforce and workplace. Sadly, one involvestrying to grow a business through public file sync and sharing toolscreated for non-business use. Tools that are incompatiblewith your tech environment, ask you to rely on someone else’sdefinition of security, and can’t tell you where your data’s beenhanging out. Truth is, solutions like OpenText Tempo Boxare the foundation for the future. Tempo Box is built on an ECMinfrastructure and operates in the cloud, on-premise, or as a hybridmodel that incorporates both. It’s time to take the leap and implement atrue enterprise-grade, sync-and-share solution that effortlessly bringsthe best advantages of external file sync and sharing‑ contentcreation, collaboration, and storage‑back behind the firewall and into asecure, governable structure where it belongs. I guarantee you’ll sleepbetter. Try Tempo Box today!

Read More

Top 10 Ways to Fail at Information Governance

A successful information governance program helps organizations effectively use and manage their assets to drive maximum value, while minimizing information-related risks and costs. While many organizations see information governance as a difficult undertaking, it’s actually easier than they may think. In a recent webinar, Barclay Blair, an expert in the field and president of ViaLumina, a consulting firm that helps companies build information governance programs, discussed the keys to successfully implementing information governance. Here are his top ten pitfalls to avoid: Pitfall No. 1 : Create a records management department of one Best-in-class organizations know that records management—an important part of information governance—requires the right support. “Many companies that I go into today say, ‘Yes, we’re doing records management as part of information governance,’” said Blair. “[But] it’s one person. That’s not a recipe for success.” When setting your goals for information governance, be sure to consider how much time and effort an ongoing, successful system will entail. Pitfall No. 2 : Set perfection as a goal It’s as true in information governance as it is in life: If you set the bar too high, you’ll always fail. Before getting started, sit down with business leaders and decide on realistic information governance goals for your organization. Even if the system you implement is not 100 percent perfect, it will be a vast improvement from your current state. Pitfall No. 3 : Implement technology before policy “We did this with email about 20 years ago, and look where we are today,” said Blair. “Email is at the root of billions and billions of dollars of costs.” Choosing the right information governance technology is paramount, but without effective policies and guidelines in place, it can only go so far. Pitfall No. 4 : Forgo formal planning To make information governance work for your organization, you need to carefully consider all the requirements. When business leaders sit back and let this phase evolve organically, said Blair, they run into problems down the road. Pitfall No. 5 : Don’t get senior management commitment At the root of many information governance pains and failures is a lack of support from senior management. As you implement information governance, get buy-in from C-level executives. “Make sure your program has fundamental support, accountability, and mandate from the most senior levels,” said Blair. Pitfall No. 6 : Don’t adapt corporate governance to information governance If your organization is like many others and doesn’t have the right senior executives in place to sustain information governance, you may need to adapt roles to support the program. For example, said Blair, “The vast majority of CIOs in fact are not responsible for information: They’re responsible for information technology or infrastructure. And that’s a problem. But an even bigger problem is that the other C-level executives think that the CIO is in fact responsible for information.” Many organizations don’t have a C-level executive with authority over information in the enterprise, said Blair. “Unless and until that gap is filled, we will never achieve our goals of information governance, and we will never see the full value that information governance can provide.” Pitfall No. 7 : Treat information governance as a project Information governance is an ongoing program that needs constant support. Treating it like a one-off project will ensure that “once all of the fanfare has died down, there’s nobody to take the mantle and run it as part of your everyday enterprise,” said Blair. Pitfall No. 8 : Don’t realistically estimate costs and benefits Guessing in the dark at costs and benefits means you won’t be able to measure success or failure down the road. Spend the time to realistically estimate what your organization needs to do to build successful information governance before you move forward with building the system. Pitfall No. 9 : Assume that all users will want information governance Not all departments will see the importance of better information management or how it can help them. “The value of information governance is different for every group and department,” said Blair. “You need to articulate that and provide that value, otherwise you will fail.” Pitfall No. 10: Set unrealistic timelines and arbitrary deadlines According to Blair, the importance of being realistic cannot be underestimated: Organizations need to understand that information governance cannot be implemented overnight. Know that success takes time and effort—and remember that your mother was right when she said anything worth doing is worth doing well. Want to learn more about information governance? Find out how to start your own information governance program at www.OpenText.com/InfoGov.

Read More