Information Management

CSUN: Presenting PDF Accessibility at the Annual International Technology & Persons with Disabilities Conference

“PDF/UA makes certain that the PDF format isn’t the source of accessibility problems.” It’s a quote I use sometimes in my presentations on the PDF/UA format, sourced originally from a blog post by Matt May, Adobe’s Accessibility Evangelist. I saw May again at the Annual International Technology & Persons with Disabilities Conference, hosted by California State University, Northridge (CSUN). I’d originally quoted him during a different conference presentation back in January in Orlando, Florida, so imagine my surprise when the then stranger approached me afterwards and introduced himself! It was just one of many surprises and unique experiences at the CSUN conference, hosted in San Diego, CA in March. The 29th annual event was dedicated to technologies for people with disabilities, and among the attendees were representatives of organizations looking to find accessibility solutions for high-volume, e-delivered PDF customer communications. With more and more companies offering those documents online, making them accessible to visually impaired and blind customers has become more important than ever – but they want to be able to offer these accessible documents in a way that’s convenient and not cost prohibitive too. To inform them of their options and discuss some of the solutions for accessibility compliance around PDFs, I presented twice during CSUN. The first presentation – PDF/UA: What Is It? Why Is It Relevant? – looked at the PDF/UA format. UA, in this case, stands for Universal Accessibility, and references a PDF file format based on international standards for PDF accessibility. These are technical standards, not best practices, which define how to represent PDF in a manner that allows the file to be accessible. Included are code-level requirements for developers to allow for compatibility across the document content, the reader that displays it (like Adobe Acrobat Reader) and the assistive technology such as screen reader JAWS often used by the blind and visually impaired to access the document. “That does not mean that a PDF/UA-compliant document will always be perfectly accessible – issues like poorly-built Word documents or other source material will, of course, carry their accessibility flaws no matter what format they’re converted into,” Matt May pointed out in his blog (another quote of his I sometimes use in my presentations), but it does help ensure the access to, navigation of and full usability of a PDF for the visually impaired. The second of my presentations was called PDF Documents: Regulations, Risks, and Solutions for Compliance, which I co-presented with Paul Schroeder and Darren Burton from the American Foundation for the Blind. It took a bigger picture view of PDF accessibility, lookingat the concerns large private companies and government agencies are facing when it comes to creating accessible, high-volume PDF communications. Specifically the lawsuits they’re finding themselves in the midst of, and the structured negotiations and settlements being put together behind the scenes – all of which are adding to their heightened awareness regarding web and web content accessibility issues. The session was packed with people – in fact, it ended up being standing room only – looking to find out what they could do about this new frontier of accessibility. Many didn’t even know that a solution now exists that can automate the conversion of high-volume non-accessible PDFs to fully accessible and standards-compliant PDF documents without the need for costly manual remediation, making the job much easier. I hope the attendees at the presentations found it informative and useful. For me, as usual, CSUN was a hit, filled with great people and interesting learning experiences. Once again, I was excited to be part of it! View the videos of my CSUN presentations.

Read More

Responsive Web Experience Drove Me to Subscribe—and Stay

Let me start by saying I am a bit of a sports fan. Some of the sports I really like to watch aren’t all that popular in Canada. You guessed it – we’re not talking about hockey. My favorites are Rugby and Australian Rules Football (aka “Footy”). My Australian colleagues got me hooked on Footy with an amazing live game in Sydney—but I digress. While both games are best enjoyed live and while we can sometimes do that with Rugby, we have not found Footy live in our area. So I set out to watch on the net. My first experience was watching the Aussie Rules final the year before last. I had just come back from seeing the semi-final in Sydney and was determined to watch the Swans win the final. I got onto the AFL site, subscribed, and within a few minutes I was up and running. I started by buying just the one game to see how it went. I was amazed. I watched the game live (yes it was in the wee hours or the morning) and shared some texts with the guys in Australia. While watching I wanted to learn more about the players. As I was streaming from my laptop to my big screen TV, I pulled out the iPad and brought up the Swans team page with all the players and stats. I brought up the page for the other team as well. It all worked beautifully—great performance, great picture, and terrific experience of the site on both my laptop and my iPad. And the Swans won in a close match so it was even better. I have to admit that while I was completely impressed—so much that I have bought 2 season subscriptions since—I didn’t realize just how special this experience was. I knew the AFL used OpenText Web Experience Management software for their site, www.afl.com.au, but I didn’t really appreciate that responsive design was a big part of what drove the great experience. That is, until I tried to replicate the experience with my other favorite sport, rugby. Fast forward to this month when I wanted to watch a particular rugby game which shall go nameless. I was able to find the team’s site and read some of the info about the upcoming game. They didn’t have the ability to stream on the site. I had to search around and find a place with live streaming of the game, which was not affiliated with the team. The quality was poor. I pulled up the team site on my iPad while watching the game. I was surprised that some of the text appeared over other images or text and made things hard to read. I looked up some player stats and couldn’t read them because they had dark text on a dark background. Clearly this site wasn’t created with responsive design because it was just fine on the laptop. The result: I spent much less time on this team’s site, and I cancelled the viewing subscription after one game. The net of it was, for this consumer, using the right web design software and methods directly resulted in customer satisfaction, greater time on the site, and sales. If you would like to learn more about how Web Experience Management can make a real difference in your internal and external web experiences, visit www.opentext.com/simplify. Coming Soon: Rugby Fights Back with some great web experiences. Stay tuned!

Read More

How you get error-free EDI documents that comply with your company’s business rules

EDI has been successfully adopted as the means to automate the exchange of business data, but this data can, of course, contain errors, such as incorrect order numbers, bad zip/postal codes, bad dates, etc. Also, some of your business partners may not be following your process rules, such as the timeframe within which an Advance Ship Notice must be sent. You need a system that can identify and fix these problems. Best-in-class companies solve the bad-data and process violation problem with “B2B operational intelligence,” which monitors EDI transactions as they flow through the trading network before they enter your back-office systems. B2B operational intelligence acts as a firewall that protects your internal systems, identifying bad data and then rejecting or quarantining it in-transit. It also provides visibility into potential problems, mechanisms to contain them, and alerting capabilities to notify those who can resolve the problems fastest. This in-transit processing performs all the value-matching and rule validations on each EDI transaction. Documents with missing or incorrect values are automatically set aside to be evaluated prior to processing by internal systems. This ability to react immediately to business anomalies prevents major issues resulting from bad data, and it helps make your company truly agile.     If you don’t have a B2B operational intelligence solution, I’d say this should be top of your shopping list in 2014 because, as Gartner noted in a recent webinar, it “delivers the most direct integration ROI to the business” This is because it enables you to improve your business processes, and thus reduces your costs and/or improves revenues. B2B Operational Intelligence vs. Business Intelligence A lot of people ask me about this, so here is my take on it. Some business managers believe that their bad-data risk is mitigated by their ERP and/or translator. While it is true that ERP systems are designed to catch certain data problems, they can only act on errors once they have entered the system. Clearly, too late! Also, many ERP systems and translators do not have the rigorous monitoring capabilities needed to eliminate data errors that are introduced from external sources, including business partners. So, how is B2B operational intelligence different from Business Intelligence (BI) tools. The core difference is when the “intelligence” is available. Business intelligence solutions find problems after they occur, while B2B operational intelligence solutions detect problems and provide actionable information before internal processing begins, enabling proactive resolution of issues. B2B operational intelligence acts as the front line, enabling your internal systems to operate more efficiently. Elements of an Effective B2B Operational Intelligence Solution If, as a result of reading this far, you have a B2B operational solution on your shopping list, here is my checklist of the key elements a solution should provide: Seamless operation with supply chain processes Integration support for any structured data, including EDI, XML, CSV, or flat files Flexible business rules configuration that is powerful enough to handle even the most complex, company-specific rules and vendor compliance scenarios Scalable architecture that performs at the speed of your business An easy-to-use interface that requires minimal training for line-of-business managers, enabling self-service and reducing the number of requests to IT Role-specific visibility for your employees and business partners into operations and critical real-time events with actionable information, enabling timely decision-making Scorecard capability that provides visibility into supplier performance and which, when coupled with easily accessible detailed performance data, enables productive performance improvement discussions with suppliers Ability to create issue resolution workflows and alerts tailored to your specific business processes B2B operational intelligence is a key factor in any successful B2B program..” Click here to watch a recent Gartner webinar, “Roadmap for Improving Your Integration Strategy, the 7 Things You Must Know (and Do) About Integration”, to learn more about B2B operational intelligence and other key factors in your EDI program.

Read More

How do you keep up with your customers?

Customers and employees today are becoming less and less loyal. Some years ago, the supplier you had selected was the one you had for the rest of your life. Like the bank you chose, or the electricity firm. The same goes for employees. Before people tend to stay at the same workplace longer than they do today. Today it’s different . If we cannot deliver what our customers or employees expect, they will leave. If you cannot deliver what they want when they want it, they will leave. And to top it up, if they are not happy, they will let everybody and anybody know about it via social networks and communities. With online reviews and information available, your customers know more about your products, your service, your competitors, and pricing than you do. Swedish institute for statistics just released their report for this year, and according to them kids in Sweden in the age 3-6 years, 25-30% are using Internet daily. Imagine when they grow up! Many companies already have a website that is engaging, social channels that engaging content is sent out to. Most have some sort of segmentation to personalize what is delivered to the customers. However, most companies don’t have a streamlined, optimized way of doing this. Many steps are manual. Many organizations have departments divided into online marketing, social media and web and so on, which creates siloes of information making it difficult to reuse content and work efficiently. Marketing and IT are struggling to deliver exceptional customer experiences. Many are still implementing tools that are IT-centric instead of marketing-centric. Using siloes of information rather than integrated technologies to support a unified experience. Forrester recently asked a panel of 100 customer experience professionals about their approaches to customer experience innovation. To read the results, download the whitepaper today! http://www.opentext.com/campaigns/experience-suite/captivate-customers.htm EFFECTIVE, MOBILE AND IN CONTROL Looking closer at 3 points specifically will help us gear up to meet the expectations of today AND tomorrow: Our customers want to be EFFECTIVE, which means using every moment of the time they have, being MOBILE. And they want to be able to control what information they get. It’s not only corporations that face the issues with “big data”. The consumers are too! They need a way to filter all the information available. They need to be IN CONTROL of the information they receive. We as a company need to be EFFECTIVE. We need to provide the customers with a great customer experience across all touch points without too much cost attached to it. We need to provide this great customer experience on all thinkable devices that exists today but also consider the devices that will be there tomorrow. Enabling the customer to be MOBILE. And lastly we need to be IN CONTROL of the content we send out to the customers, we need to use insight and statistics to deliver the right content to the right person. We do not want to broadcast and push information at the customer when they are not interested. That would probably be the easiest way to ruin a great experience. So what does your company do to keep up? What is your strategy to retain control and productivity and at the same time keeping up with the high requirements of your customers?

Read More

Information Governance in the Energy Sector: Treating Information as a Valuable Resource

This is the first in a series of blogs that explore how Information Governance is delivering value to utilities, mining and oil and gas companies, in their core business processes. Information Governance is more than just “records management”: It is a means to manage risk, ensure HSE and regulatory compliance, and achieve operational excellence and competitive advantage from your information. Even in heavily regulated industries such as energy and resources, the term “records management” (RM) has a somewhat negative connotation. Over the span of my career in engineering firms, energy companies, and public sector bodies responsible for governing the industry, I’ve come to accept that executives are reticent to invest in RM initiatives. In my experience, it typically takes some compelling event to drive major information and records management initiatives within these businesses—a scathing audit demonstrating non-compliance; a health, safety, or environmental incident that could have been avoided; or a costly claim or lawsuit. In these instances, poor information management (IM) or the reactive costs of eDiscovery are too high a price for the business to pay. Today, it is hard to imagine that there is any executive—or employee, for that matter—who doesn’t recognize the value of information in their day-to-day operations. Especially when you consider that companies such as Google and Facebook are making such lucrative business out of information alone, it’s clear that information is a valuable resource. Yet many utilities and resources companies continue to suffer from poor information management, often failing to invest in the tools that support even their most basic information management needs, let alone leveraging that information for innovation and competitive advantage. Information has value, and when properly managed, information can reduce risk as well as positively impact your overall revenue, efficiency, and profitability. When effective information management is embedded in the processes and activities that your organization performs in the course of normal operations, you’re able to achieve increased compliance and realize the strategic value of that information. This is where Information Governance comes in. I am pleased that the IM industry has adopted this term to reflect the new era of information management. It more accurately reflects the paradigm shift we’ve been experiencing over the past 20 to 30 years, as electronic information in its many forms has replaced antiquated practices and tools rooted in paper-based processes. Over the course of the next few weeks, we will be running a series of blogs that demonstrate how Information Governance supports business processes and challenges that are specific to energy and resources companies. You will hear from my colleagues, with their individual areas of expertise, about content-centric applications such as Engineering Document Management, Contract Management, Asset Information Management, Customer Communication and Customer Information Management. We will also cover two additional topics reflecting the challenges of the complex information technology (IT) landscape in modern energy companies: governing your SAP and SharePoint information, and managing legacy systems and information silos from mergers, acquisitions, and joint ventures. I hope that you will join us for the entire blog series, as we demonstrate how Information Governance manages your risk, ensures your compliance, and delivers value to your business. For more information on the topic of Information Governance in the energy and resources sector, I encourage you to read the whitepaper we co-authored with PennEnergy, How to Mitigate Risk and Ensure Compliance: Govern Your Documents Accordingly, and visit our microsite.

Read More

Making the Most of Your CCM Initiative: Influencing Customer Behavior

Customer Communications Management (CCM) isn’t only about communicating with customers – it’s about ensuring that you communicate effectively. That means ensuring that customers are sent communications that are relevant to them. For our last post on making the most out of your CCM strategy, we’re going to look at how to successfully use CCM to influence customer behavior. This is the final step listed in the InfoTrends white paper Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. Data analytics and business intelligence tools can help companies better understand their customers: who they are and how they behave. By understanding their customers better, organizations can now help tailor communications directly to their customers’ needs, predict future requirements, as well as fuel upsell and cross-sell initiatives. Customer profiles – determined by online behavior, social media data and other factors – can help ensure that companies send more relevant, personalized communications, which leads to a higher rate of response. Meanwhile, technology such as personalized landing pages and QR codes can even help track printed communications. This becomes even more relevant as companies push more marketing communications through their CCM platforms, allowing them to become hubs that store, track and measure communications. The right CCM technology, with business intelligence and data analytics capabilities built in, will help provide the functionality to do so. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps

Read More

Discovery Suite Release—Unlocking Value in Unstructured Information

Unstructured information is the most valuable untapped resource in the enterprise today based on its potential to reveal hidden insights, impact decisions, and unveil new opportunities. And while some executives believe they are not maximizing the value of their unstructured information, others consider it to be more of a liability than an asset due to the cost and risk associated with unmanaged content. Consider the following examples. A U.S. federal government agency had 19 organizational silos capturing and storing email indefinitely because they had no means of determining what could be responsibly deleted and what had long-term value to the agency and the national archive. An oil and gas company was spending millions of dollars on eDiscovery and storage (and risking non-compliance) due to the retention of mass amounts of unnecessary employee email. The strategic use of well-governed information is a key competitive advantage. Organizations accumulate huge volumes of unstructured information. Buried within it resides the lifeblood of the organization—the intellectual property and collective knowledge of the organization. Unfortunately, it is often comingled with information that is Redundant, Obsolete and Trivial (ROT). To harness information’s true business potential, content needs to be easily available for reference, reuse, and analysis. For both the typical knowledge worker and compliance and legal specialists, unstructured information is tough to access because it resides within silos across the organization. As new content is created in increasingly new formats and locations, the digital enterprise will need technologies that not only capture and manage content but also provide critical analytics and discovery capabilities. To enrich content and empower the enterprise, OpenText has released the OpenText Discovery Suite. OpenText Discovery Suite accelerates “time to value” by enabling people to find, understand, and leverage enterprise information. With this release, Discovery Suite includes solutions for auto-classification, content migration, content analytics, semantic search, and eDiscovery. This new suite of applications delivers enhanced unified information access, support for enterprise-wide information governance, and extended discovery applications across the EIM portfolio. Unified Information Access – Enterprise content is stored within separate stand-alone systems—silos of information that prevent effective “search and locate” across the enterprise. OpenText InfoFusion is an information access platform that replaces one-off information applications—and their associated indexes, connectors, hardware, and support—with a common information management platform. Discovery Suite, based on InfoFusion, connects critical enterprise applications with robust search capabilities and adds applications to solve specific problems for knowledge workers and information management professionals. Improved Information Governance – What sets this release apart is its ability to support a transparent information governance program by automating the application of retention policies on unstructured content. The automatic classification of information helps organizations determine what information to keep and what can be disposed of, to help reduce the cost and risk associated with storage, litigation, and eDiscovery. Extended Discovery Capabilities – The suite extends discovery capabilities to assess and enrich unmanaged content and, where necessary, to responsibly delete it or migrate it to the OpenText Content Suite. Discovery Suite also supports OpenText AppWorks and the developer’s need to create enterprise-grade applications that include search and content analytics. Written once and running on mobile devices and web browsers without additional effort, applications created for Discovery Suite make enterprise information accessible to those who need it, when they need it, regardless of its format or where it is housed. Executives strive to achieve strategic competitive advantage by transforming the business while reducing operational cost and risk. OpenText Discovery Suite empowers organizations to discover, analyze, and act on volumes of content, support information governance and eDiscovery procedures, and protect their intellectual capital while maximizing its potential. To learn more about Discovery Suite, visit our product pages and read the press release.

Read More

PDF/UA Format and the Annual Assistive Technology Industry Association Conference

Back in January, I was honored to be able to present at the Assistive Technology Industry Association (ATIA) conference in Orlando. The conference has taken place annually since 1999 and focuses on assistive technology education. Geared primarily towards the user community, it showcases the range of assistive technologies currently available, with an extensive list of sessions, all designed to keep attendees informed on the latest and greatest. For my own session – PDF/UA: What Is It? Why Is It Relevant? – I focused on the newest PDF file format PDF/UA – universally accessible. Developed to define how to represent PDFs in a manner that allows for accessibility, the PDF/UA format is based on international standards and is designed to be compatible with screen readers such as JAWS and other assistive technologies. My presentation offered an overview of the PDF/UA format and looked at the challenges organizations face today in creating accessible online content, including high-volume digital PDF communications such as financial and health statements, bills, and notices. These documents pose a particular challenge for many organizations and can be difficult to manually tag for accessibility and most often are cost and time-prohibitive because of that. But finding a way to make those documents accessible has shot up to the top of companies’ priority lists because of regulations and legislation like Section 508 of the Rehabilitation Act and the Americans with Disabilities Act – as well as with the expensive lawsuits and settlement agreements that companies have found themselves faced with more and more. That means those organizations need to find a way to make those digital documents accessible, in keeping with the World Wide Web Consortium’s (WC3) list of universally spanning standards for web accessibility, stated in the Web Content Accessibility Guidelines (WCAG). While this is a standard, not a regulation, most accessibility requirements, standards and laws have been normalizing WCAG 2.0 at AA conformance levels. Truly accessible documents contain proper tagging, markups and structures as defined by the PDF/UA standards, and are compatible with assistive technology, allowing everyone to access and navigate the document fully. They are barrier-free to people with or without disabilities and are WCAG 2.O, Level AA compliant. My session at ATIA covered all of this and saw a range of attendees, including users of assistive technologies as well as representatives of organizations that want to better understand document accessibility and options for meeting compliance with these high-volume documents. Not only was the turnout great, but the questions I received were just as interesting. A lot revolved around specific PDF elements, or how the PDF/UA standard applied in certain document situations (for instance, in the case of multiple nested headers), while others questioned if the PDF/UA was a guideline they could follow for making their PDFs accessible. My session covered the explanation that the PDF/UA standards (ISO 14289-1) include code-level requirements for more geared for developers addressing compatibility across the document content, the reader that displays it (like Adobe Acrobat Reader) and  the assistive technologies, like screen readers used by the blind and visually impaired to access the document.  But the biggest questions related to how to address the challenges of manually creating an accessible document. Many of the attendees didn’t even know that there’s technology available now that allows for the automated creation of accessible PDFs, specifically, high-volume e-delivered customer communications. Even though the sheer volume of these high-volume digital communications make them a particular challenge for companies, the PDF/UA document format provides a technical structure allowing for the automation of these documents with the right technology. That was good news for most of the attendees I saw at my presentation, who are looking for the right solutions to help them with their accessibility needs. Click to view the presentation “PDF/UA: What Is It? Why Is It Relevant, from ATIA.

Read More

The Discovery Suite and the Rise of Big Content

Today, April 30th, 2014 OpenText announced the General Availability of the Discovery Suite. The Discovery Suite uses the power of search and content analytics to solve problems organizations experience with huge volumes of unstructured content. A great example of the type of problem that can be solved is Auto-Classification of content, which has become a very hot topic. Be sure to grab our whitepapers that discuss why it is time to consider Auto-Classification. Based on the availability of the Discovery Suite and interactions with customers and analysts, I would like to revisit some thinking from a previous blog. The Rise of Big Content – Change the Conversation to Unlocking Value in Unstructured Information I recently had the opportunity to sit on a panel with some of the leading lights in the field of Information Governance. One of the panelists, Alan Pelz-Sharpe, Research Director at 451 Group, took on the role of protagonist, pointing out that, despite the hype, Information Governance is falling flat in the ears of senior executives and many large corporations. He cited his research to point out that “less than one-third of senior management believed IG was very important at their organizations.” If, like me, you are a believer in the process and the benefits of Information Governance, this is probably both disturbing and perplexing. Why is it that we have not been able to get the attention of C-Suite when it comes to Information Governance? I think the answer is becoming obvious: Information Governance discussion inevitably end up focusing on risks and cost. To get the attention of execs, we need to talk about value – and more specifically ways of contributing to the top line. We talk about costs because they are tangible and measurable. We talk about risks because of past experience, or events that happen to like-organizations. Value on the other hand is typically acknowledged as a goal, but then it often fades into the background amongst the many challenges facing organizations as they take on Information Governance. Big Content – A New Way to Look at Unstructured Content Big Content is not the wake of our digital activities like Big Data, but the byproduct of the knowledge economy. It is the huge volume of freeform, unstructured information that organizations are creating and storing on the desktop, on mobile devices, and increasingly, in the cloud. Like the “3V’s” of Big Data, Big Content has significant challenges that need to be addressed in order to be useful. Big Content is Unintegrated, Unstructured, and Unmanaged. In order exploit this information and gain insight from this information, we need to solve the “3U’s” for Big Content. Announcing the OpenText Discovery Suite – Solutions for Big Content Integrating the Unintegrated: Instead of getting smaller, the list of systems that host unstructured information is growing in most organizations. The OpenText Discovery Suite eliminates these silos with a Unified Information Access Platform for enterprise sources and content. The Discovery Platform has connectors for critical Enterprise applications and it processes and indexes documents found in the enterprise. It also provides a rich library of UI components and APIs so that developers can embed search directly into line-of-business applications. Bringing Structure to the Unstructured: Normalizing and enriching content brings structure and consistency to otherwise freeform content. OpenText Discovery Suite uses OpenText Content Analytics to bring structure to the unstructured. It extracts semantic information from content and applies it as structured metadata. Semantic metadata includes people, places, organizations, complex concepts, and even sentiment. This, combined with file metadata and permissions metadata, provides a way to query and organize unstructured information in ways not possible before. Managing the Unmanaged: Big Content, more than anything else, is unmanaged. We need to first manage the content so that we can differentiate between the valueless content and the content that is worth keeping. OpenText Discovery Suite has applications to manage specific Big Content problems that organizations struggle to solve every day. Search and content analytics alone do no solve business problems. Business logic, roles, reporting, and much more needs to be built on top of the platform in order to support the use case and provide a clear return on investment. Some of the Information Governance use cases include auto-classification of content, collection, and early case assessment for eDiscovery, remediation, and migration to control the lifecycle of content. Some of the engagement and productivity use cases include site search, intranet, and extranet search applications. The Way Forward – Making Big Content Valuable In solving the “3U’s”, the Discovery Suite encapsulates the processes required to identify value in Big Content. Once the valueless content is being managed, it is possible to focus on securing and managing the valuable content. Finally, the value of the content can be amplified. Enriched content provides better access, greater productivity, increased collaboration, and content reuse. This allows us to have conversations about effecting both the bottom and top line.

Read More

Top 5 reasons you don’t need Information Governance

Enterprise Information Management companies like OpenText are pushing information governance as something that is imperative for business, but do you really need it in your life? Here are the top 5 reasons that your organisation doesn’t need an information governance program. All your content and business information is stored and managed. Your IT department has a large and ever increasing budget, so it can afford to keep all of your enterprise content and information forever – especially the duplicate and transient varieties. The more copies the better… Your employees all follow company procedure to the letter. Your team is the best – they never fail to follow standard procedures because they have time to read and memorise them, and regularly review them in case company policy changes. So, they understand that information is the most valuable asset in your organisation; and they’d never use their USB sticks or unsecure public file sharing sites to store and share content… Your industry is immune to oversight and regulation. There’s really no need to put legal holds on information if there is a case brought against you; the law does not apply to you, and you can afford the huge fines anyway, right? Your employees don’t BYOD and they never use social media. There is no one under 60 in your company, and what’s wrong with phoning people up on landlines if you want to talk to them? Mobile phones and tablets are just a fad and would only encourage employees to play angry birds all day. Your business runs best when you are completely uninformed. In your industry, it pays not to know what’s going on; that way you can’t be blamed for making bad decisions, yes? For the rest of us who do need information governance, there is a wealth of videos, white papers, and other resources aimed at helping you understand how it helps minimize risk, ensure compliance, and maximize the value of information in your organization.

Read More

Making the Most of Your CCM Initiative: Automating and Integrating

There are several ways to make the most out of your Customer Communications Management (CCM) strategy. We’ve recently been outlining the 5 steps discussed in the InfoTrends’ white paper, Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. The fourth tip relates less to the processes you put in place and more to how they are implemented, with the goal of automating and integrating through the right CCM technology. Most CCM systems are implemented in IT architectures that are patchy at best, with stakeholders often operating in siloes, with limited knowledge of how the overall CCM process works. Legacy systems may be in place and they may be less advanced in terms of their management and tracking capabilities. Corporate IT systems that could be considered part of CCM include enterprise resource planning, CRM, accounting/tax and archiving systems. Having a centralized approach to your CCM system can help ensure that you automate and integrate everything in the most efficient way possible. To ensure this, communications should be stored centrally, whether it’s in an archive, an Enterprise Content Management (ECM) system or a CCM solution. Other things to consider include: CCM technology should be easily integrated with other systems. Legacy communications should also be stored and tracked centrally, since modern CCM platforms and post composition solutions can process and store legacy output in an archive. Vendors that invest in cloud solutions can offer the next level of integration, likely including on-premise data and cloud-based delivery. Lastly, the right technology will also ensure that your communications are tracked and stored properly, giving business users and customer service representatives the information they need to communicate with customers. They also allow for the use of data analytics, offering information you can use for upsell and cross-sell opportunities. We’ll finish off this series in our next blog post, by looking at one final step to successful CCM strategy: Influencing Customer Behavior. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps

Read More

Regulatory Matters: What you should know about new FDA Supply Chain Security regulations

Last week, the European Medicines Agency sent warning letters to healthcare professionals across Europe about falsified and/or tampered vials of Herceptin, Roche’s potent drug for breast cancer. It appears that the vials were stolen in Italy, had their lot numbers modified, and reintroduced into the supply chain. This is a growing problem, not only in the EU, but in the US as well. In response to the growing threat of counterfeit, adulterated, stolen and diverted medications entering the pharmaceutical supply chain, the US Food and Drug Administration (FDA) has implemented several important regulations highlighted within the Drug Supply Chain Security Act and the FDA Safety and Innovation Act (FDASIA). As with any global enterprise, the risks of maintaining supply chain integrity from manufacturing to distribution across international borders are massive. These risks are multiplied when we’re talking about a nearly trillion dollar industry and where a breakdown can mean injury or death. The FDA estimates that 40% of finished drugs and 80% of active ingredient precursors are imported and has defined a rigorous process of inspections with the stated goal of preventing any type of illegal activity within the supply chain. Title VII of the FDASIA, signed into law in 2012, grants the FDA new authority to address these new challenges and better ensure the safety, effectiveness and quality of drugs imported into the United States. As stated on the FDA’s website, “ Implementation of these authorities will significantly advance its globalization and harmonization strategies and support FDA’s ongoing quality-related initiatives. Further, these authorities will allow FDA to collect and analyze data to make risk-informed decisions, advance its risk-based approach to facility oversight, strengthen its partnerships with foreign regulators, and drive safety and quality throughout the supply chain through strengthened tools. At the same time, implementation of Title VII of FDASIA is difficult and complex, and requires not only the development of new regulations, guidances and reports, but also major changes in FDA information systems, processes and policies.” Pharmaceutical companies and API manufacturers will need to become familiar with these regulations and determine how to leverage existing or implement new technologies to interface with the FDA. These include registration and listing of all drug/excipient manufacturers and importers. One critical aspect of Title VII is outlined in Section 706 which speaks to the types of records required by the FDA and timelines to produce the records prior to an inspection or audit. Having a robust document and records management and recovery strategy has always been important, but under these new guidelines, getting the right information quickly to the FDA is essential to prevent delays in manufacturing or distribution. With over 300 life science implementations, OpenText has long provided validated ECM solutions for pharmaceutical records and process management, and through its tight integration with SAP, ensure compliance with current and emerging FDA and EMA regulations. In the next Regulatory Matters, I’ll go into more detail about the Drug Supply Chain Security Act, which outlines critical steps to build an electronic, interoperable system to identify and trace certain prescription drugs as they are distributed in the United States, also known as “Track and Trace.” In the meantime, feel free to contact me if you have any questions on how an Enterprise Information Management strategy can increase efficiency and innovation at your organization, while maintaining regulatory compliance. I will also be at the Gartner Supply Chain Executive Conference, May 20-22 in Phoenix, so if you’re in the area stop by the OpenText booth and say hello!

Read More

How I Learned to Stop Worrying about Compliance and Love Information Management

Even in heavily regulated industries such as energy and resources, the term “records management”(RM) has a somewhat negative connotation. Over the span of my career in engineering firms, energy companies and public sector bodies responsible for governing the industry, I’ve come to accept that executives are reticent to invest in RM initiatives. In my experience, is has typically been some compelling event that has driven major information and records management initiatives within these businesses – a scathing audit demonstrating non-compliance; a health, safety or environmental incident that could have been avoided; or a costly claim or lawsuit – where poor information management (IM) or the reactive costs of eDiscovery were too high a price for the business to pay. Today, it is hard to imagine that there is any executive, or employee for that matter, that doesn’t recognize the value of information to their day-to-day operations. Especially when you consider that companiessuch as Google and Facebook are making such lucrative business out ofinformation alone. Yet, many utilities and resources companies continueto suffer from poor information management – often failing to invest inthe tools that support even their most basic information managementneeds – let alone leveraging that information for innovation andcompetitive advantage. Information has value, and when properly managed, information can reduce risk as well as positively impact your overall revenue, efficiency and profitability. When effective information management is embedded in the processes and activities that you perform in the courseof your normal operations you achieve increased compliance – and yourealize the strategic value of that information. This is where “Information Governance” comes in. I am pleased that the IM industry has adopted this term to reflect the new era of information management. It more accurately reflects the paradigm shift we’ve been experiencing over the past 20-30 years, as electronic information in its many forms, has replaced antiquated practices and tools rooted in paper-based processes.   An effective Information Governance strategy not only addresses the myriad of information types in your business – paper, electronic documents,rich media, structured data and master data, or sensor and operational analysis data – but it manages the information in the corebusiness processes and applications across the enterprise. The rightInformation Governance platform is foundation that unifies the information silos and processes – and delivers value to the core operations of the business. Over the course of the next few weeks, we will be running a series ofblogs that demonstrate how Information Governance supports businessprocesses and challenges that are specific to energy and resourcescompanies. You will hear from my colleagues, in their individual areasof expertise, about content-centric applications such as: EngineeringDocument Management; Contract Management; Asset Information Management;Customer Communication and Customer Information Management. We will alsocover two additional topics reflecting the challenges of the complexinformation technology landscape (IT) in modern energy companies:governing your SAP and SharePoint information; and managing legacysystems and information silos from mergers, acquisitions andjoint-ventures. I hope that you will join us for the entire blog series, as we demonstrate how Information Governance manages your risk, ensures your compliance and delivers value to your business. For more information on Information Governance in the energy and resources sector, I encourage you to read the new whitepaper I co-authored with PennEnergy and check out all the information at Information Governance for Energy & Resources. This post orignially appeared on ECM.guru

Read More

Electronic and Physical Records Management: Why a Combined Approach isBest Practice

When many organizations make plans to implement Records Management, they focus on electronic records exclusively. Other organizations that are heavily paper based may focus on physical records management. Best practice using current technology is to combine both approaches into a common Records Management platform. The ideal paradigm is one that allows users to search for records without regard to file type or media. If the result is an electronic record and the user has the correct permissions, the file can be retrieved and viewed immediately. If the result is a physical record and the user has the correct permissions, the user can initiate a check out process to have the record delivered; the system will track the checked out record until it is returned. All records, regardless of media or file type, including email, should be governed by the same records retention rules. Your records retention schedule should be media independent so that there are no unique rules for paper records versus electronic records. What is included in Physical Records Management? A Physical Records Management solution is much like an inventory control system or a warehouse management system. It includes locations: building with address, room, row, shelf, and bin where boxes of records are stored. Locations and boxes are barcoded for rapid data entry when boxes are placed in a location and for checking out boxes for records requests. When records are placed in a box, the box identifier is entered into the system so that if a user requests that particular record, the specific box is known. When records are accessioned, they are typically accessioned in bulk as a group of boxes. These boxes must be stored on shelves, ideally in contiguous space in the same area in the file storage area, onsite or at a remote records storage facility. A key feature therefore is “Space Management,” which calculates the amount of storage space needed for the accession, providing available storage locations. When records are moved from one location to another, groups of boxes are typically placed on pallets for shipment. Another key feature therefore is the ability to “containerize.” This feature provides the ability to group records into boxes, boxes onto pallets, and so on so that the pallets can be tracked from one location to another, with visibility into which boxes are included on each pallet. When a user requests records, the solution sends a notification to the records center. Notifications are often batched together into a “pick list,” much like a warehouse management or inventory control system. The pick list may be sorted by location to make it easy for the staff to retrieve the requested boxes. The staff uses barcodes to inform the system that the box has been checked out and the system records the requestor, request date, fulfillment date and time. Reports provide lists of all checked out records and any due dates for return. This solution can be used to track any physical objects, not just boxes of records. It can be used for evidence lockers for law enforcement, for storage of data tapes, CDs, DVDs, external hard drives, file cabinets, historical artifacts and so on. Tracking of location, check-out and check-in, and so on applies to these functions very well. For paper records stored in boxes, each box should include an inventory sheet that indicates the content of the box. Ideally each individual record has an entry in the Records Management system with the same information that an electronic record would have, plus the Box ID for the container in which it is stored. Summary An Electronic Records Management system is a solution that uses software to manage all records, electronic and physical. If you are considering implementation of a solution, you should consider what features are supported for all media types. A separate solution for physical records versus electronic records is not best practice, and ultimately this approach will lead to higher costs and policy that is inconsistent at best.

Read More

Making the Most of Your CCM Initiative: Enabling Business Users

Recently, we’ve been looking at different ways to make the most out of your Customer Communications Management (CCM) initiatives, discussing some of the strategies listed in the InfoTrends white paper, Improve Your Enterprise Customer Communications Strategy in Five Vital Steps. Our tips have so far involved taking a more centralized approach to your CCM strategy and engaging your customers through mobile technology. Another step to a more successful CCM strategy is finding ways to enable your business users, while reducing IT costs in the process. Standardized processes for document creation, production and fulfilment can sometimes make flexibility in CCM difficult – any changes along the way require approval from marketing and legal, and need the line of business (LOB) or department to sign off on costs. Only then can IT implement any changes. Such a long process not only means it takes a while before those changes are finally made, but can add to overall costs as well. The right technology can help streamline the process, putting less strain on IT. More specifically, template-based technology can help business users create, manage and deploy customer communications themselves. With this end in mind, tiered template communication systems are becoming the norm. In this system, there are three “tiers” of users: The IT user develops data and content constructs to make template creation easy. The business super user builds the templates and creates business rules. The business user then creates, modifies and schedules the communications. This method takes a lot of the pressure off IT, and puts more power into the hands of users, creating a more efficient and flexible CCM system. In our next blog post, we’ll look at the fourth step for improving your CCM initiative: Automating and Integrating. Read the full white paper: Improve Your Enterprise Customer Communications Strategy in Five Vital Steps.

Read More

Accessibility and Health Insurers: What Section 508 and ADA Legislation Mean For Health, Medicaid and Medicare Insurance Organizations

First published on G3ict.org. More than ever, health insurance organizations – including Medicare and Medicaid programs – are tackling the issue of online accessibility, including accessible online PDFs. They have to: they face financial penalties and the risk of losing government contracts if they don’t. Two separate concerns have created this emerging demand: 1. Like federal government agencies, private health insurers are looking for ways to comply with Section 508 of the Rehabilitation Act. Since these private organizations have contracted with the federal government (Centers for Medicare and Medicaid Services) to offer the Medicare and Medicaid programs, access to the program must meet this federal regulation. The regulation requires they ensure access to and use of their websites and digital documentation to people with disabilities, including the blind or visually impaired who use screen reader software to visit the web and read their electronic documents. Non-compliance could lead to the loss of lucrative contracts for insurers. 2. The Americans with Disabilities Act (ADA) legislates accessibility in the United States, and currently this legislation doesn’t mention web and web content accessibility specifically, although that may be about to change. Since the 1990s, disabled plaintiffs have triumphed in many structured negotiations, agreements and lawsuits against large private organizations that did not make their website and web content accessible to them. The rulings have generally fallen under the ADA’s Title III which defines “places of public accommodations”. Judges have agreed in these lawsuits that websites and its content are indeed an extension of a brick and mortar business, a public accommodation, for those organizations otherwise required to comply with the ADA. That risk of litigation is likely to increase if a proposal by the Civil Rights Division of the Department of Justice (DOJ) goes through, asking for an amendment to Title III to update the definition of “places of public accommodation” to definitively include websites and online information. To avoid the potential threat of large settlements, health insurers need to begin implementing measures to make their digital information and communications accessible. Ensuring compliance for both Section 508 and the ADA means not only creating accessible core content, but making sure that all online PDF documents are also accessible. That includes the thousands of informational PDF documents and collaterals typically associated with Medicare and Medicaid programs on an insurers’ websites, but also includes a more onerous category of PDFs; the e-delivered communications like statements and notices such as billing statements, EOBs (Explanation of Benefits) SBs (Summary of Benefits), etc. Why are these documents such an onerous challenge to make accessible? Traditionally, making PDFs accessible requires a manual tagging approach. Even when a document is created with accessibility in mind and converted to a tagged PDF, those tags still often need to be manually adjusted in order to give a screen reader user full navigation and usability of the document. This is a labor intensive process that can be time and cost prohibitive at high volumes. Since insurers are generating these statements and notices for thousands or even millions of members every month, the page counts can be in the millions, hundreds of millions or even billions, making a manual process simply not scalable. That’s all changed now, though. New technology exists that can convert these high-volume e-delivered PDF communications on the fly, meeting the Web Content Accessibility Guideline standards (WCAG 2.0 Level AA) and making them accessible to visually impaired customers on demand. By negating the need for manual PDF remediation, this makes creating accessible statements and notices obtainable. Plus it’s more cost effective and far less time consuming, converting a single statement in milliseconds. Now the insurer’s visually impaired customers no longer have to wait for their accessible version, a traditional frustration for those customers who were given less time to make critical health and financial decisions based on information in the documents. The issue of website and online document accessibility isn’t going to go away for health insurers. To keep contracts and avoid risks of penalties, fines, lawsuits, and brand damage – while ensuring comparable access and opportunity for their blind and visually impaired customers – they’ll need to comply. The right plan of action, with accessibility technology in place, can help them do so. Are you a health, Medicare or Medicaid insurer? How could new PDF accessibility technology help you strategize your compliance plan moving forward? Offer your thoughts in the comments section below.

Read More

So Much Innovation, So Little Time

Ever think that progress is moving ever faster in today’s world—that technology is taking us places faster than our imaginations can envision? Well, our Innovation Tour Panel in Washington, D.C. last Wednesday not only reinforced the truth of that notion, but threw in a few even bigger ideas. Our topic was how Enterprise Information Management enables agencies to better perform their missions and we heard of many agencies finding that to be true. If it wasn’t clear that the value of convergence is a reality in every aspect of technology and information before the session, then Data.gov Evangelist Jeanne Holm and CGI Federal COO Toni Townes-Whitley made it brilliantly obvious. They wove a persuasive message into a glorious tapestry of real-life examples from government agencies. And these examples all improve government performance by opening up information within, between, and among agencies and with stakeholders, including citizens and businesses. Jeanne’s exhortation to organize data for “open by default” (consistent with the new Executive Order on Open Data), restricting only what is needed, and including rich metadata for both internal use and search and eventual publication—set the tone for the session. Toni followed up, advocating digitizing and integrating information from multiple sources and citing the Environmental Protection Agency’s use of the ESRI geospatial platform to pull data from Department of Education school information, Health and Human Services’ cancer statistics, and Census’ population data to overlay on EPA’s maps of Toxic Release Inventory (TRI) locations to enable investigating relationships to chemical leaks, cancer rates, and impacted population centers. The potential of such analyses for public health is huge. In noting the concerns of agency staff that their data is not ready to be open because of known limitations, Jeanne proffered the fearless efforts of USAID leaders who, when faced with releasing questionable data sets, used crowdsourcing , recruiting 147 volunteers, to “true up” the data. Her own home agency, NASA, found that exposing data even across their agency help turn up instances of data duplication. Perhaps you, like me, have found that just asking questions about significant changes in data reported regularly turns up either mistakes or thoughtful analysis. Well, that’s just what these efforts reveal, too, and they broaden our perspectives as well. As Toni pointed out, multi-layered data mashups help you ask new questions and think about government information as a broader ecosystem—not just one program or stovepipe, not just one agency, not just federal, but across an agency, across government and with citizens – bringing them all into the mix. Such expanding horizons are already leading to international initiatives such as License2Share, Toni noted, a CGI and OpenText partnership in Norway between government regulators and the oil and gas industry. It’s based on a joint venture (JV) license sharing solution among industry partners for permit application and maintenance. The L2S platform hosts 40 clients, sharing information across 8 countries, 4 continents, and with 7,500 unique administrators. It promotes collaboration, is scalable, and flexible enough to accommodate legislative change. At an even broader and more human level, the U.S. Department of Agriculture has partnered with the UK government and others across the globe to offer farming and nutrition data to assist farmers through the Global Open Data for Agriculture and Nutrition < http://godan.info/> initiative, particularly in underdeveloped countries. In practical terms, Jeanne emphasized, this data can now enable a farmer on the hillside to treat his sick cow with information via SMS from the village phone. What do all these new possibilities mean for CIOs, no longer safely hidden behind infrastructure? Do they mean that, finally, CIOs must actually understand how their agencies work to accomplish mission activities and provide the technology that can contribute to quicker and better outcomes? Well, that answer is obvious. Toni mentioned that CIOs need to have new skills, ask more questions, ask not tell; and curiosity is the first step. They need to frame the vision of the future, think about how the types of questions will be different over the coming 10 years. And that’s already happening. We’ve seen beginnings in PortfolioStat’s efforts to pull the C-suite operating executives into agency IT prioritization. In keeping with those insights, Jeanne noted, the Air Force CIO is taking a more mission-centric approach to defining Enterprise Architecture and to predictive analysis as well. The panel’s comments clearly demonstrated the value of enterprise information management for an enterprise of any size and complexity. Opening up data opens up new horizons and minds along the way. And linking and sharing it across enterprises from agencies to global stakeholder conglomerates enables both actionable analytics and improved outcomes. So, we’re not quite to world peace yet but, with EIM, we’re inching closer. Our panelists’ final advice made that clear. From Jeanne: “Be Brave; Share Data—June 1 is the National Day of Civic Hacking; Get Involved.” From Toni: “Think Big, Be Big, and Show Up Big!” Definitely, EIM is helping us all see the BIG Picture.

Read More

Part II: Rolling in the Deep: Navigating Compliance and Information Exchange

The Sarbanes-Oxley Act In the first part of our blog series, Rolling in the Deep: Navigating Compliance and Information Exchange, we discussed a Healthy Knowledge of HIPAA where we broke down a few simple tricks to making murky compliance issues a bit clearer. In Part II, we’re delving into the complicated regulations of the finance industry with the Sarbanes-Oxley Act, or as we like to call it, we’re here to help you with: Sorting Out SOX What is the Sarbanes-Oxley Act (SOX)? After the financial scandals that rocked stakeholders and cost them billions of dollars (remember Enron?), Senator Paul Sarbanes and Representative Michael Oxley architected the Sarbanes-Oxley Act to increase the accountability of CEOs and CFOs. No longer would they be able to claim ignorance of overt financial fraud within their organizations, as the act requires them to sign financial reports indicating that the material therein is factual and accurate. The response to the act was monumental and since then, similar laws have been approved in other countries. However for our purposes, we will just be referring to publically-traded U.S. companies and the accounting firms which audit them, though international companies who have registered with the Securities and Exchange Commission are also bound by SOX. What you’re on the hook for: Section 302 requires that corporations are obligated to create periodic financial reports which the principal financial officers are required to sign to indicate they believe the reports are true, without omission of important facts, and are not misleading. They are also responsible for ensuring that internal controls are in place to prevent tampering of data, and to monitor and test those controls on a regular basis. Section 404 states that management must create and maintain internal control structures, including the financial reporting procedures used, and provide descriptions of these methods in an annual control report as well as assessing the effectiveness of said controls. Section 906 dictates that if officers do not certify their financial reports, or do so and fail to meet all the requirements, they could receive a fine up to $1,000,000, be imprisoned up to ten years, or both. If the fraud was done knowingly, the fine increases to $5,000,000 and the prison sentence goes up to twenty years! What it means for file transfer: SOX requires that senior officers are obligated by law to maintain complete control over the flow of financial data within their company. That means that information exchange protocols are critical for being SOX-compliant. Financial officers have to be able to track where data has been, who has access to it, and where the information came from. Additionally, since they are required to create and maintain effective internal controls, data transmitted within the company must be secure. Most importantly is to pay attention to the business side with IT playing the role of key facilitator. Tips to Help? The biggest way to pass a SOX audit is to implement best practices policies in your organization. We suggest the following: Web-enabled applications that access or move sensitive data must be SSL-encrypted and secured along with authentication credentials. Deploying all the common end-point protection tools that would be required in any secure environment. This applies primarily to end-point antivirus, malware protection, host intrusion prevention systems, and client firewalls. Documenting these procedures and policies to support a SOX audit. The Right Tools: Having the right solution to managing your information exchange that provides full transparency, auditability, and industry-standard encryption for all your file movement, can help make navigating compliancy standards far easier. To learn more about secure information exchange, visit our OpenText Secure MFT page. * This blog series is meant to be a high-level discussion and should not be used as direction for your compliance policies, please refer directly to the governing body for those policies.

Read More

PBS Adopts OpenText Media Management

A major press announcement at the NAB Show in Las Vegas, “PBS Adopts OpenText Media Management to Support Range of Media Assets,” revealed the latest strategy for PBS in managing digital assets for their network of more than 350 member stations. A key part of that strategy is DAM and process management. “As a public media enterprise, managing and sharing an increasingly rich and complex universe of assets is a growing challenge. We were seeking an asset management system that could meet a broad range of needs,” said Chris Contakes, PBS Vice President, Information Technology. “We were looking for a solution that would help us develop the ability for producers, member stations and PBS to create, distribute and exchange rich assets efficiently and effectively. Being able to assign specific end users to deliverables along with timelines and scope of work offered us complete visibility into our processes,” continued Contakes. “The OpenText system offers us a digital asset management solution to manage assets as deliverables with assignments and due dates in order to meet PBS’s promotional and programming timelines.” OpenText worked closely with PBS, establishing a proof-of-concept to demonstrate not only how OpenText Business Process Management (BPM) and Media Management are integrated, but the value of managing processes throughout the entire digital media workstream, or supply chain, for digital assets. At OpenText, we recognize this as the Create-to-Consumean information flow. The Create-to-Consume information flow interacts with many departments in an organization. Media organizations, like PBS, are recognizing that the traditional linear workflows are overwhelmed as marketing, promotion and omni-channel distribution of media content continues to grow. At NAB2014 OpenText demonstrated the Create-to-Consume information flow based on the themes identified in working with PBS and common to the media and entertainmant industry. We mapped out a workstream demonstrating the flow of a project from the initial pitch or project commissioning, production, finalization and approvals, to Omni-channel publishing and programming. Integrated with OpenText Business Process Management, the demo started with the orchestration and management activites associated with making a program, production or even a marketing campaign. At NAB we showed how a team collaborated on a program pitch, getting the green light and then the system created a project structure in OpenText Media Management. This notified project members, assigned tasks and set up a folder structure to start gathering and sharing content. Once initiated, these parallel streams for video production, photo shoots, graphics production for DVD packaging were all able to collaborate, with status and progress being tracked througout the work-in-progress production processes. Final approval was directed from the Media Management system for all the deliverables, including marketing campaign assets, program information, product catalog, video promos etc. One area of keen interest to NAB attendees was monetization, where we showed an integration with e-Commerce, in which assets populated a product catalog for an online store allowing customers and partners to download or order content. The marketers and website producers had access to all the assets in Media Management. Integration with OpenText Web Experience Management and Customer Communication Management allowed automated customer communications and the ability to seamlessly build out the webpages and microsites with video promos and images for a rich and engaging experience. The primary goal of the demonstration was to show the digital media workstream as an ecosystem with many interdependent and interrelated applications and activities – DAM, WEM, BPM, e-Commerce, Work-in-Progress, Approvals and Omni-channel delivery all working together in a specific use case. Managing an ever-increasing number of people, processes and technology is a growing challenge for organizations. OpenText has a broad portfolio of products and experise in enterprise information management to help companies struggling with a digital media strategy. Understanding the value of the digital media workstream and how coordinated, collaborative processes bridging the creative production side with downstream deliverables for marketing and consumption can help put your company on a path to greater success.

Read More

Guest Blog: Evan Quinn, Research Director – Enterprise Management Associates

Guest Blog Post by Evan Quinn, Research Director – Enterprise Management Associates How quickly are organizations deploying new applications to introduce new processes? And, is the deployment speed meeting expectations? Apps Should Match or Lead the Business Organizations are in constant flux – new or changing employees, customers, partners, markets, ideas, decisions, operations, products, services, competitors. Companies that try to stand pat lose. Don’t be fooled by “brand” which may make a company look solid and unmoving to the general public. Successful companies adapt constantly, sometimes reactively and unconsciously, sometimes proactively and consciously. The same holds true for applications, whether in-house or from software vendors, old or new. Therefore, the question needs to be expanded in scope to include (a) enhancements to current applications in addition to (b) new applications, since both have material impact on the business. Companies that do not commit to ensuring their applications reflect business change, preferably at the rate of business but optimally just ahead of business, place a yoke around their ability to compete. What good are applications if they are running behind the business? The “automation” and “productivity” ROI of apps turns into a drag on business performance when the apps loses touch. Therefore, CEOs, CIOs, and Line of Business executives that make keeping their apps up to snuff a top priority, and that includes building new apps if called for, are executing a winning formula. Organizations should be constantly deploying new or enhanced apps, if they aren’t something is amiss. BPM and iPaaS are the Answers Even with a commitment to keeping apps on aligned with the business there are only so many resources – analysts, developers, support staff, budgets, and time are in limited supply. BPM in general, and the concept of iPaaS in particular, integration Platform-as-a-Service, offer a set of proven best practices and tools to help companies keep up, and even pull ahead. Any application that reflects important business processes that has not been documented by BPM diagrams is like visiting a new country without a map or GPS. Without that map of the processes and the corresponding data, analysts and developers fly blind when determining how to service new requirements, and when trying to innovate. Without BPM, companies become dependent on local knowledge stuffed in developers’ and analysts’ heads, and are at risk when those experts find new jobs. Thus companies successfully using BPM both avoid risk and enjoy reward. The next step after BPM, iPaaS, offers an opportunity to build new apps out of existing, proven processes. Based on BPM, and rendered as APIs, iPaaS helps analysts and developers quickly build and deploy more innovative apps and process sets without reinventing the process wheel each time. I like to call these “integrative applications” because they typically reflect true, optimized, complex business flows, rather than a restricted set of imagined business processes. In summary: Most companies are in a constant state of change regarding business processes, and should be. Applications, old and new, should therefore also change to reflect the nature of the business. Organizations, like it or not, are always deploying new apps and enhancing old apps, or else their IT is out of business alignment. BPM is the best available set of tools and practices to add reliability and speed to keeping apps in touch with the business. iPaaS offers a new layer on top of process integration to rapidly build and deploy new apps to reflect optimized process flows, adding productivity to developer and analyst work, and ensuring better business outcomes. All that said, the business should never be satisfied with the speed of new app development and app enhancement. IT should directly feel and respond to, even ask for, the pressure that the business feels.

Read More