Digital Transformation

Replacing Your Legacy Archiving System is a Pain. No More!

Large organizations rely heavily on rapidly evolving technology to thrive in today’s competitive business environment. And one of these vital solutions is the electronic archiving system, which is expected to maintain a comprehensive and accurate record of customer information such as statements, bills, invoices, insurance policies, scanned images and other organizational information that is essential to the survival and growth of the enterprise. It is critically important for modern organizations that these assets are retained in an efficient and intelligent manner so that they can be retrieved on-demand for customer presentation, compliance, auditing, reporting, etc. Like all information technology, archive systems too need to be upgraded from time to time. Depending on the requirements of a progressive organization, this could even mean replacing the existing systems with a brand new solution. The first step toward an effective solution, however, is identifying the shortcomings of the current system in the context of your evolving business needs. Here are a few tell-tale signs that your archiving system hasn’t been keeping up with your growth: Waning vendor support – It doesn’t receive enough attention from the vendor in terms of upgrades and support. Costly Upgrades – When it becomes prohibitively expensive to boost performance or add new capabilities/features. New Media Deficit – The system falls short on receiving and serving up content to the multitude of customer channels, including web, social, mobile, tablet, text, messages, email, and print. Social Disconnect – Perhaps the most easily recognizable symptom of an outdated archive system is the inability to connect with social media such as Facebook and Twitter accounts and capture and store customer information. Content Inaccessibility – Users complaining of an inability to extract data for targeted messaging, trans-promotional marketing, analytics, and other sales and marketing functions. Compliance Infractions – Inability to store or retrieve content that could lead to investigations, fines, license revocations, or lawsuits. If you can relate to one or more of these issues then upgrading to a more contemporary solution may be the best way forward. An example of archive migrations we have conducted for our customers and have extensive experience in is for the Mobius/ASG-ViewDirect® system. The challenges often highlighted for this system include some of those listed in the points above, as well as other issues typically seen in legacy archive systems, such as a lack of a coherent product roadmap, high costs, and an outdated user experience. Customers are often certain about the need for migration but are unsure about how to move to a new archive without disrupting critical business functions. The only real roadblock to improved performance then, is the migration itself. The process can be laborious and cumbersome, with key performance factors around the ability to perform complex document migrations on time and within budget, while maintaining access for existing applications, repurposing information locked in legacy document formats and meeting regulatory requirements. While enterprise IT departments have stringent migration requirements, modernizing your archiving system doesn’t necessarily have to be painful, and OpenText®’s ECM Migration service has a methodology in place to make sure it isn’t. The service provides a way to efficiently migrate content out of legacy archiving systems like Mobius/ASG-ViewDirect® and others to a more contemporary solution such as OpenText’s Output Archive (formerly known as BIRT Repository). Some of the unique benefits of using OpenText’s ECM Migration Service for Mobius migrations include the ability to migrate content out of Mobius without the need to purchase expensive Mobius APIs and the capability to read directly from the underlying file structure using the Mobius Resource Extractor, bypassing the need for Mobius to be running. Our ECM Migration methodology has been designed keeping best practices gleaned from many successful engagements and utilizes award-winning technologies to automate migration in a lights-out environment without disrupting day-to-day business activities. The ECM Migration team has worked with many ECM systems including IBM® Content Management OnDemand (CMOD), IBM® FileNet® Image Services, IBM®FileNet® P8, ASG-ViewDirect®, and others for decades, and the maturity of our solution proves it. Our technology and DETAIL™ Methodology enables us to: Manage all aspects of a migration Cut up to 6 weeks off of the initial planning Use standard logging on a single platform Provide multi-threaded support out-of-the-box Implement process flows, advanced logic and routers through drag-and-drop interfaces without the need for scripting Connect to and pool the connection with multiple databases and repositories Run processes concurrently, by thread or broken down by stage (i.e. Load, Extract, Convert) Handle massive volumes of data, documents, images and metadata So, if you think it’s time to say goodbye to your current archiving system, know that there are experts out there who can help you define your requirements and deploy an appropriate solution that will take you where you want to go. And remember – organizations that evolve, thrive. Others perish.

Read More

Heat Wave: Summer Momentum for OpenText Analytics

Summertime, and the livin’ is easy, Gershwin’s famous song tells us. The lengthening days may beckon people to relax and slow down, but not at OpenText Analytics and Reporting. We are seizing the season and accelerating ambitious plans to help our customers win in a digital-first world. We’ve got momentum behind us – a Summer Wind, to quote another famous song – and we believe that recent analyst reports from Ventana Research and Forrester Research highlight that momentum. Ventana Research: “Hot Vendor” OpenText Analytics was named a Hot Vendor in the Ventana Research Analytics and Business Intelligence Value Index, just released. Ventana Research analysts categorize vendors as Hot, Warm, Cold or Frigid. We think being called Hot is pretty cool, and it encourages us to push ourselves farther. Ventana’s Value Index examines vendors’ products across seven categories – five that evaluate the ability to support business processes associated with analytics and business intelligence, and two covering vendor validation and TCO/ROI. These seven factors are then boiled down to a single number for each vendor. OpenText Analytics earned its Hot rating with an 88.1 on a 100-point scale. “Actuate (now OpenText) scored as a Hot Vendor with its highest scores in the categories of reliability, adaptability and TCO/ROI,” says Tony Cosentino (@TonyCosentinoVR), VP and Research Director, Business Analytics at Ventana Research. “Actuate continues to make strides in the integration of its portfolio including API documentation, ease of embedding BIRT and with its discovery tool, BIRT Analytics.” Forrester Research: A “Leader” in Enterprise BI Platforms We believe that The Forrester Wave Enterprise Business Intelligence Platforms, Q1 2015 report also confirms the momentum behind OpenText Analytics, citing us as a Leader. Forrester’s four evaluation categories are Leaders, Strong Performers, Contenders and Risky Bets; see the four evaluation categories here. “OpenText Actuate differentiates by scaling to millions of reports and users,” writes Boris Evelson, Vice President and Principal Analyst, in the report. “Its top use cases involve distributing complex, interactive online statements to customers of large financial services institutions.” Evelson (@bevelson) writes that Forrester tracks more than 100 vendors in the BI market, but only deep-dives on the 11 top vendors for the Wave report. These 11, including OpenText Analytics, were chosen based on product fit, customer success and Forrester client demand, with analysis weighted based on the needs of larger companies and other scenarios. We’re pleased to be named a Leader in Forrester’s Wave report, and intend to use that momentum to propel OpenText Analytics farther than ever. As yet another classic summer anthem goes: “Catch a wave and you’re sittin’ on top of the world.” Click to read Tony Cosentino’s summary of the Ventana Value Index report, “Who’s Hot in Analytics and Business Intelligence.” Click to download the Forrester Wave: Enterprise Business Intelligence Platforms, Q1 2015 report. Click to view a free replay of Boris Evelson discussing the Forrester Wave report in an OpenText Analytics webinar. Surfing image by Chris Pizzitola, from Flickr.

Read More

OpenText Partners to Establish the Open Data Exchange (ODX)

This week, OpenText was honored to host the announcement of the Harper Government’s support for the Open Data Exchange (ODX) at our Waterloo office. Minister Tony Clement, President of the Treasury Board of Canada and long-time proponent of open government and open data movements in Canada, officially introduced the initiative by announcing up to $3 million in funding for Communitech Corporation to establish the Open Data Exchange (ODX) here in Waterloo. Both open government and open data movements are stimulating innovation and fostering economic growth around the world. Open data is information that is accessible, available in digital machine-readable format, and reusable under open license terms. Over the past decade, governments have launched initiatives to promote the reuse open data, developing open license models, establishing regulatory frameworks, and making data publicly available on government websites. The ODX is a component of “Government of Canada’s Action Plan on Open Government 2.0” and a demonstration of the Government’s continued commitment to establishing Canada as a leader in digital innovation. OTX brings together key players focused on unlocking the potential of open data for the economy. Its foundation is based on a partnership between the Federal Economic Development Agency for Southern Ontario (FedDev Ontario) and the University of Waterloo, Communitech, the Canadian Digital Media Network (CDMN) and tech leaders OpenText and the Desire to Learn (D2L). Representatives from each organization joined Tony Clement to share their thoughts on the importance of the ODX. From left to right, speakers included Jeremy Auger, Chief Strategy Officer, D2L; Kevin Tuer, Managing Director of the ODX and Vice-President, Digital Media for Communitech; Tom Jenkins, Chairman of OpenText Corporation; the Honourable Tony Clement, Peter Braid, Member of Parliament for Kitchener-Waterloo; and Dr. George Dixon, Vice-President of Research, University of Waterloo. Data is a part of our DNA here at OpenText. We manage a third of the world’s data behind the firewall. It’s only fitting that we provide a temporary home for the ODX, which will reside at our Waterloo Headquarters in the David Johnston Research & Technology Park in Waterloo, Ontario, until plans are finalized for a permanent home in uptown Waterloo. Data is the raw material for new products and services. Open data promises unlimited potential in its combinations of datasets of information. The ODX is the engine that will help drive to Ontario’s growth and prosperity. Tech innovation is the oil that will power this engine in its race for the digital future. Please join me in welcoming the ODX to OpenText. Read the press release.

Read More

Accessible PDF Discussion: A Solution for High-Volume Customer Communications

To help them achieve complete web accessibility, organizations require a viable technology solution for automatically generating high-volume (i.e., enterprise-level) personalized customer communications in Accessible PDF format. Such a solution would give blind and visually impaired people immediate and equal access to electronic documents, which they currently do not have. This series of blog posts explains why demand is increasing for customer communication documents in Accessible PDF format, describes current industry practices for producing these documents, and introduces a new technology option that enables organizations to keep PDF format as their default electronic format for high-volume, web-delivered documents of record while enabling full accessibility and usability of those PDFs for individuals who are blind or visually impaired. In recent blog posts we examined the Drivers behind Web Content Accessibility, Best Practices for PDF and Accessible PDF and Approaches to Generating Accessible PDFs. In this post, we will examine a new, patented, state-of-the-art technology solution that is specifically designed to generate high volumes of Accessible PDF. You can also access the complete white paper on this topic, Enterprise-Level PDF Accessiblity: Drivers, Challenges and Innovative Solutions. New Technology for Generating Enterprise-Level Accessible PDFs A first-to-market, enterprise-level technology for automatically producing personalized customer communications in Accessible PDF format is now market available. The new technology converts print streams, documents, and data into Accessible PDFs, either in high-volume batches (i.e., thousands, millions) or individually on demand (dynamically in real time). Organizations can use this type of innovative technology to simultaneously improve customer experience for people with visual disabilities and comply with relevant accessibility legislation such as the ADA, Section 508 of the Rehabilitation Act, Section 255 of the Telecommunications Act, and the CVAA. Accessibility Rules Unlike manual remediation, this automated technology leverages a sophisticated, inherently flexible rules model, ensuring that each source document, whether in PDF or print stream format, completely incorporates the specified accessibility rules. Accessibility templates can be easily edited to ensure production continuity of recurring (e.g., monthly) high-volume transactional documents. Organizations may need to secure accessibility expertise (at least initially) to define PDF accessibility rules for each document type, although the various document authors and creators require no specialized accessibility knowledge because the automated technology includes a highly intelligent graphical user interface. Quality Control Visual PDF tag inspection and usability testing is not required on every page, as it is with manual remediation. Instead, quality control can be maintained with automated and manual accessibility/usability testing on small batches of documents. This patented technology utilizes PDF/UA format (ISO 14289-1) and incorporates the Matterhorn Protocol2 to generate Accessible PDF output that has been independently tested and found to conform to WCAG 2.0, Level AA, by nationally and internationally recognized prominent advocacy organizations and well respected worldwide accessibility firms. Deployment This innovative technology can be deployed as a traditional on-premise traditional or virtual software installation, or as a cloud-based solution. Observed Effect on Traditional Format Production Recent cloud deployments of this new technology are having an unexpected, positive effect on the production of traditional alternate formats such as Braille, large print, and audio. These traditional formats, like manual PDF remediation, can be time and cost intensive to produce, and can delay the delivery of customer communications. The rich, structured output from this automated technology has allowed for automation of the production of traditional formats, including Braille, large print, and audio, lessening the labor resources needed for manual processing. Each personalized communication statement or notice can be produced more efficiently, reducing cost and delivery time for alternate hardcopy formats. ePresentment Options With the advent of affordable, scalable technology for automatically producing high-volume Accessible PDF documents, private organizations and government agencies suddenly have a number of new options for presenting online customer communications. For example, organizations can provide Accessible PDF communications by default, creating an inclusionary environment while meeting legislative mandates. This delivery scenario enables blind and visually impaired people to access their personal information at the same time as other customers, without the delays associated with requesting alternate hardcopy formats. Or, instead of producing Accessible PDFs by default, organizations with existing systems for online presentment of PDF documents may choose to provide customers with on-demand conversion to Accessible PDF format, both for current PDFs and archived documents. For online users, this would mean replacing an inconvenient exception process with merely a few button clicks. Implications for Organizations and Individuals This innovative technology solution for enterprise-level PDF accessibility offers blind and visually impaired people equal (and instant) access to electronic documents, empowering them to be more independent, make more timely financial and other critical decisions, and participate more fully in the 24/7 digital world. Imagine, individuals who prefer digital technology can finally say “No, thanks” to exception processes, accommodations, late-arriving hardcopies in alternate formats, and other such hassles. For blind and visually impaired people who do not currently request documents in alternate formats, and instead rely on family, friends, and support workers to help them manage their affairs, this technology gives them another avenue for becoming more independent. With the arrival of this groundbreaking automated technology and its ready availability to organizations producing customer communication PDFs, 20+ million blind and visually impaired Americans have an opportunity to use their buying power to affect meaningful change. They can achieve this by patronizing service providers that offer instant online access to Accessible PDF versions of bank and credit card statements, phone bills, insurance documents, and other routine (yet vital) communications. Likewise, they can demand a comparable level of service from government agencies, non-profits, and institutions of higher education. This type of technology is a game changer for industry, for government, and most importantly for blind and visually impaired people who deserve equal access to the entire internet, including websites, web content, and web-delivered documents. For more information on this technology, visit ccm.actuate.com.

Read More

Good Cloud. Bad Cloud. Why Cloud?

Confused about the cloud? You’re not alone. Adoption is projected to grow at double digits despite plentiful guidance on why we should fear the cloud. Pundits tell us, “If your organization is not implementing the cloud, you’re already behind.” Yet it is easy to feel the cloud is just beyond our grasp. So let’s take a look at some real-life use cases from sectors that are leading the way in enterprise adoption of the cloud. Cloud Illusions Ask a few CIOs about the cloud and you are likely to hear a wide range of responses, from concern that the cloud endangers security and privacy to elation that the cloud can be the ultimate platform for change. While much of this reflects well-reasoned advice and counsel, some is pure hype. When even The Onion takes on “that cloud thing that everyone is talking about,” we should realize that we are at hype and jargon saturation. With all the noise around cloud computing, cloud storage and cloud apps and debate about the pros and cons of public, private and hybrid clouds, we need to consider what is real and what is merely illusion, and moreover why we should ultimately care. These beautiful lyrics from the 60s seem to foretell our current state of confusion over the cloud: I’ve looked at clouds from both sides now, from up and down, and still somehow it’s cloud illusions I recall. I really don’t know clouds at all.” — Joni Mitchell, Both Sides Now from the album Clouds The cloud is a growing reality. CIOs and IT teams need to clearly understand how it can best be applied to advance their strategic interests. IDC research forecasts public cloud will grow at double digits and spending on private cloud will top $24 billion by 2016. CompTIA predicts that the next decade will see cloud computing becoming even more accepted as a foundational building block. We are seeingthe cloud go mainstream in the public sector, and Gartner predicts the cloud is moving into digital business, advising CIOs and other IT leaders to continually adapt to leverage increasing cloud capabilities. The Open Group Cloud project analyzed 24 business use-casesdriving adoption. In general, the rationale can be classified in five areas: agility, productivity, QoS, cost and the ability to take advantage of new business opportunities — all of which have been guiding principles for applying technology in the past. So how well are our past years of enterprise hardware and software know-how translating to the cloud for large-scale applications? Here are three sectors that are forging the way with successful cloud implementations in order to drive efficiency, improve time to market, and effect business transformation. The Cloud Drives Cost Efficiency World Economic Forum research reveals that governments are adopting cloud services at higher than expected rates. The growing adoption of cloud technology is happening at all levels of government around the globe. We are already seeing cloud play a role in changing how government agencies fundamentally spend money and allocate their IT resources. We came out with a cloud-first policy because… it offers a faster time to market, a reduction to risk, and hopefully a reduction in cost.” CIO Carlos Ramos, California While adoption is being driven in part by cloud-first mandates, the cloud is clearly aligned with government mission objectives. The public sector has embraced a data-driven approach — including open data and big data initiatives — to be responsive to citizens. Cloud implementations are seen as a means of moving beyond data transparency to achieve a cost-effective state of operational excellence. Four Trends to Watch in 2015 highlights the cloud as a means to be responsive to citizens’ wants, needs and ideas. For municipalities, the cloud provides equal, on-demand cost-effective access to a shared pool of computing resources. The City of Barcelona hosts 1.5 million guests for the La Merce festival using the cloud to help manage the surging foot, bike, auto and public transportation traffic. The state ofDelaware has implemented a cloud-based CRM application for constituent tracking in two months, adopting a cloud-first policy that piggybacks on federal policy. The state set up a private cloud and virtualized 85 percent of the state’s physical servers, saving $4 million per year. Delaware now has 70 applications in the cloud —from event notification to cybersecurity training. For central government organizations, including the US Department of the Interior, shared services are eclipsing “cloud-first” mandates as the driver behind cloud adoption. DOI’s groundbreaking cloud initiative consolidates all the records information programs under one IT governance system, and this shared service is expected to save an estimated $59 million in taxpayer dollars by 2020. The Cloud Supports Business Transformation Gartner Research identified financial services banking and insurance segments as two of the top cloud adopters. These segments are driven by the need for more innovation and the value they get from that innovation. Financial services firms are rewarded for systems that can process transactions faster and more securely and are providing new services, such as mobile banking and claims, that are ready-built for cloud-based systems. There is also growing competition with startups that are shifting the playing field. Way back in 2013 (a decade in cloud years), my article The Art of Banking: How Financial Services Approach Great Customer Experiences talked about how bankers would increasingly take innovation cues from consumer tech and smart retailers as they practice the art of banking. Over the past year, the cloud has proven to be both a major disrupter and an enabler for innovation. Like the other big research firms, IDC sees digital transformation as key for businesses and a bridge that CIOs must learn to cross, and that bridge includes the disruptive influence of cloud computing. A recent article from Banking Technology, “Why I’m backing the banks,” declares that traditional banks are now in a race to remain relevant as they face a slew of non-bank competitors with offer models that consumers increasingly value. Accenture found that one in five consumers would be happy to bank with PayPal — a cloud firm born in Silicon Valley. Though often a cost-saving measure, CIOs are seeing the potential in the cloud to create a flexible platform for future innovation. A poll of financial services sector decision makers revealed the top two benefits of adopting cloud platforms as cost savings (voiced by 62 percent of respondents) and a simplified IT environment (52 percent). It is this simplification of the IT environment that will enable banks to level the playing field with the upstarts: The newer entrants owe much of their success to their extreme agility with ICT: they have got where they are because they use technology better than anyone else. Yet, it would be premature to lament the passing of banks as we know them. They are increasingly taking the tech start-ups’ own medicine… [and the] search for innovation is rapidly pushing the cloud up banks’ technology agendas.” While banking has definitely upped its cloud game in the last few years, insurance is perhaps the granddad of cloud adoption. In How Cloud Computing will Transform Insurance, Accenture highlighted Insurance as being in the forefront of cloud growth and predicted that the cloud would transform the industry. On their list of reasons to adopt cloud, the “ability to respond to market change and reshape operating model[s] to address new and emerging opportunities and challenges.” An SMA study of cloud adoption trends in insurance found that 35 percent of participants said the cloud “provides companies with the flexibility needed to respond quickly to changing needs.” In retrospect, while cost savings has been a driver for insurers to adopt the cloud, there are already a number of insurance cloud success stories that illustrate the cloud’s real potential as a means of innovation and competitive advantage in a changing market with a changing customer demographic. Andre Nieuwendam, director of IT for United Property & Casualty describes their cloud success in customer-centric terms: “From an insured perspective, there are many initiatives on the table that we want to be able to provide them, file a claim electronically, check billing, and interact with customer service people in a real-time environment. Being in the cloud has enabled us to meet all of these objectives in a very, very short period of time.” The Cloud Enables Speed to Market In a recent Forbes article, “Cloud Is the Foundation for Digital Transformation,” Ray Wang (@rwang0) highlights cloud as the single most disruptive of all the new technologies. ”Cloud not only provides a source of unlimited and dynamic capacity, but also helps users consume innovation faster.” The idea of leveraging the cloud as a platform for speed in a changing market is appealing and especially resonates in the communications, media and entertainment sector, one that Gartner has identified as second only to banking in cloud adoption. In Breaking Bad: How Technology is Changing Media & Entertainment, I wrote about the digital media supply chain and how entertainment and broadcast companies are experiencing no less than an industry revolution: Motion pictures used to be cut, approved, and canned for distribution and released in a series of ‘windows’ for consumption. With digital distribution this model stops working — all the traditional ‘windows’ of distribution are collapsing. This has a ripple effect all the way down the chain of production and accounting and requires new IT systems and applications to address the new paradigm.” According to Accenture’s Content in the Cloud in the Broadcast and Entertainment Industry, the cloud can be the platform on which the digital media supply chain operates to better serve changing markets and consumption models. Cloud technology is poised to make an impact by supporting the next round of breakthroughs…from proliferating devices that demand a more flexible business model to new levels of IT capacity requirements that dictate highly scalable IT solutions to competitive pressures for speed and innovation that call for better workflow, business analytics, and customer insight.” How Cloud Computing Will Save Hollywood tells the story of how Lionsgate is using cloud to run their studio and compete with the “big guys” in the industry. Cloud has been helping them deal with their dispersed global environments during film production: media complexity, an unprecedented influx of massive amounts of data, and unique data and workflow requirements. Cloud Resolutions Perhaps the cloud is not so mysterious after all. In a Gathering Clouds interview, David Linthicum (@DavidLinthicum) shared his perspective that businesses that adopt cloud gain a strategic advantage: … the companies who [adopt cloud] can turn on a dime…. These companies will be able to leverage their information in much more innovative ways.” As industries increasingly digitize, the cloud is proving to be a useful partner to the CIO in an increasingly digital-first world. It is not surprising that KPMG’s recent survey, Elevating Business in the Cloud, found the top uses for cloud are to drive cost efficiencies and enact large-scale change including enabling a flexible and mobile workforce, improving alignment and interaction with customers, suppliers and business partners, and better leveraging data to provide insightful business decisions. The key to success, as with any new bright shiny technology, is to apply the cloud to achieve critical business and mission objectives. As Jim Buczkowski of Ford Motor says, The cloud is about delivering services, features, and information…to make the driving experience a better one.” So here’s to accomplishing great things with the cloud! Just keep these tips from KPMG in mind as you resolve to make your cloud initiative a success: Make cloud transformation a continuous process. Drive cloud transformation from the top. Focus on strong leadership and engagement. Avoid silos. Measure success. Plus one bonus tip from me: Avoid the trap of “cloud for cloud’s sake,” lest we discover the biggest truth in Joni Mitchell’s lyric is “So many things I would have done but clouds got in my way.” A version of this article first appeared in CMSWire.

Read More

OpenText and SAP Run Together for Exceptional Customer Impact

As we gear up for another year at SAPPHIRE, I’d like to reflect on the strong relationship that OpenText and SAP have shared for decades and look ahead to an exciting future together. For more than 20 years, we have worked together to empower the enterprise to manage its unstructured and structured information for business success. Our combined solutions make information more discoverable, manageable, secure, and valuable. Connecting SAP business suites with OpenText information suites delivers a powerful platform for innovation and opportunity. Together, we have: Transformed processing operations at Bumblebee Foods from being 100 percent reliant on paper to being 100 percent digital, with automated processes reducing costs by over 50 percent and significantly increasing efficiency. Positioned Alagasco for future growth through increased sustainability and performance. Centralized information has helped break down organizational silos, speed up sales processes, and maintain business continuity. Created a culture of innovation at Distell by empowering employees to share best practices and collaborate. As well as increasing productivity, the organization has managed its intellectual capital more effectively to enhance and protect its brand. As the world around us shifts to digital, the combined value that we deliver as partners grows exponentially. In celebration of this valued relationship, OpenText has been awarded the SAP Pinnacle Award for seven years in a row. Today, I’m pleased to announce that we have just received the 2015 SAP Pinnacle Award for “Solution Extension Partner of the Year”, making OpenText a recipient for the past eight years. This category honors partners who co-innovate with SAP to deliver exceptional customer impact. OpenText was selected for this year’s award based on our innovative approach that enriches and extends the capabilities and scope of SAP products and applications OpenText was formally presented with the 2015 SAP Pinnacle Award at the SAP Global Partner Summit last evening, in conjunction with SAPPHIRE® NOW, SAP’s international customer conference in Orlando, Florida. We’re on hand at this event to showcase the latest advancements in joint OpenText and SAP releases. Look for us at booth #130 at the conference where we’ll be demonstrating the power and flexibility of products like SAP Document Presentation, SAP Invoice Management, and Tempo Box Premium. We continue to build out the OpenText and SAP ecosystem. Our strategic solutions now support a broad range of SAP offerings—from HANA database and analytics to Simple Finance and the HANA Enterprise Cloud. Recent releases include HANA integrations for SAP Document Presentment by OpenText and SAP Invoice Management by OpenText—both designed to deliver deeper insight and content value, enhancing an organization’s process efficiency and the ability to make more strategic decisions. These extensions are available in the cloud, on premise, or as a hybrid solution. At Enterprise World 2014, our annual user conference, we introduced the OpenText Business Center for SAP Solutions, a platform for automating mission-critical business processes across the SAP business suite. We have now announced the general availability of this product. Using the OpenText Business Center for SAP, joint customers will be able to digitize entire processes in SAP—from capture to creation—without requiring complex configuration or programming resources. In the Digital-First World, all of an organization’s information and processes will be digital. This release is part of our commitment to simplify, transform, and accelerate business for the digital enterprise—enabling it to drive efficiency through digitization. In addition to expanding our support for SAP processes, we will be also be introducing Tempo Box Value Edition & Tempo Box Premium. Tempo Box Value Edition & Tempo Box Premium are secure solutions for sharing and synchronizing both personal and SAP enterprise content across different platforms and devices. Both deliver tight integration into SAP Extended ECM, giving users greater freedom to share and work with business content across any device, while still maintaining information governance and control. Tempo Box Value Edition & Tempo Box Premium enhance the SAP ecosystem by securely extending content tied to SAP business processes beyond the firewall to non-SAP users, including unlimited external users such as customers, suppliers, and partners across the business network. The ability to manage unstructured information in the enterprise plays a pivotal role in digital transformation—and it is a key capability that the OpenText and SAP ecosystem delivers. Our partnership continues to drive product breakthroughs that produce impactful and tangible results for our customers. Together, we are laying the foundation for a Digital-First World for over 4,500 customers and 50+ million active users—across two decades of innovation and into the future. Read the press release. Visit our website.

Read More

Accessible PDF Discussion: Best Practices

Web accessibility is a blanket term covering every file format and content type on the internet, including PDF documents. Millions of Americans now receive web-delivered PDF versions of bank statements, government notices, utility and telecom bills, and other communications in lieu of mailed hardcopies. Until recently, none of these high-volume PDF documents have been accessible, so blind and visually impaired people could not independently manage their affairs online, requiring sighted assistance from family, friends, or support workers. This series of blog posts explains why demand is increasing for customer communication documents in Accessible PDF format, describes current industry practices for producing these documents, and introduces a new technology option that enables organizations to keep PDF format as their default electronic format for high-volume, web-delivered documents of record while enabling full accessibility and usability of those PDFs for individuals who are blind or visually impaired. In a recent blog post, we examined the Web Content Accessibility Drivers. In this post, we will define and look at the best practices for using PDF, HTML and Accessible PDF. For more information, download this recent white paper, Enterprise-Level PDF Accessiblity: Drivers, Challenges and Innovative Solutions. Accessible PDF To maintain their independence, blind and visually impaired people have continued to participate in exception processes that replace paper (or PDF) with traditional alternate formats (e.g., Braille versions of their documents). Since alternate hardcopy formats can be labor-intensive to produce, blind and visually impaired people often receive important documents several weeks or months later than other customers. At best, an exception process delivers an inferior customer experience and, at worse, undermines human dignity and prevents people from making timely decisions about their finances, health, and other important affairs. For organizations, exception processes can be costly to set up and maintain. Clearly, customers and organizations alike would appreciate an affordable technology solution for producing high-volume, personalized communications in Accessible PDF format. This would allow tech-savvy customers to opt out of cumbersome exception processes and gain instant access to their documents, while organizations would simultaneously save money, meet regulatory standards, adhere to accessibility legislation, and serve their customers better. Why PDF? There are several compelling reasons why the PDF format is (and will continue to be) the de facto electronic document standard for high-volume customer communications such as statements, notices, and bills. These reasons are: PDFs provide an unchanging snapshot – Organizations require a document of record, or a single, reliable visual presentation of business documents, including customer communications, at the time they are authored. In other words, a PDF document is the digital equivalent of a hardcopy. PDFs are portable – PDFs offer secure multi-platform support for viewing and managing documents on desktops, laptops, tablets, and smartphones, using Windows, iOS, Android, Linux, UNIX, and other operating systems. PDF is an open standard – Specifications for the PDF format were made freely available in 1993, and PDF was officially released as an open standard in 2008 (ISO 32000-1). PDF is commonly used as a regulatory standard for archives – In some industries, such as financial and insurance, regulatory statutes require official electronic archive records to be in PDF format. PDF is already supported by extensive IT infrastructure – By all indications, PDF is not going away anytime soon. Government agencies and big companies have massive investments in IT infrastructure that produces millions of recurring PDF documents such as personalized monthly notices and bills. The financial industry, an early adopter, has also invested heavily in IT technology that enables online presentment of PDF statements. For these reasons (and others, no doubt), PDF is the format of choice for organizations that need to generate and distribute business communication documents to their customers. PDF versus HTML Closely tied to the question “Why PDF?” is the question “Why not HTML?” While HTML is ideal for delivering live accessible web and mobile content, such as on-demand, customizable summaries of financial, telecom, or health data, it does not provide a single, reliable visual presentation of a document at the time it was authored. HTML supports the structure and semantics of content, but not its presentation. Also, HTML does not support archiving or portability, meaning HTML files are not “official” documents (i.e., documents of record) that can be stored and distributed as fixed entities, such as when a person provides documents to prove residence or creditworthiness. In industries such as health, finance, and insurance, some data contained on a customer statement often does not appear in the HTML presentation. HTML, when designed accessibly, is best suited for dynamic web transactions and visual presentations such as expense or timekeeping submissions, job or credit applications, summarized data trends, and marketing web pages (optimized for search engines). Below is a table summarizing the use of PDF and HTML. The Difference between PDF and Accessible PDF Unlike regular PDFs, Accessible PDF documents are designed to be barrier-free, universally accessible, and usable by people with or without disabilities. Although the top image layer of an Accessible PDF document is identical to that of a regular PDF, meaning the files are visually indistinguishable, the associated metadata inside the PDF is different. Accessible PDF documents contain distinct tagging, markup, and structure that enables assistive technologies such as screen reader programs to read the documents in the correct order, facilitate navigation, and provide complete information about visual elements such as images and graphs. Accessible PDFs include mark-up that defines: Logical read and tab order Text and headings Tables Non-text elements (e.g., images, graphs, figures) Lists Properties, fonts, and contrast Language Bookmarks Links (internal and external). To meet a functional standard of accessibility, PDF documents should be designed to comply with WCAG 2.0, Level AA or higher. Ideally, documents are created in PDF/UA (Universal Accessibility) format, as defined by ISO 14289-1. In this post we defined PDF and Accessible PDF, and examined their uses for effective customer communications. In the next blog post, we will take a look at how PDF documents are generated and what the accessibility implications are.

Read More

What the Department of Homeland Security Knows About Data

What does the Department of Homeland Security know that you don’t know? OK, that’s a trick question. The answer (in this case) is this: It knows how to get from data to information. On April 22, OpenText Analytics and Reporting hosted a webinar featuring Chris Chilbert, Chief Enterprise Architect at the Department of Homeland Security. With hundreds in attendance, Chilbert made a powerful case that data is worthless unless you can turn it into relevant information – and that information can then become knowledge, wisdom and ultimately action. The graphic above is from Chilbert’s presentation. There’s a human element in gaining information from data, Chilbert explained. People need to understand the organization they work in and the processes they use; those pieces help us put data in context and make better decisions. If you missed Chilbert’s presentation, please check out this free replay. After Chilbert’s presentation, I talked for a few minutes about how OpenText Analytics and Reporting products support the Four Pillars of Business Analytics: data, people, processes, and technology.  (Read more about the Four Pillars in this blog post and a free ebook.) Many webinar attendees asked follow-up questions – more than we had time to answer during the hour, so I’ve answered some of them here. Q: Explain how OpenText Analytics secures data from unauthorized access. A: OpenText Actuate Information Hub (iHub), our data visualization platform, provides multiple layers of security. These include authentication integration with Active Directory, single sign-on solutions, and two-factor authentication. iHub administrators also can control what a user can access by securing data at the row or column level, and also by controlling page-level security in reports. To read more about the security features in iHub, check out the white paper, The Ten Layers of Security in iHub. Q: Can you give an example of social data and explain how iHub accesses it? A: Twitter and Facebook are the most common social data sources today. OpenText Analytics has several social data connectors for iHub in our developer center, including the Facebook ODA Driver,  BIRT Twitter Gadget  and Twitter JSON Search ODA.  For other unstructured data – which social data is, in essence – we provide APIs that you can use to connect to and query any data source. You may want to take a look at these two blog posts from Kris Clark: Creating a Custom ODA and Use JSON as a Scripted Data Set. Q: What mobile devices does iHub support? A: We support all mobile devices by providing APIs that allow you to integrate content from iHub into your mobile application. This way you get to select what mobile devices you want to support. For ideas and inspiration, take a look at two example applications we have created using iHub and its APIs:  Aviatio, a mobile web application (GitHub link for Aviatio),  and Gazetteer, an iOS hybrid application (GitHub link for Gazetteer). Q: How does iHub work in a multitenant architecture model in a cloud environment? A:  Multitenant support is built into iHub. With multitenant support, each project instance within a cluster isolates several characteristics (including security, user and role management, and scheduling) to allow them to be managed independently, even as the instances all share cluster resources. This allows a single iHub installation to support multiple applications and projects with a variety of characteristics and requirements. Developers use multitenant capabilities to build software-as-a-service (SaaS) solutions in the cloud. The benefits include lower cost, because hardware and software are used more efficiently; faster time to market, because adding a new project requires just a few administrative commands; and improved security, because each application instance has its own security processes. Technical benefits of using multitenancy with iHub include reduced deployment burden, eased administrative load, simplified system impact testing, and flexible backup and recovery. Q: Does OpenText Analytics have an electronic scorecard to allow input of information from the bottom up, as well as from the top down? A: Yes, with OpenText Analytics users can input information at any level that may have a bearing on a specific key performance indicator (KPI). The flexibility of our scorecard function accommodates any performance framework, including Balanced Scorecard, Malcolm Baldrige, Six Sigma and custom frameworks, and scales to meet the needs of large initiatives. The Briefing Book function of iHub scorecards allows users to create and deploy customized performance views. Briefing Book measures can be selected manually or filtered based on criteria such as performance, criticality, location or ownership. Links to relevant standard and custom reports, maps, external documents and websites can be added. Briefing Books can be defined as private or shared, and include advanced security features to ensure that users only have access to the information they are entitled to see. If you require more clarification on any of these answers, please leave a note in the comments. And be sure to check the replay of my webinar with Chris Chilbert of the Department of Homeland Security.

Read More

3 Questions: John Johnson of Dell Services Discusses Analytics and Reporting for IT Services

Dell Services, the global system integrator and IT outsourcing arm at Dell, provides support, application, cloud, consulting, and many other mission-critical IT services to hundreds of organizations worldwide across many sectors. The company collects and manages massive amounts of data concerning customer infrastructures from simple, high-frequency metrics (such as CPU, memory, and disk utilization) to helpdesk tickets and service requests, including hardware and software asset information. Using this data to understand and respond to customer needs before they become a problem falls to John M. Johnson, Manager, Capacity Management at Dell Services. Johnson recently spoke with OpenText about the type of data Dell Services collects, and the evolving ways his customers consume that data. He also spoke about how he uses this data to plan for the future. OpenText: You have a 12-terabyte data warehouse of performance metrics on your customers’ systems and applications. Tell us about that data and how you use it. Johnson: Our infrastructure reporting data warehouse has been around for seven-plus years. It collects aggregated information about more than a hundred customers, which is just a segment of our base. Originally we started the data warehouse to meet legal retention requirements, and it evolved to become the repository for ticketing data, service request data, and SLA performance data. Now it’s an open warehouse where we continually add information related to our services delivery. It’s fantastic data, and a fantastic amount of data, but we lacked two things: an automated way to present it, and a consistent process behind its presentation. My twenty capacity planners were spending too much of their valuable time churning out Excel reports to present the data to our clients, and far too little time understanding the data. A little less than two years ago we started using open source BIRT for report automation and to eliminate manual errors, consistency issues, and remove the “personal analysis methods” that each engineer was introducing to the process. The next maturing of the process was to leverage iHub to further automate report generation, delivery and presentation. OpenText: Some of your customers and users get dynamic dashboards, while others get static reports. How do you decide who gets what? Johnson: That’s an easy answer: It begins with contract requirements. Those expectations are drawn out and agreed upon by legal counsel on both sides. Once those fundamental requirements are met, the question of, “Who gets what?” is very simply based on how they need and want the data. I have three customer bases: my services customers, and my delivery teams, and peer technical teams who have reporting requirements. And everybody wants a different mix of data. DBAs want to see what’s going on with their infrastructure – their top databases, hardware configurations, software versions and patch level, clusters performance, and replication stats. Other teams, such as service delivery managers, and the account teams, want to see pictures more on a financial level. They need answers to standard questions like, “What has the customer purchased, and is that service meeting the customer’s expectations?” In some cases we handle customer applications in addition to their infrastructure. In those cases, the customer needs reports on uptime, availability, performance, user-response time, outstanding trouble tickets, number of users on each system, and various other application metrics married with the infrastructure data. Those are all static reports we typically deliver on a monthly schedule, but we’re looking to make that particular reporting a daily process with iHub Dashboards. Dashboards will serve three major groups: 1. Application owners, who will see what’s going on with their infrastructure and applications in real-time 2. Our service managers, who coordinate the daily delivery of our services around the world 3. Senior leaders at the director, VP and CxO levels. That last group has much less interest in a single trouble ticket or server performance, but they do care about service levels and want to know how the infrastructure looks on a daily basis. I think the executive-level dashboards will be big consumers of data in the future, so we’re evolving and maturing our offering from a technical level – where we have traditionally been engaged – to the business level. Because that’s where people buy. OpenText: That is an ambitious plan to extend your reporting platform. How do you prioritize your projects, and what advice would you give to peers with similar plans? Johnson: There’s one overall strategy I try to employ with all my applications: Apply modern, agile software development methodologies to them. You have to stay up-to-date on software patches and capabilities. You have to keep your platform relevant. We keep updates coming rapidly enough that our customers don’t have to create workarounds or manual processes. Fortunately, iHub works well with how we manage upgrades. We manage reports as a unit of work inside of iHub, so I don’t have to make monolithic changes. When I’m prioritizing projects, I first ask, “Who is my most willing customer?” The customer who’s going to work with you provides the quickest path to success, and that success is the foundation upon which you build. Second is to expect to get your hands dirty and do a lot of the lifting. Most customers are always going to have trouble verbalizing what they need and how they want data to look. So you have to just get that first visualization done and ask, “Does this data presented this way answer your needs?” Don’t be afraid of responses such as, “That is not what I wanted at all. I told you I wanted a report,” and that’s one of the most frustrating things about the job. You have to accept that you are a statistical artist and visual presentation is something you own, and then embrace and drive it. Fortunately, the ease of developing and managing data with iHub means we can respond to these inputs rapidly.  

Read More

Accessible PDF Discussion:  Web Content Accessibility Drivers

Under pressure from advocacy groups and individuals, organizations such as government agencies, retailers, utilities, telcos, banks, credit card companies, non-profits, and institutions of higher education are now focusing on web and web content accessibility. This series of blog posts explains why demand is increasing for customer communication documents in Accessible PDF format, describes current industry practices for producing these documents, and introduces a new technology option that enables organizations to keep PDF format as their default electronic format for high-volume, web-delivered documents of record while enabling full accessibility and usability of those PDFs for individuals who are blind or visually impaired. As a first step, we will look at Web Accessibility Drivers. For more information, download this recent white paper, Enterprise-Level PDF Accessiblity: Drivers, Challenges and Innovative Solutions. Web Accessibility Drivers Currently, over 285 million individuals across the world have visual impairments, including more than 20 million Americans with some form of vision loss, ranging from low vision to blindness. These individuals are valued customers—consumers with buying power— and employees, who have been swept along with everyone else on the internet wave that is revolutionizing how we live, work, play, and communicate. It is now a fact of daily life that organizations of all shapes and sizes urge and incentivize people to interact with them online through self-service websites (which are far less costly than traditional live customer service operations). To thrive and survive in the 24/7 on-demand digital world, blind and visually impaired people have embraced assistive technologies such as screen reader software to help them gain access to electronic, web, and mobile content. With an aging population, adoption and use of these technologies is exploding, driving demand for an online user experience comparable to that enjoyed by sighted customers. Laborious exception processes, in which blind and visually impaired customers request documents in traditional formats such as Braille (and then wait for them to be reproduced and delivered in the mail), are falling into disfavor as consumers become increasingly tech savvy, wanting (and expecting) instant access to both their personal account information and widely available digital content. Caught off guard by this sudden technology-induced shift in customer expectations, a large segment of the private and public sectors have yet to offer websites and web content in a format that is universally accessible to all customers, with or without vision loss. Responding to public pressure, lawmakers are amending legislation and regulatory standards to address web accessibility issues. Also, a mounting number of successful lawsuits and settlements related to inaccessible websites and web content, including web-delivered personal communications, is helping to establish a “new normal” for accessibility on the internet. Accessibility Legislation and Standards Listed below are a few of the legislation and standards that affect web accessibility. The Americans with Disabilities Act (ADA) states that government agencies, public accommodations, commercial facilities, and transportation organizations must take reasonable steps to provide access to their services. Section 508 of U.S. Rehabilitation Act mandates how the U.S. Federal Government procures, develops, uses, and maintains Electronic and Information Technology (EIT). Private-sector federal contractors and vendors must also comply with Section 508 in order to do business with federal agencies, or to deliver federally funded programs or services. Section 255 of U.S. Telecommunications Act requires telecommunication providers to make products and services, including billing services, accessible. Anticipated Updates to the Rehabilitation Act and Telecommunications Act are on the horizon. For Section 508, the goal is to broaden the scope to include Information and Communication Technology (ICT) such as mobile technology, establish a clear standard for accessible content, and achieve a stronger harmonization with other standards. Twenty-First Century Communications and Video Accessibility Act (CVAA) was signed into law in 2010, and is intended to help people with disabilities access broadband, digital, and mobile innovations such as the internet, television programming, mobile content, and emergency information. The Web Content Accessibility Guidelines (WCAG) was created by the non-profit organization World Wide Web Consortium (W3C). They help organizations and individuals develop truly accessible web content. This globally relevant set of standards and practices has been adopted by many countries, providing a solid foundation for web accessibility legislation. Legal Activity around Web Accessibility With accessibility legislation lagging behind the needs of blind and visually impaired people, national advocacy groups have brought web accessibility to the fore by suing highly visible organizations for failing to provide accessible websites and web content to their customers. Many judges have agreed with the interpretation of the ADA’s definition of a “place of public accommodation” to include websites. As a result, organizations have been compelled to retrofit their websites (and all electronic content, including PDF documents) with accessibility features. In conclusion, we believe that with the maturity of the legislation and standards, and the rise in litigation, it is in the best interests of organizations to understand and implement Web Accessibility.

Read More

EDI and B2B Insights – What Kinds of Analytics Do I Need?

In my last blog, Why Your EDI and B2B Processes Need Analytics, I provided examples of the kind of B2B analysis questions that you need to have answers for in order to improve competitiveness in your business. I have found three pre-requisites that will create value from your B2B data via analytics: A foundation of good-quality B2B data for analysis. If you are already automating your B2B processes, you are likely to have some of this critical data already upon which your analysis can be based. The definition of what you want to measure and how the results can be visualized in a way that enables you to understand trends, markets, customers and suppliers. I provided some examples of these in my last blog, Why Your EDI and B2B Processes Need Analytics. The analytics tools to deliver the B2B data visualizations you need and that can help you to engage decision-makers. Below is a “ladder” of the types of analytics capabilities in sequence from the fundamental capabilities at the bottom to the advanced capabilities at the top. Most companies are beginning to incorporate the first few capabilities at the bottom rungs of the ladder, and will need to start to plan how to incorporate those at the top in order to successfully compete in their chosen markets. Standard Reports – these are pre-defined, configurable reports that provide key information about files and transactions you exchange with your trading partners. These typically include powerful capabilities to sort, filter, save, schedule and distribute. For example, you may wish to see a monthly report of all orders received from all your customers. Adhoc Querying & Reporting – this is the capability to search and generate custom reports on your B2B transactions by document type, trading partner, date, time, status, and more. You can define the fields to include on the report and tailor it to your specific needs. Dashboards and Alerts– These provide both the timely “track and trace” data needed to address exception conditions and the summary data needed to identify performance trends, drill down to view specific details and export the data through integration with Microsoft Office. For example, below is a transaction dashboard that provides a visual summary of transaction activity and exception situations. Furthermore, it provides volume trends by document type and trading partner. So now, at a glance you can see which transactions need your immediate attention (e.g. purchase orders that have not been acknowledged), which documents account for the highest volume (e.g. invoices are in the 2nd spot after carrier shipment statuses), and which trading partners account for the highest transaction volumes. Configurable Dashboards – This provides the ability to integrate and combine EDI and non-EDI data from various applications and gain insights into supply chain trends concerning reliability, responsiveness, flexibility, trading partner performance, e-invoicing performance, etc. In addition, these dashboards can be integrated with workflows based upon user-defined rules. For example, if the dashboard shows that order volume from a customer drops by 20% over a month, it can trigger a user-defined business process and/or send a notification to specific users in the organization. Predictive Modelling – This is an advanced capability that few businesses have today, but which is now possible with the latest analytics technology. Sophisticated statistical models that analyze B2B data available from various applications can help you forecast trends and needs in inventory management, logistics and all other areas requiring strategic planning. Scorecards – This is a visual display of Key Performance Indicators (KPIs), such as order acceptance, invoice accuracy, delivery punctuality, ASN timeliness, and fill-rate. The scorecard enables you to measure, evaluate and analyze supplier performance. This information can then be used by buyers and suppliers during negotiations when justifying business awards and pricing. Benchmark / Index – This capability benchmarks an organization’s performance against the industry and provides insight into those processes that trail or exceed your competition, thus enabling your organization to take appropriate action. If this blog has interested you and you would like to learn more, click here to watch this new on-demand webinar, Using Analytics to Unlock the Value of your B2B Data.

Read More

A Funny Thing Happened on the way to Real-Time Payments

There is a tremendous amount of confusion over various proposals and initiatives to make US domestic payments move and settle more quickly. How do Same Day ACH (Automated Clearing House), the Fed’s Task Force on Faster Payments, and The Clearing House’s plan to develop a real-time payment system relate to each other? It’s hard enough for a so-called payments expert to keep up with, never mind someone who has an actual business to run! In the interest of full disclosure, I spent more than a dozen years starting in the late 1980s focused almost exclusively on ACH product development, a fact that seems a little embarrassing in hindsight. More than 15 years after expanding my horizons, most banks still rely on outdated technology that limits the processing of ACH transactions to a handful or less of batch windows per day. After all, these banks aren’t running real-time core deposit systems, so what’s the point of processing payments more quickly? (I hope you can come up with at least five reasons). No wonder that multiple efforts to expedite the processing of ACH transactions since 2010 have failed. The world is moving faster, alternative payment solutions are available to transfer funds quickly and efficiently but ACH continues to be a next day solution at best (with some very limited exceptions). As we approach NACHA’s upcoming ballot on Same Day ACH (expected this quarter), let’s give credit where credit is due. If approved, a phased implementation of Same Day ACH settlement would begin in late 2016 with additional capabilities coming online in 2017 and 2018. Let me be clear: There absolutely is value in moving to same day settlement for ACH transactions. But it isn’t enough. The US ACH network’s efficiency, ubiquity and ability to carry large amounts of data make it a key part of our future payments infrastructure. Not all payments need to be made in real-time. NACHA has identified 10 use cases for same day settlement among them same day payroll, expedited bill payment, B2B payments and account-to-account payments. For these use cases, same day ACH is a big move forward. Let’s hope that NACHA has garnered sufficient support to overcome the objections of members who blocked the 2012 Same Day ACH proposal. Having said that, same day is not real time. Let me repeat. Same Day does NOT equal real time. For a growing number of different use cases, the establishment of a new, efficient, low cost, payment network that would allow US consumers and businesses to move funds in real time is a no brainer. Most of the developed world and many of the emerging market countries have already established or are in the process of building such systems. Please indulge me by using my own consulting business as an example. I maintain a personal deposit account at one bank and a business deposit account at a different bank. When I am paid for a consulting engagement (my only source of incoming cash), the funds go into my business account, usually by ACH but sometimes by check or wire despite efforts to get clients onboard with ACH. I need most of those funds to be transferred into my personal account to cover my non-business expenses. Most of the time, scheduling a payment between accounts in advance is perfectly fine. However, if it happens to be… let’s say April 15… and my quarterly estimated federal and state tax payments are due and I don’t have sufficient cash in my personal account, it would be ideal if I could move those funds on a real-time basis from one bank to another. Even same-day ACH would help in a case like this, assuming that I had notification early enough in the day that a client invoice had been paid. But what happens today? I have to use my business bank’s online bill pay module to initiate a future dated payment to my personal account (it usually requires 2-3 business days and payments are settled through the ACH network). I can take a chance and hope that my incoming payment will be received in time to cover the bill pay transaction by the 15th or I have to move funds from an investment account into my personal deposit account to make sure I can cover the tax payments. This is inefficient, risky and takes time away from my ability to do important things like writing blogs about payments. So what are the prospects for real-time payments in the US? The Fed’s newly formed Faster Payments Task Force will have its first meeting in a few weeks with a goal of identifying viable alternatives for creating such a network and establishing a rules framework… by the end of 2016. No timelines beyond 2016 have yet been established. I applaud the effort and plan to participate personally on the task force. But I do wonder whether a task force with such a broad set of constituents is the most effective way to make this happen. In my experience, some organizations will join the task force with the express purpose of slowing it down (if not shutting it down). In the meanwhile, The Clearing House (the operator of CHIPS and EPN, owned by 24 of the largest banks) has committed to building a real-time payment system, the timeline for which has not been announced publicly. Their effort will seemingly take place on a parallel track to the Fed’s Task Force but at some point, the two will need to come together, at least from a governance perspective. Perhaps The Clearing House will become the de facto real-time payment system or there may be one or more competitive schemes that develop out of the work being done by the Fed. So what’s the bottom line? On a personal level, I am hopeful that by the 3rd quarter of 2016, I will be able to transfer same day funds between my business and personal accounts using ACH. Real-time payments… I believe they will happen but it will be a longer and more difficult track before we see results. When I can pay my share of a group dinner by making an instant payment on my mobile phone to a friend who insists on using her credit card for obtaining loyalty points, I’ll be happy. When a young family member (to remain anonymous) calls on a Sunday morning and says he needs some cash in his account to buy a birthday gift for his father and I can send that payment immediately, I’ll be happy. When I am able to manage my liquidity easily and efficiently, I’ll be happy. After all, consumers and consultants from Singapore to Chile to Denmark to Nigeria (and about 15 other nations) can already do this today. It’s long past time for us to get on the express train. On a related note, if you are attending next week’s NACHA Payments 2015 Conference, please don’t miss OpenText’s interactive Topical Talk on Tuesday at 9:15am in the Great Hall Pre-Function area. You can share your thoughts with other leaders in the payments space about how you are going to prepare for the changes that are coming.

Read More

Sports on the Web: Newer, Better, Global

Are you a sports fan? Are you one of the lucky ones that enjoys sports that are carried on your local station at convenient times, or are you like a growing number of fans that enjoy sports that are carried more often in other countries, on their time zone and not broadcast on local television stations? Modern sports viewing is now often enabled by strong web experiences, and a growing number of fans are now able to enjoy their favourite sports on the web, at times of their convenience regardless of where in the world the sport is played live. Many sports franchises are now taking advantage of the streaming vs. viewing methods fans are adopting, and there are some great sites powered by Web Experience Management software such as OpenText’s Customer Experience Suite. Cloud enabled apps provide viewing and stats on all types of devices, and allow viewers to enjoy the sport and the commentary that goes with it – from both the official commentators and the other viewers. As reported in a recent press release, UK-based Aberdeen Football Club (if you are from North America think Soccer) recently remodeled their site with OpenText’s Experience Suite to include real-time stats, commentary, Twitter feed, pre and post game analysis and real-time photos. The omni-channel experience is critical as 58% of their fans enjoy their site on mobile devices, often as they are watching the game live in the stadium. Check out www.afc.co.uk to see the latest. This time of year sees some of my favorite sports back live and online. While the sites stay up all year sharing info, they come alive when the teams are back and playing again. It is rugby season again and the 6 Nations site www.rbs6nations.com once again brought the tournament and all the news to the locals and those of us in other parts of the world. The site is great with game info and pictures and my favourite is the running list of clips that summarize some of the great moments. Even if you don’t watch the full games, you can get a pretty good idea of the play, the emotion and certainly the outcomes from this site. And of course. my favorite Australian Rules Football (AFL) season has just started, so I will be spending increasing time on their site www.afl.com.au checking out the predictions, results and pictures, and watching highlights or full games at times that are convenient wherever in the world I am. Watching with two screens is a bonus so I have the player info and stats handy while streaming the games from a second device. In the immediacy era we live in, sports on the internet can now be consumed at our leisure and our convenience. Thanks to strong web experience sites we now have a PVR-like option for watching the games, and the apps provide extras like real-time stats and commentary. There is no substitute for the excitement of live sport but when you can’t be there in person, web experiences are now a great alternative.

Read More

OpenText Featured on Bloomberg Business

For this year’s global Innovation Tour, we’ve taken our Digital-First World message on the road. Already underway, members of our executive leadership team have presented in major cities in Asia Pacific, including Mumbai, Tokyo, Sydney, and Singapore. As one of many highlights of the tour, OpenText CMO Adam Howatson was featured on Bloomberg Asia’s Brandstanding in Singapore. The interview covers a number of topics, ranging from the value of information in creating new services and opening up new revenue opportunities to why the cloud is it important for brands and businesses, and how OpenText is successfully rewriting the rules of business for successful digital transformation. According to Howatson, “Being able to connect the way [the business is] represented on social platforms and the way that brands are represented and shared on the Internet and through our connected society, through to back office operations, to manufacturing and internal business processes… It truly is the organizations who are able to integrate that experience and that flow of information who will outperform their competitors in every industry.” Bloomberg Business delivers business and markets news, data, analysis, and video nationwide featuring stories from Businessweek and Bloomberg News. As a global business network, Bloomberg has over 22 million visitors to its web video assets. Watch the video. Learn more about the Digital-First World by downloading the book: Digital: Disrupt or Die.

Read More

3 Questions: Adam Dennison Discusses Analytics for CIOs

Adam Dennison, senior vice president and publisher of CIO and IDG Enterprise’s events, boasts a 15-year career in technology publishing. CIO Perspectives, the company’s regional event series, brings CIOs and senior IT executives together with top technology vendors for a day of thought-provoking, action-oriented discussions. “At a high level, we cover three main topics: strategy, innovation and leadership,” Dennison explains. “More specifically, emerging technologies, digital transformation, and security are three major initiatives for CIOs today.”  We asked Dennison (@adamidg) how CIOs can leverage analytics in their work. OpenText: You’re moderating a panel at CIO Perspectives called “Straight Talk about SMAC” in which you’ll share key stats on trends and investments in social, mobile, analytics and cloud. Can you give us a preview of a stat you find compelling or timely? Dennison: Our research and discussions with CIOs show that currently 32 percent of enterprise IT investment is slated for edge technologies (like SMAC) vs. core technologies (like legacy systems and infrastructure). However, this will shift to 45 percent within the next one to three years. Additionally, 54 percent of enterprise CIOs plan to increase their spend with “newer vendors” within the next 12 months. OpenText: Late last year you published CIOs Must Market IT’s Value, arguing that internal marketing of IT can help businesspeople understand the value of IT departments. Meanwhile, marketing efforts are increasingly driven by data and metrics. How do the two relate? Dennison: My column on CIOs marketing IT to the business was more broad-based than just data and analytics. However, if asked to focus on the data aspect of it, I would say showing the business exactly what they are paying for is step one in the process. Explain to the business units through data, facts and transparency that they are getting true value from enterprise IT vs. a “do it yourself” solution. OpenText: A lot of industry research – and anecdotal evidence, including a recent CIO Quick Takes – shows that CIOs are eager to use analytics to improve business insights. What drives CIOs’ push for analytics? Dennison: What I read from the CIO Quick Takes was a laser focus on customers. Our CIO research shows us that 41 percent of CIOs’ current spend is on external customer experience, relationship and interaction. The only way to get closer to your customers and better serve them is to gather data on them and use that to your advantage. Our 2015 State of the CIO research also shows us that Data/Analytics is the number one tech priority for the next 12 months that will drive investment. Mobile and Cloud came in a close second and third respectively, but it’s all about data today. Adam Dennison photo courtesy of IDG Enterprise. Used with permission.

Read More

6 F-Type Examples in DevShare You Can Use Now

With the release of OpenText Information Hub, Free Edition (formerly BIRT iHub F-Type) we added corresponding content to the DevShare portion of the Developer Center. One section of that content, F-Type Examples, houses sample apps and working examples developed specifically for iHub, Free Edition. When it comes to working with a new product, or using aspects of a product I know well but have not used before, I find examples valuable for getting started. Whether I need ideas for what is possible or confirmation I am going down the right path, I find examples to be particularly useful. Currently, six different examples have been uploaded to the DevShare to demonstrate various ways that iHub, Free Edition can help you. These examples range from simple, well-designed reports to a full sample application. In this post, I will step you through these six examples and detail their various aspects and features. The Six Examples Call Center Example Dashboard – With the large number call centers around the world and companies increasingly focusing on custom experience, business leaders demand metrics about call center performance, and expect those metrics on quick-reading, real-time dashboards. This example demonstrates how the iHub, Free Edition can do exactly that by creating a dashboard to quickly and effectively analyze the metrics of a call center. It utilizes a well designed data object that uses CSV files as the data source, includes multiple data sets that have properly configured column properties (format, header, etc.), and uses a well designed data model with proper categories and hierarchies. Additionally, the use of multiple data sets within the data object is part of what makes this a well designed data model. When used against a relational database, the multiple data sets allow for optimal query trimming within the datadesign file and the generated data object will have better compression. The Dashboard itself has two tabs: Call Analysis (screenshot below) displays an interactive dashboard with drill-downs and selectors, and Calls By State displays a United States map; you click a state on the map to launch a sub-report for that state. InfographiX Examples – This example can create three different infographics. These samples can inspire you with ways to display your own data in highy visual formats. The three included samples are: Classic Models – This example uses the Classic Models sample database that comes with iHub, Free Edition and displays information about the customers in the database by geography, purchasing habits and more in graphic form. The result is shown below. Storage Statistics – This example uses static data and displays statistics for data in a cloud-based storage system – for example, data formats (audio, video, photos, etc.), traffic by data type, and downloads vs. uploads. Tornado – Uses example uses a data object for its data source. It displays statistics about tornadoes in the United States, such as the number of tornadoes per month, relative numbers of tornadoes by strength (on the Enhanced Fujita, or EF, scale) and fatalities by state. Chicago Crimes Example Dashboard – As a developer and a user, I am always interested in seeing location information integrated into applications. I find it particularly useful when a map is used to visualize location information as it allows you to quickly analyze important geographic information about a data set. This web application demonstrates how we can interact with location information. It uses the Custom Visualizations feature in iHub 3.1 to display a Google Map of the Windy City, and adds custom icons and marker clustering to show crime data. It also provides a data selector gadget so users can choose which types of crimes to display. This example, which can be deployed to iHub as an application, shows you how to display a Dashboard when opening a BIRT application. Simple and Well Designed Reports – This example includes two well designed reports that are simple and elegant. By “well designed,” we mean that elements of the data model have been configured to use libraries for styles, which unifies the whole report and makes its appearance consistent. Additionally, they conform to industry standards for layout and design. If you are new to report development, these are good examples to use as a starting point for developing quality reports with a uniform look and feel. If you are more experienced, these are still worth going over because you may find a feature you have not used before, and they will reinforce good practices on the features you already know. These reports demonstrate how to use a report library, and they highlight several features – such as alternate rows, hyperlinks, highlighting, aggregations and formatting – that help you to display complex data more efficiently and effectively. The reports also show how to use themes to standardize the look and feel of charts and tables, and demonstrate parameters with drop-down lists that use both static and scripted default values. A screenshot of one of the reports appears below. BIRT with the Power of jQuery – This includes two examples that use jQuery to add functionality to BIRT reports. Expand and Collapse – This example uses jQuery to automatically expand and collapse sections in a report. A plus (+) or minus (-) is added next to the report elements that, when clicked, will expand or collapse various sections for the report. For example, say you have organized your customers based on region and country. With this added interactivity, you can start with a compact table and then expand the table into the desired country and region in real time. This enables you to limit the displayed results without the need for filtering or re-rendering the report. Highlight on Mouse Hover – This example (shown below) uses jQuery to highlight rows and columns when the user’s mouse hovers over them, which helps users navigate through very wide or very tall tables. The jQuery used to achieve this behavior updates the CSS properties of the various elements, which gives you flexibility when modifying the styling of the report elements. The effects in this example are primarily achieved through modifying the background color, so any color you specify though standard CSS can be used for the highlight. City Taxi Sample App – This example, as the name implies, is a sample application for an imaginary urban transportation company. It demonstrates how BIRT can power compelling embedded analytic applications. Features demonstrated by this application include: Information graphics for displaying data in visually appealing formats Columnar reports with filtering capabilities Interactive reports that end users can modify from their browsers Geospatial visualization built into a report Dashboards designed for Big Data analysis Additionally, this application demonstrates how BIRT can seamlessly add analytic capabilities to existing web applications. All content is presented as part of the HTML web pages, providing a consistent overall user experience for data analysis. The app also demonstrates how to use the Javascript API to embed content with minimal coding. As you can see, this new DevShare section provides a unified area where iHub, Free Edition examples can be shared and distributed. The multiple examples already available range from simple, elegant reports for people who are just getting started, to full sample applications for developers who are ready to integrate BIRT into your existing applications. Thank you for reading this post on the Examples DevShare. We encourage you to download and work with the examples, then tell us what you like and what more you want to see.

Read More

OpenText Study Proves that B2B Integration Significantly Improves Supply Chain Performance

Over the past few months I have posted a few blogs highlighting the results from a new OpenText sponsored study by IDC Manufacturing Insights. The study demonstrated that there is a direct correlation between how increased adoption of B2B Integration technologies directly improves supply chain performance. In fact take a look at how key supply chain metrics are improved through the adoption of B2B integration technologies. To wrap up this project I just wanted to highlight how you can download further information about this study. The following link will allow you to access a recorded version of the webinar that we hosted with IDC in early March, a copy of the webinar slides, the executive white paper and finally the infographic shown below. IDC created the infographic to help illustrate some of the key findings from the study. Click here to access this content. Finally, if you would like to access the various blogs that I have written in support of this new study then please click on the following links :- General Introduction to the Study Automotive Industry Findings High Tech Industry Findings CPG Industry Findings  

Read More

OpenText is Accelerating Healthcare Interoperability at HIMSS 15

We can’t wait to go to HIMSS 2015 in Chicago. As the event for technology and innovation in healthcare we couldn’t think of a better place to preview our latest fax solution to help healthcare professionals redefine the way they exchange their essential digital information. We know that many organizations rely on integrated, electronic fax as their primary method of exchanging information. And since these fax integrations are the very core of how EMR information is sent and received, changing this process can be very complex. However, there are countless forces driving health care organizations to share health information via Direct messaging to attest to Meaningful Use Stage 2 requirements. But does your organization know how to adopt Direct Project standards? That’s why we’re talking about RightFax Healthcare Direct at HIMSS to show you how leveraging existing RightFax integrations and EMR systems can give you the interoperability you need without the complications.

Read More

Information Security in the Digital Age [Podcast]

This is the first of what we hope to be many podcasts in which we explore the technology and culture of Enterprise Information Management (EIM). We’re going to share stories about how OpenText is delivering world class technology and improving our Customer Experience on a daily basis. In this installment, we hope to give you a better understanding of the current cyber security climate, show you what we’re doing to keep your data secure and protect your privacy, and tell you how you can protect yourself online. Our discussion on information security has been recorded as a podcast! If you’d like to listen but don’t see the player above click here. If you don’t want to listen to the podcast, we’ve transcribed it for you below: … The unknown unknown… … If it was three in the morning and there was a bunch of guys standing down a poorly lit alley, would you walk down there by yourself? Probably not. Yet on the Internet, we do that continuously—we walk down that street—and then we’re shocked when negative things happen… … People have an expectation that once they put a lock on their door they’re secure. And that might be the case in their home. But electronically it’s not quite so simple… Are we safe online? Perhaps a better question is whether our information is safe online. 2014 was a banner year for information, data—what we now call cyber—security, and if analyst reports can be any indication, security professionals are on high alert in 2015. International governing bodies have also placed an urgency on better understanding cyber security risks and putting in place strategies to ensure stable telecommunications and safeguard information. There has also been growing concern around data privacy. Though security and privacy work hand-in- hand and it’s difficult to have data privacy without security, there is a difference between the two terms. Security involves the confidentiality, availability and integrity of data. It’s about only collecting information that’s required, then keeping that information safe and destroying it when it’s no longer needed. On the other hand, privacy is about the appropriate use of data. To help us through the topic of cyber security, we talked to Greg Murray, VP of Information Security and Chief Information Security Officer at OpenText. The OpenText security team is made up of specialists around the world who provide operational response, risk assessments and compliance. They also brief executive leadership regularly, and keep development teams abreast of pertinent security information. More importantly, Greg and his team work with our customers to ensure their unique security needs are covered end-to-end. “It starts early in the process,” says Greg. “It starts in the presales cycle where we try to understand the risks that [our customers] are trying to manage in their organization. We find out how they are applying security against that, and then that becomes contractual obligation that we make sure is clearly stated in our agreement with the customer. From there, it goes into our operations center—or risk center, depending on what we’re looking at—and we ensure that whatever our obligations, we’re on top of them and following the different verticals and industries.” Again, 2014 was a big year for cyber security in the news (I think we all remember the stories of not too long ago). But while news agencies focused on the scope and possible future threats, Greg learned something else: “I think if we look at media, one probably would not have argued until last year that media was a high threat area compared to something like aerospace defense. That has changed. Clearly that has changed. As a result, customers come back and say, ‘Hey, our environment has changed. What can you do to help us with that?’” “What a financial institution requires is very different than what a manufacturing provider requires or a pharmaceutical organization. Some of that, as a provider to these organizations and customers, we can carry for them on their behalf. In other cases they must carry it themselves. A lot of the discussions that we have with customers are in regards to ‘Where’s that line?’” “At the end of the day, there’s a collaboration. It’s not all on the customer, it’s not all on OpenText. We have to work together to be able to prove compliance and prove security across the environment.” Regardless of the size, industry or location of an organization, security needs to be a top priority. This concept isn’t a new one. As Greg told Adam Howatson, OpenText CMO in a recent Tech Talk interview, information security hasn’t evolved that much over the last 50 years (view the discussion on YouTube). Greg’s answer may surprise, but after some digging I learned that back in 1998, the Russian Federation brought the issue of information security to the UN’s attention by suggesting that telecommunications were beginning to be used for purposes “inconsistent with the objectives of maintaining international stability and security.” Since then, the UN has been trying to increase transparency, predictability and cooperation among the nations of the world in an effort to police the Internet and private networks. Additionally, if you have seen the Alan Turing biopic The Imitation Game, you know that people have been trying to encrypt and decipher messages since the 1940s and probably even earlier. Today, the lack of physical borders online has certainly complicated things, but the information security game remains the same, and cooperation among allies remains the key. “Are we all contributing together?” Greg asks. “If we’re all working together—just like Neighborhood Watch—we need that same neighborhood community watch on the internet. If you see stuff that doesn’t look right, you should probably report it.” The bad guys are organized and we need to be organized as well. The more we share information and the more we work together… Particularly at OpenText, we have a lot of customer outreach programs and security work where we work hand-in-hand with customer security teams. By doing that, we improve not only our security, but we improve security across the industry.” Recently I attended a talk given by Dr. Ann Cavoukian, former Ontario Privacy Commissioner and Executive Director at the Privacy and Big Data Institute at Ryerson University in Toronto. In it, she said that “privacy cannot be assured solely by compliance with regulatory frameworks; rather, privacy assurance must ideally become an organization’s default mode of operation.” She said that privacy—which again, involves the appropriate use of information—must be at the core of IT systems, accountable business practices and in the physical design and networked infrastructure. Privacy needs to be built into the very design of a business. And I think it’s evident from what Greg says about security, and the way OpenText designs its software with the users’ needs in mind, that our customers’ privacy and security is an essential part of what we offer. “We have a tremendous number of technical controls that are in place throughout all of our systems. For us, though, it starts on the drawing board. That’s when we start thinking about security.” “As soon as Product Management comes up with a new idea, we sit down with them to understand what they’re trying to achieve for the customer and how we’re going to secure it. So that by the time somebody’s uploading that document, it’s already gone through design, engineering, regression testing analysis, security penetration testing.” “One of the other things we do is called threat modelling. Typically we look at the different types of solutions—whether they’re file transfer or transactional, for example—and we look across the industry to see who has been breached and how. We then specifically include that in all of our security and regression testing.” You don’t need to look further than the OpenText Cloud Bill of Rights for proof in our dedication to information security and privacy. In it, we guarantee for our cloud customers the following: You own your content We will not lose your data We will not spy on your data We will not sell your data We will not withhold your data You locate your data where you want it Not everyone is up front with their data privacy policy, but with people becoming more aware of information security and privacy concerns, organizations are going to find themselves facing serious consequences if they do not make the appropriate changes to internal processes and policy. Data security doesn’t lie solely in the hands of cloud vendors or software developers, however. We asked Greg what users and IT administrators can do to protect themselves, and he said it comes down to three things: “One is change your passwords regularly. I know it sounds kind of foolish, but in this day and age if you can use two-factor or multi-factor authentication that does make a big difference.” “The second thing you can do is make sure your systems are patched. 95% of breaches happen because systems aren’t patched. When people ask ‘What’s the sexy side of security?’, it’s not patching. But it works. And it’s not that expensive—it’s typically included free from most vendors.” “The third thing is ‘think before you click.’ If you don’t know who it is or you don’t know what it is… Curiosity kills the cat and curiosity infects computers.” We hope you enjoyed our discussion on information privacy and cyber security. If you’d like to know more about the topics discussed today, visit opencanada.org, privacybydesign.com and of course Opentext.com. We also encourage you to learn more about security regulations and compliance by visiting the CCIRC and FS-ISAC websites.  

Read More

Business Process: The Future of ECM

For years, enterprise content management (ECM) solutions were adopted primarily for two main use cases. The first was to achieve compliance, and many early adopters of ECM continue to successfully use it to address various regulatory requirements. Compliance provided functionality for records management, archiving, and information governance. A while back I wrote a blog post titled What Features Ensure Compliance? that elaborates on the functionality required for compliance use cases. The second use case was around team effectiveness with functionality such as collaboration, document sharing, and social capabilities. Collaboration is subject to frequent changes in direction as every new technology promises an easier and more compelling user experience—from mobility and social software to file sync-and-share. The frequent feature churn in the collaborative use cases doesn’t go well with the compliance requirements that often need the system to remain unchanged for several years (validated environments, anyone?). ROI and Dependency on the User Not only were the two primary use cases not really well aligned in their feature requirements, they had two additional challenges. Neither use case provides a very strong ROI. Sure, we marketers always calculate the savings in storage and government fines that compliance solutions help you avoid. But let’s face it: preventing penalties is not exactly a hard ROI and storage is cheap (or at least everybody thinks it is). The collaborative use cases are even worse—measuring the ROI here is fuzzy at best and often impossible. The second challenge was the dependency on the users to do the right thing. For the compliance use cases, users were expected to diligently file their documents, weed out their inboxes, type in the metadata, and apply the right retention policies. Obviously, users are not very consistent at it, even if you try to force them. In the case of collaboration, users were expected to share their documents openly with others, comment in a productive way, and stay away from email and all the other collaboration tools around them. As it turns out, this type of behavior very much depends on the culture of the team—it works for some, but it will never work for others. The adoption of any collaboration solution is therefore usually very tribal. So, is there any hope for ECM? Can we get an ROI and get employees to use it without someone watching over their shoulder? ECM: Part of the Process As it turns out, there is a third type of use case emerging. It is the use of ECM as part of a business process. Business processes are something people already do—we don’t have to force anyone. That’s what companies and working in them is all about: everything we do is part of a business process. Business processes are also important, relevant, and very measurable. There is an ROI behind every business process. Every instance of a business process includes the context, which can be used to populate the metadata and to select the right policy automatically. Business processes can handle the automation of content management and don’t have to rely on the end user to do it. But business processes don’t live in ECM. Sure, the process artifacts usually reside in a content repository, but it would be a stretch to claim that the entire business process happens in an ECM application. Nor does it live in the BPM application, even if that application may be the primary application for some users. In fact, there is usually a master application from the structured data world that rules the business process: enterprise resource planning (ERP), customer relationship management (CRM), product lifecycle management (PLM), supply chain management (SCM), etc. That’s why it is important for ECM to connect with the master applications through the business process. This is not just a simple way to link data sets or to hand over data from one system to another. Using modern, REST-based technology, it is possible to achieve integration that goes much deeper and involves users, roles, permissions, classifications, and of course the user experience. Deal with Content Chaos ECM addresses some very important problems that every organization has to deal with. Given the volume and relentless growth of content in every enterprise, it has to be managed. Yet ECM struggled to be adopted widely because of lack of tangible ROI and a difficulty to attract end users. Tying ECM to a business process through a master application addresses these challenges. It may not solve every problem with content in the enterprise and there will still be content outside of any business process, but it will go a long way to dealing with what AIIM calls “Content Chaos”. Click below to view my SlideShare presentation from the AIIM Conference 2015 on the challenges with traditional approaches to ECM and a solution provided by tying ECM to business processes: Business Process – the Future of ECM from Lubor Ptacek  

Read More