Information Management

Thinking Small to Re-Envision the Future of Customer Apps and Big Data

Big Data is being re-examined for its role in powering data-enriched customer apps. Harnessing the power of Big Data is all the rage. And with good reason, given the potential to turn large volumes of data into game-changing market insights and strategies. So it’s not a surprise that Big Data spending is expected to hit $50 billion by 2017 according to the latest market forecasts, and more than 60% of organizations (with $1B in revenue) have active Big Data projects according to a recent Actuate survey. Yet, it may be time to start thinking “small” when it comes to big data. Small in terms of the approaches and design perspectives that are all about delivering actionable insights and answers to the broadest possible audience. Even as many organizations strive to gain competitive advantage and better serve their customers by integrating and analyzing (and even sharing) their Big Data assets, several challenges emerge when it comes to delivering insights directly to customers and customer-facing staff: How to access the ever growing volume and diversity of data sources relevant to an organization’s internal business users, when most traditional BI systems/approaches are more suited for “batch” processing of largely transactional data without access to new Web/social/mobile data? How to manage/secure data assets and present large volumes of information to a large volume of end-customers in a highly consumable, engaging fashion (as a special type of Customer Experience Management or “CEM”) via both online and mobile delivery channels? How to deliver these capabilities to everyday business users and end-customers (who aren’t data scientists) with zero training, to help them performeveryday tasks like targeting a new campaign, understanding churn rates, tweaking an investment portfolio, or projecting next month’s sales commissions? How to merge (or at least begin to) the management of customer experience with the delivery of meaningful data-driven insights to end users. Customer experience today doesn’t just revolve around how effective it is to interact with an organization. It’s also increasingly linked with customers’ individual personal business and lifestyle data. These challenges, along with the success of consumer data-driven apps, like Amazon’s product recommendations or Nike Fuelband, in delivering Big Data insights in simple, smart ways, have inspired a number of us in the industry to re-examine the state of Big Data, and more specifically its role in powering data-enriched customer apps. In fact, this thinking has moved into the spotlight over the past 18 months as part of the small data movement (driven by the design principles: make it simple, make it smart, be responsive, be social) I’ve helped to advance via my research, blog, and sessions at various industry events. The movement is not only about shifting focus to the “last mile” of Big Data, but also the approaches and design perspectives that support the delivery of relevant insights and answers to everyday users. As we’ll explore below, this perspective can also help us re-envision the future face of both customer apps and data, as well as the tools that support the creators of those data-enriched customer experiences. The Future of the Data-Driven Customer Experience As data and insights play an increasingly important role in the end-customer journey, brands who tap the power of big data to better connect, inform and motivate their customers at each step have an opportunity to gain significant competitive advantage. Yet, this starts with a foundation that enables us to access, manage, and deliver today’s customer data in a form that fits the needs and skills of every user. Building on our small data philosophy and definition, a series of design criteria/questions emerge for these types of customer “Data-enriched Customer Experience Apps”: 1. Are all relevant data sources accessible? Are we able to deliver large volumes of new and historical individual data to each user on a regular basis? Are resulting customer apps accessible in the broadest sense—by being available to all regardless of role, location, or physical ability? 2. Is data presented so it is understandable (and not overwhelming)? Can we enable (non-technical) users to access relevant data as a report, dashboard, or data export? Are we personalizing the experience by account, role, or user segment? Can users easily navigate and explore their personal data, and access other sources from within the app? 3.  Are we being helpful and delivering actionable insights? Are we packaging insights and answers to support everyday tasks? Can users easily annotate, and share learnings from their session? Can support teams seamlessly monitor the customer experience and help out as requested? The Opportunity With ubiquitous computing driving more customer choice, effective delivery of individualized insights will be the differentiator for more brands in 2014 and beyond. In many cases, personalized data will define the brand experience, especially where we have a large volume of both users and insights from disparate sources, that need to be presented in a highly consumable fashion to everyday users (think asset management, healthcare portals, premium content services etc.). In this environment, scalability and smarts, plus a zero training interface and visual tools, are critical to delivering time to value, as well as experiences users will want to explore and share with their peers. This article first appeared on 1to1 Media. Used with permission.  

Read More

The 6 Most Important Requirements for Enterprise Document Archiving

Guest post by Richard Medina. First published on RichardMedinaDoculabs.com. In What Should You Do with Your Legacy Archiving System?, I offered a simple procedure for helping you decide what to do with your archiving systems. And in Selecting an Enterprise Archive Solution, I provided a quick overview of the three general solution options you have for archiving. In this post I’m going to outline the 6 most important requirements for enterprise document archiving. An “archive” is a system that: Securely stores primarily customer documentation Retains the documents as long as needed Purges documents when they are no longer needed for legal, compliance, or business purposes Provides authorized users (both internal and external) with access to the documents for various purposes (e.g. for business processes, customer service, customer or agent self-service, and discovery) The types of documents in scope for this kind of archive are primarily system-generated output (e.g. statements, EOBs, correspondence), and sometimes also include images and e-communications. The scope should not include unstructured “dynamic” documents such as Microsoft Office documents that require version control or collaboration capabilities. (But you’d be surprised what some folks put in archives.) The scope also does not include structured data such as information in line-of-business applications or databases. The following figure depicts the primary, secondary, and out-of scope types of documents (shown as green, yellow, and red, respectively). Here are the most important high level requirements for enterprise document archiving: Scalability and performance. The archive should provide the ability to handle (moderate, high, or very high – whichever applies to you) volumes of ingestion within time windows needed to provide the business with access to documents when needed within business processes. In addition, the archive should provide reasonable response times for document search and retrieval, and the solution should have the ability to perform ingestion and archive functions without negatively impacting overall system performance for users. Accessibility and availability. The archive should provide a mechanism for authorized users to search for and retrieve documents. In addition, the archive should provide the ability for certain external users to retrieve documents, such as e-presentment for customers or agents. Security and protection. The archive should have the ability to restrict access to documents, such as for documents that are private, confidential, privileged, secret, or essential to business continuity. This may include requirements for encryption of stored content. Retention and integrity. The archive should be able to retain documents for defined periods of time, taking into account legal, regulatory, fiscal, operational, and historical requirements. In addition, the archive should provide a suitable guarantee of authenticity. Finally, (if this applies to you) the archive should provide the ability to retain information on unalterable storage platform when needed (e.g. WORM storage for SEC 17a-4 compliance). Disposition. The archive should support disposition (purging) of documents upon expiration of defined retention periods (both time- and event-based). In addition, the archive should support a formal approval process before purging, and it should support override of purging in cases where documents are under legal hold. Finally, the archive should enable authorized staff to periodically review and potentially modify retention periods. Integration. The archive must provide a standards-based architecture and open API that allows integration with other systems or middleware components, including existing legacy systems in use at your company.

Read More

Top Four Ways to Make the BPM Business Case to IT Stakeholders

Does anyone know the challenges, requirements, and issues you face in your job better than you do? Who better understands that when you’re trying to resolve a case, you need to be able to instigate workflows, collaborate with everyone involved, and be able to check-in on the status at anytime, anywhere, and on any device? That means that when it comes time to find a solution for better process efficiency, it’s more often than not the line of business driving the initiative. However since one of the biggest challenges users reported as a roadblock in their BPM implementation was a conflict over process ownership between IT and business stakeholders*, it’s imperative that you get equal buy in from the line of business and IT. You see, even though organizations of all sizes, business models and industries talk about the essential nature of IT and business alignment, achieving that balance is often easier said than done. Part of the challenge is the growing complexity and interdependencies of business processes, as organizations strive to break down traditional operating silos not only by sharing information but also by creating “cultures of collaboration” that help integrate and better leverage essential business processes. In particular, enterprises want to use technology to unify and strengthen the three pillars of their organization: people, processes and information. The successful integration of these three elements is at the heart of long-term, demonstrable business performance. Another key concern IT and business leaders have with BPM is how best to utilize it in order to go past nice but unspectacular incremental improvements in favor of disruptive, revolutionary enhancements that help the organization achieve competitive advantage. Despite the desire to do more with BPM, according to a recent study from Forrester Research, 59% of enterprise architecture respondents use BPM not for strategic requirements, but rather for finding bottlenecks, refining processes and reduce operating costs, which is not a great way to get a good ROI. Instead, the right solution needs to have significant functionality, high performance and can be deployed in a variety of architectural designs for maximum flexibility while still meeting the need for speed and agility. Four key components to look for in your BPM solution: Packaged applications : Look for applications that capture best practices and domain-specific components, and package them into turnkey applications that require only configuration and light customization to meet the need of an organization. Solutions accelerators: By utilizing prebuilt and reusable components, developers in the line of business or IT can build applications using coarse-grained service components and prebuilt application elements, reducing the time to value and utilizing service delivery best practices built into the components. Model-driven development : Robust modelling environments to develop applications using BPMN-based process modelling tools, or CMMN-based case modelling (the emerging standard for case management modelling), are also important. The platform should also include a service-oriented architecture-compliant framework so developers can create, utilize and reuse Web services in the development of applications, and also leverage the enterprise service bus functionality for more complex integration work that goes beyond typical Web services integration. Code-level development: The solution should provide resources and support for developers that includes code snippets, comprehensive libraries, reusable elements and services, and developer communities for developers to engage with others. With these tips in mind, you’ll not only be able to find a BPM solution that fits your needs, but also one IT can stand behind because of its flexibility to constantly adapt to changing requirements without reinventing the wheel each time. It’s also important to note that the way applications are developed is usually a mix of these for different solutions, and sometimes even within the same solution, so it’s imperative to have all of these available to you. Now you can work with IT and find a solution that evolves with your organization, instead of working against it. Want to learn more about how to make the business case for BPM? Download the Tech Target white paper, “ Selling the Benefits of Business Process Management for IT and Business Stakeholders ” today. *InfoWorld BPM Survey for OpenText 2014

Read More

5 Ways to Enable Differentiation in Transaction Banking

Differentiate or die. This is a key business and marketing principle. This provocative phrase reminds us that the Natural Laws also have very real implications, outside of physics, in the world of business. At OpenText GXS, we have identified five basic rules that will put Banks and Financial Institutions on a path to differentiation. This blog is intended to help build your strategic agenda for business transformation, or simply to map out discussion items with your clients. 1 – Make it easy to do business with you Right from the start, the customer experience commences with the Account Opening and Know Your Client (KYC) processes. This “business on-boarding” is where the client’s Finance teams are coming to grips with intensive banking paperwork, sharing details of signatories, accounts structure, etc. First impressions count; you must make your clients’ tasks easy and stress-free from the very start. At the very least, the Bank should leverage self-serve tools such as Client Community Management, or Business Process Modelling (BPM) workflows to collect client information, which will become critical as part of the technical on-boarding and the on-going BAU operational relationship. Internal SLAs or SLOs between the Bank’s various on-boarding teams will allow predicable and consistent client experience, with dates and timelines you can commit to 2 – Step up your game with a service-oriented Banking channel Typically, your client’s priority is to optimize the way they operate their order-to-cash and purchase-to-pay cycles, while optimizing working capital. Historically, banks are servicing a few points of needs coming out from these processes, such as payments, cash management, trade finance, investment banking. These are multiple facets of the same processes for the customer; the least you have to do is to present the access to those services in a consistent and simplified way. This is often called a “consistent cross-channel experience”, with the principle of a technical “Single Pipe” between the client and the Bank. However, a single pipe on its own is not enough; the Bank needs to meet consumer-type expectations for user experience meaning you need to step up your game. 3 – Shift your staff’s focus to activities that deliver unique value There is no valid economic model or competitive logic in running large IT empires, when business partners and solution providers can do it better, faster, cheaper and with the same levels of security. Banks should be able to focus their resources where real competitive advantage can be gained, making a measurable difference to the bottom line. Cloud technology and Managed Services bring unprecedented scale for Enterprise IT functions and the lines of business supported. Channel Technology, client on-boarding and Industry Standards are only means to an end; everyone needs to comply and provide the same components. Why task your most valuable people on activities that are complex but not distinctive, when others are outperforming you effortlessly in the same areas using external providers? 4 – Create value within the customer’s environment Running a global treasury strategy has its unique challenges, with only a fraction of the associated internal challenges are exposed to the Bank. Financial Institutions need to start to show interest in the deeper needs of their corporate clients, offer to cross the bridge and literally integrate business processes rather than just the technology. Resource and market pressures mean that corporates increasingly rely on their vendors for advice, innovation and cost reduction. Successful banks will need to offer business consulting skills, deep domain expertise and best-in-class delivery. Remember that clients are expecting to receive and consume this advice in a very different way to a consultancy project; this should be factored within the fabric of the commercial and operational relationship. SMEs, mid-market clients and larger corporate customers are all leveraging Cloud technology in a shape or form, making them a very educated and clever buyer when it comes to consuming the Bank’s services and products. 5 – Don’t sell; challenge, advise and map their buying process Clients have changing expectations and attitude towards Financial Technology and Banking. In only a few years from now, a generation of Millennials will bring new expectations about how Banks will need to treat them as customers. These demands will need to be met immediately, in a personalized, social and consultative manner. The bank must also be prepared to demonstrate the value they provide throughout the term of the partnership. Client feedback shows that a key driver of customer loyalty is the Bank relationship director offering unique, valuable perspectives on the market, educating them on issues and outcomes. Let’s conclude with one last provocative thought: “Measure what matters most, but don’t let what you measure become what matters most”. Are your organisation’s metrics around Transaction Services really rewarding differentiation? At OpenText we are keen to further discuss those items and spell out the success stories we enabled with our Banks and Financial Institution clients. Feel free to contact me directly or come over and visit our booth at SIBOS in Boston in October 2014.

Read More

Top 4 Digital Disruptors

The future of information is changing—fast. From the Internet of Things to wearable technology to 3D printing, today’s digital and technological disruptions are transforming the business world. “What I see coming over the next five years is like nothing I’ve seen before,” said Mark Barrenechea, OpenText CEO, at a recent event. “Digital disruption is stronger and faster than any technology shift we’ve seen.” Barrenechea identified four key disruptors in today’s business world: 1. Changing Workforces By the year 2020, 60 percent of OpenText’s workforce will have been born in the aged of the Internet—and their expectations will be set accordingly. “This workforce is dramatically different,” said Barrenechea. “The generation is not hierarchical, they’re social. They want to help each other. They’re impatient, mobile, and collaborative. And if [OpenText is] not digital, how am I going to hire and retain the best workforce?” To meet end users’ expectations, organizations need to give them easy access to each other and to their information. Workers expect to be able to work anywhere they want, from any device they want, in a secure, seamless environment. Learn more about the future of work and how to build a more collaborative business in the webinar on demand, Fuel the Speed of Innovation: Top 5 Ways to Inspire Collaboration in Your Organization. 2. Digital Transformation “Digital changes everything,” said Barrenechea. He asked the audience to consider the paper-driven, manual processes that are left in their organizations, and then to think of the benefits they’d see by digitizing them. For example, at OpenText the expense management process currently involves filling out a form, putting receipts in envelopes, and mailing them to distribution centers around the world where people manually enter in the information. “It’s a mundane process, but it’s an important process,” said Barrenechea. “And when we digitize it next year I’ll be able to take $4 million of expenses out of my business.” OpenText’s goal over the next four years is to have platforms that allow our customers to be completely digital in everything they do. Ultimately, this enables businesses to grow faster, scale quickly, and enable a whole new process for innovation. 3. New Disruptive Technologies “By 2020 one of the big nexuses of change will be new technology, and it’s going to drive us to be more effective, more efficient, and more competitive,” said Barrenechea. He listed a number of new technologies that are changing the business world, including: Wearable technology: The data generated by wearable technology such as Google Glass will come into enterprises via the Internet of Things. To help support this data, OpenText has set out to provide a globally reliable, secure network. 5G: Think of how 4G changed the world and made it vastly more mobile. The same is set to happen with 5G. Working on mobile devices: By the end of this year, Barrenechea predicts that OpenText will never again buy another PC. They’ll be primarily gone, exchanged for working on tablets and other mobile devices. New display technologies: The ability to fold up a device and take it with you will drive new display options over the next few years, giving way to more low-footprint choices like holographic displays. 3D printing: 3D printing is set to change the manufacturing world, shortening production cycles and reducing costs across the board. 4. The Cloud More and more workloads and processes are moving out into cloud architectures. According to Barrenechea, part of what makes the cloud so disruptive is its misinterpretation in some organizations. The line of business thinks they no longer need the CIO: “They can go out and say, ‘I’m going to move my marketing system out to the cloud and forget about the CIO’… But your data is now in many places, and may not be under the control of your CIO.” As the world becomes more cloud-centric, leading organizations are looking for cloud systems that help them ensure security and maintain control over their information. “There’s a nexus of forces coming for 2020,” said Barrenechea. “What will change our workplace, what will change digital, what will make us innovate?”

Read More

Best Practices for Securely Sharing Healthcare Documents While Mobile

I recently read an article discussing best practices for securing mobile healthcare devices. This got me thinking about the larger set of issues surrounding information security, HIPAA compliance, Meaningful Use and the potential for personal data leakage when accessing healthcare content on laptops, tablets and smartphones. Over recent years we have seen three healthcare trends converge, making the ability to view documents on mobile devices an imperative: The increased use of mobile devices such as smartphones and tablets across healthcare stakeholders, including providers, payers and patients The growing amount of unstructured data, such as PDFs of ECG results, radiology images, photos, transcription documents and more The expanding use of Health Information Exchange (HIE) systems, including the release of medical information to patients With up to 80 percent of a typical healthcare institution’s data being traditional documents or other such “unstructured” content, the challenge arises when an authorized stakeholder (e.g., a provider, the patient, a case manager or care-giver) needs to access health information from their tablet or smartphone. How can healthcare organizations be sure that the information is viewable by authorized individuals, yet protected from prying eyes? A typical set of best practices surrounding content protection include: Ensure that users access and view information using only trusted apps Protect content, ensuring that it can only be viewed by a designated individual and for a defined period Secure stored data in a protected or encrypted format to ensure that it can not be accessed should a device be the subject of theft or other malicious activities Prevent the use of untrusted file-sharing apps for accessing secure documents Record all viewing, printing and sharing actions into an audit log Ensure that sensitive, personal information contained within documents is only available to verified individuals To me, this last point stands out with the looming deadline for meeting Stage 2 Meaningful Use requirements demanding both broader access to electronic records and protection of personal information. This makes including a comprehensive redaction (removal of sensitive content) strategy essential, especially when sharing healthcare records to mobile devices. This will help protect against bad habits of well-intended users as well as the actions of users with more malicious intentions.

Read More

Advocacy: The Bridge between Selling & Buying

The impact of the Internet and rise of social media is fueling a customer revolution in both B2C and B2B environments. Companies that don’t have a comprehensive, consistent and rich branding strategy that INCLUDES social media run the risk of undermining their entire sales machine. Buyers don’t want to be Sold to! OpenText CEO Mark Barrenechea noted in a recent Innovation Tour keynote address, “…customers have completed nearly 60% of their buying journey BEFORE engaging with a vendor’s selling process.” Utilizing the Internet, customers now have the means to retrieve essential information for themselves and will push back sales involvement as long as possible. The report referenced is published research conducted by CEB Marketing Leadership Council in Partnership with Google, entitled The Digital Evolution in B2B Marketing. According to the study of 1500 B2B professionals, 70+%: “are readily turning to their personal networks and publicly available information—increasingly via digital and social media channels—to self-diagnose their problems and form opinions about solutions.” While Content Marketing provides critical information that customers need to inform their decisions, it is engagement that transforms that information into knowledge. For customers, even more so than for prospects, experience provides the emotional attachment to deeply bind the customer to the solutions they own, and ultimately to your company. Considering this shift in buying dynamics, corporations need to consider allocating a significant portion of their marketing efforts and funds directly toward their customers, as they represent not only direct revenue opportunity but the extended opportunity in their social advocacy to your prospects. I’m not suggesting bribery, rather support of social communities outside and within your company, enriched by information and resources that help your customers meet, or exceed their business goals with your products. This can take the form of eLearning tools, best practices guides and tips and tricks content, particularly in engaging rich media format which is both more engaging and more effective. But these are table stakes. Once engaged, providing a forum, with access to product specialists, industry experts and other customers who are committed to your solutions insures that customers will attract prospects and validate your value proposition. Well produced, quality content will be referenced and shared among the customer and prospect communities. The brand elements will insure that your business is credited for that content and reflect that your business provides quality solutions and promotes business success. While your customer engagement facilitates customer success and fosters advocacy, your brand links that advocacy to your company and products. Your social engagement provides the connection between customers and prospects. Clearly, your advocates then become the most valued guide in the buyer’s journey and draw those buyers into your social space from which it becomes a natural channel to your marketing assets and engaging with your selling process. Social Media is empowering the Voice of the Consumer. If you’re not proactively engaging in a dialogue with customers then your brand is in the hands of strangers. Through our efforts, we in marketing engage directly or indirectly with every stage of the customer lifecycle, from Suspect to Opportunity and Close to Advocacy. As such, within the enterprise it’s our responsibility to insure strong, clear and dynamic articulation of our business’ value proposition. To be validated it must resonate within every communication, at every touch point and in all forums of customer interaction; no small task for enterprise marketing. The Experience Suite by OpenText is designed to assist in delivering your digital presence and the correspondence that enables you to amplify your online marketing strategy. Marketers are better able to manage their global brand and produce effective customer communications. The Experience Suite is a vehicle for Marketing and Line of Business owners to insure the seamless transition of customers from Suspect to Advocate, where advocacy becomes the bridge between buyer and seller. To learn more about OpenText’s Experience Suite visit http://www.opentext.com/what-we-do/products/customer-experience-management Are you experiencing a shift in buying behavior? How engaged are you with your customers in social venues?

Read More

What’s the Difference Between Van Morrison and a Value Added Network?

Well put simply one is still doing the same thing they were doing more than forty five years ago and the other has evolved into something very different, but which is which? When I was a child, my parents were constantly playing Van Morrison music in the background whilst I was trying to build intricate engineering models with my Meccano set! In fact the late sixties were quite busy with Van Morrison launching what was to be a very successful solo career, the first EDI messages started to be exchanged and I was born around this time as well. When I joined GXS back in early 2006 I was introduced to the world of hub and spoke communities and Value Added Networks but this was at a time when the company was busy repositioning itself into something very different. After I joined GXS I started to hear terms such as the company being ‘more than just a VAN’ and as soon as I heard the VAN acronym I had flashbacks to when my parents were playing Van Morrison records, may be it was because the name ‘Van’ had been so engrained in my mind from a very early age! Anyway time moves on, GXS has evolved and under new ownership of OpenText™, the world’s largest provider of Enterprise Information Management solutions, Trading Grid™, as our B2B network is called, is going to evolve still further and will strengthen the link between the internal and external enterprise. Moving EDI messages from one mail box to another is still part of our business, however the key growth area is our Managed Services offering and this is perfectly timed with the global interest in moving to cloud based services as a way to develop leaner, more scalable IT infrastructures. OpenText™ Trading Grid™ is essentially a network, something that our company has offered for many years and it helps to connect companies together to allow them to undertake business with each other. Trading Grid™ provides the single entry point into an enterprise and allows you to connect to many different external trading partners. So using this analogy Trading Grid™ is a business or B2B Network, not just any B2B network but one that is processing more than 16billion transactions each year. Once connected to Trading Grid™, companies can potentially connect with over 600,000 other businesses that are also making use of this network today. The former GXS company now sits under a business unit called Information Exchange and this business unit includes services such as Secure Messaging and Rightfax solutions to name but a few. The most staggering number shown below is the amount of commerce being transacted across Trading Grid over a one year period. So in the same way that Van Morrison’s music was initially released on records, you can now download a complete digital set of his music from Apple’s iTunes, in the world of EDI, the Trading Grid™ network has evolved into offering cloud based B2B integration services. This is significant progression in my mind! In my last blog post I discussed how companies can get more out of a B2B Network and during my keynote presentation at EDIFICE I cited several examples of different consumer and business networks. The so called ‘Network Effect’ is transforming how both people and companies communicate with each other. From personal networks such as Facebook and LinkedIn, through to consumer networks or eco-systems which offer multiple services from with an environment such as iTunes or Google. Finally there are business networks such as industry specific ones such as Exostar and then B2B networks such as OpenText™ Trading Grid™. People have become use to connecting to a network and then using different services that reside on these particular networks. In the case of Trading Grid™, these additional services could be processing invoices across each of the 28 countries that make up the European Union, connecting to global banks via our SWIFT Bureau service, tracking the lifecycle of business transactions, through to managing the day to day collaboration between potentially thousands of trading partners and then providing direct integration with back office business systems such as SAP and SAGE. Three years ago I saw the above image posted on the internet which highlighted all the interactions between different users on Facebook over a fixed period of time. As you can see, all the Facebook interactions neatly define a map of the world. Given that I look after the industry marketing for the manufacturing vertical at OpenText™, I was curious to see the type of network that could be formed by companies connected to Trading Grid™. For the purposes of the graphic below, I have removed the names of the companies but it quickly became apparent that if an automotive supplier is connected to Trading Grid™ then they would be able to undertake B2B with virtually any of their trading partners located anywhere in the world. I won’t bore you with the details on all the individual B2B solutions used by these companies but once I created the above diagram, using a very small subset of our overall automotive customer base, there were some interesting observations. North American companies were very keen to try move towards using cloud based services (represented by the Managed Service, MS icon), European companies were keen on using their own home grown B2B platforms combined with our messaging platform, Trading Grid Messaging Service (TGMS) and the Japanese companies were moving away from behind the firewall B2B solutions to cloud based services. The Japanese observation was probably as a result of the recent natural disasters that have impacted the country and their desire to spread their production risk around the world. In fact the automotive industry is truly global in nature and when OEMs move into a new country such as Mexico, their key suppliers are expected to move quickly into the country with them. Only a cloud based B2B infrastructure can provide this level of flexibility and scalability. As I highlighted in an earlier blog relating to the Internet of Things (IoT), the B2B network as we know it today is going to evolve still further. For example information from billions of connected devices across the supply chain will provide an end to end view of shipments that we have never experienced before. So just when today’s CIOs have started to embrace Cloud, Mobile, Big Data and Social Networks, along comes the IoT, considered by many as one of the most disruptive technologies of our times. Needless to say OpenText™ will embrace these disruptive technologies as part of our 2020 Digital Agenda and we will help guide CIOs through this period of significant ‘Digital Disruption’. So if you would like to learn why our B2B network is significantly more than just a VAN, then please visit our website for more information on Trading Grid™ and our future 2020 Digital Agenda. So just in case you haven’t worked out by now, after 45 years Van Morrison is still producing music and it is the EDI VAN that has evolved into a cloud based B2B Network. In closing it is interesting that Van Morrison’s latest album is called ‘Born to Sing’, a bit like Trading Grid, ‘Born to do B2B’

Read More

What if you could reduce the time to launch new services from months to weeks?

 Your business needs are changing constantly and if you don’t have solutions in place to increase agility and flexibility, you won’t be able to stay competitive. While you’re waiting for a custom BPM solution to be built, your competitors are passing you in the market, your organization isn’t running at its optimum efficiency, and likely by the time it’s finally ready, your business needs have already changed again. It’s an endless cycle that needs to stop now. AEGON Religare Life Insurance Company is the perfect example. They’re a three-way joint venture between the Netherlands-based AEGON, an international life insurance, pension, and investment company and India’s Religare, a leading integrated financial services groups, and Bennet, Coleman & company, India’s largest media house. As you can imagine trying to be process efficient and incorporate real-time integration of systems and information across infrastructure of that size was a massive challenge. They needed to launch operations across 14 independent systems at 25 locations on the first day. Not to mention, the Indian insurance market serves over 21 languages and a number of unique local cultures in a population exceeding one billion. Therefore scaling and adapting processes to meet increasing volumes and customer acquisition and retention was a huge challenge, especially when combined with their need to meet an aggressive time-to-market. Watch the video to hear Srinivasan Iyengar, COO of AEGON Religare, discuss how they overcame this hurdle and managed to deploy a solution in days, not months. Want to learn more about how you can make the case for better process efficiency? Read the Tech Target white paper, “Selling the Benefits of Business Process Management for IT and Business Stakeholders.”

Read More

Using Big Data to Develop a Customer-Centric Culture [Webinar]

“Customer experience is like gravity. You may want to ignore it, but you don’t have that choice.” This juicy quote comes from Tim Walters, Principal Analyst at Digital Clarity Group (DCG), who presented a recent webinar with Allen Bonde, Vice President of Product Marketing and Innovation, Actuate. The focus of their talk: the relationship between Big Data and Customer Experience (CX). CX and customer experience management (CEM) are becoming increasingly critical to companies scrambling to get the best picture of customer needs. The momentum behind CX and CEM has certainly been bolstered by Big Data. Structured and unstructured data, when used well, enable companies to make better decisions about customers and what they may need. Companies of all shapes and sizes want to use Big Data to create offers and experiences that drive  lifelong brand loyalty. Companies that bring the power of Big Data to everyday business users, rather than keeping it in the realm of data scientists, can additionally drive better insight, campaign performance and organizational knowledge. During the webinar, Walters and Bonde address the paradigm of Big Data use and discuss how small data, digital disruption, experience delivery and advanced analytics help drive better customer experiences. As seen in the graphic above,  “fast is the new big” when it comes to using data to empower customer-facing teams and encourage a better understanding of customer desires – and, as Bonde says, “allow visuals to be the language of the data.” Check out more words of wisdom about using Big Data to develop a customer-centric culture. Register for the webinar here.

Read More

INTERVIEW: Essential Features of DAM

This interview was published by the DAM Coalition, a community for Digital Asset Management and media professionals. To see other interviews and articles visit www.DAMCoalition.com DAM Coalition: Tell us about your history in digital asset management. How did you get involved with DAM? John Price: My background is in TV production during the pre-digital days. Back when analog video was still going on I worked on the technical side of production in news environments so I was directing, editing, producing and doing just about whatever needed to get done. It gave me great insight as to how the media process works. Then I moved into working with broadcast vendors in creating software to help them better manage all of the stuff they were doing. I helped them automate some of the processes that were involved in delivering their broadcast content. That evolved eventually into where I’m at with digital asset management and trying to help people manage those same processes. Now though, instead of going out to a single broadcast channel, I’m working with people as they work to get their message out across numerous channels and in multiple formats which need to be able to be viewed on many different devices. It’s a much more dynamic system, and in a lot of ways a much more complex one as well. How did that automation impact digital asset management? In a huge way, because it allowed us to control so many more processes than we were ever able to do in the past, and that gave people the freedom to focus on more important things. As you reduce manual, repetitive tasks and increase productivity, creativity jumps. DAM has allowed more to get done in various ways for a greater amount of people. In the broadcast world, most of the issues were caused by human error. Somebody hit the wrong button…somebody played the wrong commercial at the wrong time…somebody didn’t give the cue, etc. Those human errors can add up and they can be very costly. The whole prospect with automation was that it could reduce the amount of human intervention, which meant you could reduce the errors. The machine won’t be making those mistakes. Of course, it gets that much more complex, because you have to configure the system, you have to create those rules and manage the exceptions, but automation continues to be a major part of almost any DAM system. The ability to take those manual and repetitive tasks off the table so your staff has more time to be more productive is an incredibly powerful concept. When you’re having a conversation about what a DAM system can do for someone’s organization, do you find yourself having to temper expectations or is it more about trying to figure out what they want to accomplish? Conversations typically go in two directions. My preferred direction is starting out with a high level strategic path where the customer or company thinks in terms of what they’re going to do with their digital media. Most companies understand that the amount of media they’re generating is growing exponentially every year. What they don’t see is that they need to mange all of that information. Many people don’t think strategically about how they want to get from where they are today to where they want to be in the future. They aren’t sure how they’re going to go down that path. Let’s say right now that as an organization, you have all of your assets on shared drives or on people’s computers so you can’t really find anything. So you take a step back and identify that in five years, as an organization, you want to be able to access any of the rich media you own to deliver better product to your customer. If that’s where you want to go, you have quite a path to get there. But that’s where it can get exciting, because then we can start to explore lots of different methods and ways to make that a reality. There’s no one specific way to accomplish that goal, so we can look at what’s going to be the best way for your organization in terms of the way you work now. We can maximize what you have working in order to map the path for where you need to go. If you start thinking strategically, you’ll open up new opportunities. Your whole infrastructure can be energized by something like this, because it gives more people the ability to access information they want and need. That’s the one conversation we usually have. The other conversation takes that strategic thinking into account, but it’s focused on what exactly DAM is. When we have that conversation, it crystalizes into a couple core things that DAM needs to do for them. If it does those things well, it solves a lot of issues for the customer on a small and large scale. For me, the whole purpose of digital asset management is to maintain control and access to your digital assets. If I’m an organization and I’m putting lots of effort and cost into creating intellectual property, whether it’s a logo or message or anything else, I want to be able to control that asset and be able to provide proper access to it. For some of those assets, I want the entire world to be able to access them so they can download and share all of that and use them however they want. Other assets I want to maintain very tight control over that only certain people can see or use them. It all boils down to control and access. So much of it is about managing users, because I’m going to have internal users, external users, partners, agencies and all sorts of different people who are going to be contributing, consuming and modifying content. If I can create a system that allows them to easily access what they need while providing specific limitations around that access, then I’ve created a system that works for everyone. This helps the organization become more productive and also helps it grow the system. Say you’re a small marketing department and you just want to control your branding assets. Once you have that control and have everyone on the same page you can start looking at how you can expand that beyond marketing. Then you can pull in the product department and they can create consistent packaging. You can pull in the finance department so they can use the latest version of the logo. You can pull in whoever needs those assets. It really boils down to the control and access features that you can find in the typical DAM system. “DAM” is just one of the many acronyms that people use, sometimes interchangeably with others (PAM, MAM, ECM, EIM, CEM, etc.). Do you think we should still be saying “DAM” when we want to talk about the management and decisions surrounding the ingestion, cataloguing, storage, retrieval and distribution of digital assets? Or is it a moot point? DAM is a convenient and useful acronym; it has critical mass that conceptually people have some idea of what it is. As you mentioned, we have all kinds of different combinations of letters but you have to have something that you can hang all of these ideas on, and it’s kind of become accepted that “DAM” is that hanger. Whether they’re documents in an ECM system or media that needs to go to a WCM, you’re managing something that’s both digital and an asset, so the term “DAM” seems to be totally appropriate. Really though, you can look at a DAM system in whatever way you want to express the concept. And that expression will be in whatever way you want to fulfill the purpose of the DAM system. It might be in a small department, it might be WCM, it might be ECM, it might be multiple DAM systems across your entire organization. And there are lots of people who work in all of those ways and more. But it still comes down to having these digital assets and managing them in a way that works for individuals and for the organization. What sorts of people have become (to borrow a term from the DAM Guru Program) the “DAM guru” within an organization? A lot of it springs out of a desire from an individual in an organization who figures that there must be a better way to do something. They get tired of spending four hours a day or 25 hours week just trying to find a single image they need to use for a particular campaign, and that’s without even knowing if they have the rights to it. People often want to and need to organize the chaos that’s around them. As they start to organize they look for tools that will help them do that, and they find out that a DAM system can make their lives easier in many different ways. That’s the exciting thing, because you’re starting to see more and more organizations realize that these sorts of people are really providing a valuable service to the company as a whole. A lot of these organizations have tens of thousands or hundreds of thousands of assets, and so many of the people in that organization have no idea where anything is or even what they have available. We are starting to see more job titles and positions being created that are focused on DAM, but titles don’t do the work. People do. And people will do what is needed so the work can be expanded. What’s an issue that you see come up over and over for people when they’re dealing with their DAM? Click link to see rest of interview: http://damcoalition.com/damcoalitionexclusive/story/what-are-the-essential-features-of-a-dam-system

Read More

Modernize to Survive: e-Government Trends and Transformations

This is an excerpt from e-Government or Out of Government, the new book by Mark J. Barrenechea, President and CEO of OpenText, and Tom Jenkins, Chairman of OpenText. Get your complimentary copy of the full book here — and don’t miss the webinar on demand featuring Mark Barrenechea, Digitize to Survive: Using e-Government to Optimize Programs and Performance . The digital revolution is transforming politics and the nature of government. From education improvement and tax collection to better health care systems and job creation, technology is putting pressure on governments to make all aspects of government more accessible. Citizens’ voices are “present” in public policymaking and their demands for an always-on, connected government are intensifying through social media. Governments are realizing that technology can act as an enabler to transform government processes, enhancing the provision of public services and increasing productivity. The challenge for modern government is complex but two key trends emerge: 1) Society is being profoundly transformed by new Information and Communication Technologies (ICTs). Entire commercial sectors and old business models have already been swept away as consumers find new ways to access news media, music, films, books, and entertainment and satisfy daily needs. Governments, imprisoned by decades of regulation and bureaucratic habit, are challenged to become more responsive, citizen-centered, and flexible organizations. 2) Governments are experiencing financial constraint with increased demand for services. Productivity is the key to the future and to public support. To achieve this, governments need to manage their information assets more collaboratively and effectively. Information is the life-blood of public service. While the imperative of change has been recognized by leaders in government, the extent of change required is challenged by laws, regulations, and a public service culture inherited from an analog world. In government, traditional models of hierarchy define (and confine) how information flows. Accountability and performance management regimes inherent in these models work to reinforce hierarchy and inhibit collaboration and teamwork. But the issues of a modern society require open information sharing and collaboration across the organization. Digital natives coming to the public service today encounter a fossilized culture that is adapting too slowly to address today’s issues. 2020: e-Government or Out of Government While there are many possible scenarios for e-government over the coming decade, the challenge will be to achieve the productivity and service quality that new information technologies offer while remaining true to the fundamental values of a professional public service. The emergence of technologies like big data, mobile apps, and cloud computing could be used for extensive social control, eavesdropping, and monitoring all communications—tracking the locations, shopping and entertainment preferences of all citizens, and intruding into all aspects of life. This is the scenario envisioned by novelists and film producers as they contemplate the future. These technologies are simultaneously liberating and can be used to encourage individual expression and participation in society’s issues. Social media played a major role in the Arab Spring, the rash of demonstrations and protests, riots, and civil wars in the Arab world that began in December 2010. Social media enables NGOs and interest groups to mobilize faster than governments can respond. The issue facing all governments and societies is to determine where they want to be on the continuum of state-control and individual freedom. This will determine the fundamental values and characters of their societies in the coming decades. Many governments are in crisis, torn by conflicting interests and faced with insurmountable challenges that span national borders and require resources to be mobilized on a scale that often exceeds capacity. These governments are criticized for their failure to respond to market conditions and citizen demands. This is because theory and policy are rooted in approaches, solutions, principles, and technologies that have become obsolete. Outmoded approaches to governing are based on traditional values about organizational structure, processes, and performance. Transformation is the way forward for government today. As a strategy, e-government is already helping public sector organizations master change. To be effective in today’s complex, connected, and fast-paced world, governments need to redesign their structures and processes to capitalize on a new set of actors and tools. The approach, strategies, standards, and technologies can all be defined as e-government. e-Government can enable a better government. But what does the future model of government look like? How will we be governed in 2020? Learn the answers to these questions and more in webinar on demand, Digitize to Survive: Using e-Government to Optimize Programs and Performance. 

Read More

Information Governance in the Energy Sector: Improving OperationalExcellence in Asset Management

This post is part of our blog series that explores how Information Governance is deliver value to utilities, mining and oil and gas companies, in their core business processes. Information Governance is more than just “records management”: It is a means to manage risk, ensure HSE and regulator compliance, and achieve operational excellence and competitive advantage from your information. When it comes to capital-intensive assets such as power plants, oil rigs, or manufacturing facilities, you have much to gain by optimizing performance of operations and maintenance. Many of the industry’s sharpest and most chronic pain points are directly related to issues around the production assets. Doing everything to maximize return on assets and operational efficiency is critical and minimizing production downtime is a top asset management priority. As a consequence, plant maintenance and asset management systems have become increasingly prevalent over the last 15 years, providing operational control of maintenance and repair operations and supporting the end-to-end asset lifecycle, which can easily span decades. HSE (Health, Safety and Environment) and strict compliance requirements are another angle of complexity. Records must be maintained throughout the entire life of each asset and kept current any time you change key processes. Many types of equipment—such as pressure vessels and valves, furnaces, and heavy machinery— come with a risk of injury, fatality, and environmental impact. The public has high expectations for industrial responsibility, while government regulations require increased diligence, testing, and documentation. Keeping accurate corporate records is no longer optional but mandatory. To manage all your asset-related information—such as technical specifications, user manuals, maintenance records, service contracts, drawings, and more, in a traceable and auditable way—you need effective information governance. From the beginning of the asset lifecycle, the design and construction of new projects need to be properly managed and documented. Transitioning from asset construction to operations and maintenance down to final decommissioning along the lifecycle of the asset can also present specific challenges, such as the controlled document handover between EPC companies engineering and constructing the assets to the owners / operators of the asset. But with content spread all over the enterprise, how can you succeed? Accessibility, quality, and traceability of asset information is a huge issue—and organizations need a single source of truth The fact is that most asset management systems such as SAP Plant Maintenance or IBM Maximo focus on structured, transactional information. While this data is important, it’s only one part of the story. Equally important is unstructured information—the kind not stored in a database. Effective maintenance can be heavily impacted by poor access and asset-related documentation. When it comes to the speedy repair of breakdowns and outages, time is money, and speed of access to associated information and effective collaboration between the involved parties is critical. According to ARC Advisory Group, 1.5% of revenue is typically lost due to poor asset information management: However, in many organizations, there is still a substantial disconnect between transactional plant maintenance data and the wide range of supporting documents that need to be held against each individual asset item—bid documents, supplier correspondence, project contracts, process instructions, maintenance manuals, technical drawings, set-up specifications, health and safety sheets, inspection logs, instruction videos, and so on. This involves a considerable range and flow of information that must be managed, accessible, transparent, and accurate at all times. This information may be internal to the main ERP system, within a dedicated enterprise asset management system (EAM), or in any number of other repositories—document management, SharePoint, CAD, project management, quality management system, and so on.  Many companies still completely overlook the benefits of integrating unstructured information and business processes to enable process-centric collaboration and information management. The resulting challenges can be broken down into three main categories:    Enterprise Information Management How can a proper asset information management system be implemented? Enterprise Information Management (EIM) can link all unstructured information such as documents and files —wherever they reside in the organization—back to the asset data in the EAM system, creating a single version of the truth. What can you achieve with an EIM system integrated with your asset management system? Imagine all your content is automatically stored with an appropriate linkage to your asset structure, and key asset attributes are associated with the documents. This enables users to navigate or search to readily find the content associated with a functional location, equipment, supplier, or material in ERP and EAM systems, thus improving the productivity of maintenance workers and technicians. New documents such as job packs can be automatically created based on the asset data (e.g. equipment data, material data, work order data, standard operating procedures, and safety bulletins) and made accessible across multiple departments, on an enterprise level, with proper records management and management of change enforced. The following screenshots shows how OpenText Extended ECM for SAP Solutions provides full information transparency and the business context for an asset. A business workspace provides a 360-degree view for an asset (in this example, an electronic pump), including folders and documents with search and content filtering capabilities (to the left), EAM data, and the business context in the form of related EAM objects such as materials, maintenance notifications, and maintenance orders (to the right).   The Bottom Line  With an EIM system, asset information is no longer distributed over multiple silos or disconnected from the leading EAM system. All stakeholders within the end-to-end asset lifecycle can access asset information from multiple user interfaces such as SAP, Microsoft Windows®, Microsoft Office®, Microsoft SharePoint® email, and web. This improves knowledge management and collaboration by providing organization-wide access to information and helps to take your assets successfully from the capital projects and handover phases, through a long lifecycle of efficient operations and maintenance, and ultimately through decommissioning.   Get tips on building a successful information governance strategy in the webinar on demand: Build a Winning Operational Risk Management Strategy with Information Governance.     Resources OpenText Extended ECM for SAP Solutions: http://www.opentext.com/What-We-Do/Products/OpenText-ECM-Suite-for-SAP®-Solutions/OpenText-Extended-ECM-for-SAP®-Solutions Solution Brief “ECM for Enterprise Asset Management” ( http://socialassets.opentext.com/ecm-for-enterprise-asset-management-asset ) Whitepaper “Information Drives Asset Performance” ( http://socialassets.opentext.com/information-drives-asset-performance-asset ECM for Excellence in Production & Operation (90 seconds video): http://www.youtube.com/watch?v=ScDwXbS3t2E

Read More

Industry Perspective: Making PDF Documents Accessible – Part 2

Turn up the Volume: Create Accessible High-Volume PDF Documents First published on Output Links. Read Part 1: Making Documents Accessible. Accessibility for Information and Communications Technology (ICT) is a rather new subject, with a lot of difficulties involved. There are still questions and challenges that arise for those working in the field, and very few resources for them to turn to for answers. With that in mind, I want to address a couple of the most common questions, specifically as related to High Volume Transactional Output (HVTO) PDF documents. Upstream or Downstream? Why not address PDF Accessibility issues upstream in the source document generation system, rather than downstream through an automated remediation process? It’s a common question, and to really answer it we first need to decide what is considered a document generation system. To some, this may just be anyone creating documents in Word or Excel that are later printed to PDF for the purposes of displaying online. But there’s a different type of content too: namely, HVTO documents. The fact is, things like proper training and understanding of accessibility principals will go a long way in making any single document-based process more accessible. For example, if the users of MS Office are trained on how to make accessible content, they will likely get 80 percent of the way there. But HVTO documents come with their own challenges when it comes to accessibility, and proper training may not be enough. Accessible HVTO Upstream What is an HVTO document? As you know, statements, bills and direct mail likely fit the bill. These types of customer communications are generally designed and laid out using a template approach, merging data and content at runtime into a template that produces personalized output for millions of end customers. Could we apply the same concept of training on accessibility principals and requirements at this level to produce properly tagged accessible PDF-UA documents? That is a possibility, though I would argue that only 80 percent accessibility is not the goal for most big multinational firms with HVTO needs, which are concerned with lawsuits related to the Americans with Disabilities Act (ADA). If your company is concerned with ADA compliance, they need to have the goal of completely useable and navigable PDFs that give visually impaired customers the same ability to understand communications that sighted individuals have. Case in point: I was working with one health insurance provider that was told by its print vendor to simply modify the templates they were using in their document generation system and then output properly tagged PDFs. They thought it was a great idea and invested a ton of cash to modify their existing templates. When this health insurance company submitted their brand new output to the Centers for Medicare and Medicaid Services (CMS) for approval they were rejected on the basis that the content was not accessible. One could argue that this was a limitation of the document generation system, or a training and knowledge issue with the print vendor. I would argue it’s likely both! Accessible HVTO Downstream Considering that most enterprises normally have multiple document generation systems outputting various print streams and electronic formats, there’s a significant level of complexity when considering how to make everything accessible. In many cases it’s easier to address this issue downstream once the content has already been formatted for each customer. There is an option to automate the transformation of any print or electronic format to a PDF-UA that has been proven in the marketplace already. This system will ingest any document and can output in batch or in real time. It can be set up to remediate content in existing repositories, or just sit inside a portal to deliver PDF-UA in sub second responses as requested by end customers. Finding Solutions Many governments locally and globally have written laws that require information technology and online content to be made accessible. So the second question I want to address is: What is the best solution to adhere to accessibility legislations in the case of high-volume documents? Automated Technology for Accessible PDFs In order for a PDF to be considered accessible it needs to be completely useable and navigable to people with vision loss, cognitive disabilities, blindness, etc. Until recently, there have not been any automated technology options to effectively remediate HVTO content into accessible PDF-UA documents. The primary solution for years has required people with disabilities to identify themselves; companies would then follow an accommodation process. That process, though, has been plagued by problems such as privacy, cost and delay in document delivery. Making HVTO documents accessible is not a simple process. There are multiple requirements to adhere to, and important considerations in the development of your accessible HVTO. The fact is that most companies do not have the manpower to really understand all of the legislation and standards for accessibility. That is why many service organizations have been cropping up all over the world to help. These organizations are great at what they do and are valuable assets in building and executing an organization’s accessibility plan. But it just makes more sense to use automated, software-based approaches to solve the issues around HVTO accessibility. What if there was a software appliance that could be dropped into any company’s IT environment that could automate the remediation of HVTO? Would you or your company be interested in using this to automatically remediate your high-volume print streams for statements, policies, bills, notices, etc.? The idea of a software appliance that is downloadable and preconfigured to work out of the box is a revolutionary concept. It would also need to integrate seamlessly into other output management and back office IT infrastructure. In addition to being easy to deploy and integrate, it would need to be simple to use so that there would be no requirement for programmers to run the system. Benefits of an Automated Approach The fact is that this kind of solution does exist and if used in the right way can deliver outstanding business results. By leveraging this approach to ensure compliance with accessibility regulations, companies can start to avoid costly lawsuits and legal penalties. Also, a reduction in operational costs can come from taking a more automated approach than a human-centric one. Finally companies will begin to gain a competitive advantage when they truly focus on improving the customer experience through delivering more accessibility offerings.

Read More

7 Years Strong: OpenText Receives SAP Pinnacle Award

Information is at the heart of digital transformations today, driving innovations and helping organizations connect with their customers, employees, partners, and others in more meaningful ways. Leading organizations are putting solutions in place to improve the management of their unstructured information and keep up with this information-based economy. For more than two decades, OpenText and SAP® have worked together to help enterprises maximize the potential of their digital information. And yesterday, for the seventh year in a row, SAP awarded OpenText the 2014 SAP Pinnacle award for Solution Extension Partner of the Year in the “Run Together” category. OpenText received the award at the SAPPHIRE NOW conference in recognition of the company’s excellence in developing and growing its partnership and driving customer success. “Partners like OpenText are our force multipliers, and today, more than ever, they are essential to our customers’ success,” said Mark Ferrer, chief operating officer, Global Customer Operations and executive vice president, Ecosystem and Channels, SAP, in a recent press release. Our SAP partnership has led to many great innovations—including a few new ones. Last week we announced that the OpenText Archive Server will now run on the SAP HANA Platform. This expanded solution means that more than 4,200 current on-premise SAP Archiving by OpenText customers will have access to the same robust Archive solution on SAP HANA Enterprise Cloud, enabling our customers to process business-critical content with a faster, smarter and simpler real-time platform for their business. We also announced a special offer for SAP customers to get Tempo Box for free! SAP Extended ECM 10.5 and SAP Document Access customers can get Tempo Box for free here. OpenText’s solutions for SAP help customers maximize the potential of their information, ensuring business users are able to easily access, manage, and leverage their content in a secure environment. Organizations are able to drive business transformation by digitizing processes that leverage SAP technology innovations, including the SAP HANA platform as well as cloud and mobile offerings. We’re thrilled to be recognized again with this award from SAP, and we can’t wait to continue the conversation at SAPPHIRE NOW in Orlando, Florida. OpenText subject matter experts, customers, and partners have had a chance to speak at some great events, and we’ll continue with even more sessions today and tomorrow (see our full session info here). If you’re attending the event, don’t miss our CEO Mark Barrenechea’s keynote address tomorrow, June 5, at 9 a.m.

Read More

Customer Experience is not only Marketing’s responsibility

Business Processes gives power to Customer Experience Let me give you an example of how you might want to integrate your customer experience with your business processes. Let’s say a dissatisfied customer posts an angry tweet about your product or company. This can be captured and analysed via sentiment analysis and eventually used to initiate a resolution process within your company. Another example would be a Facebook post suggesting a new feature that can kick off a product enhancement process. It’s obvious that customer services with the necessary information at hand and the ability to resolve questions or issues quickly are essential to maintaining happy customers. A survey done by Forrester found that 1/3 of the companies still don’t have access to such basic information as “where is my order?” Another example would be if you have an e-commerce site selling for example snowboard gear. Your customer heads to the site to buy a new board. He gets information about what other people that bought this kind of board also look at. He goes through and checks out his basket. He makes the payment with his credit card online. Once he’s done, he gets an email with an order confirmation and an offer on a special price trip to the Alps. This offer is valid until 7.00 pm tonight. He thinks it looks interesting so he clicks the link and it opens up the offer. However, he does not have time to finish this off so he leaves the site continuing with some else. And as the day passes he forgets about he offer. At 5 pm a customer service rep calls him up, asking if he needed more information to be able to go a head and book the trip. This way you have connected your customer experience with your customer service function. Instead of just letting him forget about the trip you make sure a process is triggered within your company so that the customer service rep calls the customer up or sends and email to remind him. Different customers have different needs, making a one-size-fits-all service model does not work. Integrating Business Process Management with your Customer Experience gives you the flexibility you need to make the customer experience tailored to your customers and their current situation. To get the complete picture of a customer to be able to give the best customer experience through your entire company, easy access to document and information stored about your customer is essential for customer service and sales. This is where connecting Enterprise Content Management with Customer Experience gives you the opportunity to get a 360 degree view of the customer. Going back to the example of the snowboard buyer, the customer service should have full visibility into the order the customer placed using the online store. They should easily be able to access all documents related to that product to be able to give the customer the best experience possible. Also all information about the customer like customer contracts, customer emails, customer interactions/conversations and so on should be easily accessible to customer service as he is speaking with the customer. Giving the best customer experience possible. Another example would be a customer in a car accident could initiate an insurance claim using a mobile phone by simply taking a picture and submitting it via a mobile application. Data like location and time is captured automatically, saving the customer hours on the phone. BUT, this kind of information needs to be stored and managed according to regulations not just stored anywhere. How do you create a 360-degree view of your content and customer in an easy and efficient way so that you can deliver a great customer experience? http://www.opentext.com/campaigns/experience-suite/expert-webisodes.htm

Read More

Into the Future: Replies from an Actuate Content Services Repository Survey

Actuate recently surveyed several customers and other large organizations using various repositories, asking them about their Repository needs. During that survey, we asked 3 separate questions: Are there any problems or deficiencies with your existing repositories? What features are your business users requesting and what regulations do you have to adhere to? What is your vision of where this technology is heading in the next few years? We covered the first 2 questions in our earlier blog post. In this one, we want to look at how respondents answered the third question: What is your vision of where this technology is heading in the next few years? As was the case with the others, this question got a variety of answers, everyone having different ideas for future repository use and features. Some answered with the industry as a whole in mind, while others considered just their company’s case. For example, one customer, considering the entire industry, suggested that he expected to see more emphasis on storage regulations. “There will be an integration of Big Data into Enterprise Content Management (ECM). Full text will become very important,” said the Vice President of Information Services for a major life insurance organization. He predicted that the consolidation of imaging, print and annotations will also become important, as will the collaborative use of stored content. Another banking customer – the Senior Architect of Enterprise Architecture for a major Canadian bank – expects to see the following trends: A shift from ECM to Information Management Analytics to replace content management The ability added to search across databases and repository Open Source Content Services Meanwhile, a state government respondent wants to see a system in the next few years that integrates all of the chosen file formats and document accessibility concerns, “while focused on access to storage, storage time frame and size of file.” Other respondents considered their expectations for repository use internally. For instance, a financial services customer – this one the Director of IT Applications/Document Solutions – expects that more projects will be integrated into content management at his organization. Finally, the Manager of Security, Compliance and Network for an insurance organization answered: “Our vision is to have one platform and some strategic vendors bringing everything together for us to help us manage our content for use, storage and retrieval. As well we want to be compliant with all of the Americans with Disabilities Act requirements at that time.” Does your organization have a repository system? We’d love to hear your expectations on the future of repository use – either internally or industry-wide – in the comments section.

Read More

OpenText Supports the Information Governance Initiative [Video]

Barclay Blair, founder of the Information Governance Initiative and a long-time consultant in the field, likes to compare Information Governance to a diamond: “It has many facets and it’s very hard.” He’s half joking, of course—but he makes a good point. Information Governance has come to the forefront of many business leaders’ minds as they look to better manage their information and find the business value in it. The Information Governance Initiative, which was founded in February, has set out to help organizations by better defining Information Governance, creating a common set of best practices, and bringing experts together to actively solve Information Governance problems. I recently sat down with Barclay after we both participated in an Information Governance panel discussion (watch the full event on demand here). In our one-on-one time, we discussed Barclay’s work on the Information Governance Initiative, of which OpenText is a founding member. Watch our conversation: Learn more about managing your enterprise information and finding the value in your content with OpenText Discovery Suite. The latest solution from OpenText helps organizations implement a successful Information Governance strategy to promote compliance, reduce costs, and support defensible disposition of information.

Read More

AIIM Research Findings on Information Governance

These days, it seems data leaks and security breaches are in the news almost constantly. It’s not a good problem, but the silver lining may be that it’s made organizations rethink their Information Governance strategies. The new Industry Watch report from AIIM Market Intelligence, Automating Information Governance, sheds new light on industry trends around the shifting Information Governance challenges. As Doug Miles, head of the AIIM Market Intelligence division, puts it: “The rules and risks have changed. We now need to keep all electronically stored information securely, compliantly, and available to the legal process, whether it’s work-in-progress documents, emails, collaboration tools, or any other repository of content.” I can’t wait to discuss AIIM’s findings with Doug during the upcoming webinar, Automating Information Governance: AIIM Research Findings and Industry Trends, on June 10. The landscape is changing quickly and organizations that don’t rethink their Information Governance strategies soon risk being left behind. Get the full AIIM Industry Watch report here. And see a preview of their findings in this video from Doug Miles:

Read More

Getting the Most from Your B2B Network

Two weeks ago I was on a Eurostar train bound for Brussels. I was attending the 122nd EDIFICE plenary, a high tech industry association that GXS™ has been involved with for more than 25 years. EDIFICE works with leading high tech companies to develop best practices and new standards for deploying B2B solutions and services across the high tech supply chain. I have been attending EDIFICE events for the past six years and I have found the sessions to be incredibly informative in terms of understanding the B2B challenges faced by today’s high tech industry but also to allow me to network and share experiences with companies such as HP, Cisco, Infineon, Texas Instruments and Xilinx. In fact the attendee list for these events quite often reads like a who’s who in the high tech industry. For more information on EDIFICE, please visit www.edifice.org. Each member company of the EDIFICE community is invited to sponsor plenary events and on this occasion OpenText™ GXS™ had the pleasure of hosting the event with a theme of ‘Getting the Most from Your B2B Network’. I have presented at several EDFICE events in the past but this event provided me with the first opportunity to introduce OpenText, a company that very few people in the audience had heard of before. I delivered the key note presentation for the event and this allowed me to introduce the world of Enterprise Information Management (EIM) and how this is likely to impact the world of B2B moving forwards. In the future, companies will be able to get even greater value from their B2B network due to the availability of powerful EIM solutions from OpenText. So what I wanted to do for this particular keynote presentation was to explain how companies could get more out of their B2B platform, ie once connected to a B2B Network such as OpenText™ Trading Grid™, what could you do with the B2B information flowing across the network, either internally or externally across the extended enterprise. I also wanted to demonstrate that a B2B network can be used for a lot more than just exchanging EDI messages. So with this in mind I thought I would just recap the ten points that I discussed during my keynote presentation to highlight how companies, when connected to a global B2B network, can get more out of their platform: 1. Enabling 100% Trading Partner Connectivity – In order to get a good return on the investment in your B2B network you need to make sure you can electronically enable 100% of your trading partner community. Many companies struggle to on board their smallest suppliers who are quite often located in emerging markets where ICT and B2B skills are limited. OpenText has a range of B2B enablement tools that can help a business to exchange electronic business information, irrespective of the size or location of the supplier. From sending PDF based business documents via secure messaging/MFT, implementing Fax to EDI based solutions, through to Microsoft Excel based tools and web forms, there are many tools available today to help a company onboard their smallest suppliers. 2. Simplify Expansion into New Markets – Many companies today are introducing lean and more flexible supply chains to allow them to be more responsive to constantly changing market and customer demands. Many companies struggle to establish a presence in a new market, either because they do not have a local presence in the market or they do not know how to connect a remote operation, in terms of a plant and its domestic suppliers, to a centralised B2B platform. The benefit of a cloud based platform such as OpenText™ Trading Grid™ is that you can scale up or scale down your B2B activities as required by the needs of the business. With over 600,000 companies conducting business across Trading Grid today and a B2B network which stretches into every major manufacturing and financial hub around the world, Trading Grid significantly reduces the amount of time it takes to establish a ‘B2B Presence’ in a remote market. Even if you currently work across multiple network providers in different countries around the world, consolidating onto one cloud based provider can certainly help to improve the bottom line for your business. 3. Overcoming Regional Complexity Issues – Many businesses today are required to adhere to various regional compliance regulations. For example the area of e-Invoicing compliance is an incredibly complex area that companies have to embrace. For example some countries such as Mexico and Brazil mandate electronic invoicing whilst each of the 27 countries in the European Union have country specific rules for dealing with VAT compliance, archiving and applying digital signatures. OpenText GXS has implemented B2B projects all over the world and hence we have the experience to shield your business from the complexity of dealing with a myriad of regional specific B2B standards and e-Invoicing related regulations. 4. Improving Accuracy of Externally Sourced Information – If you spend $50,000 on a luxury car with a high performance engine, you wouldn’t pour low grade oil into the engine now would you? The same applies to a B2B platform, where information entering the platform can come from many internal and external sources. In fact in one study conducted a few years ago we found that over a third of information entering ERP comes from outside the enterprise. Now what happens if poor quality information enters your B2B platform, gets processed and then enters your ERP environment? All it takes is for an external supplier to enter the incorrect part number or quantity on a shipping document for example and this information will slowly propagate its way through your business. What if you could implement business rules to check the quality of information entering your business according to specific rules templates? Applying business rules and hence improving the quality of B2B information will help to improve operational efficiency, in terms of reducing manual rework, in relation to how information flows across your extended enterprise. 5. Increasing Resilience to Supply Chain Disruptions – Global supply chains have been severely disrupted by many natural disasters in recent years. Companies have been trying to build increased resilience to supply chain disruptions and having access to a single, global B2B platform can help minimise supply chain disruption. Utilising a cloud based platform allows companies to access their B2B related information anywhere in the world, irrespective of where disruption may be occurring. For example many Japanese companies are starting to embrace cloud based B2B platforms as it helps to introduce flexibility into their B2B strategies and also minimises any disruption to their supply chain and production operations. Having every trading partner connected to a single, global B2B network allows you to quickly identify points of weakness across your supply chain during a period of disruption and to take remedial action as required. 6. Adhering to Regulatory Compliance Initiatives – Businesses today have to embrace a variety of industry specific regulatory initiatives. In addition, many companies have established their own compliance initiatives and they expect their trading partners to adhere to these regulations. In many cases suppliers are being asked to adhere to these compliance initiatives as a condition of doing business with them. In the automotive industry suppliers have to undertake an annual quality assessment known as Materials Management Operations Guideline Logistics Evaluation (MMOG/LE) and in the high tech industry companies with headquarters in North America now have to demonstrate that they do not source conflict minerals across their global supply chain. Many compliance related initiatives in place today require some form of assessment to be conducted across a supply chain and the two examples I highlighted above use spreadsheets as the basis of the assessment process. However for this assessment to be effective, companies require up to date contacts for each and every supplier across their supply chain and they need to make sure these assessments are conducted in a timely manner. OpenText GXS can provide a platform that can centralise the management of supplier related contact information, which in turn helps to significantly improve day to day communications and collaboration with a trading partner community. 7. Conducting Transaction Based Trading Partner Analytics – One of the benefits of operating the world’s largest B2B networks is that it can potentially provide a significant source of data to analyse trading partner related trends. As we know, Big Data analytics is becoming an increasingly important area for companies to embrace, but quite often they do not have the internal skills or knowledge to process the information or transactions flowing across their supply chain. OpenText Trading Grid processes over 16 billion B2B transactions each year and this can potentially provide a rich source of information for companies to leverage and allow real time decisions to be made in relation to the management of their supply chain operations. 8. Initiating Process Based Transaction Flows – Many companies have implemented numerous business processes to manage different aspects of their operations. From managing production lines through to inventory replenishment, having a strong process centric approach to running a business is key to winning a competitive advantage in the market and increasing customer satisfaction levels. B2B related transactions have sometimes had a loosely coupled relationship with business processes but many supply chain processes are becoming so complicated today, especially supporting global manufacturing operations for example, that coupling business processes more tightly with B2B transaction flows makes increasingly more sense. For example, managing transactions relating to a reverse logistics process used in the consumer electronics sector. OpenText has significant experience in the Business Process Management space and when eventually combined with OpenText Trading Grid this could potentially offer a different way for supply chain directors and logistics managers to look after their trading partner communities. It is still early days in the OpenText and GXS integration process but this could be a big growth area in the future. 9. Achieving Pervasive Visibility of all Transactions – Achieving true end to end visibility has been high on the agenda of nearly every Supply Chain professional around the world. The introduction of powerful smart phones and tablets has only increased the desire to get access to B2B related transactional information, any time, any place or anywhere. It is already possible to introspect transactions as they flow across OpenText Trading Grid, but the next logical phase would be to make these transactions actionable in some way via a remote device. OpenText has an interesting suite of mobile development tools called Appworks which will help to considerably accelerate the development of B2B related apps which connect into the Trading Grid platform. 10. Integrating to ‘Internet of Things’ Devices – The Internet of Things (IoT) has the potential to completely transform the way in which companies manage their supply chains in the future. With billions of intelligent devices expected to be connected to the Internet over the next decade, companies will have access to significantly more information from their global supply chains. From IoT connected storage bins, forklift trucks, containers, warehouses, lorries and in fact anything that is part of a supply chain has the potential to send information back to a centralised IoT hub via a simple internet connection. In the same way that we talk today about integrating B2B platforms to ERP systems, tomorrow we will be doing the same level of integration to IoT related hubs. This is a subject that I have discussed extensively via an earlier blog of mine.

Read More