Information Management

Bye-Bye Repetitive Marketing Tasks, Hello Compelling Customer Conversations

For nearly two decades, email has been the main message bearer in marketing. But with tightening regulations, it is becoming less and less viable to simply email your database—not to mention the ‘email fatigue’ we all feel; it’s simply becoming a less effective tactic. As marketers look to uncover alternative ways to get their message out, many organizations are opening deeper, more effective dialogues with customers through compelling content. Today’s customers desire interactive conversations with organizations, to get to know not just the product but the organization behind it. Email as a marketing tool will be dead in 5 years or less, and marketers need to think quickly about what will replace it in the age of the digital customer. Content and conversations, self-service and self-selection will form the epicenter of B2B and B2C marketing. Creating rich, engaging and, most importantly, timely customer interactions from initial contact through to buying takes time, data and a deep understanding of both current and future customer requirements. And while many of today’s marketing leaders recognize this, most would admit that they just don’t have the insight they need to really deliver on a customer-centric approach. But, today’s marketing automation tools can help create digital experiences. These tools nurture close relationships, and engage customers at every step of the decision journey to drive brand loyalty, revenue, and customer lifetime value, freeing up marketers to focus on creating compelling content. The Power of Content Content alone is not enough. It must be compelling. It must be engaging. And, it must be optimized to reach your customers at each touch point. Compelling content draws audiences in to your message. They begin a journey with the brand, from awareness to consideration to decision and advocacy. Unlike email, the ultimate interruption-driven marketing tool that pushes your message, content and experience marketing drives the journey through engagement with your customer, and is more efficient—costing over 60% less (62% less) than traditional campaigns. The Journey There is no single “channel” that today’s marketers can rely on to engage with customers. Customers today interact via multiple avenues, whether through social channels, a brand’s “owned” digital properties or more traditional routes like the media. In each case, the customer must experience a continuous, personalized and authentic digital journey that offers the best experience at every point of interaction and in every phase of the lifecycle. With a lineup of engaging content, customers can delve into the information they are looking for in their preferred medium. For instance, almost 50% of Internet users look for videos related to a product or service before visiting a store (Google, 2016). In a recent report on demand generation, 96% of B2B buyers said that they want content with more input from industry thought leaders, and over 50% said they relied on content as they researched buying decisions—from both the vendors and independent third parties. With all roads leading to the power of engaging and personalized content, it’s time to re-focus on the future of marketing. The Freedom to Create More Content Knowing how important content is, it’s time to balance your efforts. The bottom line is this: a big driver of today’s conversion rates is compelling content. The better the content, the better the conversion rate. But with all the technology and touch points and channels in play, there’s no question that marketers are making tough choices on where to spend their time. Automate your marketing processes and free your big thinkers to create the kind of content that speaks to your audiences in personal terms. For more information on how you can automate your marketing operation, check out the OpenText Experience Suite.

Read More

Five Compliance Challenges Facing Your Organization in 2017

compliance challenges

2017 is turning out to be a tumultuous year for compliance. A combination of Brexit, a Trump presidency and the reform of EU privacy rules has put regulatory change and uncertainty back into the spotlight. Mega-size fines have returned too and compliance officers worry about personal liability more than ever. 1. The GDPR – the countdown is on If your company hasn’t familiarized itself with the General Data Protection Regulation (GDPR) yet you may already be behind. The GDPR was ratified in May 2016 and designed to bring personal data protection into the digital age. It imposes stringent requirements about how companies store and handle the personal data of EU citizens. The regulation will have far-reaching impacts – from how organizations obtain consent, use cookies on their website, to giving teeth to the right to be forgotten. Don’t think that, as this is EU legislation, that GDPR won’t affect you. It affects any organization that collects and stores personal data of EU citizens. With the GDPR becoming enforceable in May 2018, the countdown is on for organizations to prepare. The GDPR will impact more than just the Compliance team but indeed many other parts of the business. Key Steps An important first step is to have clarity of the personal data processing practices and content within your organization, including: • What personal data you process? • Where it is stored across the organization? • Who has access to it? • What consent has been provided and where it is documented? • Where it is transferred from and to (including to third parties and cross-border)? • How it is secured throughout its lifecycle? • Are there policies and processes in place to dispose of personal data? Visit OpenText GDPR to learn more about the regulation and how OpenText can help. 2. Pressure on the Compliance function not letting up Compliance officers have never had a higher profile than they do now but with great power comes great responsibility. Pressure on the compliance function has been steadily increasing and 2017 is no exception. For example, sixty-nine percent of firms surveyed in 2016 expected regulators to publish even more regulations in the coming year, with 26 percent expecting significantly more. In addition, personal liability appears to be a persistent worry. Sixty percent of survey respondents expect the personal liability of compliance officers to increase in the next 12 months, with 16 percent expecting a significant increase. In addition, with the GDPR comes the rare explicit requirement to appoint a qualified compliance role, the Data Protection Officer (DPO). Though the GDPR does not establish the precise credentials DPOs must have, it does require that they have “expert knowledge of data protection law and practices.” Key steps Compliance officers don’t need to be technology experts but need to know how to leverage governance, risk and compliance solutions to make their jobs easier. Other key steps include ensuring your policy framework is up-to-date and that staff understand and are trained their compliance responsibilities. Read the AIIM white paper and infographic: Managing Governance, Risk and Compliance with ECM and BPM. 3. A new administration means changes in regulatory priorities President Trump has been clear and consistent on his desire to reduce the amount of regulations in place. From financial services to the environment, compliance officers are bracing for the changes and what it will mean for them. Most industry experts agree that even where regulations are streamlined or reformed, there will be plenty of work for your team to do to address the vacuum left by previous regulations or to interpret the way the new regulations need to be applied. The picture may be uncertain at the moment but you can be certain that regardless, any changes means there’ll be work to do for your Compliance team. Key steps How do you prepare for the unknown? Many pundits advise wisely that it’s business as usual and not to re-draft policies and procedures just yet. Now’s a good time to evaluate your overall compliance program however. For example, if your organization does not have its regulatory information management house in order now is the time to clean up. Whether your firm is based in or works with the United States, the result of the potential changes to the regulatory landscape means that businesses will need to be adaptable in order to quickly take advantage of opportunities, mitigate risks, and stay in compliance. Learn about OpenText compliance solutions. Continue to read compliance challenges 4 and 5 on page 2.

Read More

The Future of ECM is Coming Into Focus: Here’s Where to Start

ECM

Enterprise Content Management (ECM) is changing. It’s an idea everyone involved in content management has heard with increasing frequency over the past couple of years, and it’s driven by the reality that Enterprise Information Management needs are changing immeasurably. Enterprises are now embracing simple, lightweight solutions, ones that nimbly address highly varied and very specific productivity and control issues. These applications can be either in the cloud or on-premises and —like blocks in a foundation — build on each other to result in optimal ECM coverage. It’s a different approach than the traditional methodology of attempting to blanket the entire organization with an all-encompassing ECM solution. And it’s such a step-change in concept and practice that the respected analysts at Gartner have even decided that ECM doesn’t cut it as a sector name anymore. In their estimation, the new topography is better encapsulated by “Content Services,” a nod to the decentralized, purpose-built applications that organizations find much easier and effective to implement. Now, all this may promise unheralded levels of agility and integration, but I can also see LOB and IT execs shaking their heads and thinking: “We’ve just spent a decade investing in our ECM infrastructure. Are we starting all over?” No!  The short answer —and single, most important thing to remember — is: No. The key to sustaining momentum and achieving future ECM success is centered around building on what you already have. There’s just a new way of thinking about what comes next. Resources to Speed your Learning Curve Over the past month, I’ve had the opportunity to participate in a few initiatives with the industry experts at AIIM that have gone a long way to helping explain this path forward: First up was an extremely well-received roundtable webinar inspired by the completion of the OpenText acquisition of Documentum. This must-see webinar for OpenText AND Documentum customers provides straightforward, detailed insight into our vision for the two product families, including future integration plans. This overview morphed into a wide-ranging panel discussion on what the overall future of ECM looks like and the competencies organizations need to have to excel in this new world. That was followed by a pair of related events; the release of an AIIM-authored eBook, “Revolution of Evolution? 10 Strategies to Navigate the Shift from ECM to Content Services” and an accompanying webinar, Next-Gen Information Management — Succeeding in a New Era. Both provide excellent insight into the context behind this shift in ECM strategy, the constants that will always hold true, and the questions and actions organizations need to address to chart their journey. Start by reading the eBook, then view the accompanying webinar-on-demand. Next up is the annual AIIM Conference in Orlando, Florida from March 14-16. OpenText and Documentum experts will be at the event, including hosting a number of sessions on understanding and capitalizing on the evolution of ECM. If you’re planning on attending, book a 1:1 meeting with us, there’s no better way to get up-to-speed than connecting face-to-face and asking what this all means for you.

Read More

The 3 Most Asked Questions about Fax Technology in Healthcare

healthcare

Freshly back from HIMSS 2017, I spent some time reflecting on the rich conversations that I had with tradeshow attendees. These top three questions were so consistent among the conversations that I wanted to share them, just in case you missed them at HIMSS: Q: How are healthcare organizations using fax solutions to save costs or be more efficient?  A:  Two words: Simplify and Optimize. Fax and paper continue to dominate patient information exchange, accounting for as much as 90% of all exchanges. First, it’s important to simplify their faxing by eliminating the security and compliance risk of standalone fax machines and manual faxing and replace them with a secure, digital fax solution. This eliminates unnecessary paper and the costly, time-consuming task of manual faxing. Second, healthcare organizations should optimize their faxing by integrating their digital fax solution with Electronic MR systems, MFP devices, document management systems or other healthcare applications. By integrating electronic fax with the devices and applications they use the most, healthcare providers get access to the right patient information when they need it and where they need it. Q:  What trends are you seeing with fax technology in healthcare? A:  There are 2 major trends in healthcare today:  Fax volumes are rising (yes, you heard me right) and hybrid fax deployments.  First, fax volumes are rising.  As more patients enter the health system, attributed to more people having affordable access to healthcare and the healthcare needs of the aging population, fax volumes increase, too. The second trend is the shift to hybrid fax deployments, which combine an on-premises fax server with cloud-based fax transmission. Hybrid fax deployments are becoming more and more popular because they simplify existing on-premises fax server deployments and allow healthcare organizations to leverage the cloud for just the transmission of the fax. In addition to simplifying the deployment, the on-premises fax server keeps its integrations with EMR systems, MFP devices, and other healthcare applications and there is no change to the user experience or established patient information exchange workflows. Q:  Where is fax technology headed and how is OpenText innovating in healthcare? A:  As other forms of patient information exchange develop, such as Direct messaging and other forms of electronic exchange, it’s important that fax technology evolve to coexist with these new forms of exchange because fax is so deeply rooted in healthcare. When fax coexists with other forms of exchange, it allows healthcare organizations to begin to transition to new forms of exchange at their own pace, or as importantly, at the pace of other providers in the healthcare continuum, with minimal or no change to the user experience (or better yet, make the user experience better!). For example, OpenText has recently launched an innovative healthcare solution that combines fax and Direct messaging in a single solution, allowing healthcare organization to convert an outbound fax to a Direct message whenever possible with no change to how they send a fax today. I’m already looking forward to HIMSS 2018 and the great conversations we will have then!

Read More

Regulatory Matters: Collaboration is key for Life (Sciences) in 2017 – Part Two

Regulatory

The Life Sciences sector is very innovative. The Boston Consulting Group found that almost 20% of the world’s most innovative companies came from the sector. In fact, PwC suggests that Healthcare will surpass Computing as the largest industry by R&D spend by 2018. Shining a light on the innovation paradox Yet, for all the effort, there is still a lack of new products. Last year marked a six-year low for new drug approvals by the FDA. The rise of treatment-resistant superbugs has shone a light on the fact that there hasn’t been a completely new antibiotic for over 30 years. The poor return on R&D investment explains the paradox between innovation increase and new product decrease. Deloitte found that returns on research and development investment at the top 12 pharmaceutical companies fell to just 3.7 percent in 2016 from a high of 10.1 percent in 2010. While many Life Sciences executive remain upbeat about the development of new medicines, it’s clear that two factors will drive success: achieving improved operating efficiencies internally and creating more strategic alliances externally. The Internet of Things will increase the focus on cybersecurity In 2014, the Financial Times found that cyber security for the healthcare and pharmaceutical industries worsened at a faster rate than any other sector. As the sector becomes more and more IT driven in terms of innovation, R&D and manufacturing, cyber crime has been increasing in areas such as intellectual property (IP) theft, international espionage and denial of service attacks. As the sector looks to embrace digital transformation and the Internet of Things (IoT), cyber security is likely to be top of every CIOs priority list. The trend towards preventative and outcome-centric models relies on the ability to monitor and measure the health of individual patients. Whether wearables or other intelligent medical devices, the requirement for some form of online connectivity creates a vulnerability. At a recent cyber security conference, experts showed how items such as an insulin pump can be hacked. This represents a real threat to the individual but also raises the possibility of devices such as pace makers being used to launch denial of service on other targets. Addressing cybersecurity concerns, the FDA has issued guidance to medical device manufacturers to mitigate and manage cybersecurity threats. The excitement around IoT has to be tempered with the need to deliver water-tight security. This stretches way beyond the ability to gain access to user devices. It has to encompass data in transit and the management and storage of data within the life sciences company itself. Security-by-Design – built into all OpenText solutions – will become a foundational element of every part of the IT infrastructure for healthcare and pharmaceutical companies. Achieve operational efficiencies to improve margin and time to market With the focus firmly on value-based medicine, personalized care and population health, the Life Sciences sector is experiencing new levels of convergence and collaboration. Companies have begun to transform their business operations through collaborative product development and new service development. The ‘not invented here’ model is no longer appropriate for increasingly complex and expensive product lifecycles. As Deloitte points out: “Collaborating throughout the product development lifecycle is becoming an increasingly common and effective way for biopharma and medtech companies to offset mounting R&D costs, funding shortfalls, increasing disease complexity and technology advances”. In 2017, life sciences companies are transforming their traditional, linear supply chain into a supply chain of dynamic, interconnected systems that integrate their ecosystem of partners. This new supply chain modality allows organizations to extend their value chain beyond product development into the enablement of care in an increasingly outcome-based healthcare environment. By creating a secure, open and integrated supply chain, organizations are able to reduce cost, increase quality and manage risk across the partner ecosystem. It provides the foundation to quickly and easily extend the partner network for Life Sciences. As you evaluate your business strategies and priorities over the next 12-18 months, collaboration with trusted partners like OpenText can prepare your organization for the challenges ahead. Contact me at jshujath (@opentext.com) to discuss how we can help. If you missed the first blog in this two part series, you can view it here.

Read More

OpenText WFO Video Series: Defining a Positive Customer Experience

WFO video series

To paraphrase that old adage about art: I may not know much about customer experience, but I know what I like. As professionals in the contact center business, we know a thing or two about customer experience because we live and breathe it every day. We would generally agree, I think, that customer experience is commonly described in terms of our customers’ personal opinions about their interactions. So the simple definition of a positive customer experience might include delivering a service that leaves customers with a feeling of having been heard, of having received satisfactory resolution of an issue, and of thinking that, yes, they just might recommend us to a few friends. But the full definition of a positive customer experience is much more nuanced because it must take into consideration both the customer and business sides of the equation – the employees and technology in place that actually enable the delivery of effective interactions, as well as voice of customer analysis and reporting that make it possible to gain insight into customer expectations. What defines a positive customer experience? is the first question asked of our 2017 Video Series speakers, and for his part Keith Dawson maintains that there are in fact two components to customer experience: What the customer perceives What impact it has on the business Listen to what Keith has to say about the importance of “tangible business benefit” in understanding what defines a positive customer experience. While you’re at it, hear how the other Video Series speakers define what they mean by a positive customer experience. You might find that your own definition of a positive customer experience is confirmed..…or perhaps tested and ultimately broadened. In all, our speakers answer eight important questions about driving awareness of the contact center within your organization and explain why this should be of interest to every contact center agent, supervisor, manager and executive. So when you have a few moments be sure to hear how our panel of experts answered all of these questions: What defines a positive customer experience? Why should customer experience be a top enterprise goal? How can the contact center be positioned as a leader in customer experience? How can the contact center align with the top priorities of executive leadership? What’s the best way to coordinate contact center goals with other business units? What performance goals resonate most with executive leadership? What other tools demonstrate contact center impact to the executive team? What are some lessons learned about reporting to the executive team? And continue the conversation by commenting on our blog posts with #CCTRImpact or by using the “Get in Touch with a WFO Expert” form on the Video Series pages.

Read More

OpenText Positioned as a Leader Again in 2017 Gartner Magic Quadrant for Customer Communications Management (CCM) Software

Gartner

Gartner recently published its 2017 Customer Communications Management Software Magic Quadrant, with OpenText positioned as a Leader in this report. Gartner evaluated both OpenText™ Communications Center and OpenText™ Exstream before the acquisition. However, Gartner focuses on each vendor’s technology execution, strategy and vision – not product specifics – so there is only one OpenText “dot” on the graphic. The 2015 CCM MQ positioned both OpenText Communications Center  and Exstream in the “Leaders” quadrant. In the latest version, we believe we maintain a strong position as a Leader due to our breadth of capability across our CCM products and the strength of our direction to combine Communications Center and Exstream into a single platform, thereby bringing additional value to our current and prospective customers. Our position on the “Ability to Execute” axis improved from the average position of Communications Center and Exstream in the 2015 MQ. The position on the “Completeness of Vision” axis held steady compared to Exstream’s placement in the last report, which we believe is a good sign that Gartner approves of our strategy to provide a single flagship CCM offering to enable business users to easily design and deliver omnichannel communications – including web, mobile, SMS, and print channels. The future is bright and exciting as we bring our CCM products together into a single, powerful platform to meet any omnichannel CCM requirements. This is happening now and will bring benefits to all current customers of Communications Center Enterprise (formerly StreamServe), Exstream, and Communications Center CRM (formerly PowerDocs). Get your copy of the full report here.

Read More

General Data Protection Regulation (GDPR) – What is it and how Does it Impact Enterprise Information Management

GDPR

In May 2016, a new EU Regulation and Directive was released to govern the protection of personal data, the General Data Protection Regulation (GDPR). It will enter into force after a two year grace period in May 2018. This is just little more than one year to go and enterprises need to get active to evaluate what it means for them and how they need to prepare. As stated on the European Commission website: “The objective of this new set of rules is to give citizens back control over of their personal data, and to simplify the regulatory environment for business.” Data protection laws are nothing new in the European Union. However, the new GDPR rules presents some significant impacts and changes to current data privacy regulations. For one, what used to be a directive, is now a regulation with full force of the law, valid across all EU countries. And despite BREXIT, the UK government has confirmed that UK will implement GDPR (read the UK Information Commissioner’s blog on this topic). The other important aspect is that GDPR now imposes substantial fines upon individuals and enterprises that do not adhere to the law. Minor breaches will be fined up to 10 Million EURO, or up to 2% of the total worldwide annual turnover of the preceding financial year for a business, whichever is higher. Major breaches will be fined up to 20 Million EURO, or up to 4% of the total worldwide annual turnover of the preceding financial year for a business, whichever is higher. And it should be re-emphasized that the turnover is not just the turnover of the EU located part of the enterprise, but the worldwide turnover of the enterprise. Protecting Personal Data of EU Citizens – What does that mean? As GDPR protects the personal data of the citizens of the European Union, it imposes duties upon enterprises, that collect and manage personal data. These entities are called “Data Processors”. Data processing entities located in the EU are subject to GDPR, but also companies outside the EU that process personal data of EU citizens. So the regulation also applies to non-EU enterprises: EU GDPR requires compliance outside of the EU as well (EU GDPR applies for non-EU companies with contact points to the EU). Collecting and processing data is legitimate as long as it serves a justified purpose, as defined by GDPR, for example “if data processing is needed for a contract, for example, for billing, a job application or a loan request; or if processing is required by a legal obligation …” Such justified purposes for storing and retaining personal data are, for example, laws that govern retention of content, such as tax relevant data and documents, where retaining the scanned vendor invoice or a customer bill is not only justified but an obligation. What is the relevance of GDPR for Day-to-Day Business Processes? There is personal data processed and stored during the course of day-to-day business processes that relates to business partners, such as customers and suppliers, in the procure-to-pay processes as well as order-to-cash process. To give some concrete examples, let’s now take a look at an enterprise that uses SAP ERP to manage their processes and OpenText to attach business documents to these processes. It is of course not just about the data created and stored in the SAP database of the leading enterprise application (ERP, CRM, …), it is also about the business documents that are captured during this process. Take for example, an incoming vendor invoice on paper, which is scanned, attached to the transaction via ArchiveLink and then securely stored on the OpenText™ Archive Center. Or in the example of an order-to-cash process it an incoming sales order and delivery note to a client, which are linked to the SAP order and stored in OpenText. May 2018, GDPR will start to apply following a two-year transition period to allow the public and private sector get ready for the new rules. So how should enterprise prepare and get ready for GDPR? With regards to aspects of storing personal data for a justified purpose, enterprises need to set up policies and procedures – not only to retain content as long as they are obliged to do by law such as taxation or product liability laws, but also to delete content in a timely fashion when it is no longer needed respectively the justified purpose for retention has expired. Learn more about OpenText’s capabilities to support GDPR requirement in the SAP environment in a forthcoming blog post, and also by reading our other blog entries here  and here. You can also visit our web site and learn how OpenText EIM offers capabilities that can support customers to prepare for GDPR or listen to our webinar.

Read More

Innovation: The Real Deal

Innovation

The subject of innovation is surrounded by worn out metaphors, which is why it’s so stimulating to find authentic instances of genuine innovation in practice. You know…real tangible examples, like human commercial space travel, that redefine humanity’s reach and potential. Identifying this type of authentic innovation has been our remit ever since we first started planning the OpenText™ 2017 Innovation Tour many months ago. Now we are just weeks away, I’m delighted to say that the speaker line-up for this year’s one-day event on 21st March doesn’t disappoint. Our keynote speaker is Stephen Attenborough, current Commercial Director, and former CEO, of Virgin Galactic, the world’s first commercial space travel business. Throughout the entire span of human history, only about 550 people have ever visited space. In fact, most of the seven billion of us on the planet have never even met anyone who has been there. And those few astronauts and cosmonauts that have, are hardly representative of our species as human beings. They’ve all tended to be of a similar age, gender, ethnicity, language or professional background. Stephen and his colleagues at Virgin Galactic are using their innovation, imagination and expertise to change all that. To date, around 700 people from 50 different countries, spanning from ages 10 to 90, from many different professions and speaking numerous languages, have signed up to become future Virgin Galactic astronauts. As Virgin Galactic’s first full time employee, Stephen had the unique challenge of laying the commercial foundations to make one of the world’s most ambitious business projects a reality. Except he wasn’t a ‘space nut’, but rather came from the financial services world. Sir Richard Branson is his boss. If ever there was a need for innovative thinking, it was here. He will be sharing with us the next exciting chapter in their journey. Closer back to earth, we’ve also got peer level presentations across seven different tracks, including Manufacturing, Financial Services, Energy & Utilities, Public Services, Retail, as well as Life Sciences.  There’s something for everyone. The likes of Thames Water, FinTech Circle, IDG, the NHS, as well as a former Sky, ITV and BBC presenter (to name just a few) are all coming together to share their respective expertise. And of course, our executive leadership team, including OpenText CEO and CTO Mark Barrenechea, OpenText President Steve Murphy and Executive VP of Engineering, Muhi Majzoub, will be sharing key insights to help with your own innovation plans. In addition, some of our strategic partners, such as SAP, Accenture, Capgemini and Deloitte, will also be providing high level viewpoints that could directly impact both your industry and your future projects – all of this is packed into a single day. If you’ve not yet registered, I’d strongly urge you to do so as we’re approaching capacity. I look forward to seeing you there, and who knows, you might want to book a ticket on the world’s first space airline.

Read More

Regulatory Matters: Collaboration is key for Life (Sciences) in 2017 – Part One

Life Sciences

Life Sciences, like life itself, is constantly evolving. The rigid, product-based environment of complementary but discrete healthcare specialists is rapidly being replaced with a fluid ecosystem where growing and global value chains and strategic alliances drive innovation and price competitiveness. Secure collaboration is key as Greg Reh, Life Sciences sector leader at Deloitte says: ” All of the pressures that life sciences companies are under, be they cost, regulatory or operational, in some way shape or form can be de-risked by adopting a much more collaborative approach to R&D, to commercialization, to manufacturing and distribution”. As increased collaboration touches every part of a Life Sciences business, there are a number of trends that will affect most companies during 2017. Prepare for uncertainty in the compliance landscape There has been a great deal written about the affect that the Trump administration will have on regulatory compliance.  Amid all the uncertainty, Life Sciences companies can’t take a ‘wait and see’ attitude. One thing we do know for certain is that new legislation and regulations will keep coming. Whether the pending regulations on medical devices in the EU  or MACRA  (the Medicare Access and CHIP Reauthorization Act) in the US, regulatory change does not stand still – not even for a new president! We also know that there is greater focus on enforcement. According to law firm, Norton Rose Fulbright, almost one third of all securities class actions in the US in 2016 were against Life Sciences companies, a figure that had risen in each of the previous three years. The company noted that 56% of claims in 2014 were for alleged misrepresentations or omissions. In response, companies have been placing focus on effective marketing content management to develop appropriate quality control on promotional and advertising materials. In addition, enforcement is becoming more stringent is areas such as TCPA and FCPA – where last year the global generic drug manufacturer Teva International agreed to pay $519 million to settle parallel civil and criminal charges. Within extended value chains, compliance becomes an increasingly collaborative process to ensure that information is available to the regulators. However, in compliance, collaboration is working both ways. Life Sciences companies need to be more collaborative as global regulators and enforcement agencies are already cooperating with each other. As global regulators and agencies share information and work together, it becomes even more important to manage compliance risk across the organization and beyond. Consumer price sensitivity continues to drive value-based pricing models According to Statista, the sales of unbranded generic drugs almost doubled between 2005 and 2015. In Japan, the government has an objective of substituting 80% of branded drugs with generics by 2020. There is increasing price sensitivity within both the buyer and regulator communities. Within many economies, the depressed fiscal environment limits the potential for healthcare spending. Governments and insurance companies want to shift payment for product sales to patient outcomes. In fact, the U.S. Centers for Medicare and Medicaid Services (CMS) wants 90% of all Medicare payments to be value-based by 2018 . This value-based pricing model places extra burdens on drug companies but also offers opportunities for the organzations to maintain the profitability within branded drugs. It provides the opportunity to look ‘beyond the pill’ to look more at the patient and what they’re doing. This requires end-to-end evidence management systems that exploit the masses of data created through managing patient outcomes to deliver value-added services around patient wellbeing, rather than simply selling more or more expensive drugs. At OpenText, we would expect most digital transformation efforts to include an element to enable the correct environment for value-based pricing, especially as operational efficiencies and time to market are improved. Part Two of this blog is available to read here.

Read More

Sending the Wrong Email can be an Opportunity to do the Right Thing

customer communications management

We all get them every day. Emails that we delete without reading. Yet companies invest countless hours in developing email campaigns and messaging to try and catch our attention or interest just for us to ignore them. My wife and I were discussing last night the top email subject headers that means we will automatically delete a marketing email. My wife’s top flag was anything that gave her an order to do something. Yesterday’s winner in that category was an email she received from a company that shouted “This is important information you need – Don’t Delete!” – The first thing she did? Deleted that email. My pet peeve is over friendly emails from people I’ve never met, like this example from yesterday, “Reminder – Hey Alan, did you have a chance to review my email?” My response, check the company on the email address, not someone I do business with, then hit the Delete button. Then there’s the emails from companies that you do interact with on a regular basis, but when you read it you think “How did I end up on that mailing list?” You delete it and don’t give it much thought beyond it ramping up an annoyance factor with the company that can eventually impact your overall customer experience. But great brands and customer-aware companies can use a well-defined customer communications management strategy to turn that “How did I end up on this list?” moment into a positive experience rather than a negative one. A case in point. My car. Although my family changes cars on a pretty regular basis we are pretty brand loyal. At any given time you can bet that someone in the family is driving an example from this particular brand’s line up. At the moment it’s me, and I am driving a fully tricked out version of the company’s sportiest offering. It’s the tenth example of the brand we’ve owned. So imagine my surprise to receive an email from the company that was headed “We’re sorry to see you go.” It continued along the lines that the company had heard we had sold the car and wanted to ask a few questions of our experience with the brand, and why we’d moved on. Looking out the window I could still see my car sitting on the driveway. Yep, definitely on the wrong mailing list. I deleted the note, and didn’t think any more of it. Until two days later. A follow-up email arrived from the car company apologizing for the wrong email being sent. There was a well- worded message along the lines of “we know that you still own your car, and thanks for being a loyal customer.” This was followed with a note that by way of apology a small gift was in the mail (which arrived the next day). There was also an additional follow-up that laid out my ownership of the current car, and a note that as a token of thanks for my loyalty if I headed to my local dealer within the next thirty days they would upgrade me from my 2015 model to the equivalent 2017 model at a stated lower APR rate. One mistake = good follow up + bonus gift + acknowledgement of my customer loyalty + upsell offer. That’s good customer communications management, it helps strengthen relationships, develops good customer experience, and promotes more value and revenue across the customer lifecycle. While I’m not ready to take up that trade-in offer just yet, but when it does come time to change my car again, guess which company will once again be top of my list?

Read More

2017. The Year Distributed Becomes Mainstream for Utilities?

Utilties

There’s been a great deal written about the regulatory uncertainty surrounding the new President but one thing is certain: the influence of renewables and Distributed Energy Resources (DERs) will continue to grow. So, will 2017 be the year that Utility companies fully embrace DERs and what will this new business model look like? The growth of DERs The challenges of loss of revenue due to DERs and the justifiable concern of the utilities that the DER users are not paying their fair share of Grid maintenance costs will need to be taken into account by regulators when they are making the Rate Design policies. At the same time, the Utilities are also beginning to see the opportunities that DERs bring to an aging infrastructure, badly in need of modernization allied with increasingly stagnant demand. Regardless of the new administration’s attitude to the EPA, the Clean Air Act or the Clean Power Plan, it is clear that the US government is keen to legislate in a way that Utilities companies are rapidly adapting to DERs ties grid modernization to the integration of DERs. Indeed, we are beginning to see more and more evidence of Utility companies investing in DERs as a means to abandon or defer upgrades to existing bulk generation and transmission/distribution assets. There are at least two reasons for this: renewable energy – especially solar – is rapidly reaching price parity with traditional energy sources, even natural gas. In some cases, solar and wind are proving, on average, most cost-effective than natural gas. The second reason is that Utility companies understand they need to change from a ‘cost centric’ to a ‘customer centric’ model to survive. Utilities companies are rapidly adapting to DERs While Utility companies struggle with stagnant or declining demand which has meant them seeing any impingement from DERs as a serious competitive threat, customers have been faced with rising costs and declines in the quality of service including unexpected power outages and planned rolling black-outs. So, the growing customer demand for DERs is completely understandable. It is not seen by most as a money-making scheme but more as a way of improving energy provision services in a way that may lower the cost to them. It is that context that has seen Utility company executives quickly turn their attention to the opportunities – not the threats – of DERs. It is instructive that in the State of the Electric Utility Survey 2015, 56% of the utility sector respondents said they understood the opportunities of DERs but were unsure how to build a viable business model. A year later, they had begun work on those models – with the majority favoring partnership with third party providers as the best route. Seizing the DER opportunity Whether acting as an aggregator for DER providers and microgrids or developing completely new supply chains, the Utility companies can lower the cost of DER market entry while protecting existing revenue generation and beginning to explore entirely new service opportunities away from bulk generation into niche and targeted supply. For this to succeed, two things must happen. First, Utility companies that have traditionally provided an end-to-end service must learn how to work in what ABB has neatly termed the energy neighbourhood.  ABB states: “Adopting the energy neighborhood perspective can help bridge historic silos in the energy market, which have been hindering the evolution of more flexible, efficient, sustainable, and environmentally friendly energy systems. By working together more, or at least consulting each other more regularly and proactively, utilities, DER operators and customers can make mutually beneficial decisions about assets, business operations and resources.” Secondly, The ability to communicate and share data and information across this neighborhood becomes essential and proactively adopting digital is going to be a key requirement in Utilities. The DER market already requires sensors and meters to regulate quality and output, the type of ecosystems being built for Utilities to integrate DERs into the grid require complete transparency and visibility. The Utilities, DER companies and customers working together have to be able to make complete sense of the structured and unstructured data involved in service delivery. Coping with this level of digital disruption was recently covered in an interesting blog from OpenText CMO, Adam Howatson which you can read here. In practice, terms of service, SLAs and production and maintenance schedules will need to be combined with generation data and ratings engines to ensure that every party is sure that they and others are fully meeting their obligations. This is especially true with the trend towards Time of Use (ToU) and other demand-side rating design as a means to more effectively compensate DER providers. The challenge will be to implement new types of software – such as EIM – that can act as a central, integrated platform of communications, content sharing and data analytics both within the Utility company and beyond to connect and engage with customers, DER providers and, of course, the regulators. Successful integration of DERs with the existing grid is going to be critically important, as DERs are forecasted to have a big impact on the “Duck Curve” – Net Load forecast curve for the 24 hours of the day. California System operator, CAISO, has performed detailed analysis of net load forecasts till the year 2020 and has shown the need for steep ramping of resources and possibility of over-generation risks. CAISO is also working with the industry and policymakers on rules and new market mechanisms that support and encourage the development of flexible resources to ensure a reliable future grid. American Council for Energy Efficient Economy (ACEEE) has recently reported that Utilities can drive a 10% reduction in peak demand by using demand response capabilities and reduce the impact of the steepening Duck Curve.  New EIM software as an integrated platform for communications will be crucial for the Utilities. It is essential for the successful sharing of content and structured and unstructured data with all the stakeholders including DER providers, Customers and System Operators and for introducing new Demand Response technology initiatives. Read more on page 2 to find out about regulation, and regulators taking center stage.

Read More

The Future of e-Delivery and its Impact on Customer Experience

e-delivery

It seems everyone is talking about “digital transformation” but many are still unsure of what it is, why it is important, or even how to get started. Wikipedia defines digital transformation as the change associated with the application of digital technology in all aspects of human society. It has also been described as taking existing manual and paper-based processes and converting them to digital channels and documents – or “going paperless.” Many companies like yours, are talking about improving the customer experience and going digital, but don’t know where to begin. Regardless of what it means to you, a lot of companies are now realizing that digital transformation often includes e-delivery – ensuring that emails containing bills, statements, ID cards and other business critical information get to the intended recipients. They rely on their customer communications management (CCM) solution for this, but are often not aware of what is actually involved in reaching the recipient’s inbox. It can also be challenging sometimes to quantify the benefits of e-delivery and the costs associated with poor deliverability. To help you better understand the importance of email deliverability, we have a new recorded webcast available and you can register and then view it here. InfoTrends customer communications experts Matt Swain and David Stabel, and OpenText™ Exstream Manager of Product Strategy Avi Greenfield, present new research on consumer preferences and enterprise plans for e-delivery in this webcast. They also share key trends, challenges and best practices for managing e-delivery, including the impact that non-delivery can have on your bottom line. View the webcast.

Read More

Forget on-premises: InfoArchive, Docker and Amazon AWS

InfoArchive

There are two buzzwords that we have heard in the IT world for some time now: Cloud and Containerization. For me, 2016 proved that these two topics have changed from hype to reality, even in the biggest enterprises, and a lot of customers were asking for our solutions, like OpenText™ InfoArchive,  in public clouds and/or running as Docker containers. While our engineering and PS teams are doing a great job in providing these solutions, I decided to walk this route myself. Follow me  on the journey if you’re interested. I started my tests by creating a Docker hub account. The account private repository will be used to store the InfoArchive Docker images and automatically deploy from there. It is very easy to create a Docker container from InfoArchive – talk to me if you want to know more. It takes just a couple of steps and you’ll have your InfoArchive Docker container image ready. What’s next? Now let’s run this image in Amazon EC2 Container Services (ECS). Welcome to the “cloud world” If you’re new to the Amazon world you might have difficulty understanding some of the terminology around Amazon ECS. I hope this post will help you with this. ECS cluster In the first step we need an ECS Cluster. ECS Cluster of EC2 instances and services. EC2 instances are our “good old” virtual machines and represent our available compute resources. The work that you assign to the cluster is described as “services”. The picture below shows that our InfoArchive cluster started with 3 micro servers (each of them automatically initiated by ECS from the below amzn-ami… VM image): Within a minute your cluster compute resources are running and waiting for you to assign them some work. Ignore the memory values in the below screenshot – I took the screenshot with 3 running tasks occupying the memory already. InfoArchive is a “classic” three-tiered architecture product: Native XML database xDB at the backend, InfoArchive server as middleware and InfoArchive web UI. To prepare for scalability requirements of our deployment we’ll run each of the tiers as dedicated containers. We’ll “front-end” each of the tiers with an EC2 load balancer. This approach will also simplify the configuration of the container instances, since each container instance will have to connect to the underlying load balancer only (with known static hostname/IP) instead of trying to connect with the constantly changing IP addresses of the container instances. On a very high level the architecture can be depicted as shown below: EC2 load balancers are set up quickly  – my list (shown below) contains 4 instances since I’ve also configured a dedicated public load balancer for xDB connectivity. With this step completed the ECS cluster, its compute resources and the cluster load balancers are prepared. Let’s put InfoArchive on the cluster now.

Read More

WFO Video Series: Driving Contact Center Awareness

WFO video series

I have had the pleasure of speaking with Donna Fluss, President of DMG Consulting, on numerous occasions, and she often ends her sessions with a very compelling illustration – a boardroom table with one empty seat. She then asks the audience, “How do you earn your seat at the table?” Then, with the right amount of poise and firmness, she challenges the audience to align their day-to-day lives with the quarter-by-quarter business objectives of their organization. “It is up to you,” she says, “to establish the importance of the contact center in helping the enterprise achieve its strategic goals.” In much the same way, I often see the normal cast of characters:  CEO, CFO, CMO, COO, CIO, CTO…involved in strategy, but painfully unaware of the role that their contact center plays in driving corporate customer experience goals. So to help you drive contact center awareness, OpenText WFO Software is launching a new video series with your journey in mind. Our 2017 video series is now online and features a great line-up of industry veterans and analysts. We asked each speaker their view on questions such as: What defines a positive customer experience with your company? How do you align your contact center with top priorities of your executive leadership? How do you align your goals with these other business units? And the list goes on! Visit the Video Series where you can easily navigate from question to question and  from speaker to speaker, then listen to video commentaries from each panelist. We encourage you to share the insights. Each video clip is very short, and as they are published over the coming weeks you can share a specific video with your colleagues or via social media by clicking on the blue “share” box under each video. Use the hashtag #CCTRImpact. Finally, I want to extend my sincere thanks to Donna, Jason, Keith, Kate and Roger for their time in helping us to bring this series to you. Their individual expertise is highly respected in our industry, and we all hope the advice they have offered in each video will help you get closer to having your place at the table.

Read More

What a Difference a Year Makes, Here and in IoT

Internet of Things (IoT)

This is my first anniversary at OpenText – and what a year it’s been. I’ve travelled the world and met some amazing people doing some amazing things. Special mention has to go to the 2016 Enterprise World, OpenText’s flagship event, and the IDC Manufacturing Summit  in Lisbon where we discussed the role of Digital Transformation in the sector. But, let’s talk about the Internet of Things (IoT). I wrote a blog in early 2016 predicting that it would be the year that IoT went mainstream in manufacturing and I thought it might be good – unlike so many other analyst predictions – to go back and take a look at just how right I was! A year back, my argument was that IoT was beginning the move from theory to practice. Organizations were building IoT ecosystems that would fundamentally change the way they operated. My particular interest is the Industrial Internet of Things – or Industry 4.0 – which is about enabling manufacturers to work smarter and attain business goals such as: Doing more with less by increasing the use of smart data to power business efficiencies Open up new market opportunities that were previously inaccessible before disruptive technology was available Grow their business by increasing to value of their product through full life support by enhancing products with added life long services Increasing quality of product through real time and virtual monitoring and predictive maintenance and thus retain customer loyalty for life. Glance at recent research and my predictions are looking pretty good. According to McKinsey, the economic impact of IoT applications could be as much as $11 trillion by 2025 – up to $3.7 trillion of which will happen within factory environments. By 2019, says IDC, 75% of manufacturing value chains will undergo an operating model transformation, with digitally connected processesthat improve responsiveness and productivity by 15%. More impressively, Tata Consulting has found that manufacturers utilizing IoT solutions in 2014 saw an average 28.5% increase in revenues between 2013 and 2014. Indeed, OpenText’s own 2017 research has shown that 38% of European manufacturers surveyed have already implemented IoT solutions with another 48% planning to within the next twelve months. Look out for more on this in a future blog. One company I highlighted as a great example of how IoT is already beginning to change everything was Tesla. I had the luck to test drive the Tesla S on its introduction to the UK and the motoring and customer experience was like no other. It demonstrated functionality and capability that are real differentiators for the industry. Add to that a very unique go to market, service and ownership model this car is an automotive game changer in so many ways. Now, Tesla says it’s pretty close to having a driverless car that can travel from New York to Los Angeles without any human intervention. This is an incredible example of how quickly things have progressed in such a short period of time – and it’s only one of many. We are now at the stage where it is easy to point to factories that are already moving away from traditional centralized production process to an integrated, highly automated network of devices and machines. Companies are already beginning to create flexible production processes to move from mass production to individual runs that can be achieved cost-effectively and just in time to unique customer demands. So, we’ve made a great start but I’m not sure we can call IoT mainstream just yet. As the World Economic Forum points out, there are still some important challenges to be overcome: How to assure the interoperability of systems How to guarantee real-time control and predictability, when thousands of devices communicate at the same time How to prevent disruptors, or competitors, taking control of highly networked production systems How to determine the benefit or return on investment in IoT technologies This echoes exactly my thoughts. You can watch a webinar here that I held in partnership with The Manufacturer magazine in the UK. At the time, I made the point that organizations had to take much greater control of their data. By adding the technology that collects that data and channeling it through an Enterprise Information Management (EIM) system like OpenText, they have been presented with suites of information on which to base much smarter and faster business decisions. To this I’d add the need to for a powerful and easy-to-use analytics engine  that can deliver both predictive and operational insight into the vast amounts of data created within any IoT ecosystem. Placing IoT at the heart of business strategy is also essential – and companies that have done this are starting to reap the rewards. One of my first engagements when I joined OpenText last year was to take in the inaugural IoTTechExpo Conference in London. Patrick Bass, CEO of ThyssenKrupp NA, gave an excellent presentation of how successful transformation projects need to be part of your business strategy. One year on,  Andreas Schirenbeck, CEO of ThyssenKrupp Elevators spoke about how IoT is now transforming their industry. I’ll come back to this in another blog soon. So, I’m going to take credit for being half right! The trend towards IoT implementation is coming on in leaps and bounds but, while organizations focus on building the interoperability of networks and devices, they must also make sure they have a platform to ensure they mazimise the value in their data and information. And, if you’d like to wish me a happy anniversary, you can send me a tweet!

Read More

How We’re Using Discovery Analytics to Solve GDPR Challenges

discovery analytics

We’re still over a year away from General Data Protection Regulation’s (GDPR) “go live” date, but the sense of dread at recent conferences is tangible. And understandably so: The GDPR imposes sweeping requirements on organizations to understand and protect the personal data they process and use. While records management and infosec have so far dominated the GDPR discussion, your lawyers and compliance teams are also gearing up with discovery analytics, including machine learning, to help them manage GDPR risk.  The New Cost of Personal Data The GDPR introduces a slew of IG regulations that attach to Personally Identifiable Information (PII), which is defined as any information relating to an individual. If that sounds broad, it’s because it is. Your name, your pictures, your email, your IP address—really anything that could be used to identify you is included. The GDPR creates personal rights in this data, like the right to be forgotten, the right to audit your data, the right to correct it, or transfer it. It also includes enhanced data breach notification and response obligations. Basically, if your organization touches consumer data in some fashion you’re likely covered by the GDPR. And if your organization’s products or services regularly involve personal data, security takes on even more prominence. Failure to comply with the GDPR could incur fines of up to 20 million Euro or an enormous 4% of global turnover.    The dramatic penalties have spurred organizations to conduct Privacy Impact Assessments (PIA) and proactively audit their own data to measure risk & exposure. Understanding how and where you handle personal data is the first challenge, and a significant one since PII can be embedded in nearly all your business documents and some are more important than others. Finding a Needle in a Stack of Needles If a basic component of GDPR is understanding your data, then naturally you need tools to search, identify, categorize, and flag documents. Traditional search methods of manually reviewing contracts one by one for language about PII treatment, processing, or warehousing is unreliable and inefficient. During a breach response or a PII-assessment, triage is key; you need to rapidly identify the most sensitive documents and tag them for special handling (more on that later). To do so, you need discovery analytics and machine learning. Pattern identification is a crucial technology to rapidly identify simple documents containing standardized PII like credit cards, licenses, medical records, and more. But this technology on its own won’t identify all the documents necessary for a PIA because not all PII is pattern-based and is often highly contextual. That’s where concept analysis, an unsupervised machine learning algorithm, comes in. This technology analyzes the co-occurrence of words and clusters them together according to contextual themes—even if they lack specific keywords—and without any human feedback. These tools can, with astounding accuracy, distinguish between different contexts that influence how we interpret words. For instance, if the word “private” appears in a number of documents related to military ranks the engine could group those documents aside from ones that feature the word “private” in relation to personal data.   These automated tools can get you started on a privacy evaluation, but the ultimate analysis is too nuanced to rely exclusively on machine categorization. Human review is an indispensable element, so having document review workflows and administration tools is necessary. This means the ability to batch out documents in related groups to keep legal reviewers engaged with relevant content. And with a continuous machine learning algorithm running in the background, each decision that our legal team makes while eyeballing documents will train a recommendation engine. This algorithm can then evaluate the remaining documents and predict which ones are likely to contain sensitive data (much more on that interesting topic here). In this way, you can start with a known dataset (like your vendor contracts database) and then leverage analytics to identify unknown, risk-prone documents. As you review more documents and find more PII-laden content, the algorithm is constantly learning in the background. It conducts broad sweeps of your remaining data to prioritize batches of content that are likely to contain PII. What’s more, these algorithms can run on an issue-specific basis—a crucial ability since the GDPR distinguishes between “personal data” and “sensitive personal data.” Knowing is Half the Battle The broader impact of GDPR will shake out over years, it’s still unclear how individuals will exercise their rights or how DPAs will enforce the rules. But organizations can take steps today towards understanding their risk exposure and doing what they can to mitigate potential consequences. OpenText™ Discovery combines tools like machine learning, pattern identification, and entity extraction with data visualizations, keywords, and metadata filters to help legal and compliance teams identify any PII-carrying data. All of this is guided by a document review workflow that has been honed over years of litigation projects and layered security. Learn more about GDPR readiness by watching our analyst webinar here.

Read More

Information-Centric Design Broadens Variety of Use Cases for Low Code Application Platforms

low code

In this post we welcome guest blogger Maureen Fleming, Vice President at IDC and lead analyst of IDC’s IoT analytics and information management practice, and IDC’s research covering process automation, API management and continuous analytics. The use of low code application platforms to build and deploy custom applications is one of the fastest-growing large technology markets. In fact, spending on low code is so fast that, by 2018, we expect enterprises to spend more on low code platforms than they spend for traditional application platforms running developer-written custom code. This is true whether enterprises are running custom applications in their datacenters or on a public cloud. The goal of low code platforms is to speed up development and minimize re-work by making it easy for business teams to work with developers to design and build applications. For smaller, tactical applications, developers may not be involved at all. Low code development evolved from either workflow-oriented tools or from data-centric offerings. Teams had to choose which approach made the most sense for their application. As customers began demanding more capabilities to support a broader and more flexible spectrum of applications, some vendors began to offer both workflow and information-centric capabilities within the same platform. They saw value in not forcing the customer to choose, and also saw value in greater flexibility by separating information-based functionality from workflow. Low Code’s Shift to Information-Centric Design Products embracing information-centric design shift teams from building automation by linking functionality to specific nodes of a workflow to using the information structure as the driver of automation and the basis for functionality development. The foundation of this structure are data entities, which abstract data into subjects and their properties. These properties are then used in the development of rules, in interaction and UI design, forms, in navigation or as parameters that can determine which workflow is called or the flow of a process or the page flow of an application. By contrast, classic workflow automation uses business objects as a building block. Similar to data entities, forms are created from the business object properties and rules can use the same properties. Unlike data entities, the properties of business objects are always associated with the workflow. Information-centric design does not require associated workflow, and in fact, workflow becomes a subordinate function to the information structure. As a result of the shift to information-centric design, there has been a significant expansion in the capabilities and use of low code development with corresponding improvements in ease of use for non-developers. Today, the same platform can automate a process and provide case management as well as deploy browser and mobile apps disassociated from any type of automated workflow. The use of workflow automation tools continues to be important, and with the shift to cloud architecture and the use of APIs, there are ways to access workflow as needed either broadly or discretely in support of a specific purpose. In fact, workflow has become more important in our increasingly distributed way of doing business. But for organizations investing in low code to help forge autonomy for business teams requiring automation or to use a platform to build strategic applications, identifying software that is centered around information design while also supporting workflow provides an optimal choice for use across a broad spectrum of automation use cases. More About Maureen Fleming  With more than 20 years of industry and analyst experience Maureen Fleming is Program Vice President for IDC’s Business Process Management and Middleware research area. In this role, Ms. Fleming examines the products and processes used for building, integrating, and deploying applications within an extended enterprise system.

Read More

Excellence in Sales Order Entry – From Document to Digital

digital sales orders

Sales orders, the documents with the odor of company success attached to them! Physical (or electronic) proof that your company sells products that your customers like. Proof that you make money and create and retain jobs. So what could there be that is not to like about sales orders? Well, the question here is: Are your sales orders solely creating value and financial success for your company? Or are they also costing you money? Are they slowing down your business? Maybe even creating conflicts with your customers? Fully digital sales order process – why? In a digital world, you should consider automating your sales order entry process from beginning to end. The digital sales order process should start the minute a sales order enters your company, from document to digital. This should be independent from your input channel – whether your sales orders reach you via EDI, email, fax or paper document, make sure to digitize your sales orders when they first touch your company. Many of our customers have EDI in place for 60 – 80% of their sales orders. However, the remaining 20-40% slows down their business, preventing them from having full insight and transparency of the status of ALL sales orders. The impact When our customers started to capture the data also from PDFs, emails and paper documents, they realized how valuable a fully automated a digital process is. With their model from document to digital they turned the sales order process into a fast, customer-friendly and fully transparent process. They now have full insight into the status of any sales order. If a customer has a request referring to a sales order, they can answer it within seconds, independent from its input channel or process status. Reporting and transparency have exponentially improved. Management is now able to track the performance of the sales order process across countries, from month to month or year over year. Now, even the performance tracking task is a simple activity, too. It is fast and it is accurate. Not only for the electronic input channel, but for all sales orders. The information extracted is also proof that with the new integrated sales order automation, customers have been able to cut sales order cycle time in half by also automating the remaining 20-40% of sales orders. Customer relationships have also improved because disputes over orders and invoices or wrong deliveries have reached an all-time low. The analysis of sales orders allows making purchasing recommendations to customers from evaluating other customer orders – those who regularly order specific products in combination with other products. These cross-sell opportunities are well-received by customers as they create value and often help to meet their core business needs. Have you identified a need to further digitize your sales order entry process? Take a look at how OpenText™ Business Center for SAP Solutions helps to improve the sales order process and much more.

Read More

Ovum Names OpenText as a Leader in Web Experience Management

web experience management

The research and analysis firm Ovum has released a report naming two OpenText products as leaders in web experience management – also commonly known as web content management (WCM). The Ovum Decision Matrix: Selecting a Web Experience Management Solution, 2016-17 report cited OpenText™ TeamSite and OpenText™ Web Experience Management as market leaders because of their strengths in technology and execution. Here are strengths Ovum attributed to OpenText WCM solutions: Top-rated for maturity Strong roadmaps and long-term strategy Ease of use and interoperability Large portfolio of capabilities Ovum considers web experience management as a key element of digital transformation for today’s enterprise organizations, and we at OpenText fully agree. Organizations need to attract, engage, and hold the attention of their customers through round-the-clock, connected digital experiences. While web experience management or WCM initially focused on websites, it now encompasses so much more: WCM, digital asset management (DAM), web analytics, social, mobile experiences, etc. Sophisticated enterprise solutions cover the entire customer journey, and connect with other key platforms for marketing automation, e-commerce, and customer relationship management. Market leaders not only score high in key capabilities but are also widely accepted as best-of-breed. Read more details in the report and take a look at OpenText WCM offerings.

Read More