Information Management

Innovation: The Real Deal

Innovation

The subject of innovation is surrounded by worn out metaphors, which is why it’s so stimulating to find authentic instances of genuine innovation in practice. You know…real tangible examples, like human commercial space travel, that redefine humanity’s reach and potential. Identifying this type of authentic innovation has been our remit ever since we first started planning the OpenText™ 2017 Innovation Tour many months ago. Now we are just weeks away, I’m delighted to say that the speaker line-up for this year’s one-day event on 21st March doesn’t disappoint. Our keynote speaker is Stephen Attenborough, current Commercial Director, and former CEO, of Virgin Galactic, the world’s first commercial space travel business. Throughout the entire span of human history, only about 550 people have ever visited space. In fact, most of the seven billion of us on the planet have never even met anyone who has been there. And those few astronauts and cosmonauts that have, are hardly representative of our species as human beings. They’ve all tended to be of a similar age, gender, ethnicity, language or professional background. Stephen and his colleagues at Virgin Galactic are using their innovation, imagination and expertise to change all that. To date, around 700 people from 50 different countries, spanning from ages 10 to 90, from many different professions and speaking numerous languages, have signed up to become future Virgin Galactic astronauts. As Virgin Galactic’s first full time employee, Stephen had the unique challenge of laying the commercial foundations to make one of the world’s most ambitious business projects a reality. Except he wasn’t a ‘space nut’, but rather came from the financial services world. Sir Richard Branson is his boss. If ever there was a need for innovative thinking, it was here. He will be sharing with us the next exciting chapter in their journey. Closer back to earth, we’ve also got peer level presentations across seven different tracks, including Manufacturing, Financial Services, Energy & Utilities, Public Services, Retail, as well as Life Sciences.  There’s something for everyone. The likes of Thames Water, FinTech Circle, IDG, the NHS, as well as a former Sky, ITV and BBC presenter (to name just a few) are all coming together to share their respective expertise. And of course, our executive leadership team, including OpenText CEO and CTO Mark Barrenechea, OpenText President Steve Murphy and Executive VP of Engineering, Muhi Majzoub, will be sharing key insights to help with your own innovation plans. In addition, some of our strategic partners, such as SAP, Accenture, Capgemini and Deloitte, will also be providing high level viewpoints that could directly impact both your industry and your future projects – all of this is packed into a single day. If you’ve not yet registered, I’d strongly urge you to do so as we’re approaching capacity. I look forward to seeing you there, and who knows, you might want to book a ticket on the world’s first space airline.

Read More

Regulatory Matters: Collaboration is key for Life (Sciences) in 2017 – Part One

Life Sciences

Life Sciences, like life itself, is constantly evolving. The rigid, product-based environment of complementary but discrete healthcare specialists is rapidly being replaced with a fluid ecosystem where growing and global value chains and strategic alliances drive innovation and price competitiveness. Secure collaboration is key as Greg Reh, Life Sciences sector leader at Deloitte says: ” All of the pressures that life sciences companies are under, be they cost, regulatory or operational, in some way shape or form can be de-risked by adopting a much more collaborative approach to R&D, to commercialization, to manufacturing and distribution”. As increased collaboration touches every part of a Life Sciences business, there are a number of trends that will affect most companies during 2017. Prepare for uncertainty in the compliance landscape There has been a great deal written about the affect that the Trump administration will have on regulatory compliance.  Amid all the uncertainty, Life Sciences companies can’t take a ‘wait and see’ attitude. One thing we do know for certain is that new legislation and regulations will keep coming. Whether the pending regulations on medical devices in the EU  or MACRA  (the Medicare Access and CHIP Reauthorization Act) in the US, regulatory change does not stand still – not even for a new president! We also know that there is greater focus on enforcement. According to law firm, Norton Rose Fulbright, almost one third of all securities class actions in the US in 2016 were against Life Sciences companies, a figure that had risen in each of the previous three years. The company noted that 56% of claims in 2014 were for alleged misrepresentations or omissions. In response, companies have been placing focus on effective marketing content management to develop appropriate quality control on promotional and advertising materials. In addition, enforcement is becoming more stringent is areas such as TCPA and FCPA – where last year the global generic drug manufacturer Teva International agreed to pay $519 million to settle parallel civil and criminal charges. Within extended value chains, compliance becomes an increasingly collaborative process to ensure that information is available to the regulators. However, in compliance, collaboration is working both ways. Life Sciences companies need to be more collaborative as global regulators and enforcement agencies are already cooperating with each other. As global regulators and agencies share information and work together, it becomes even more important to manage compliance risk across the organization and beyond. Consumer price sensitivity continues to drive value-based pricing models According to Statista, the sales of unbranded generic drugs almost doubled between 2005 and 2015. In Japan, the government has an objective of substituting 80% of branded drugs with generics by 2020. There is increasing price sensitivity within both the buyer and regulator communities. Within many economies, the depressed fiscal environment limits the potential for healthcare spending. Governments and insurance companies want to shift payment for product sales to patient outcomes. In fact, the U.S. Centers for Medicare and Medicaid Services (CMS) wants 90% of all Medicare payments to be value-based by 2018 . This value-based pricing model places extra burdens on drug companies but also offers opportunities for the organzations to maintain the profitability within branded drugs. It provides the opportunity to look ‘beyond the pill’ to look more at the patient and what they’re doing. This requires end-to-end evidence management systems that exploit the masses of data created through managing patient outcomes to deliver value-added services around patient wellbeing, rather than simply selling more or more expensive drugs. At OpenText, we would expect most digital transformation efforts to include an element to enable the correct environment for value-based pricing, especially as operational efficiencies and time to market are improved. Part Two of this blog is available to read here.

Read More

Sending the Wrong Email can be an Opportunity to do the Right Thing

customer communications management

We all get them every day. Emails that we delete without reading. Yet companies invest countless hours in developing email campaigns and messaging to try and catch our attention or interest just for us to ignore them. My wife and I were discussing last night the top email subject headers that means we will automatically delete a marketing email. My wife’s top flag was anything that gave her an order to do something. Yesterday’s winner in that category was an email she received from a company that shouted “This is important information you need – Don’t Delete!” – The first thing she did? Deleted that email. My pet peeve is over friendly emails from people I’ve never met, like this example from yesterday, “Reminder – Hey Alan, did you have a chance to review my email?” My response, check the company on the email address, not someone I do business with, then hit the Delete button. Then there’s the emails from companies that you do interact with on a regular basis, but when you read it you think “How did I end up on that mailing list?” You delete it and don’t give it much thought beyond it ramping up an annoyance factor with the company that can eventually impact your overall customer experience. But great brands and customer-aware companies can use a well-defined customer communications management strategy to turn that “How did I end up on this list?” moment into a positive experience rather than a negative one. A case in point. My car. Although my family changes cars on a pretty regular basis we are pretty brand loyal. At any given time you can bet that someone in the family is driving an example from this particular brand’s line up. At the moment it’s me, and I am driving a fully tricked out version of the company’s sportiest offering. It’s the tenth example of the brand we’ve owned. So imagine my surprise to receive an email from the company that was headed “We’re sorry to see you go.” It continued along the lines that the company had heard we had sold the car and wanted to ask a few questions of our experience with the brand, and why we’d moved on. Looking out the window I could still see my car sitting on the driveway. Yep, definitely on the wrong mailing list. I deleted the note, and didn’t think any more of it. Until two days later. A follow-up email arrived from the car company apologizing for the wrong email being sent. There was a well- worded message along the lines of “we know that you still own your car, and thanks for being a loyal customer.” This was followed with a note that by way of apology a small gift was in the mail (which arrived the next day). There was also an additional follow-up that laid out my ownership of the current car, and a note that as a token of thanks for my loyalty if I headed to my local dealer within the next thirty days they would upgrade me from my 2015 model to the equivalent 2017 model at a stated lower APR rate. One mistake = good follow up + bonus gift + acknowledgement of my customer loyalty + upsell offer. That’s good customer communications management, it helps strengthen relationships, develops good customer experience, and promotes more value and revenue across the customer lifecycle. While I’m not ready to take up that trade-in offer just yet, but when it does come time to change my car again, guess which company will once again be top of my list?

Read More

2017. The Year Distributed Becomes Mainstream for Utilities?

Utilties

There’s been a great deal written about the regulatory uncertainty surrounding the new President but one thing is certain: the influence of renewables and Distributed Energy Resources (DERs) will continue to grow. So, will 2017 be the year that Utility companies fully embrace DERs and what will this new business model look like? The growth of DERs The challenges of loss of revenue due to DERs and the justifiable concern of the utilities that the DER users are not paying their fair share of Grid maintenance costs will need to be taken into account by regulators when they are making the Rate Design policies. At the same time, the Utilities are also beginning to see the opportunities that DERs bring to an aging infrastructure, badly in need of modernization allied with increasingly stagnant demand. Regardless of the new administration’s attitude to the EPA, the Clean Air Act or the Clean Power Plan, it is clear that the US government is keen to legislate in a way that Utilities companies are rapidly adapting to DERs ties grid modernization to the integration of DERs. Indeed, we are beginning to see more and more evidence of Utility companies investing in DERs as a means to abandon or defer upgrades to existing bulk generation and transmission/distribution assets. There are at least two reasons for this: renewable energy – especially solar – is rapidly reaching price parity with traditional energy sources, even natural gas. In some cases, solar and wind are proving, on average, most cost-effective than natural gas. The second reason is that Utility companies understand they need to change from a ‘cost centric’ to a ‘customer centric’ model to survive. Utilities companies are rapidly adapting to DERs While Utility companies struggle with stagnant or declining demand which has meant them seeing any impingement from DERs as a serious competitive threat, customers have been faced with rising costs and declines in the quality of service including unexpected power outages and planned rolling black-outs. So, the growing customer demand for DERs is completely understandable. It is not seen by most as a money-making scheme but more as a way of improving energy provision services in a way that may lower the cost to them. It is that context that has seen Utility company executives quickly turn their attention to the opportunities – not the threats – of DERs. It is instructive that in the State of the Electric Utility Survey 2015, 56% of the utility sector respondents said they understood the opportunities of DERs but were unsure how to build a viable business model. A year later, they had begun work on those models – with the majority favoring partnership with third party providers as the best route. Seizing the DER opportunity Whether acting as an aggregator for DER providers and microgrids or developing completely new supply chains, the Utility companies can lower the cost of DER market entry while protecting existing revenue generation and beginning to explore entirely new service opportunities away from bulk generation into niche and targeted supply. For this to succeed, two things must happen. First, Utility companies that have traditionally provided an end-to-end service must learn how to work in what ABB has neatly termed the energy neighbourhood.  ABB states: “Adopting the energy neighborhood perspective can help bridge historic silos in the energy market, which have been hindering the evolution of more flexible, efficient, sustainable, and environmentally friendly energy systems. By working together more, or at least consulting each other more regularly and proactively, utilities, DER operators and customers can make mutually beneficial decisions about assets, business operations and resources.” Secondly, The ability to communicate and share data and information across this neighborhood becomes essential and proactively adopting digital is going to be a key requirement in Utilities. The DER market already requires sensors and meters to regulate quality and output, the type of ecosystems being built for Utilities to integrate DERs into the grid require complete transparency and visibility. The Utilities, DER companies and customers working together have to be able to make complete sense of the structured and unstructured data involved in service delivery. Coping with this level of digital disruption was recently covered in an interesting blog from OpenText CMO, Adam Howatson which you can read here. In practice, terms of service, SLAs and production and maintenance schedules will need to be combined with generation data and ratings engines to ensure that every party is sure that they and others are fully meeting their obligations. This is especially true with the trend towards Time of Use (ToU) and other demand-side rating design as a means to more effectively compensate DER providers. The challenge will be to implement new types of software – such as EIM – that can act as a central, integrated platform of communications, content sharing and data analytics both within the Utility company and beyond to connect and engage with customers, DER providers and, of course, the regulators. Successful integration of DERs with the existing grid is going to be critically important, as DERs are forecasted to have a big impact on the “Duck Curve” – Net Load forecast curve for the 24 hours of the day. California System operator, CAISO, has performed detailed analysis of net load forecasts till the year 2020 and has shown the need for steep ramping of resources and possibility of over-generation risks. CAISO is also working with the industry and policymakers on rules and new market mechanisms that support and encourage the development of flexible resources to ensure a reliable future grid. American Council for Energy Efficient Economy (ACEEE) has recently reported that Utilities can drive a 10% reduction in peak demand by using demand response capabilities and reduce the impact of the steepening Duck Curve.  New EIM software as an integrated platform for communications will be crucial for the Utilities. It is essential for the successful sharing of content and structured and unstructured data with all the stakeholders including DER providers, Customers and System Operators and for introducing new Demand Response technology initiatives. Read more on page 2 to find out about regulation, and regulators taking center stage.

Read More

The Future of e-Delivery and its Impact on Customer Experience

e-delivery

It seems everyone is talking about “digital transformation” but many are still unsure of what it is, why it is important, or even how to get started. Wikipedia defines digital transformation as the change associated with the application of digital technology in all aspects of human society. It has also been described as taking existing manual and paper-based processes and converting them to digital channels and documents – or “going paperless.” Many companies like yours, are talking about improving the customer experience and going digital, but don’t know where to begin. Regardless of what it means to you, a lot of companies are now realizing that digital transformation often includes e-delivery – ensuring that emails containing bills, statements, ID cards and other business critical information get to the intended recipients. They rely on their customer communications management (CCM) solution for this, but are often not aware of what is actually involved in reaching the recipient’s inbox. It can also be challenging sometimes to quantify the benefits of e-delivery and the costs associated with poor deliverability. To help you better understand the importance of email deliverability, we have a new recorded webcast available and you can register and then view it here. InfoTrends customer communications experts Matt Swain and David Stabel, and OpenText™ Exstream Manager of Product Strategy Avi Greenfield, present new research on consumer preferences and enterprise plans for e-delivery in this webcast. They also share key trends, challenges and best practices for managing e-delivery, including the impact that non-delivery can have on your bottom line. View the webcast.

Read More

Forget on-premises: InfoArchive, Docker and Amazon AWS

InfoArchive

There are two buzzwords that we have heard in the IT world for some time now: Cloud and Containerization. For me, 2016 proved that these two topics have changed from hype to reality, even in the biggest enterprises, and a lot of customers were asking for our solutions, like OpenText™ InfoArchive,  in public clouds and/or running as Docker containers. While our engineering and PS teams are doing a great job in providing these solutions, I decided to walk this route myself. Follow me  on the journey if you’re interested. I started my tests by creating a Docker hub account. The account private repository will be used to store the InfoArchive Docker images and automatically deploy from there. It is very easy to create a Docker container from InfoArchive – talk to me if you want to know more. It takes just a couple of steps and you’ll have your InfoArchive Docker container image ready. What’s next? Now let’s run this image in Amazon EC2 Container Services (ECS). Welcome to the “cloud world” If you’re new to the Amazon world you might have difficulty understanding some of the terminology around Amazon ECS. I hope this post will help you with this. ECS cluster In the first step we need an ECS Cluster. ECS Cluster of EC2 instances and services. EC2 instances are our “good old” virtual machines and represent our available compute resources. The work that you assign to the cluster is described as “services”. The picture below shows that our InfoArchive cluster started with 3 micro servers (each of them automatically initiated by ECS from the below amzn-ami… VM image): Within a minute your cluster compute resources are running and waiting for you to assign them some work. Ignore the memory values in the below screenshot – I took the screenshot with 3 running tasks occupying the memory already. InfoArchive is a “classic” three-tiered architecture product: Native XML database xDB at the backend, InfoArchive server as middleware and InfoArchive web UI. To prepare for scalability requirements of our deployment we’ll run each of the tiers as dedicated containers. We’ll “front-end” each of the tiers with an EC2 load balancer. This approach will also simplify the configuration of the container instances, since each container instance will have to connect to the underlying load balancer only (with known static hostname/IP) instead of trying to connect with the constantly changing IP addresses of the container instances. On a very high level the architecture can be depicted as shown below: EC2 load balancers are set up quickly  – my list (shown below) contains 4 instances since I’ve also configured a dedicated public load balancer for xDB connectivity. With this step completed the ECS cluster, its compute resources and the cluster load balancers are prepared. Let’s put InfoArchive on the cluster now.

Read More

WFO Video Series: Driving Contact Center Awareness

WFO video series

I have had the pleasure of speaking with Donna Fluss, President of DMG Consulting, on numerous occasions, and she often ends her sessions with a very compelling illustration – a boardroom table with one empty seat. She then asks the audience, “How do you earn your seat at the table?” Then, with the right amount of poise and firmness, she challenges the audience to align their day-to-day lives with the quarter-by-quarter business objectives of their organization. “It is up to you,” she says, “to establish the importance of the contact center in helping the enterprise achieve its strategic goals.” In much the same way, I often see the normal cast of characters:  CEO, CFO, CMO, COO, CIO, CTO…involved in strategy, but painfully unaware of the role that their contact center plays in driving corporate customer experience goals. So to help you drive contact center awareness, OpenText WFO Software is launching a new video series with your journey in mind. Our 2017 video series is now online and features a great line-up of industry veterans and analysts. We asked each speaker their view on questions such as: What defines a positive customer experience with your company? How do you align your contact center with top priorities of your executive leadership? How do you align your goals with these other business units? And the list goes on! Visit the Video Series where you can easily navigate from question to question and  from speaker to speaker, then listen to video commentaries from each panelist. We encourage you to share the insights. Each video clip is very short, and as they are published over the coming weeks you can share a specific video with your colleagues or via social media by clicking on the blue “share” box under each video. Use the hashtag #CCTRImpact. Finally, I want to extend my sincere thanks to Donna, Jason, Keith, Kate and Roger for their time in helping us to bring this series to you. Their individual expertise is highly respected in our industry, and we all hope the advice they have offered in each video will help you get closer to having your place at the table.

Read More

What a Difference a Year Makes, Here and in IoT

Internet of Things (IoT)

This is my first anniversary at OpenText – and what a year it’s been. I’ve travelled the world and met some amazing people doing some amazing things. Special mention has to go to the 2016 Enterprise World, OpenText’s flagship event, and the IDC Manufacturing Summit  in Lisbon where we discussed the role of Digital Transformation in the sector. But, let’s talk about the Internet of Things (IoT). I wrote a blog in early 2016 predicting that it would be the year that IoT went mainstream in manufacturing and I thought it might be good – unlike so many other analyst predictions – to go back and take a look at just how right I was! A year back, my argument was that IoT was beginning the move from theory to practice. Organizations were building IoT ecosystems that would fundamentally change the way they operated. My particular interest is the Industrial Internet of Things – or Industry 4.0 – which is about enabling manufacturers to work smarter and attain business goals such as: Doing more with less by increasing the use of smart data to power business efficiencies Open up new market opportunities that were previously inaccessible before disruptive technology was available Grow their business by increasing to value of their product through full life support by enhancing products with added life long services Increasing quality of product through real time and virtual monitoring and predictive maintenance and thus retain customer loyalty for life. Glance at recent research and my predictions are looking pretty good. According to McKinsey, the economic impact of IoT applications could be as much as $11 trillion by 2025 – up to $3.7 trillion of which will happen within factory environments. By 2019, says IDC, 75% of manufacturing value chains will undergo an operating model transformation, with digitally connected processesthat improve responsiveness and productivity by 15%. More impressively, Tata Consulting has found that manufacturers utilizing IoT solutions in 2014 saw an average 28.5% increase in revenues between 2013 and 2014. Indeed, OpenText’s own 2017 research has shown that 38% of European manufacturers surveyed have already implemented IoT solutions with another 48% planning to within the next twelve months. Look out for more on this in a future blog. One company I highlighted as a great example of how IoT is already beginning to change everything was Tesla. I had the luck to test drive the Tesla S on its introduction to the UK and the motoring and customer experience was like no other. It demonstrated functionality and capability that are real differentiators for the industry. Add to that a very unique go to market, service and ownership model this car is an automotive game changer in so many ways. Now, Tesla says it’s pretty close to having a driverless car that can travel from New York to Los Angeles without any human intervention. This is an incredible example of how quickly things have progressed in such a short period of time – and it’s only one of many. We are now at the stage where it is easy to point to factories that are already moving away from traditional centralized production process to an integrated, highly automated network of devices and machines. Companies are already beginning to create flexible production processes to move from mass production to individual runs that can be achieved cost-effectively and just in time to unique customer demands. So, we’ve made a great start but I’m not sure we can call IoT mainstream just yet. As the World Economic Forum points out, there are still some important challenges to be overcome: How to assure the interoperability of systems How to guarantee real-time control and predictability, when thousands of devices communicate at the same time How to prevent disruptors, or competitors, taking control of highly networked production systems How to determine the benefit or return on investment in IoT technologies This echoes exactly my thoughts. You can watch a webinar here that I held in partnership with The Manufacturer magazine in the UK. At the time, I made the point that organizations had to take much greater control of their data. By adding the technology that collects that data and channeling it through an Enterprise Information Management (EIM) system like OpenText, they have been presented with suites of information on which to base much smarter and faster business decisions. To this I’d add the need to for a powerful and easy-to-use analytics engine  that can deliver both predictive and operational insight into the vast amounts of data created within any IoT ecosystem. Placing IoT at the heart of business strategy is also essential – and companies that have done this are starting to reap the rewards. One of my first engagements when I joined OpenText last year was to take in the inaugural IoTTechExpo Conference in London. Patrick Bass, CEO of ThyssenKrupp NA, gave an excellent presentation of how successful transformation projects need to be part of your business strategy. One year on,  Andreas Schirenbeck, CEO of ThyssenKrupp Elevators spoke about how IoT is now transforming their industry. I’ll come back to this in another blog soon. So, I’m going to take credit for being half right! The trend towards IoT implementation is coming on in leaps and bounds but, while organizations focus on building the interoperability of networks and devices, they must also make sure they have a platform to ensure they mazimise the value in their data and information. And, if you’d like to wish me a happy anniversary, you can send me a tweet!

Read More

How We’re Using Discovery Analytics to Solve GDPR Challenges

discovery analytics

We’re still over a year away from General Data Protection Regulation’s (GDPR) “go live” date, but the sense of dread at recent conferences is tangible. And understandably so: The GDPR imposes sweeping requirements on organizations to understand and protect the personal data they process and use. While records management and infosec have so far dominated the GDPR discussion, your lawyers and compliance teams are also gearing up with discovery analytics, including machine learning, to help them manage GDPR risk.  The New Cost of Personal Data The GDPR introduces a slew of IG regulations that attach to Personally Identifiable Information (PII), which is defined as any information relating to an individual. If that sounds broad, it’s because it is. Your name, your pictures, your email, your IP address—really anything that could be used to identify you is included. The GDPR creates personal rights in this data, like the right to be forgotten, the right to audit your data, the right to correct it, or transfer it. It also includes enhanced data breach notification and response obligations. Basically, if your organization touches consumer data in some fashion you’re likely covered by the GDPR. And if your organization’s products or services regularly involve personal data, security takes on even more prominence. Failure to comply with the GDPR could incur fines of up to 20 million Euro or an enormous 4% of global turnover.    The dramatic penalties have spurred organizations to conduct Privacy Impact Assessments (PIA) and proactively audit their own data to measure risk & exposure. Understanding how and where you handle personal data is the first challenge, and a significant one since PII can be embedded in nearly all your business documents and some are more important than others. Finding a Needle in a Stack of Needles If a basic component of GDPR is understanding your data, then naturally you need tools to search, identify, categorize, and flag documents. Traditional search methods of manually reviewing contracts one by one for language about PII treatment, processing, or warehousing is unreliable and inefficient. During a breach response or a PII-assessment, triage is key; you need to rapidly identify the most sensitive documents and tag them for special handling (more on that later). To do so, you need discovery analytics and machine learning. Pattern identification is a crucial technology to rapidly identify simple documents containing standardized PII like credit cards, licenses, medical records, and more. But this technology on its own won’t identify all the documents necessary for a PIA because not all PII is pattern-based and is often highly contextual. That’s where concept analysis, an unsupervised machine learning algorithm, comes in. This technology analyzes the co-occurrence of words and clusters them together according to contextual themes—even if they lack specific keywords—and without any human feedback. These tools can, with astounding accuracy, distinguish between different contexts that influence how we interpret words. For instance, if the word “private” appears in a number of documents related to military ranks the engine could group those documents aside from ones that feature the word “private” in relation to personal data.   These automated tools can get you started on a privacy evaluation, but the ultimate analysis is too nuanced to rely exclusively on machine categorization. Human review is an indispensable element, so having document review workflows and administration tools is necessary. This means the ability to batch out documents in related groups to keep legal reviewers engaged with relevant content. And with a continuous machine learning algorithm running in the background, each decision that our legal team makes while eyeballing documents will train a recommendation engine. This algorithm can then evaluate the remaining documents and predict which ones are likely to contain sensitive data (much more on that interesting topic here). In this way, you can start with a known dataset (like your vendor contracts database) and then leverage analytics to identify unknown, risk-prone documents. As you review more documents and find more PII-laden content, the algorithm is constantly learning in the background. It conducts broad sweeps of your remaining data to prioritize batches of content that are likely to contain PII. What’s more, these algorithms can run on an issue-specific basis—a crucial ability since the GDPR distinguishes between “personal data” and “sensitive personal data.” Knowing is Half the Battle The broader impact of GDPR will shake out over years, it’s still unclear how individuals will exercise their rights or how DPAs will enforce the rules. But organizations can take steps today towards understanding their risk exposure and doing what they can to mitigate potential consequences. OpenText™ Discovery combines tools like machine learning, pattern identification, and entity extraction with data visualizations, keywords, and metadata filters to help legal and compliance teams identify any PII-carrying data. All of this is guided by a document review workflow that has been honed over years of litigation projects and layered security.

Read More

Information-Centric Design Broadens Variety of Use Cases for Low Code Application Platforms

low code

In this post we welcome guest blogger Maureen Fleming, Vice President at IDC and lead analyst of IDC’s IoT analytics and information management practice, and IDC’s research covering process automation, API management and continuous analytics. The use of low code application platforms to build and deploy custom applications is one of the fastest-growing large technology markets. In fact, spending on low code is so fast that, by 2018, we expect enterprises to spend more on low code platforms than they spend for traditional application platforms running developer-written custom code. This is true whether enterprises are running custom applications in their datacenters or on a public cloud. The goal of low code platforms is to speed up development and minimize re-work by making it easy for business teams to work with developers to design and build applications. For smaller, tactical applications, developers may not be involved at all. Low code development evolved from either workflow-oriented tools or from data-centric offerings. Teams had to choose which approach made the most sense for their application. As customers began demanding more capabilities to support a broader and more flexible spectrum of applications, some vendors began to offer both workflow and information-centric capabilities within the same platform. They saw value in not forcing the customer to choose, and also saw value in greater flexibility by separating information-based functionality from workflow. Low Code’s Shift to Information-Centric Design Products embracing information-centric design shift teams from building automation by linking functionality to specific nodes of a workflow to using the information structure as the driver of automation and the basis for functionality development. The foundation of this structure are data entities, which abstract data into subjects and their properties. These properties are then used in the development of rules, in interaction and UI design, forms, in navigation or as parameters that can determine which workflow is called or the flow of a process or the page flow of an application. By contrast, classic workflow automation uses business objects as a building block. Similar to data entities, forms are created from the business object properties and rules can use the same properties. Unlike data entities, the properties of business objects are always associated with the workflow. Information-centric design does not require associated workflow, and in fact, workflow becomes a subordinate function to the information structure. As a result of the shift to information-centric design, there has been a significant expansion in the capabilities and use of low code development with corresponding improvements in ease of use for non-developers. Today, the same platform can automate a process and provide case management as well as deploy browser and mobile apps disassociated from any type of automated workflow. The use of workflow automation tools continues to be important, and with the shift to cloud architecture and the use of APIs, there are ways to access workflow as needed either broadly or discretely in support of a specific purpose. In fact, workflow has become more important in our increasingly distributed way of doing business. But for organizations investing in low code to help forge autonomy for business teams requiring automation or to use a platform to build strategic applications, identifying software that is centered around information design while also supporting workflow provides an optimal choice for use across a broad spectrum of automation use cases. More About Maureen Fleming  With more than 20 years of industry and analyst experience Maureen Fleming is Program Vice President for IDC’s Business Process Management and Middleware research area. In this role, Ms. Fleming examines the products and processes used for building, integrating, and deploying applications within an extended enterprise system.

Read More

Excellence in Sales Order Entry – From Document to Digital

digital sales orders

Sales orders, the documents with the odor of company success attached to them! Physical (or electronic) proof that your company sells products that your customers like. Proof that you make money and create and retain jobs. So what could there be that is not to like about sales orders? Well, the question here is: Are your sales orders solely creating value and financial success for your company? Or are they also costing you money? Are they slowing down your business? Maybe even creating conflicts with your customers? Fully digital sales order process – why? In a digital world, you should consider automating your sales order entry process from beginning to end. The digital sales order process should start the minute a sales order enters your company, from document to digital. This should be independent from your input channel – whether your sales orders reach you via EDI, email, fax or paper document, make sure to digitize your sales orders when they first touch your company. Many of our customers have EDI in place for 60 – 80% of their sales orders. However, the remaining 20-40% slows down their business, preventing them from having full insight and transparency of the status of ALL sales orders. The impact When our customers started to capture the data also from PDFs, emails and paper documents, they realized how valuable a fully automated a digital process is. With their model from document to digital they turned the sales order process into a fast, customer-friendly and fully transparent process. They now have full insight into the status of any sales order. If a customer has a request referring to a sales order, they can answer it within seconds, independent from its input channel or process status. Reporting and transparency have exponentially improved. Management is now able to track the performance of the sales order process across countries, from month to month or year over year. Now, even the performance tracking task is a simple activity, too. It is fast and it is accurate. Not only for the electronic input channel, but for all sales orders. The information extracted is also proof that with the new integrated sales order automation, customers have been able to cut sales order cycle time in half by also automating the remaining 20-40% of sales orders. Customer relationships have also improved because disputes over orders and invoices or wrong deliveries have reached an all-time low. The analysis of sales orders allows making purchasing recommendations to customers from evaluating other customer orders – those who regularly order specific products in combination with other products. These cross-sell opportunities are well-received by customers as they create value and often help to meet their core business needs. Have you identified a need to further digitize your sales order entry process? Take a look at how OpenText™ Business Center for SAP Solutions helps to improve the sales order process and much more.

Read More

Ovum Names OpenText as a Leader in Web Experience Management

web experience management

The research and analysis firm Ovum has released a report naming two OpenText products as leaders in web experience management – also commonly known as web content management (WCM). The Ovum Decision Matrix: Selecting a Web Experience Management Solution, 2016-17 report cited OpenText™ TeamSite and OpenText™ Web Experience Management as market leaders because of their strengths in technology and execution. Here are strengths Ovum attributed to OpenText WCM solutions: Top-rated for maturity Strong roadmaps and long-term strategy Ease of use and interoperability Large portfolio of capabilities Ovum considers web experience management as a key element of digital transformation for today’s enterprise organizations, and we at OpenText fully agree. Organizations need to attract, engage, and hold the attention of their customers through round-the-clock, connected digital experiences. While web experience management or WCM initially focused on websites, it now encompasses so much more: WCM, digital asset management (DAM), web analytics, social, mobile experiences, etc. Sophisticated enterprise solutions cover the entire customer journey, and connect with other key platforms for marketing automation, e-commerce, and customer relationship management. Market leaders not only score high in key capabilities but are also widely accepted as best-of-breed. Read more details in the report and take a look at OpenText WCM offerings.

Read More

“One Step” to Make Agent Guidance Easier

Agent Guidance

Have you looked at the desktop of the average customer service employee lately?  Even with unified communications and the consolidation of systems such as CRM and ERP, most desktops look more like a NASA command center than a helpful application to deliver a great customer experience. I have good news…and bad news. Let’s start with the bad news – your IT department has a long uphill journey to merge systems, unify the tools in use and reduce the chaos of customer data. The good news is that there are ways to provide agent guidance and overcome desktop application challenges that don’t include a forklift upgrade to a single desktop application. But don’t look for an easy trail to follow with the typical solutions on the market. Tools for guidance and automation are quite common from vendors the likes of OpenSpan (now Pega) and Cicero, but we find that today’s contact centers struggle to prioritize these efforts for several valid reasons — deployment and product complexity. In fact, in the report, “These Overlooked Assistance Tools for Your Customer Service Agents Can Boost Productivity,” Brian Manusama and Jim Davies of Gartner evaluated the complexity, deployment, vendor and ROI level for such tools. Here’s is one of the tables in the report: Table 2. Technology Category Overview In layman’s terms, I believe these tools are hard to configure, hard to use and hard to deploy. But why?  First, most of these tools are designed to be professional-service-revenue-generators and not happy-customers-that-use-it-generators. Second, your IT department doesn’t want to deal with yet ANOTHER thing on the desktop to configure or install. This is exactly why the OpenText™ Qfiniti team has made Qfiniti Optimize, our agent desktop automation and analytics solution, native to the OpenText Qfiniti platform. If you’re using another call recording and quality management solution, then let us show you our integrated WFO suite. If you’re already using Qfiniti today, then most likely you have everything you need to push guidance and automation previously installed and ready to test. To show you exactly what this means, we’re inviting you to see how easy it really is. We call it the Qfiniti Optimize One-Step. One Step. Give us one broken application workflow and let us show you how to message, guide, automate and monitor the agents to better AHT, compliance and accuracy. One Team. Give us one team of agents and let us enable Qfiniti Optimize in a manner of minutes, to try the “One Process” steps to improve their efficiency. One Month. Allow that team to use the automation and guidance during a one month trial. Nothing to install, configured by you, and monitored by us. We think that you’ll like what you see, and the agents in your “One Team” beta group will like it too.  I’ve thrown dozens of pizza parties in my time for call center agents, but perhaps your beta team will throw you and IT a pizza party for a change.

Read More

OpenText ApplicationXtender 8.1 SP1 is Here!

ApplicationXtender

ApplicationXtender is now part of OpenText. Like you, we’ve always known that ApplicationXtender was a jewel within the Dell EMC Enterprise Content Division, but now that we’re part of OpenText, it’s great hearing this validation from our new colleagues as well. If you’d like to know more about the ECD integration, we recommend you read Stephen Ludlow’s recent blog which will also point you to the recording of an interesting AIIM webinar. OpenText invests in products and technologies that have the opportunity to address new markets and opportunities. ApplicationXtender fits into this category, offering a quick-to-implement and easy-to-use solution for companies and departments that don’t have the budget or IT support for a full-scale ECM implementation. As early proof of this commitment, we just launched the first service pack for ApplicationXtender 8.1: Certification / Security for Microsoft Windows Server 2016, SQL Server 2016, and Microsoft Office 2016 Image Capture Supportability update and PDF rendering performance enhancements Microsoft Office Integration Supportability update Cumulative Patches If you are new to ApplicationXtender or haven’t moved to ApplicationXtender 8.x yet, you may want to know that ApplicationXtender is a scalable, cost-efficient document management solution, optimized for line-of-business content. With ApplicationXtender 8, we’ve started our mobile and cloud-first journey, with a long list of enhancements over previous versions. Take a look: Mobile-enabled, with an intuitive user interface, no plugins required Cloud-ready, for public, private or hybrid cloud deployment Easy to learn! Users can stay in their familiar business applications. They can easily view content without invoking the application that created them Rapid to deploy, requiring minimal IT involvement Based on open standards such as RESTful Services and HTML5 Available in English, German, Simplified Chinese, Brazilian Portuguese, Spanish, French, Italian We’re excited not only about this release of ApplicationXtender, but its future as well! Learn more about ApplicationXtender here. Existing customers current on maintenance can access the latest release of the software by visiting the Dell EMC Support site.

Read More

Webinar Outlines a Bright Future as OpenText and Documentum Come Together

Documentum webinar

On February 2, I had the good fortune to participate in a very interesting webinar hosted by AIIM. Inspired by the new union of OpenText and Documentum, the event brought together a variety of experts to discuss what the pairing means to customers of both, the partner network, and the industry in general. And the interest was certainly there — registration numbers were some of the highest AIIM has ever seen. Well, we covered what we set out to, and more! The roundtable discussion and subsequent Q&A session were wide-ranging and dynamic, addressing the concerns of the customer base and future product integration plans, but also delving into a wide open sharing of views around the present and future state of ECM and the skills that organizations are going to need in order to be successful. If you didn’t get the chance to attend the webinar, then setting aside some time to listen to the webinar-on-demand would be 60 minutes of your time well spent. Regardless of your current solution provider or your role in ECM, there’s some thought-provoking perspectives on topics like “the difference between ECM and Content Services” and “the kinds of technology and business competencies an organization needs to have – or develop – in order to embrace this shift toward content services.” I hope you enjoy it as much as I did. Finally, please join us on February 14 as AIIM Chief Evangelist John Mancini and I connect once again to present Next-Gen Information Management — Succeeding in a New Era. We’ll be examining the emerging age of Content Services and what that means to the traditional concept and practice of ECM. Sit in to gain valuable insight into the changing definition of ECM and learn the next steps that will allow you to prepare for the future while maximizing your current investment. Here’s hoping everything OpenText and Documentum do together in the future is as great as our first webinar!

Read More

Why DAM Isn’t Just Pretty Pictures on the Web Any More

DAM webinar

Why do you purchase and implement a Digital Asset Management (DAM) platform in the first place? If you’re like me when I ran the content management team at another company, it was originally to control the flow of approved images to the company’s online presence. We were revamping the website and eCommerce platform and a key part of the project was to improve the images used, and to make sure that they were both brand and safety compliant. It didn’t take long for the word to spread that we now had a single safe source for brand approved images. Soon we were talking to other groups in the company, and even our dealer network about how they could contribute to, and access, the DAM. Instead of just storing the images selected for use on the website we were soon storing every picture from a product photo shoot, then came interest from the company archives. In the space of eighteen months we had passed one million assets and over eight-thousand users accessing them. But the most interesting part was the way that the DAM became the source for applications and use cases that we had never considered. We had developed a way to create lightweight 3D models of our products, and started storing the source files for those on the DAM too. Suddenly the DAM was the source driving Augmented Reality proof-of-concept innovations, being used to populate digital signage at dealer showrooms, as well as training, facilities planning, trade shows, coffee table art books, calendars, licensed merchandise, and more. At the point where I left the company we had recorded sixteen different use cases for the content stored in the DAM, and I’m sure there’s even more now. The thing is, I was far from alone in witnessing how a good DAM platform can be used in different. powerful ways. Since joining OpenText I’ve seen other uses, such as: Media companies who use their DAM to deliver DVD packaging and advertising banners that automatically resize and place the correct logos and text based on the intended markets and distribution channels. Drinks companies where the DAM is a central component of their high-profile sports sponsorships programs A rail company that uses the DAM to manage rail inspection videos from cameras mounted on the front of locomotives An aerospace engine company that uses its DAM to store and analyze images of parts from any engine involved in an accident So how are you using your DAM platform? Join us on Wednesday February 15th for a webinar on how to Unlock New Potential (and ROI) From Your DAM. Click here to register

Read More

3 Tips to Gain Mindshare for Your Contact Center

Contact Center

One of my favorite discussion topics with OpenText WFO customers and other contact center professionals is about the internal brand perception of the contact center within any organization. Contact center brand perception? Yes, exactly. Every enterprise contact center is perceived differently by other business units and C-level executives depending on how the company approaches its customers and markets. Is the company strictly bottom-line driven, wringing every last dollar out of its budgets in order to maximize profits? Or is the company customer-centric, doing everything it can to improve customer service in order to compete effectively in the marketplace? I love talking about this at customer meetings and industry events because we all know the contact center holds the key to vast and rich customer information, exactly the kind of customer knowledge that every department and every executive should want to understand in some form or another. Better business decisions are made when more is known about customer preferences, behaviors and opinions. So why is it that the contact center is more often perceived as a cost-center rather than a customer experience leader? Why are we constantly tasked with delivering better service, hitting higher sales targets, scoring higher customer satisfaction responses but with ever-tightening personnel resources and budget dollars? Why is the contact center constantly tasked with delivering better service with ever-tightening personnel resources and budget dollars? I invite you to register now for a webinar on February 23 when Ken Landoline, Principal Analyst for Customer Engagement at Ovum, and I will explore this issue that I’m so passionate about. Make no mistake: this is an internal brand perception issue. But we will approach the discussion from a very practical point of view, offering you specific tips on how to secure greater investment and ensure organizational mindshare. In this webinar we will share proven methods about how to get more in order to do more in your contact center: – Learning what KPIs matter most, identify, provide and relay metrics that matter – Setting up your dashboard, quickly identify information to make real time decisions and predict behavior – Becoming an indispensable resource, understand and coordinate contact center goals with others in your organization With this actionable information in hand, you can then manage or influence up and be the agent of change who helps evolve the internal brand perception of your contact center from cost center to value center. I look forward to you joining us at this webinar. Doing More with Less? 3 Tips to Gain Budget and Mindshare for Your Contact Center Webinar Date: Thursday, February 23 Time: 2:00 PM ET / 1:00 PM CT / 12:00 PM MT / 11:00 AM PT Register Now

Read More

Empower Your Mobile Workforce

mobile workforce

Is your mobile workforce integrated into your IT based business processes? Can maintenance engineers look up data, order parts and enter job details directly at maintenance sites? Does your delivery staff need to get signatures for delivered items? Do you need to notify workers about their next task? Do you want your Sales team to submit orders right at the customer site? Do your mobile workers need to take photos of work sites and submit them to headquarters? Do you need to track the GPS position of mobile workers for further processing? There are endless scenarios where organizations can benefit from mobile IT solutions. Mobile infrastructure Mobile network infrastructure is at a point where high speed network access is available everywhere. And mobile devices like smartphones and tablets have internal databases and file storage capabilities that enable developers to create offline capable apps without problems. Technically there is no reason to exclude your mobile workforce in your IT processes today, and create a more efficient organization and a better, more responsive customer experience. Mobile workforce app development best practices All this can be easily done with low effort app development and usage of mobile devices like smartphones, phablets and tablets. The development effort for mobile workforce apps can be kept quite low when using multi-platform development tools like OpenText™ Gupta TD Mobile. Other than Apple and Google’s native development tools which require high effort platform dependent coding, TD Mobile offers development of mobile workforce apps for all mobile platforms from one source code, empowering organizations to develop mobile workforce apps that work on any mobile device at a small budget. Integrate mobile workforce apps into your IT infrastructure TD Mobile workforce apps can easily connect to any of your backend IT systems. Using the no-coding database access feature, developers can configure database access to all SQL and NoSQL databases without actually writing any code. If you are using a Web Service or REST service layer for your business logic, TD Mobile can easily access these services as well. Integrating business logic from SAP, SalesForce and OpenText AppWorks is easy as well. The data communication with the datacenter and the binding of backend-values to visual objects is automatic as well. TD Mobile is built to automate and reduce the development effort of mobile apps as much as possible. Mobile workforce apps require strong security and integration with office security systems. With TD Mobile all data communication between the data-center and the mobile device can be configured as strongly SSL encrypted. At the backend TD Mobile can interact with any security provider as LDAP and Microsoft Active Directory to force sign-on via corporate standards. Smartphones and tablets contain a host of device features like camera, GPS sensor, accelerometer, fingerprint reader and many software features and apps like the contacts database, email, notifications and much more. Developers crafting apps using TD Mobile can easily access the device and software features of smartphones and tablets. The built-in camera can be used to decode barcodes, the GPS feature can be used to detect the current position, notifications can be sent to an app user, to for example send the next work order. The possibilities of integrating device features are endless only limited by a developer’s imagination. Many organizations have grown international and require mobile workforce apps to adopt to local language. Using OpenText Gupta TD Mobile development teams can build multi-lingual apps from scratch. All required UX languages can be provided in one single app source. At application runtime the app chooses the language to use based on the current locale of the device. Native app deployment and Web app deployment TD Mobile apps can be deployed as native Android, iOS and Windows Phone apps. Developers can easily monetize their efforts as native apps can be deployed through the Apple AppStore, Google Play store and the Windows store. Apple and Google enterprise deployment options can be used to deploy internal workforce apps. Additionally or alternatively TD Mobile apps can be deployed as Web apps allowing to run them on any device that hosts a Web browser. This comes in very handy when the app needs to be used on desktop computers that use Windows, Linux or MacOS operating systems. If you are still using paper-based processes for your mobile workforce it is time to rethink and integrate your mobile workforce into your IT systems giving them access to anything they need from anywhere to increase efficiency and speed up internal processes, increase efficiency and such providing a better more responsive customer experience. Download a free trial version of OpenText Gupta TD Mobile 2.1  now.

Read More

100,000 Pieces of Content a Day

content

It often feels like we are being deluged by content, we are exposed to more stories, images, video, and audio than ever before. Yet most of that content (social media aside) has been sorted, indexed, written, edited, managed, and gone through a publication process before it even gets to us, the consumer. How do those who produce the content deal the vast amounts of raw information, text, and images etc., that go to make up the stories that we see? And how do they make their publishing efficient enough to keep up with the unedited real time content streaming across the various social media platforms? These were the sort of problems facing News UK the publisher of some of the biggest and most popular British newspapers. The Times, Britain’s oldest daily national, and The Sunday Times are the world’s best-known quality newspapers. The Sun is the most read British newspaper, with more than four million readers each day. News UK also operates a number of digital channels, including Sun Bingo, Sunday Times Wine Club, and Riviera Travel. News UK receives and generates more than 100,000 new digital assets each day, and manages in excess of 25 million assets in total. The assets including text, images, pages, video, graphics, and audio needed to be captured, indexed, and quickly made available to users across the business. Their existing digital asset management system (DAM) had served the business well, but was more suited to print media, with limited options for moving towards a converged, multichannel solution. It also lacked the ability to be easily integrated to its chosen editorial system. “We need to drive a greater responsiveness for global news coverage, rapidly publishing articles that provide a consistent, rich multimedia experience for readers across all channels and publication brands,” says Simon Pumphrey, Systems Manager at News UK.” Against a backdrop of technical change, we have to ensure we remain at the forefront of how news is delivered, across all channels.” In looking for a replacement for their legacy system the new DAM solution had to be faster, easier to use, and be more cost-effective than our existing system. It should also help us ensure compliance with usage rights of the assets we use, with comprehensive tracking, audit, and reporting. We wanted a browser-based solution, based on open standards, which would be straightforward to integrate to our editorial system. OpenText™ Content Hub for Publishers (CHP) meets all of these criteria and more,” says Pumphrey. CHP has been introduced as part of a large-scale transformation project to increase collaboration across editorial teams. “The business critical deployment of OpenText CHP allows News UK to collect as many as 100,000 or more new digital assets and news feeds submitted each day by multiple journalists, photographers, and agencies into a single system. The OpenText content Analytics engine automatically tags these assets, ensuring content can be quickly found and retrieved across the various editorial desks.” Not only can the assets be easily repurposed across The Times, The Sunday Times, and The Sun, but the solution ensures the correct rights are associated with each asset, helping to mitigate the risk of digital rights infringement. “In today’s connected world, customers are choosing to engage with our newspapers across a growing number of devices and, increasingly, we need to manage the growing types of digital content to create a richer digital experience. We chose OpenText CHP as the scalability of the platform has enabled us to move from a print-centric process to one where journalists can associate multimedia content directly into different channels,” You can find more information about the News UK implementation of CHP here, and download the white paper on Content Hub strategy.

Read More

Matthew Storm Joins the OpenText WFO Team

WFO

If you follow Matthew Storm on social media you may see “marketer, speaker, traveler, foodie, a farmer’s son, a customer service fanatic, lover of life and empty-nester.”  Sounds more like a transformer to me! Matthew has been working on mobile solutions for the past year as part of the OpenText team and as of this month, has stepped up as the product marketing team leader for OpenText™ Experience Suite.  This includes Workforce Optimization, Customer Communications Management, Digital Asset Management and Web Experience Management. Matthew has spent the last decade working overseas and leading a great team at NICE Systems, in addition to his years of contact center operational experience. He has a proven track record of delivering innovative and action-oriented results that unerringly focus on what matters most to both his internal and external customers. Fun and engaging, highly collaborative, deliberate when necessary but with a keen sense of speed to market – these are some of the attributes you’ll appreciate most when working with Matthew. I spoke with him about his new role, and here are three reasons he’s thrilled to be at OpenText: Analyst Recognition The Ian Jacobs-led team of Forrester researchers recently looked at what contact center teams are doing and what problems they’re solving with WFO solutions.  In their research, Forrester also took into account how easy it is to work with the vendors.  At OpenText, a valuable two-way relationship is based on how much a vendor listens to its customers. One of our references told Forrester, “We have had a seat at the table to influence the overall product road map.”  We are excited to bring new innovation to this space and honored to be categorized “Strong Performer”. Download your copy of the Forrester Wave for Workforce Optimization here. WFO Innovation The OpenText™ Qfiniti team has been busy!  In the past 18 months, the OpenText Qfiniti platform has released key innovations in agent guidance, desktop analytics, analytics-based QA, mobility, gamification, and managed services, all with a consistent user interface and unparalleled scale. Matthew was instrumental in the early stages of Qfiniti’s birth in 2003 and said, “I’m proud to see that the fresh innovations to OpenText™ WFO Software are grounded in the longstanding best practices of usability and customer-driven advancement.” Connecting Customer Journeys Finally, while many vendors talk about the multi-channel experience, OpenText has the depth of portfolio to actually “create” digital experiences on the web and social media that match the conversations happening in the contact center.  Matthew recently shared with an analyst that, “every department in an organization thinks that their group sets the tone for the customer experience; but in reality each department is driving amazing silos of mixed delight.  OpenText is more than just multichannel and journey-speak – our solutions touch every angle of customer experience management and seek to connect experiences to drive customer lifetime value.” Learn more about OpenText Experience Suite today. Welcome to the team, Matt!

Read More