Fintech at SIBOS: From Everyday Banking to Science Fiction


In late September, we were at SIBOS 2016, the annual trade show hosted by SWIFT. It’s a major showcase for innovative financial technology (“Fintech”), featuring the latest news and analysis about banking and electronic payments. Over 8,500 people attended this year, with Eurasian, Middle Eastern, African, and Chinese banks showing a stronger presence than in the past. The show was in Geneva, and we were struck by the contrast between the timeless beauty of the Alpine setting (snow-covered Mont Blanc looming over the skyline, Lac Leman at the foot of our hotel, and our daily commute passing vineyards) and the dynamic, even futuristic, requirements of the financial industry in the 21st century discussed at the conference. SIBOS has been morphing as SWIFT seeks out new directions. It offers a showcase for Fintech through its Innotribe program, continues to develop electronic communications standards for a wide range of uses, and this year was focusing more on cyber-security (after hackers exploited a SWIFT weakness to steal $81 million from the central bank of Bangladesh’s account at the Federal Reserve) to defend the SWIFT electronic transaction platform. We also heard a lot of buzz about new Fintech developments such as cybersecurity and artificial intelligence (e.g. creating algorithms that could manage simple investment portfolios as well as most human advisors). Blockchain: From cyberpunk to competitive advantage Of all these new technologies, blockchain was a key discussion at SIBOS. The conference was full of questions about blockchain: How does it work?, what does it mean for us?, how will payments be affected?, who is working with it?, and what are the real security issues? “Blockchain” is a type of distributed, online database that keeps a permanent, tamper-proof record of financial transactions. It was created to settle trades in Bitcoin, the virtual currency, and has since become popular for deals in “real” currencies because all parties can track the transaction securely, with no need for third-party verification. SWIFT is interested in blockchain technology even though – or maybe because – it could pose strong competition to SWIFT’s own secure payments service, which can take days or weeks to settle a complex transaction. Fintech competitors using blockchain are forecasting that they will be able to cut transaction time down to near-zero. “Let’s get it done” However, what interested us most about SIBOS 2016 is that despite all that buzz, there was still a core of business as usual – practical, “let’s get it done” business. The banks still face the same issues: “How do we do this payments business faster, cheaper, and take better advantage of the relationships it creates?” For example, client onboarding continues to be a challenge for many banks, and we provide a lot of value in this area. We had many discussions about how OpenText helps banks to improve their overall compliance, get to revenue sooner, and achieve higher customer satisfaction rates. We also conversations about the wealth of data that banks have and how they can enable better use of it, for both themselves and their customers. While they may be experimenting with new technologies driven by the Fintech boom, the practical business in the next cycle will be in establishing value from the networks and relationships already in place. OpenText’s role in the new Fintech world Naturally we feel OpenText has a role to play here. To start with, OpenText™ Analytics solutions help financial companies extract more value from their information by liberating it from the many separate silos it’s often housed in and integrating these various streams of information into a complete, accurate picture of investment performance, customer satisfaction, user experience, or response to various marketing incentives. Our message attracted a lot of interest at SIBOS, where we had great meetings with clients and prospects. One of the highlights was our Oktoberfest party co-sponsored with SAP, It was a great success with more than 260 people attending. Next year, SIBOS will take place in Toronto, Canada’s financial capital – not far from our corporate headquarters in Waterloo.  Who knows what strides the Fintech world will have taken by then? Gerry Gibney, Senior Financial Industry Strategist, and Senior Global Sales Director Nigel Hysom attended SIBOS, and contributed this blog content.

Read More

How the Same Data Analytics Used in Politics Can Help Your Business


Every four years, about 230 million people turn out to vote in the U.S presidential election. That’s a huge audience to influence, no matter how sophisticated your analytics. I mean, which data pool on which channel do you start with? Where do you look first? Just like businesses trying to reach customers, political campaigns can analyze sentiment, social media activity, and feedback from a variety of channels to spot trends, make predictions and craft persuasive messages. A new OpenTextVoice article on, 3 Things The Presidential Race Can Teach Business About Digital Strategy, explores how the same micro-targeting used in political data analytics can be applied in business. As November draws near, the data crunching will undoubtedly grow more feverish. To get a sense of how often the media is covering each candidate and in what context, check out the free online Election Tracker ’16 from OpenText. To read the full article on, go here.

Read More

Power Up Your iHub Projects with Free Interactive Viewer Extensions

Interactive Viewer

We’ve updated and are republishing a series of helpful tips for getting the most out of the Interactive Viewer tool for OpenText™ iHub, with free extensions created by our software engineers.  These extensions boost the appearance and functionality of Interactive Viewer, the go-to product for putting personalized iHub content in the hands of all users. Below are links to the full series of six blog posts.  If you have any suggestions for further extensions or other resources, please let us know through the comments below. 1. Extend Interactive Viewer with Row Highlighting A simple jQuery script for highlighting the row the user’s pointer is on. 2. Extend Interactive Viewer with a Pop-Up Dialog Box  Add a fully configurable pop-up dialog box to display details that don’t need to appear in every row. 3. Extend iHub Reports and Dashboards with Font Symbols  Dress up your iHub reports with nifty symbols. 4. Extend iHub Dashboards with Disqus Discussion Boards Encourage conversation the easy way, by embedding a discussion board in the popular Disqus format. 5. Extend iHub Interactive Viewer with Fast Filters  Make column-based filtering easy by using JSAPI to build a Fast Filter – a selectable drop-down menu of distinct values that appears in the header of a column. 6. Extend Interactive Viewer with Table-Wide Search   Filter across multiple columns in an iHub table by creating a table-wide search box.

Read More

Your Medical Biography is a Digitized Trail of Big Data


A far cry from the walls of paper files in medical offices, today’s digitized records let patients see lab results, get refill alerts and generally have more perspective and control over their medical destiny. These new systems are even using natural language processing algorithms to grasp the nuances of language, just like humans do. A new OpenTextVoice article on, Big Data Takes Turn As Storyteller, Pulls Together Health Narratives, shows how the digitization of health records is changing the way our medical histories are recorded. Now, we can save and analyze not only structured data like dates and procedure codes, but also unstructured information like doctor’s notes. All of these advancements are changing the medical paradigm to create a partnership between doctor and patient. It’s even supporting more sophisticated analysis, so that a treatment plan can be one based more on the full picture rather than just a snapshot. Get the full story here.

Read More

The Real U.S. Election Battlefield is Online


The playing field has narrowed, the debates have begun and we are slowly reaching the end of the U.S. Presidential race.  And, what a race it is. Not just on the campaign trail but online as well. Twitter, LinkedIn, Facebook and more are the hub of political commentary. Flooded with voters debating the issues, the politicians and the latest exploits of Donald Trump and Hillary Clinton, the people are speaking. A new OpenTextVoice article on explores how that data is changing the race.  While President Obama made a big political play online, it has only grown more sophisticated in the past few years. With tools and advanced analysis, campaigns can use the posts to “see how the tweets, mentions and articles praise their candidate or cast them in a negative light.” One thing is for sure whether you’re a Republican, Democrat, independent, undecided or Donald J. Trump himself, the real race is online. Check out the full article on Forbes.

Read More

Hidden in the Information: Success with Advanced Analytics


In today’s world, information is all around us. And, hidden in that Information is the key to both understanding your customers and predicting their behavior. Pretty nice, huh? Well, here comes the tricky part. With so much information how do you find the useful insights? A new OpenTextVoice article on explores how to uncover the information and apply it to your business. How can companies use that data to advance its business process or turn an industry on its head? As the article points out “Data is the foundation that allows transformative, digital change to happen.” Analytics are the key to unlocking the potential of data and turning it into something greater. Check out the full article on Forbes to find out to use advanced analytics to take charge of an industry and launch the next Netflix, AirBnB or Uber.  

Read More

With Cognitive Computing, Johnny Five is Alive… or at Least he Will be!


Have you ever wanted your own robot advisor? Well according to a new OpenTextVoice article on, your very own Lieutenant Commander Data, KITT or C3PO may be closer than you think. The article explains how unstructured data is filling the void in traditional computer systems to create cognitive computing systems which can mimic human thinking. Cognitive computing combines structured and unstructured data to enable organizations to make better decisions with intelligent systems that go beyond numbers and rows. The systems can “make predictions and recommendations that offer profound, actionable insights into a host of common business challenges.” Adding Natural Language Processing to the mix, you get a system that can think, “feel” and interact like any other human. Check out the full article on Forbes to find out how cognitive computing is bringing the automated trusted advisor to life.

Read More

Understanding Customer Feelings: Unstructured Data Analytics Tells all


Complex database queries and focus groups can help a business track customer insight, but there’s another approach. In a new OpenTextVoice article on, Meet The Algorithm That Knows How You Feel, you’ll learn how today’s businesses can analyze unstructured data using text analytics algorithms to actually grasp how customers are feeling. Unstructured data analytics lets businesses sift through documents, text messages, email and social media posts, to get to where the important insight lives. And now, using OpenText InfoFusion, businesses can manage, analyze, and understand information in ways that are transforming customer experience. As huge volumes of unstructured data come into the enterprise from multiple channels, the only way marketers can gain a competitive advantage is if they can see trends and preferences before the competition does. This insight not only fuels important business decisions but also changes the way a company relates to its customers. The article offers some history of the technology and also shares real-world applications. Get the full story here.

Read More

Turn Your Big Data into Big Value – Attend our Workshops

big data workshop

The ever-growing digitization of business processes and the rise of Big Data mean workplaces are drowning in information. Data analytics and reporting can help you find useful patterns and trends, or predict performance. The problem is many analytics platforms require expert help in sorting through billions of lines of data. Even full-fledged data scientists spend 50 to 80 percent of their time preparing data, not getting insights from it.  Moreover, there’s a built-in time lag if you need to ask your in-house data scientist to run the numbers when you, a line manager or subject expert, need answers right away. That’s why 95% of organizations want end users to be able to manage and prepare their own data, according to respected market researcher Howard Dresner. Luckily, there’s help. The OpenText™ Analytics Suite combines powerful, richly-featured data analysis with self-service convenience.  You can upload your own data by the terabyte, then access, blend, and explore it quickly, without coding. Then the analysis results can be shared and socialized via the highly-scalable, embedded BI and data visualization platform, which lets you design, deploy, and manage interactive reports and dashboards that can be embedded into any app or device. These reports can address a wide range of business questions from “What are my customers’ spending habits using the credit card from XYZ Bank?” to “Which customers are most likely to respond to our new offer?” Sure, you may be thinking, this sounds great but I want to see the solution in action before I decide. Here’s your chance. On Tuesday, Sept. 13, in San Diego, OpenText is offering a free, hands-on full day workshop on the OpenText Analytics Suite. This 6-hour session (including breakfast and lunch) provides a guided tour of the various modules within the suite and shows you how to build dynamic, interactive, visually compelling information applications. By the end of the workshop, you’ll understand how to: Build interactive, visually rich applications from the ground up Create data visualizations and reports using multiple data sources Embed these visualizations and reports seamlessly into existing apps Target profitable customers and markets using predictive analytics Our visionary speakers offer a day that’s not only informative but entertaining and engaging. Check here for the full schedule of workshops and dates available.

Read More

Unlock the Value of Your Supply Chain Through Embedded Analytics

supply chain analytics

Over the past few months I have posted a couple of blogs relating to the use of analytics in the supply chain. The first one really discussed the ‘why’ in terms of the reasons for applying analytics to supply chain operations, Understanding the Basics of Supply Chain Analytics. The second blog discussed the ‘how’, in terms of the methods of obtaining meaningful insights from B2B transactions flowing between trading partners, Achieve Deeper Supply Chain Intelligence with Trading Grid Analytics. The blogs were written in support of our recently announced OpenText™ Trading Grid Analytics, one of the many Business Network related offerings in Release 16. Release 16 is the most comprehensive set of products to be released by OpenText to enable companies to build out their digital platforms and enable a better way to work. Now those that have followed my blogs over the years will know that I have worked with many analyst firms to produce white papers and studies and I guess it was only appropriate that I should be fortunate to work with an outside analyst on a thought leadership white paper relating to analytics in the supply chain. I engaged with IDC to write a paper entitled, Unlock the Value of Your Supply Chain Through Embedded Analytics. IDC has been producing some interesting content over the years in support of their ‘Third Platform’ model which embraces IoT, cloud, mobile and big data and how companies can leverage these technologies for increased competitive advantage. The aim of our new analytics related white paper was to discuss the business benefits of embedding analytics into the transaction flows across a business network. Compared to other business intelligence and end user analytics solutions, OpenText is in a unique position as we own our Business Network and we are able to introspect the 16 billion EDI transactions flowing across our network. IDC leveraged a relatively new management theory called VUCA which stands for Volatility, Uncertainty, Complexity and Ambiguity to discuss how analytics can bring better insights into business operations. VUCA was originally defined in the military field and for our paper IDC aligned VUCA so that it leverage against a more connected, information-centric and synchronized business network, namely Velocity, Unity, Coordination and Analysis. I am not going to highlight too much content from the paper but here is one interesting quote from the paper. “It is the view of IDC that the best supply chains will be those that have the ability to quickly analyze large amounts of disparate data and disseminate business insights to decision makers in real time or close to real time. Businesses that consistently fail to do this will find themselves at an increasing competitive disadvantage and locked into a reactionary cycle of firefighting. Analytics really will be the backbone of the future of the supply chain.” Now I am not going to spoil the party by revealing any more from the paper!, if you would like to learn more then please register for our webinar, details are provided below. If you would like to get further insights about the white paper then OpenText will be hosting a joint webinar with IDC on 27th July 2016 at 11 am EDT, 5pm CET. This 40 minute webinar will allow you to: Understand how embedded analytics can provide deeper supply chain intelligence Learn how the VUCA management theory can be applied to a supply chain focused analytics environment and the expected business benefits that can be obtained Find out why it is important to have trading partners connected to a single business network environment to maximize the benefits of applying analytics to supply chain operations Learn how OpenText can provide a cloud based analytics environment to support your supply chain operations You can register for the webinar here.

Read More

Top 10 Trends To Watch in Financial Analytics (Infographic)

financial analytics

General Motors was a company facing challenges when they hired Daniel Akerson as CEO in 2010. His background was tied to financial analytics so it’s not a surprise that they then acquired AmeriCredit  in an all-cash transaction valued at approximately $3.5 billion. Daniel Akerson commented that he started “creating world-class systems in our IT and financial systems, to quickly capitalize on opportunities and protect product programs from exogenous risk“.  Fortune magazine announced recently that General Motors was working on the future of driving, which it believes will be “connected, seamless and autonomous.” Chief Financial Officers are not renowned for underestimating the power of innovation and statistics, better known today as “analytics.” Accenture has defined the CFO as the “Architect of Business Value,” yet if we go back to 2014,  we may recall many companies had deep silos of inconsistently defined, structured data that made it difficult to extract information from different parts of the business. We all know that, so what’s new for financial analytics? 1) Big data is getting bigger with the Internet Of Things → Tweet this. Having accurate data is still critical. ′In God we trust, all others must bring data′ said engineer and statistician W. Edwards Deming. We all agree that not all data is useful, but the wider the data source range, the more accurate the 360º view; so the challenge in a world with 6.4 billion connected devices is even bigger and finance organizations are challenged. 2) Digital is killing the finance organization → Tweet this. Big data coming from new structured and unstructured data sources are increasing risk but also offering new opportunities. Growing opportunities to store and analyze data from disparate sources is driving many leaders to alter their decision-making process. The number of mobile banking users in the US is now 111 million, in 2010 there were just 35 million as a comparison and today banks adopt e-commerce tactics, tracking customers through mobile to reduce churn and increase sales opportunities. 3) Data gravity will pull financial analytics to the cloud → Tweet this. Cloud data warehousing is becoming widely adopted, so the question for Information Technology now is “When?” When your data is already in the cloud, you better have analytics right there. Having your data within OpenText™ Cloud has the benefit of being with the leader in Enterprise Information Management and being able to load data via the front-end or back-end within a columnar database. 4) Compliance audit is still a challenge → Tweet this. Self-service data preparation is becoming key to better data audit. Analysts are saying that the next big market disruption is self service data preparation and that by 2019 it will represent 9.7% of the global revenue opportunity of the business intelligence market. OpenText™ Big Data Analytics, for example, provides financial analysts with easy-to-use tools for self-service data preparation like the following: Data preparation – tools like normalization, linear scaling, logistic scaling or softmax scaling Data enrichment – aggregates, decodes, expressions, numeric ranges, quantile ranges, parametric or ranking Data audit – count, kurtosis, maximum, mean, median, minimum, mode, skewness, sum, or standard deviation 5) Organizations should focus analytics on decisions, not reporting → Tweet this. Analysts introduced the concept of the algorithmic business to describe the next stage of digital business and forecast that by 2018, over half of large organizations globally will compete using advanced analytics and proprietary algorithms, causing the disruption of entire industries. 6) Analytics is not only about reporting any more → Tweet this. Analytics is maturing and priorities are naturally focused on the business. Algorithms will not only provide insights and support decision-making, but will be able to make decisions for you. 49% of companies are using predictive analytics and a further 37% are planning to use it within 3 years. Predictive analytics with big data has been the cornerstone of a successful financial services organization for some time now. What’s new is how easily any business user can access predictive insights with tools like OpenText™ Big Data Analytics. According to Francisco Margarite, CIO at Inversis Bank, “extracting and analyzing information used to take a considerable amount of time of our technical experts, and as a consequence, we were not providing the appropriate service. Now, we have provided more user autonomy and reduced the work overload in IT.” Here are some examples of how analytics can help: Venn Diagram: Identify which customers that purchased “Insurance A” and “Insurance C” did not purchase “Insurance B”, so you can get a profile and a shared list. Profile: Identify variables that describe customers by what they are and what they are not, so your company can personalize campaigns. Decision Tree: Identify which real estate is the most appropriate for each prospect according to the profile and features of the product – price, size, etc. Human Resources will probably find it useful analyzing social data to recruit people who already have an affinity with the company or its products. Forecast: Anticipate changes, trends and seasonal patterns. You can accurately predict sales opportunities or anticipate a volume of orders so you can also anticipate any associated risks. You can use sensor data to predict the failure of an ATM cash withdrawal transaction for example. Subscribers are also able to run easy-to-use pre-built Clustering, Association Rules, Logistic Regressions, Correlations, or Naive Bayes. 7) Financial data is the leading indicator when using analytics to predict serious events → Tweet this. Recent cases like CaixaBank or MasterCard prove that there is growing interest in using analytics to get ahead of social and political developments that affect corporate, and sometimes national interests. Financial data is a leading indicator. Now, we can also look at big data from social media to determine sentiment. 8) What will make emotional analytics really helpful is to have a stronger analytics model behind it → Tweet this. Risk or fraud can now be identified by sentiment analysis on social media, so you could determine future actions. But, if you look at sentiment or emotion alone, it essentially flags a lot of false positives. 9) The need to manage, monitor, and troubleshoot applications in real-time has never been more critical → Tweet this. Digital transformation grows as the reliance on new software grows, so the need for full-stack visibility and agility is the true business demand underpinning the growth of analytics. 10) To know your customer and to deliver relevant data is a key business differentiator → Tweet this. TransUnion, for instance, reported profit for 2015 after booking losses since 2012, when moving from relational databases to predictive and embedded analytics. Before the overhaul, IT spent half its time maintaining older equipment. About 60% of capital spending went to managing legacy systems, now it′s 40% to 45%, they said. First things first – no matter if you are forecasting risk of fraud, crime, or financial outcomes, once you have the insight from predictive analytics, you will need a way to communicate this to the right person. 77% of companies cite predictive capabilities as one of top reasons to deploy self-service business intelligence, but increasing loyalty by improving customer satisfaction is also a very relevant reason. Firms can now deliver a tailored, personalized experience where customers enjoy self-service access to their own account data and history, and are proactively provided recommendations and information about additional products and services to: Drive increased revenue via cross selling, by automatically suggesting new products and services to targeted customers Engage customers by providing self-service access to personal views of account data on any device Increase loyalty by enabling customers to interactively customize how they analyze their financial data Realize a more rapid time to market by leveraging its APIs to embed new features in existing customer applications OpenText™ Information Hub is providing great embedded analytics according to Dresner, and also offers insights from predictive analysis because of its unique integration with OpenText Big Data Analytics. Research “Operationalizing and Embedding Analytics for Action” from TDWI The report notes that operationalizing and embedding analytics requires more than static dashboards that are updated once a day, or less. It requires integrating actionable insights into your applications, devices and databases. Download the report here.

Read More

How to Identify Return on Investment of Social Media Marketing (Infographic)

social media ROI

The B2B marketing leaders will be spending more money on technology than the CIO in 2017. Sure they may already spend a lot, but the interesting question here is: will they now finally be able to identify the revenue? It was not long ago when CMO’s were perfectly ok with having more and more new technology tools. The problem came when they were forced to maintain the tools and, also to measure the performance of those tools in order to prove that they were still needed. While each platform may include its own reporting tools, in an omnichannel world, having so many partial views of the truth makes little sense. Many business users and decision makers can’t see the forest for the trees when it comes to the current analytics environment. How will they manage then in an Analytics of Things of world with 6.4 billion connected devices? Are CMO’s prepared? According to a recent study on The State of Marketing Analytics by VentureBeat, “Analytics are key to showing value, yet the market is huge and fragmented.” Customers are no longer using a single channel to buy yet only 60% of marketers create integrated strategies. Probably the most painful source to measure is social media. It is painful because there’s no way that investors will accept engagement metrics such as impressions or likes as revenue. It is also painful because the CMO is being pushed by market analysts to invest more and more budget on social media. Social media spending is expected to climb to a 20.9% share of marketing budgets in 5 years even though analytics is not yet fully integrated or embedded according to a recent CMO Survey. ROI of social media in the form of adding new revenue sources, enhancing revenue sources and increasing revenues is more pronounced among companies with a more mature data-driven culture. The graph below shows where competitive advantage has been achieved as a result of data-driven marketing by stage of development according to Forbes. What’s worse for the laggards is that their immature analytics culture is resulting in the lowest profitability – they even don’t know often which analytics platforms they are paying for, according to the same research. “There is no substitute for hard work,” said Thomas Edison. In order to identify the ROI of social media marketing, we will go through the hard work by going through the requirements for your social media data below, including: Why you need integration and democratization of data in the cloud, including social media data Why you need self-service advanced and predictive analytics for your campaigns Why agility is so important when it comes to marketing analytics Why integrated reporting and insights should be easy to access by any marketer or business user Integration and democratization of data in the cloud, including social media data This is not just useful to get a single view of the truth. Working in an environment where each platform’s reporting is separate from the others, is not possible to calculate ROI of social media marketing. You may find a way to calculate the revenue of a Paid Per Click campaign in LinkedIn for example, but this is a very narrow view of what social media marketing is about. The first step to identify the real revenue of social media should be to integrate all disparate data sources in the cloud. They should be integrated, because that way you will be able to cross reference information, discover the real 360º customer view and the actual ROI. Also, it should be integrated in the cloud specifically so other business users can access and self-service the final insights. Some people may think they have a 360º customer view when they integrate Google Analytics and Salesforce to calculate the ROI of a paid campaign in Google Adwords. But they are still far from that view. According to Think with Google, for instance, customers in most industries in US click on a paid ad long after they were engaged in social. That means that a percentage of the ROI of the Pay Per Click campaign should be attributed to social to be accurate. What if instead of making assumptions you start to track which channel is the first, then which is next, and which is the latest? You can do this with tools like Piwik or Eloqua Insights because they track all different devices from which the customer is visiting your website as well as specific URLs they land on, and order the events by date. While this is fine if you have less than 200 sessions per day on your website, if you have more than that you will quickly understand why big data is more than just hype! Trust me, if you had the time to explore, analyze and export using Piwik or Eloqua Insights you would really know what patience means. With big data, even if you just want to use a selected small part of it, you need columnar database technology like the one used by OpenText™ Big Data Analytics. Having your data sources integrated at the start and managed by IT is fine, but when it comes to data audits, data cleansing and data enrichment you had better expect this to be self-service. B2B companies need to know as much as they can about the companies they market to and 75% of B2B marketers say that accurate data is critical for achieving their goals but lack of data on Industry, Revenue, and Employees is a problem in up to 87% of the examples. Bad data affects not just marketing but also sales, according to Forrester, executive buyers find that 77% of the sales people they meet don’t understand their issues and where they can help. As a result a lot of CMOs are taking the initiative to start profiling and creating more targeted leads so that sales don’t have this problem. Self-service advanced and predictive analytics for marketing campaigns According to a recent study by MDG Advertising, B2B organizations that utilize predictive analytics are 2x more likely to exceed their annual marketing ROI goal. The research offers interesting reasons why 89% of B2B organizations have predictive analytics on their roadmap. According to VentureBeat there are 3 main reasons why marketers aren’t that advanced in their analytical approaches – including skill gaps around data science. Easy-to-use tools can make it easier to run reports, but without a real understanding of data-driven approaches, the final report may not be accurate enough. Predictive lead scoring, for instance, can yield significant ROI and 90% of large organizations will have a Chief Data Officer or CDO by 2019. Meanwhile CMOs are not inactive and 55% of B2B organizations are already hiring for marketing analytics roles. Success in social media is not as easy as being 30 years old. You know that you will be 29 years old for 12 months and then it will automatically change to 30. This is easy to predict, but it is not scalable to social media. In terms of social media you need to go deeper than the surface to measure, for example, if it is worth having 30 social media accounts or maybe 10. You need to measure if it is worth having 30 blog articles or maybe 300. You will want to calculate if “Channel A” is worthwhile because it generated 300 conversions from unqualified leads or not. You may want to go further and identify which of the qualified leads bought “Product A” and “Product C” but haven’t yet bought “Product B”. Do you want better segmentations and profiles of those, so you can create a custom cross-selling campaign, based on information that you can get from LinkedIn, for instance? If so, you will need to go further than data visualization. Companies need advanced analytics to identify ROI. This is what will give you the insight on the ROI of your social media, but more than it could ensure success in the current digital era. You will find the following ad-hoc & pre-built tools at OpenText Big Data Analytics: Venn Diagram: Are you tired of reaching the wrong people? Smarter companies are reporting benefits doing data mining to target advanced segmentations and the most appropriate people with marketing material that resonates with them. Note that segmentations are based on data mining, but can be created by drag and drop of the database objects in the left column. Profile: Are you still re-marketing to visitors who landed on your website by mistake? Our drag and drop, easy-to-use tool is needed by marketing in the current customer-centric culture. B2B marketing goals for predictive analytics span the customer funnel including customer retention, customer lifetime value, customer effectiveness right up to customer acquisition – so having a customer profile is a must-have to begin. Association Rules: What if you could identify which users are likely to abandon with sentiment analysis of their activity in social media and help-desks – so you can reduce churn with a loyalty campaign? Would that help the ROI of your social media? You can find more predefined analysis screenshots and use case videos to help. Companies plan to increase spend on marketing analytics, but many will select the wrong capabilities or be unable to use them properly. Harvard Business Review alerts to this point “marketing analytics can have a substantial impact on a company’s growth, but companies must figure out how to make the best use of it”. Why agility is so important when it comes to marketing analytics You already know that advanced and predictive analytics is not new, if you think about how financial services has been using it. What’s really new is how easy analysis can be created as self-service now. In the real world, only 20% of organizations are able to deploy a model into operational use in less than 2 weeks according to TDWI, so don’t forget to ask for self-service and real-time advanced analytics. You will be happy that your analytics platform technology includes a columnar database when you get to this point. Why integrated reporting and insights should be easy to access by any marketer or business user 3 out of 4 marketers can’t measure and report on the contribution of their programs to the business. Isn’t that scary? To know the customer and how to deliver relevant data is a key business differentiator. Marketing analytics tools need to be more than a nicely displayed report, they need to allow decision makers to interact with the information according to McKinsey & Company. There’s a lot that has been written about the opportunities of using big data in supply chain and retail companies, and specifically the social media capabilities to reach their audiences. The retail analytics market is estimated to grow from 2.2 billion to 5.1 billion in 5 years but difficulty in sharing customer analytics is ranked as a top challenge by the Industry. Social media is a smart way to connect a customer with a specific local store, right? Instagram includes stats of impressions, reach, clicks and follower activity for businesses. There are a few tools with powerful capabilities to personalize, share and embedded HTML5 data visualizations available, but OpenText™ Information Hub is the only one that is tied to advanced and predictive analytics. 9 out of 10 sales and marketing professionals report the greatest departmental interest is in being able to access analytics within front-office applications and OpenText Information Hub is ranked as the top vendor in the latest Embedded Business Intelligence Market Study by analyst Howard Dresner – not without reason. Don’t forget to ask for an analytics platform that is perfectly fine to be scaled to unlimited users. Turn your social data into strategy, then gold Predictive analytics not only applies to what will happen next quarter, but also to what the user may want to find right now. Google doesn’t wait for you to make association rules that have probably helped you a few times – they make their big data work for millions of users in real-time. Now think about your company and one of your prospects sharing your content on social media or email. Do you create a campaign to track and nurture these actions? How do you react if a user won’t complete a form more than once? What do you do when one of your prospects is searching on your site having landed from a social media post about “Product A”? Are you able to identify that “Product C” will be the more likely purchase? There are musicians unhappy about piracy, but there are others tracking, mining and getting revenue from social data. Getting these insights on revenue before running new omnichannel campaigns will provide one voice communication, essential for a successful omnichannel strategy. What is also important: this will help with better data-driven decisions and greater return on investment of your social media budget. 86% of companies that deployed predictive analytics for two or more years saw increased marketing return on investment according to Forbes. Download the Research “Operationalizing and Embedding Analytics for Action” from TDWI The report notes that operationalizing and embedding analytics requires more than static dashboards that are updated once a day, or less. It requires integrating actionable insights into your applications, devices and databases. Download the report here.

Read More

CRM for BI? There’s No Substitute for Full-Powered Analytics


As CRM systems get more sophisticated, many people think these applications will answer all of their business questions – for example, “Which campaigns are boosting revenues the most?” or “Where are my opportunities getting stuck?” Unfortunately, those people are mistaken. You have a CRM Some vendors of CRM (or customer relationship management) platforms like to talk about their analytic capabilities. Usually, those CRM systems have reporting capabilities and let users make some dashboards or charts to follow the evolution of their business: tracking pipeline, campaign effectiveness, lead conversion, quarterly revenue and so on. But from the perspective of analytics, this is only the smallest fraction of the value of the information your CRM is capturing from your sales teams and your customers. A CRM system is perfect for its intended purpose: managing the relationship between your sales force and your customers (or potential customers) and all the information related to them. And yes, it gathers lots of data in its repository, ready to be mined. What you may not realize is that the tool that collects transactional data is not necessarily the best tool to take advantage of it. It is critical for a CRM to be agile and fast in the interactions with your sales force. If it’s not, it interferes with sales people selling, building relationships with customers, and contacting prospects. So an ideal CRM system should be architected to collect, structure, and deploy transactional data in a way that the platform can manage it easily. Here’s the bad news: this kind of agile, transactional data structure isn’t great for analytics. Complex questions and CRMs don’t match Some CRM vendors try to provide certain business intelligence capabilities to their applications, but they face a basic problem. Data that is optimized for quick transactional access is not prepared to be analyzed, and in this scenario there are lots of questions that can’t be answered easily: Which accounts have gone unattended in the last 30 days? Who owns specific accounts? Where are my opportunities getting stuck? At a given point in the sales cycle, which accounts are generating the most revenue? How profitable is this account? Which campaigns are influencing my opportunities? Which products are sold together most frequently? These are only a few of the hundreds of questions that can’t be answered only through the data and analytic techniques CRM offers. This becomes impossible if you want to blend this CRM data with other external data in a closed system like a CRM because CRM doesn’t have this capability. CRM and full-powered analytics software CRM is a perfect tool for some things. However, it isn’t up to the demands of addressing the ever more complex questions companies need answered. Getting a 360-degree view of their business shouldn’t be a block to growing and increasing revenue. OpenText™ Big Data Analytics helps companies blend and prepare their data, and provides a fast, agile and flexible toolbox to let business analysts look for answers to the questions that decision-makers require. CRM data is one important part of this equation: being more competitive using full-powered analytic software is another.

Read More

How to Identify the Return on Investment of Big Data (Infographic)


If you are a CIO, you know what goes on at a board of directors meeting. This is not the place to be confused when your CEO asks you a simple, commonly asked question, which requires a simple, accurate answer: “What is the ROI of Big Data costs?” Let’s be honest, Big Data is a big, painful issue. It is often recommended to use just a little information in your dashboard if you want to be heard. But at the same time, you want this information to be accurate, right? That’s when you are happy that your data is big. Why is data size so important? Data scientists, for instance, are always asking for big data because they know how predictive analysis can easily be inaccurate if there isn’t enough data on which to base your predictions. We all know this when it comes to the weather forecast – it is the same when it comes to risk anticipation or sales opportunities identification. What’s really new is how easily any software can access big data. If 90% of the world’s data today has been created in the last 2 years alone, what can we expect in the near future? For starters, the Internet of Things is anticipated to hugely increase the volumes of data businesses will have to cope with. ROI is all about results. Big data is here to stay and bigger data is coming, so the best we can do is to make it worth the trouble. “Cloud is the new electricity,” according to the latest IT Industry Outlook published by CompTIA. But I don’t have good news for you, if you feel comfortable just planning to move your data to the cloud. This is just the beginning. Experts often say that big data doesn’t have much value when simply stored; your spending on big data projects should be driven by business goals. So it’s not a surprise that there is increased interest in gaining insights from big data and making data-based decisions. In fact, advanced analytics and self-service reporting is what you should be planning for your big data. I’ll briefly tell you why: You need to support democratization of data integration and data preparation in the cloud You should enable software for self-service advanced and predictive analytics Big data insights and reporting should be put where they’re needed Why support democratization of data integration and data preparation in the cloud Big data analytics, recruiting talent and user experience top the CIO agenda for 2016, according to The Wall Street Journal; but these gaps will hardly be solved in time, because of the shortage of data-savvy people. Actually, according to analysts, there is an anticipated 100,000+ analytic talent shortage of people through to 2020. So, meanwhile CIOs find solutions to their own talent gaps; new software and cloud services appear in the market to enable business users to get the business insights and advance ROI of big data. Hopefully, someday, a data scientist can provide those insights but platforms like OpenText™ Big Data Analytics includes easy-to-use, drag-and-drop features to load and integrate different data sources from the front end or the back end, in the cloud. Now, I say hopefully because requirements for data scientists are no longer the same. Knowledge of coding is often not required. According to Robert J. Lake, what he requires from data scientists at Cisco is to know how to make data-driven decisions – that’s why he leaves data scientists to play with any self-service analytics tool that may help them to reach that goal. Data scientists spend around 80% of their time preparing data, rather than actually getting insights from it – so interest in self-service data preparation is growing. Leaving the data cleansing to data scientists may be a good idea for some of their colleages, but actually it is not a good idea in terms of agility and accuracy. That’s the reason why cloud solutions like Salesforce are appreciated, because it leaves sales people time to collaborate – adding, editing or removing information that will give a more precise view of their prospects, one that only they are able to identify with such precision. What if you could expect the same from a Supply Chain Management or Electronic Health Record, where data audits depends on multiple worldwide data sources, with distinct processes and with no dependency on data experts at all? In fact, 95% of organizations want end users to be able to manage and prepare their own data, according to noted market analyst Howard Dresner. Analysts predict that the next big market disruption is self-service data preparation, so expect to hear more about it in the near future. Why you should enable self-service advanced and predictive analytics Very small businesses may find desktop tools like Excel good enough for their data analysis, but after digital disruption these tools have become inadequate even for small firms. The need for powerful analytic tools is even greater for larger companies from data-intensive industries such as telecommunications, healthcare, or government. The columnar database has been proposed as the solution, as it is much speedier than relational databases when querying hundreds of millions or billions of rows. Speed of a cloud service is dependent on the volume of data as well as the hardware itself. Measuring the speed of this emerging technology is not easy but even a whole NoSQL movement is advising that relational databases are not the best future option. Companies have been able to identify the ROI of big data using predictive analytics to anticipate risk or forecast opportunities for years. For example, banks, mortgage lenders, and credit card companies use credit scoring to predict customers’ profitability. They have been doing this even when complex algorithms require data scientists, hard-to-find expertise, not just to build but to keep them running. That limits their spread through an organization. That’s why OpenText™ Big Data Analytics in the Cloud includes ad-hoc and pre-built algorithms like: Profile: If you are able to visualize a Profile of a specific segment of your citizens, customers or patients and then personalize a campaign based on the differentiation values of this segment, why would the ROI of the campaign not be attributed to the big data that previously stored it? Forecasting: If the cloud application is able to identify cross-selling opportunities and a series of campaigns are launched, the ROI of those campaigns could be attributed to the big data that you previously secured Decision Tree: You should be able to measure the ROI of a new process based on customer risk identification during the next fiscal year and attribute it to big data that you previously stored in the cloud Association Rules: You can report the ROI of a new recruitment requirement based on an analysis of job abandonment information and attribute it to big data that you had previously enabled as a self-service solution The greater the number of stars shown on the Forecast screenshot above, the stronger the evidence for non-randomness. This is actually when you are grateful for having so much information and having it so clean! Customer analytics for sales and marketing provide some of the classic use cases. Looking at the patterns from terabytes of information on past transactions can help organizations identify the reasons behind customer churn, the ideal next offer to make to a prospect, detect fraud, or target existing customers for cross-selling and up-selling. Put Big Data insights and reporting where they’re needed Embedded visualizations and self-service reporting are key to allow the benefits of data-driven decisions into more departments, because it doesn’t require expert intervention. Instead, non-technical users can spontaneously “crunch the numbers” on business issues as they come up. Today 74% of marketers can’t measure and report on the contribution of their programs to the business according to VisionEdge Marketing. Imagine that you as a CIO have adopted a very strong advanced analytics platform, but the insights are not reaching the right people – that is, in case of a hospital, the doctor or the patient. Let’s say the profile of the patient and drug consumption is available in someone’s computer, but that insight is not reachable by any user who can make the difference when a new action is required. The hospital’s results will never be affected in that case by big data and the ROI potential will not be achieved because the people who need the insights are not getting them, and the hospital will not change with or without big data. This is called invisible analytics. Consider route optimization of a Supply Chain – the classic “traveling salesman problem.” When a sizable chunk of your workforce spends its day driving from location to location (sales force, delivery trucks, maintenance workers), you want to minimize the time, miles, gas, and vehicle wear and tear, while making sure urgent calls are given priority. Moreover, you want to be able to change routes on the fly – and let your remote employees make updates in real-time, rather than forcing them to wait for a dispatcher’s call. Real-time analytics and reporting should be able to put those insights literally in their hands, via tablets, phones, or smart watches, giving them the power to anticipate or adjust their routes. OpenText™ Information Hub offers these capabilities as a powerful ad-hoc and built-in reporting tool that enables any user to personalize how their company wants information and the data visualization to be displayed. You should always ensure that the security and scalable capabilities of the tool you need is carefully selected, because in such cases you will be dealing not only with billions of rows, but also maybe millions of end users. As mentioned at the start of this blog, user experience is also at the top of the CIO’s agenda. True personalization that ensures the best user experience requires technology that can be fully branded and customized. The goal should be to adapt data visualizations to the same look and feel as the application to provide a seamless user experience. UPS gathers information at every possible moment and stores over 16 petabytes of data. They make more than 16 million shipments to over 8.8 million customers globally, receive on average 39.5 million tracking requests from customers per day, employ 399.000 people in 220 different countries. They spend $1 billion a year on big data but their revenue in 2012 was $ 54.1 billion. Identification of the ROI of big data is dependent on the democratization of the business insights coming from advanced and predictive analytics of that information. Nobody said it is simple but it can lower operating costs and boost profits, which every business users identifies as ROI. Moreover when line-of-business users rather than technology users are driving the analysis, and the right people are getting the right insight when they need it, improved future actions should feed the wheel of big data with the bigger data that is coming. And sure you want it to come to the right environment, right? Download the Internet of Things and Business Intelligence by Dresner The Internet of Things and Business Intelligence from Dresner Advisory Services is a 70-page research that provides a wealth of information and analysis, offering value to consumers and producers of business intelligence technology and services. The business intelligence vendor ratings include scores for location intelligence, end-user data preparation, cloud BI, and advanced and predictive analytics–all key capabilities for business intelligence in an IoT context. Download here.

Read More

Big Data: The Key is Bridging Disparate Data Sources

Big Data

People say Big Data is the difference between driving blind in your business and having a full 360-degree view of your surroundings. But, adopting big data is not only about collecting data. You don’t get a Big Data club card just for changing your old (but still trustworthy) data warehouse into a data lake (or even worse, a data swamp). Big Data is not only about sheer volume of data. It’s not about making a muscular demonstration of how many petabytes you stored. To make a Big Data initiative succeed, the trick is to handle widely varied types of data, disparate sources, datasets that aren’t easily linkable, dirty data, and unstructured or semi-structured data. At least 40% of the C-level and high-ranking executives surveyed in the most recent NewVantage Partners’ Big Data Analytics Survey agree. Only 14.5% are worried about the volume of the data they’re trying to handle. One OpenText prospect’s Big Data struggle is a perfect example of why the key challenge is not data size but complexity. Recently, OpenText™ Analytics got an inquiry from an airline that needed better insights in order to head off customer losses. This low-cost airline had made a discovery about its loyal customers. Some of them, without explanation, would stop booking flights. These were customers that used to fly with them every month or even every week, but were now disappearing unexpectedly. The airline’s CIO asked why this was happening. The IT department struggled to push SQL queries against different systems and databases, exploring common scenarios for why customers leave. They examined: The booking application, looking for lost customers (or “churners”). Who has purchased flights in previous months but not the most recent month? Which were their last booked flights? The customer service ticketing system to find if any of the “churners” found in the booking system had a recent claim. Were any of those claims solved? Closed by the customer? Was there any hint of customer dissatisfaction? What are the most commonly used terms in their communications with the airline – for example, prices? Customer support? Seats? Delays? And what was the tone or sentiment around such terms? Were they calm or angry?  Merely irked, or furious and threatening to boycott the airline? The database of flight delays, looking for information about the churners’ last bookings. Were there any delays? How long? Were any of these delayed flights cancelled? Identifying segments of customers who left the company during the last month, whether due to claims unresolved or too many flights delayed or canceled, would be the first step towards winning them back. So at that point, the airline’s IT department’s most important job was to answer the CIO’s question – May I have this list of customers? The IT staff needed more than a month to get answers to these questions, because the three applications and their databases didn’t share information effectively. First they had to move long lists of customer IDs, booking codes, and flight numbers from one system to another. Then repeat the process when the results weren’t useful. It was a nightmare crafted of disperse data, complex SQL queries, transformation processes, and lots of efforts – and it delivered answers too late for the decision-maker. A new month came with more lost customers. That’s when the airline realized it needed a more powerful, flexible analytics solution that could effortlessly draw from all its various data sources. Intrigued by the possibilities of OpenText Analytics, it asked us to demonstrate how we could solve its problems. Using Big Data Analytics, we blended the three disparate data sources. In just 24 hours we were able to answer the questions and OpenText™ Big Data Analytics had worked its magic. The true value of Big Data is getting answers out of data coming from several diverse sources and different departments. This is the pure 360-degree view of business that everyone is talking about. But without an agile and flexible way to get that view, value is lost in delay. Analytical repositories that use columnar technologies – i.e., what OpenText Analytics solutions are built on – are there to help answer questions fast when a decision-maker needs answers to business challenges.

Read More

Analytics at 16 — Better Insight Into Your Business


It’s been 16 years since the dawn of Business Intelligence version 2.0. Back then, the average business relied on a healthy diet of spreadsheets with pivot tables to solve complex problems such as resource planning, sales forecasting and risk reporting. Larger organizations lucky enough to be cash-rich employed data scientists armed with enterprise-grade BI tools to give them better business insights. Neither method was perfect. Data remained in silos, accessible in days, not minutes. Methodologies such as Online Analytical Processing (OLAP), extract-transform-load (ETL), and data warehousing were helpful in computing and storing this data, but limitations on functionality and accessibility remained. Fast forward to today. Our drive to digitize analytics and provide a scalable platform creates opportunities for businesses to use and access any data source, from the simplest flat files, to the most complex databases, and online data. Advanced analytics tools now come as standard with connectors for multiple disparate data sources and a remote data provider option for loading data from a web address. These improvements in business analytics capabilities provide industry analysts with a rosy outlook for the BI and analytics market. One is forecasting global revenue in the sector to reach $16.9 billion this year, an increase of 5.2 percent from 2015. A Better Way to Work While business leaders are clamouring for more modern analytics tools, what do key stakeholders — marketers and the business analysts who support them, end-users, and of course IT and their development teams — really want in terms of outcomes? Simple: businesses want their analytics easy-to-use, fast, and agile. Leading technology analysts have commented that the shift to the modern BI and analytics platform has now reached a tipping point, and that transitioning to a modern BI platform provides the opportunity to create business value from deeper insights into diverse data sources. Over the last few years, OpenText has established its Analytics software to serve in the context of the application (or device, or workflow) to deliver personalized information, drive user adoption, and delight customers. Our recent Release 16 of the Analytics Suite is helping to enable our vision of using analytics as “A Better Way to Work.” The Analytics Suite features common, shared services between the two main products — OpenText™ Information Hub (iHub) and OpenText™ Big Data Analytics (BDA) — such as single sign-on, single security model, common access, and shared data. Additionally, iHub accesses BDA’s engine and analysis results to visualize them. The solution includes broadly functional APIs based on REST and JavaScript for embedding. Both iHub and BDA are available deployed either on-premises or as a managed service to serve business and technical users. Understand and Engage This focus drives our approach. At a high level, we enable two key use cases (as illustrated below). First, advanced analytics harnesses the power of your data to help you better understand your market or your customers (or factory or network in an Internet of Things scenario). Second, you engage with these users and decision makers with data-driven visual information like charts, reports, and dashboards—on the app or device of their choice. Whether you are looking to build smarter customer applications or harness the power of your big data to beat the competition to market, analytics is the bridge between your digital strategy and business insights that drive smarter decisions—for every user across the enterprise. Check out the Analytics Suite today and gain deeper insight into your business.

Read More

Achieve Deeper Supply Chain Intelligence with Trading Grid Analytics

supply chain analytics

In an earlier blog I discussed how analytics could be applied across supply chain processes to help businesses make more informed decisions relating to their trading partner communities. Big Data analytics has been used across supply chain operations for a few years, however the real power of analytics can only be realized if it is actually applied across the transactions flowing between trading partners. Embedding analytics to transaction flows allows companies to get a more accurate ‘pulse’ of what is going on across supply chain operations. In this blog, I would like to introduce a new offering as part of our Release 16 launch, OpenText™ Trading Grid Analytics. The OpenText™ Business Network processes over 16 billion EDI related transactions per year and this provides a rich seam of information to mine for improved supply chain intelligence. Last year,OpenText expanded its portfolio of Enterprise Information Management solutions with the acquisition of an industry leading embedded analytics company. The analytics solution that OpenText acquired is being embedded within a number of cloud-based SaaS offerings that are connected to OpenText’s Business Network. Trading Grid Analytics provides the ability to mine transaction flows for both operational and business specific metrics.  I explained the difference between operational and business metrics in my previous blog, but just to recap here briefly: Operational metrics can be defined as: delivering transactional data intelligence and volume trends needed to improve operational efficiencies and drive company profitability. Business metrics can be defined as: delivering the business process visibility required to make better decisions faster, spot and pursue market opportunities, mitigate risk and gain business agility. Trading Grid Analytics will initially offer a total of nine out-of-the-box metrics (covering EDIFACT and ANSI X12 based transactions), which will be made up of two operational and seven business metrics, all of which are displayed in a series of highly graphical reporting dashboards. Operational Metrics Volume by Document Type – Number and type of documents sent and received over a period of time (days, months, years) Volume by Trading Partners – Number and type of documents sent and received, ordered by top 10 and bottom 10 partners Business Metrics ASN Timeliness – Number of timely ASN creation instances as a percentage of total ASNs for a time period Price Variance – The actual invoiced cost of a purchased item, compared to the price at the time of order Invoice Accuracy – Measures whether invoices accurately reflect orders placed in terms of product, quantities, and price by supplier, during a specified period of time Quantity Variance – The remaining quantity to be invoiced from a purchase order, equalling the difference between the quantity delivered and the quantity invoiced for goods received Order Acceptance – Fully acknowledged POs as a percentage of total number of POs within a given period of time Top Partners by Spend – Top trading partners by the economic spend over a period of time Top Products by Spend – Top products by economic spend over time Supply chain leaders and procurement professionals need an accurate picture of what is going on across their trading partner communities so that they can, for example, identify leading trading partners and have information available to support the negotiation of new supply contracts. Trading Grid Analytics is a cloud-based analytics platform that offers: Better Productivity – Allows any transaction related issues to be identified and resolved more quickly Better Insight – Deeper insights into transactional and supply chain information driving more informed decisions Better Control – Improved visibility to exceptions and underperforming partners allows corrective action to be taken earlier in a business process Better Engagement – Collaborate more closely with top partners and mitigate risk with under-performing partners Better Innovation – Cloud-based reporting portal provides access any time, any place or anywhere More information about Trading Grid Analytics is available here. You can also learn more about the benefits of supply chain analytics.

Read More

Unstructured Data Analytics: Replacing ‘I Think’ With ‘We Know’


Anyone who reads our blogs is no doubt familiar with structured data—data that is neatly settled in a database. Row and column headers tell it where to go, the structure opens it to queries, graphic interfaces make it easy to visualize.  You’ve seen the resulting tables of numbers and/or words everywhere from business to government and scientific research. The problem is all the unstructured data, which some research firms estimate could make up between 40 and 80 percent of all data.  This includes emails, voicemails, written documents, PowerPoint presentations, social media feeds, surveys, legal depositions, web pages, video, medical imaging, and other types of content. Unstructured Data, Tell Me Something Unstructured data doesn’t display its underlying patterns easily. Until recently, the only way to get a sense of a big stack of reports or open-ended survey responses was to read through them and hope your intuition picked up on common themes; you couldn’t simply query it. But over the past few years, advances in analytics and content management software have given us more power to interrogate unstructured content. Now OpenText is bringing together powerful processing capacities from across its product lines to create a solution for unstructured data analytics that can give organizations a level of insight into their operations that they might not have imagined before. Replacing Intuition with Analytics The OpenText solution for unstructured data analytics has potential uses in nearly every department or industry. Wherever people are looking intuitively for patterns and trends in unstructured content, our solution can dramatically speed up and scale out their reach.  It can help replace “I feel like we’re seeing a pattern here…” with “The analytics tell us customers love new feature A but they’re finding new feature B really confusing; they wonder why we don’t offer potential feature C.”  Feel more confident in your judgment when the analytics back you up. The Technology Under the Hood This solution draws on OpenText’s deep experience in natural language processing and data visualization.  It’s scalable to handle terabytes of data and millions of users and devices. Open APIs, including JavaScript API (JSAPI) and REST, promote smooth integration with enterprise applications.  And it offers built-in integration with other OpenText solutions for content management, e-discovery, visualization, archiving, and more. Here’s how it works: OpenText accesses and harvests data from any unstructured source, including written documents, spreadsheets, social media, email, PDFs, RSS feeds, CRM applications, and blogs. OpenText InfoFusion retrieves and processes raw data; extracts people, places, and topics; and then determines the overall sentiment. Visual summaries of the processed information are designed, developed, and deployed on OpenText Information Hub (iHub). Visuals are seamlessly embedded into the app using iHub’s JavaScript API. Users enjoy interactive analytic visualizations that allow them to reveal interesting facts and gain unique insights from the unstructured data sources. Below are two common use cases we see for the OpenText solution for unstructured data analytics, but more come up every day, from retail and manufacturing to government and non profits.  If you think of further ways to use it, let us know in the comments below. Use Case 1: On-Demand Web Chat A bank we know told us recently how its customer service team over the past year or two had been making significantly more use of text-based customer support tools—in particular pop-up web chat. This meant the customer service managers were now collecting significantly more “free text” on a wide range of customer support issues including new product inquiries, complaints, and requests for assistance. Reading through millions of lines of text was proving highly time-consuming, but ignoring them was not an option. The bank’s customer service team understood that having the ability to analyze this data would help them spot and understand trends (say, interest in mortgage refinancing) or frequent issues (such as display problems with a mobile interface). Identifying gaps in offerings, common problems, or complaints regarding particular products could help them improve their overall customer experience and stay competitive. Use Case 2: Analysis of Complaints Data Another source of unstructured data is the notes customer service reps take while on the phone with customers. Many CRM systems offer users the ability to type in open-ended comments as an addition to the radio buttons, checklists, and other data structuring features for recording complaints, but they don’t offer built-in functionality to analyze this free-form text.  A number of banking representatives told us they considered this a major gap in their current analytics capabilities. Typically, a bank’s CRM system will offer a “pick list” of already identified problems or topics that customer service reps can choose from, but such lists don’t always provide the level of insight a company needs about what’s making its customers unhappy.  Much of the detail was captured in unstructured free-text fields that they had no easy way to analyze.  If they could quickly identify recurring themes, the banks felt they could be more proactive about addressing problems. Moreover, the banks wanted to analyze the overall emotional tone, or sentiment, of these customer case records and other free-form content sources, such as social media streams. Stand-alone tools for sentiment analysis do exist, but they are generally quite limited in scope or difficult to customize.  They wanted a tool that would easily integrate with their existing CRM system and combine its sentiment analysis with other, internally focused analytics and reporting functions—for example, to track changing consumer sentiment over time against sales or customer-service call volume. A Huge, Beautiful Use Case: Election Tracker ‘16 These are just two of the many use cases for the OpenText solution for unstructured data analytics; we’ll discuss more in future blog posts. You may already be familiar with the first application powered by the solution: the Election Tracker for the 2016 presidential race. The tracker, along with the interesting insights it sifts from thousands of articles about the campaign, has been winning headlines of its own. Expect to hear more about the Election Tracker ’16 as the campaign continues. Meanwhile, if you have ideas on other ways to use our Unstructured Data Analytics solution in your organization, leave them in the comments section.

Read More

The ‘It’ Role in IT: Chief Data Officer


A recent analyst report predicts that the majority of large organizations (90%) will have a Chief Data Officer by 2019. This trend is driven by the competitive need to improve efficiency through better use of information assets. To discuss this evolving role and the challenges of the CDO, Enterprise Management 360 Assistant Editor Sylvia Entwistle spoke to Allen Bonde, a long-time industry watcher with a focus on big data and digital disruption. Enterprise Management 360: Why are more and more businesses investing in the role of Chief Data Officer now? Bonde: No doubt the Chief Data Officer is kind of the new ‘It’ role in the executive suite, there’s a lot of buzz around this role. Interestingly enough, it spans operations, technology, even marketing, so we see this role in different areas of organizations. I think companies are looking for one leader to be a data steward for the organization, but also they’re looking for that same role to be a bit more focused on the future – to be the driver of innovation driven by data. So, you could say this role is growing because it’s at the crossroads of digital disruption and technology, as well as the corporate culture and leadership needed to manage in this new environment. It’s a role that I think could become the new CIO in certain really data-centric businesses. It also could become the new Chief Digital Officer in some cases. If you dig a little bit into responsibilities, it’s really about overseeing data from legacy systems and all of your third-party partners as well as managing data compliance and financial reporting. So it’s a role that has both an operational component as well as a visionary, strategic component. This is where maybe it departs from that purely technical role where there’s almost a left brain, right brain component of the CDO, empowered by technology, but ultimately it’s about people using technology in the right way, in a productive way to move the business forward. Enterprise Management 360: What trends do you think will have the biggest impact on the Chief Data Officer in 2016? Bonde: In terms of the drivers, it’s certainly about the growth of digital devices and data, especially coming from new sources. So there’s a lot of focus on device data with IoT or omni-channel data coming from the different touch points in the customer journey. There are these new sources of data and even if we don’t own them per se, they’re impacting our business, so I think that’s a big driver. Then there’s this notion of how new devices are shifting consumption models. So if you think about just a simple smartphone, this device is both creating and consuming data, changing the dynamic of how individuals are creating and interacting with their consuming data. There’s a whole back-drop to this from a regulatory perspective with privacy and other risk factors. Those are certainly drivers that are motivating companies to say “we need an individual or an office to take the lead in balancing the opportunity of data with the risk of data and making sure that we don’t, number one: get in trouble as a business but number two: we take advantage of what opportunity is in front of us.” Enterprise Management 360: Honing in on the data side of things – what are the most important levers for ensuring a return on data management and how do you think those returns should be measured? Bonde: A common thread across wherever the CDO is in the organization is going to be focused on outcomes and yes, it’s about technology; yes, it’s about consumer adoption. In terms of the so-called CDO agenda, I think that outcomes need to be pretty crisp; for example, we need to lower our risk profile or we need to improve our margins for a certain line of business or we’re all about bringing a product to market and so I think focusing on outcomes and getting alignment with those outcomes is the first most important lever that you have as a CDO. The second one I would argue is adoption and you can’t do anything at scale with data if you don’t have widespread adoption. The key to unlocking the value of data as an asset is driving the broadest adoption, so that most people in an organization including your customers and your partners get value out of the insights that you’re collecting. Ultimately this is about focusing on delivering insight into the everyday work that the organization is doing, which is very different to classic business intelligence or even big data, which used to be the domain of a relatively small number of people within the organization, who were responsible for parceling out the findings as they saw fit. I think the CDO is breaking down those walls, but this also means the CDO is facing a much bigger challenge than just simply getting BI tools in the hands of a few business analysts. Enterprise Management 360: A lot of people are seeing big data as a problem. Where would you draw the line between the hype and the fact of big data? Bonde: It’s a funny topic to me, having started earlier in my career as a data scientist working with what then was very large data sets and seeing if we could manage risk. This was in a telco, and we didn’t call it big data then, but we had the challenge of lots of data from different sources and trying to pull the meaning out, so that we could make smarter decisions about who might be a fraudulent customer or which markets we could go after. When you talk to CDO’s about big data I think that the role has benefited from hype around big data, because big data set the table for the need for a CDO. Yet CDO’s aren’t necessarily just focused on big data and in fact, one CDO we were talking to expressed, “we have more data than we know what to do with.” So they acknowledged the fact that they already have lots of that data, but their main struggle was in understanding it. It wasn’t necessarily a big data problem, it was a problem of finding the right data, and we see this day-to-day when we work with different clients; you can do an awful lot in terms of understanding by blending different data sources, not necessarily big data sources, but just different types of data and you can create insight from relatively small data sets if it’s packaged and delivered in the right way. IoT is a perfect example, people are getting excited about the challenges that managing data in the era of IoT will present, but it’s not really a big data problem, it’s a multi-source data problem. So I think the hype of big data has been useful for the market to get aligned with the idea of the importance of data, certainly the value of it when it’s collected, blended, cleaned and turned into actual insights. But in a way the big data problem has shifted to more a question of, “how do we get results quickly?” Fast is the new ‘big’ when it comes to data. I think that getting results quickly, transcends the ultimate result. If we can get a good, useful insights out quickly, that’s better than the perfect model or the perfect result. We hear this from CDO’s that they want to work in small steps, they want to fail fast. They want to run experiments that show the value of applying data in the frame of a specific business problem or objective. So to them big data may be creating their role but on a day-to-day basis it’s more about fast data or small data and blending lots of different types of data and then putting that into action quickly, which is where I think the cloud is as important as big data is to the CDO and cloud services like, Analytics-as-a-Service. Now, it may deal with big data but it almost doesn’t matter what the size of the data is; it’s how quickly you can apply it for a specific business problem. The CDO could be that keeper of the spectrum of how you’re collecting, securing, imagining your data but ultimately they’ll be judged by how successful they are at turning that data into practical insights for the everyday worker. That’s where the ROI and return on data management is across the whole spectrum, but ultimately if people can’t put the insights into use in a practical fashion and make it easy, it almost doesn’t matter what you’ve done at the back end. Read more about the Chief Data Officer’s role in a digital transformation blog by OpenText CEO, Mark Barrenchea.

Read More

Unstructured Data Analysis – the Hidden Need in Health Care


The ‘Hood I recently had the opportunity to attend the HIMSS 2016 conference in Las Vegas, one of the largest annual conferences in the field of health care technology. As I walked through the main level of the Exhibit Hall, I was amazed at the size of some vendor booths and displays. Some were the size of a house. With walls, couches, and fireplaces, they really seemed like you could move in! I was working at the OpenText booth on the Exhibit Hall level below the main one, which was actually a converted parking garage floor. Some exhibitors called this lower level “The ‘Hood”.  I loved it, though; people were friendly, the displays were great, and there were fresh-baked cookies every afternoon. I don’t know how many of the nearly 42,000 conference attendees I talked to, but I had great conversations with all of them. It’s an awesome conference to meet a diverse mix of health care professionals and learn more about the challenges they face. The Trick Half of the people I talked to asked me about solutions for analyzing unstructured data. When I asked them what kind of unstructured data they were looking to analyze, 70% of them said claim forms and medical coding. This actually surprised me. As a software developer with a data analysis background, I admit to not being totally up on health care needs. Claim forms and medical coding to me have always seemed very structured. Certain form fields get filled in on the claim form and rigid medical codes get assigned to particular diagnoses and treatments. Seems straightforward, no? What I learned from my discussions was that claims data requires a series of value judgments to improve the data quality.  I also learned that while medical coding is the transformation of health care diagnosis, procedures, medical services, and equipment into universal medical codes, this information is taken from transcriptions of physicians’ notes and lab results. This unstructured information is an area where data analysis can help immensely. The trick now is “How do we derive value from this kind of data?” The Reveal OpenText has an unstructured data analysis methodology that accesses and harvests data from unstructured sources. Its InfoFusion and Analytics products deliver a powerful knockout combination. The OpenText InfoFusion product trawls documents and extracts entities like providers, diagnoses, treatments and topics. It can then apply various analyses, such as determining sentiment. The OpenText Analytics product line can then provide advanced exploration and analysis, dashboards, and reports on the extracted data and sentiment. It then provides secure access throughout the organization through deployment on the OpenText Information Hub (iHub). Users will enjoy interactive analytic visualizations that will allow them to gain unique insights from the unstructured data sources. The Leave-Behind If you’re interested in learning more about our solution for unstructured data analytics, you can see it in action in this application, While this is not a health care solution, it demonstrates the power of unstructured data analysis that allows users to visually monitor, compare, and discover interesting facts. If you’re interested in helping me develop an example using claim forms or medical coding data, please contact me at I definitely want to demonstrate this powerful story next year at the HIMSS conference. See you next year in the ‘Hood!

Read More