Analytics

Telco Accessibility 101: What’s Now Covered by U.S. Legislation

telco accessibility

In a word, everything. Name a telecommunications product or service and chances are it has a legal requirement to comply with federal accessibility laws. Let’s see… Mobile connectivity services for smartphones, tablets, and computers? Check Smartphones, tablets, and computers? Check Internet services (e.g., cable, satellite)? Check Television services (e.g., cable, satellite, broadcast)? Check Televisions, radios, DVD/Blu-ray players, DVRs, and on-demand video devices? Check Email, texting, and other text-based communication? Check VoIP communications and online video conferencing? Check Fixed-line phone services? Check Fixed-line telephones, modems, answering machines, and fax machines? Check Two tin cans attached by a string? Check All of these products and services are covered by U.S. accessibility legislation (except the cans and string). What laws are we talking about here? Mainly Section 255 of the Telecommunications Act of 1996, for products and services that existed before 1996, and the Twenty-­First Century Communications and Video Accessibility Act (CVAA) of 2010, which picked up where Section 255 left off, defining accessibility regulations for broadband-enabled advanced communications services. Web accessibility legislation, while not telco-specific, is also relevant. The Americans with Disabilities Act (ADA) doesn’t explicitly define commercial websites as “places of public accommodation” (because the ADA predates the Internet), but the courts have increasingly interpreted the law this way. Therefore, as “places of public accommodation,” company websites—and all associated content –must be accessible to people with disabilities. For more insight on this, try searching on “Netflix ADA Title III” or reading this article. (By the way, a web-focused update of the ADA is in the offing.) Last but not least, we come to Section 508 of the Rehabilitation Act, which spells out accessibility guidelines for businesses wanting to sell electronic and information technology (EIT) to the federal government. If your company doesn’t do that, then Section 508 doesn’t apply to you. What this means for businesses Not unreasonably, telecommunications companies must ensure that their products and services comply with accessibility regulations and are also usable by people with disabilities. This usability requirement means that telecom service providers must offer contracts, bills, and customer support communications in accessible formats. For product manufacturers, usability means providing customers with a full range of relevant learning resources in accessible formats: installation guides, user manuals, and product support communications. To comply with the legislation, telecommunications companies must find and implement cost-effective technology solutions that will allow them to deliver accessible customer-facing content. Organizations that fail to meet federal accessibility standards could leave themselves open to consumer complaints, lawsuits, and, possibly, stiff FCC fines. Meeting the document challenge with accessible PDF Telecommunications companies looking for ways to comply with federal regulations should consider a solution that can transform their existing document output of contracts, bills, manuals, and customer support communications into accessible PDF format. Why PDF? PDF is already the de facto electronic document standard for high-volume customer communications such as service contracts and monthly bills because it’s portable and provides an unchanging snapshot, a necessity for any kind of recordkeeping. But what about HTML? Why not use that? While HTML is ideal for delivering dynamic web and mobile content such as on-demand, customizable summaries of customer account data, it doesn’t produce discrete, time-locked documents. Plus, HTML doesn’t support archiving or portability, meaning HTML files are not “official” documents that can be stored and distributed as fixed entities. Document content is low-hanging fruit Document inaccessibility is not a problem that organizations need to live with because it can be solved immediately — and economically — with OpenText’s Automated Output Accessibility Solution, the only enterprise PDF accessibility solution on the market for high-volume, template-driven documents. This unique software solution enables telecommunications companies to quickly transform service contracts, monthly bills, product guides, and other electronic documents into WCAG 2.0 Level AA-compliant accessible PDFs. Whatever the data source, our performance numbers are measured in milliseconds so customers will receive their content right when they ask for it. OpenText has successfully deployed this solution at government agencies, as well as large commercial organizations, giving them the experience and expertise required to deliver accessible documents within a short time frame, with minimal disruption of day-to-day business. Fast, reliable, compliant, and affordable, our automated solution can help you serve customers and meet your compliance obligations. Learn more about the OpenText™Automated Output Accessibility solution.

Read More

Banking Technology Trends: Overcoming Process Hurdles

Financial analytics

Editor’s Note: This is the second half of a wide-ranging interview with OpenText Senior Industry Strategist Gerry Gibney on the state of the global financial services industry and its technical needs.  The interview has been edited for length and continuity. Unifying Information for Financial Reporting I heard a lot of discussion at the SIBOS 2016 conference in Geneva around financial reporting. Banks face procedural hurdles, especially if they’re doing merchant or commercial banking.  A lot of them still have manual processes. In terms of the procedures, the bigger the bank, the bigger the problem. That’s because the information is often in many places. For example, different groups from the bank may approach their corporate banking customers to buy or use a service or product – which is great, but they have to track and report on it. Often in the beginning, these separate group reporting processes are manual. Eventually, they’ want to automate the reporting and join the information to other data sources, but that’s the big challenge – it takes time to assemble and coordinate all the information streams and get them to work as an internal dashboard.  A similar challenge is creating portals to track financial liquidity. Another example is where clients ask for specific reports. The bank don’t want to say no, so they have to produce the reports manually, often as a rush job, and in a format that the client finds useful. The challenge is to take large amounts of data and summarize so you can give people what they ask for with the periodicity, the look, and the format that they want. Embedded Visualizations for our Customers’ Customers That’s where we come in. A lot of the value we offer with OpenText Analytics is embedding our analytic and visualization applications in a client’s own application so that they can offer dashboards, windows, reporting and so forth, to their own internal or external customers. The beauty of our embeddable business intelligence or analytics piece is that no one on the business side has to see it or work with it. It offers functionality that can be applied as needed, without having to make IT adjustments on your part or requiring people to enter data into bulky third-party programs. Tremendous capabilities are suddenly just there. Users can build a data map that automatically gathers and manages data, then organizes and reports it – in any format required, whether visual via charts and graphs, or numeric, if you prefer. Plus, it has powerful drill-down ability. Flexibility to Cope with Regulatory Shifts The other aspect of reporting is reporting to regulatory agencies. After the Great Recession and the banking crisis, governments worldwide have been stepping up their efforts in regulating the financial industry. Not just nations – local governments also. In fact, the fastest-growing department in every bank now is regulatory compliance. There are ever-increasing workloads, more workflow but without more people to deal with it. The problem for the U.S. government is that it presents a moving target. Dodd-Frank controls and the Volcker Rule, required banks to end proprietary trading. There is potentially a new level of risk from government changes in requirements and the need for banks to produce new reports, even sometimes on things that they weren’t aware they needed to report on. Banks and other financial institutions need a reporting solution that enables quick and easy production of whatever information government regulators are asking for.  An ideal reporting solution will maximize the flexibility in how you can look across both unstructured data and all the structured data in multiple silos. This is a good use case for ad hoc analytics and reporting – the power to create new types of reports whatever regulators may require. Financial Analytics: Understanding Your Customer Another analytics-related topic I heard at SIBOS was the need to understand customers better and how to identify a good target customer. This is top-of-mind for banks. I’m amazed that people gather streams of data in their CRM systems and then don’t use it. Often their CRM systems are stand-alone, not connected to anything. They might contain information that’s extremely valuable and could enhance their efforts. For example, sales efforts, proposals, and pitch books. They could tie these things together, and then analyze their findings to correlate sales resources to the results. With a unified analytics flow, you can drive business by managing client relationships, figuring out through advanced analytics who is the best candidate for up-selling or cross-selling, as well as identifying new customers. Finding new insights by searching all these CRM systems is a tremendous value that analytics, especially embeddable analytics from OpenText, can deliver. Analytics can have tremendous amount of value to business operations and make them more efficient, productive, and profitable. You can’t ask more than that. To learn more about how OpenText Analytics can help the financial services industry unlock the business value of existing data, consider our Webinar, “Extracting Value from Your Data with Embedded Analytics,” Wednesday, Dec. 14, at 11 a.m. Pacific Time/2 p.m. Eastern.  Click here for more information and to register.

Read More

Banking Trends from SIBOS: Technology Solutions to Tame Rampaging Workflows

banking trends from SIBOS

Editor’s Note: Gerry Gibney, Senior Industry Strategist at OpenText and resident expert on the financial services industry, was recently interviewed on the banking trends and technical needs he discovered at SIBOS (the annual trade show hosted by SWIFT, provider of global secure financial messaging services).  I always come back from SIBOS having learned new things, it’s one of the largest banking events in the world and this year, one of the big topics was domestic payments. Many people aren’t aware that for large banks, corporate internet banking payments represent around 24% of their revenue. They benefit from payment money while it is in their hands and they can charge fees for the payment services. It’s a big market because payments have to be made, whether regular payments such as rent and utilities on buildings or one-time money transfers. And they add up. For bigger banks, we’re talking several hundred million dollars each. Of course, they would prefer to keep that balance in their bank or extract it over time. I see a big role for OpenText here. Our BPM solution can be deployed to help with business networks, so banks can manage the workflow, the processes, and the controls. Managing the controls is important because with the SWIFT processes (payments and messaging), issues include: Who is authorized to send the money? Who else can do it? Who else can approve it? What if that person leaves? How do we add them into the system or remove them? Automating Banking Workflow Our own experience at OpenText is typical. Every year, our company  goes through the payment permissions updating process. What do we need to know? What do we need to get? How do we get it? Where do we apply it? How many accounts are responsive? Doing business in, say, Hong Kong, Shanghai, or Japan, we may have 10 or 20 people with different signatory levels, each needing to sign an eight page statement. Eight pages times 10 people, every year, for every account – that’s 80 pages per account every year, and that’s typical of many companies. A company might well have several hundred accounts with just one bank, and this has to be managed every year, with ever changing rules, like regulators now requiring the CFO’s home address for example. Another workflow example is client onboarding, which has to be done every time. Even if the customer has 200 accounts and they want to add number 201, you still have to go through the onboarding process. So all the information is out there in different places, who knows how well protected it all is? OpenText’s security capabilities, our ability to add workflow, control workflow, minimize, and automate it, adds a lot of value. OpenText is also a SWIFT service bureau. We help with payments reporting, via EDI and our Business Network, to enhance what banks do. We help banking in many areas, across all our solutions – for example, with analytics, on the content side for unstructured data, or helping with records management, which is strong on compliance. With embeddable analytics we can gather all sorts of information, whether it’s for bank employees internally or their clients and customers. This information can be transformed into reports, perform sophisticated analysis, and help companies find new ways to get revenue from it. It can also help to track things more efficiently, comply with government regulations more easily, and improve bottom line without increasing operating costs. In summary, it can be a tremendously powerful component of a bank’s overall offering. The second half of this interview will be published next week.

Read More

Data Quality is the Key to Business Success

data quality

In the age of transformation, all successful companies collect data, but one of the most expensive and difficult problems to solve is the quality of that information. Data analysis is useless if we don’t have reliable information, because the answers we derive from it could deviate greatly from reality. Consequently, we could make bad decisions. Most organizations believe the data they work with is reasonably good, but they recognize that poor-quality data poses a substantial risk to their bottom line. (The State of Enterprise Quality Data 2016 – 451 Research) Meanwhile, the idiosyncrasies of Big Data are only making the data quality problem more acute. Information is being generated at increasingly faster rates, while larger data volumes are innately harder to manage. Data quality challenges There are four main drivers of dirty data: Lack of knowledge. You may not know what certain data mean. For example, does the entry “2017” refer to a year, a price ($2,017.00), the number of widgets sold (2,017), or an arbitrary employee ID number? This could happen because the structure is too complex, especially in large transactional database systems, or the data source is unclear (particularly if that source is external). Variety of data. This is a problem when you’re trying to integrate incompatible types of information. The incompatibility can be as simple as one data source reporting weights in pounds and another in kilograms, or as complex as different database formats. Data transfers. Employee typing errors can be reduced through proofreading and better training. But a business model that relies on external customers or partners to enter their own data has greater risks of “dirty” data because it can’t control the quality of their inputs. System errors caused by server outages, malfunctions, duplicates, and so forth. Dealing with dirty data? Correcting a data quality problem is not easy. For one thing, it is complicated and expensive; benefits aren’t apparent in the short term, so it can be hard to justify to management. And as I mentioned above, the data gathering and interpretation process has many vulnerable places where error can creep in. Furthermore, both the business processes from which you’re gathering data and the technology you’re using are liable to change at short notice, so quality correction processes need to be flexible. Therefore, an organization that wants reliable data quality needs to build in multiple quality checkpoints: during collection, delivery, storage, integration, recovery, and during analysis or data mining. The trick is having a plan Monitoring so many potential checkpoints, each requiring a different approach, calls for a thorough quality assurance plan. A classic starting point is analyzing data quality when it first enters the system – often via manual input, or where the organization may not have standardized data input systems. The risk analyzed is that data entry can be erroneous, duplicated, or overly abbreviated (e.g. “NY” instead of “New York City.)”  In these cases, data quality experts’ guidance falls into two categories. First, you can act preventively on the process architecture, e.g. building integrity checkpoints, enforcing existing checkpoints better, limiting the range of data to be entered (for example, replacing free-form entries with drop-down menus), rewarding successful data entry, and eliminating hardware or software limitations (for example, if a CRM system can’t pull data straight from a sales revenue database). The other option is to strike retrospectively, focused on data cleaning or diagnostic tasks (error detection). Experts recommend these steps: Analyzing the accuracy of the data, through either making a full inventory of the current situation (trustworthy but potentially expensive) or examining work and audit samples (less expensive, but not 100% reliable). Measuring the consistency and correspondence between data elements; problems here can affect the overall truth of your business information. Quantifying systems errors in analysis that could damage data quality. Measuring the success of completed processes, from data collection through transformation to consumption One example might be how many “invalid” or “incomplete” alerts remain at the end of a pass through of the data. Your secret weapon: “Data provocateurs” None of this will help if you don’t have the whole organization involved in improving data quality. Thomas C. Redman, an authority in the field, presents a model for this in a Harvard Business Review article, “Data Quality Should Be Everyone’s Job.” Redman says it’s necessary to involve what he calls “data provocateurs,” people in different areas of the business (from top executives to new employees), who will challenge data quality and think outside the box for ways to improve it. Some companies are even proposing awards to employees who detect process flaws where poor data quality can sneak in. This not only cuts down on errors, it has the added benefit of promoting the idea through the whole company that clean, accurate data is important. Summing up Organizations are rightly concerned about data quality and its impact on their bottom line. The ones that take measures to improve their data quality are seeing higher profits and more efficient operations because their decisions are based on reliable data. They also see lower costs from fixing errors and spend less time gathering and processing their data. The journey towards better data quality requires involving all levels of the company. It also requires assuming costs whose benefits may not be visible in the short term, but which eventually will end up boosting these companies’ profits and competitiveness.

Read More

Attention All Airlines: Is Your Inaccessible Document Technology Turning Away Customers?

accessible PDF

Imagine you’re an airline executive and a small but significant percentage of your customers—let’s say 10% or less—download flight itineraries and boarding passes from your website only to find that the information in these documents was jumbled up and, in some cases, missing altogether. What would you do? Would you be concerned enough to take action? Would it matter if these customers didn’t know their flight number, boarding gate, and seat assignment? After all, 90% or more of your customers would still be receiving this information as usual. Before venturing an answer to these hypothetical questions, let’s pause for a quick look at your industry. Over the last 60 years, airline profit margins have averaged less than 1%, though the situation has been improving in recent years. The International Air Transport Association (IATA) reported a net profit margin of 1.8% in 2014 and 4.9% in 2015; industry profits are expected to be 5.6% in 2016. With such narrow margins, it’s clear that airlines need every customer they can get, and the industry has little tolerance for inefficiencies. Now back to your document problem… Even if less than 10% of customers were affected, it seems likely that you’d take steps to fix the problem and also pull out the stops to get it done as fast as possible, before the company loses many customers. Of course, the underlying assumption here is that a proven, economically feasible IT solution is available. This might be happening at your airline—for real All hypotheticals aside, a scenario like this could actually be playing out at your company right now. Consider: According to the 2014 National Health Interview Survey, 22.5 million adult Americans—nearly 10% of adult Americans—reported being blind or having some sort of visual impairment. To access online flight booking tools, along with electronic documents such as itineraries and boarding passes, many of these people need to use screen reader programs that convert text into audio. If, however, the documents aren’t saved in a format like accessible PDF (with a heading structure, defined reading order, etc.), they’re likely to come out garbled or incomplete in a screen reader. Of course, visually impaired customers could book their flights by phone and opt to receive Braille or Large Print documents in the mail (expensive for your airline). Then again, theoretically, all of your other customers could book by phone, too. The point is you don’t really want customers booking by phone because your self-serve website is less costly to operate than customer call centers; electronic documents are cheaper than paper and postage, and much cheaper than Braille and Large Print. So, wouldn’t it be nice if there was an affordable technology solution that you could plug in to serve up the documents that all of your customers—that’s 90% plus 10%—need to fly with your airline? Or course, it would be even better if the solution met the requirements of new Department of Transportation (DOT) rules implementing the Air Carrier Accessibility Act (ACAA), which have a compliance deadline of December 12, 2016. Customer satisfaction and regulatory compliance? Now that would be good. OpenText Automated Output Accessibility Solution OpenText has the only enterprise PDF accessibility solution for high-volume, template-driven documents. This unique software solution can dynamically transform electronic documents such as e-ticket itineraries/receipts and boarding passes into accessible PDFs that comply with the DOT’s new ACAA rules. Designed to be inserted between the back office (e.g., a passenger reservation system) and a customer-facing Web portal, OpenText™ Automated Output Accessibility Solution has minimal impact on existing IT infrastructure. Even better, the solution generates WCAG 2.0 Level AA compliant output that has been tested and validated by prominent organizations and advocacy groups for visually impaired people. OpenText has successfully deployed this solution at government agencies, as well as large commercial organizations, giving them the experience and expertise required to deliver accessible documents within a short time frame, with minimal disruption of day-to-day business. As the de facto electronic document standard for high-volume customer communications, PDF format offers both portability and an unchanging snapshot of information, necessities for a document of record. Contact us to discuss more about how we can help you deliver accessible, ACAA-compliant PDF documents to your customers. Remember, the DOT’s deadline is December 12, 2016.

Read More

Fintech at SIBOS: From Everyday Banking to Science Fiction

FinTech

In late September, we were at SIBOS 2016, the annual trade show hosted by SWIFT. It’s a major showcase for innovative financial technology (“Fintech”), featuring the latest news and analysis about banking and electronic payments. Over 8,500 people attended this year, with Eurasian, Middle Eastern, African, and Chinese banks showing a stronger presence than in the past. The show was in Geneva, and we were struck by the contrast between the timeless beauty of the Alpine setting (snow-covered Mont Blanc looming over the skyline, Lac Leman at the foot of our hotel, and our daily commute passing vineyards) and the dynamic, even futuristic, requirements of the financial industry in the 21st century discussed at the conference. SIBOS has been morphing as SWIFT seeks out new directions. It offers a showcase for Fintech through its Innotribe program, continues to develop electronic communications standards for a wide range of uses, and this year was focusing more on cyber-security (after hackers exploited a SWIFT weakness to steal $81 million from the central bank of Bangladesh’s account at the Federal Reserve) to defend the SWIFT electronic transaction platform. We also heard a lot of buzz about new Fintech developments such as cybersecurity and artificial intelligence (e.g. creating algorithms that could manage simple investment portfolios as well as most human advisors). Blockchain: From cyberpunk to competitive advantage Of all these new technologies, blockchain was a key discussion at SIBOS. The conference was full of questions about blockchain: How does it work?, what does it mean for us?, how will payments be affected?, who is working with it?, and what are the real security issues? “Blockchain” is a type of distributed, online database that keeps a permanent, tamper-proof record of financial transactions. It was created to settle trades in Bitcoin, the virtual currency, and has since become popular for deals in “real” currencies because all parties can track the transaction securely, with no need for third-party verification. SWIFT is interested in blockchain technology even though – or maybe because – it could pose strong competition to SWIFT’s own secure payments service, which can take days or weeks to settle a complex transaction. Fintech competitors using blockchain are forecasting that they will be able to cut transaction time down to near-zero. “Let’s get it done” However, what interested us most about SIBOS 2016 is that despite all that buzz, there was still a core of business as usual – practical, “let’s get it done” business. The banks still face the same issues: “How do we do this payments business faster, cheaper, and take better advantage of the relationships it creates?” For example, client onboarding continues to be a challenge for many banks, and we provide a lot of value in this area. We had many discussions about how OpenText helps banks to improve their overall compliance, get to revenue sooner, and achieve higher customer satisfaction rates. We also had conversations about the wealth of data that banks have and how they can enable better use of it, for both themselves and their customers. While they may be experimenting with new technologies driven by the Fintech boom, the practical business in the next cycle will be in establishing value from the networks and relationships already in place. OpenText’s role in the new Fintech world Naturally we feel OpenText has a role to play here. To start with, OpenText™ Analytics solutions help financial companies extract more value from their information by liberating it from the many separate silos it’s often housed in and integrating these various streams of information into a complete, accurate picture of investment performance, customer satisfaction, user experience, or response to various marketing incentives. Our message attracted a lot of interest at SIBOS, where we had great meetings with clients and prospects. One of the highlights was our Oktoberfest party co-sponsored with SAP, It was a great success with more than 260 people attending. Next year, SIBOS will take place in Toronto, Canada’s financial capital – not far from our corporate headquarters in Waterloo.  Who knows what strides the Fintech world will have taken by then?

Read More

How the Same Data Analytics Used in Politics Can Help Your Business

election-digital-strategy

Every four years, about 230 million people turn out to vote in the U.S presidential election. That’s a huge audience to influence, no matter how sophisticated your analytics. I mean, which data pool on which channel do you start with? Where do you look first? Just like businesses trying to reach customers, political campaigns can analyze sentiment, social media activity, and feedback from a variety of channels to spot trends, make predictions and craft persuasive messages. A new OpenTextVoice article on Forbes.com, 3 Things The Presidential Race Can Teach Business About Digital Strategy, explores how the same micro-targeting used in political data analytics can be applied in business. As November draws near, the data crunching will undoubtedly grow more feverish. To get a sense of how often the media is covering each candidate and in what context, check out the free online Election Tracker ’16 from OpenText. To read the full article on Forbes.com, go here.

Read More

Power Up Your iHub Projects with Free Interactive Viewer Extensions

Interactive Viewer

We’ve updated and are republishing a series of helpful tips for getting the most out of the Interactive Viewer tool for OpenText™ iHub, with free extensions created by our software engineers.  These extensions boost the appearance and functionality of Interactive Viewer, the go-to product for putting personalized iHub content in the hands of all users. Below are links to the full series of six blog posts.  If you have any suggestions for further extensions or other resources, please let us know through the comments below. 1. Extend Interactive Viewer with Row Highlighting A simple jQuery script for highlighting the row the user’s pointer is on. 2. Extend Interactive Viewer with a Pop-Up Dialog Box  Add a fully configurable pop-up dialog box to display details that don’t need to appear in every row. 3. Extend iHub Reports and Dashboards with Font Symbols  Dress up your iHub reports with nifty symbols. 4. Extend iHub Dashboards with Disqus Discussion Boards Encourage conversation the easy way, by embedding a discussion board in the popular Disqus format. 5. Extend iHub Interactive Viewer with Fast Filters  Make column-based filtering easy by using JSAPI to build a Fast Filter – a selectable drop-down menu of distinct values that appears in the header of a column. 6. Extend Interactive Viewer with Table-Wide Search   Filter across multiple columns in an iHub table by creating a table-wide search box.

Read More

Your Medical Biography is a Digitized Trail of Big Data

healthcare

A far cry from the walls of paper files in medical offices, today’s digitized records let patients see lab results, get refill alerts and generally have more perspective and control over their medical destiny. These new systems are even using natural language processing algorithms to grasp the nuances of language, just like humans do. A new OpenTextVoice article on Forbes.com, Big Data Takes Turn As Storyteller, Pulls Together Health Narratives, shows how the digitization of health records is changing the way our medical histories are recorded. Now, we can save and analyze not only structured data like dates and procedure codes, but also unstructured information like doctor’s notes. All of these advancements are changing the medical paradigm to create a partnership between doctor and patient. It’s even supporting more sophisticated analysis, so that a treatment plan can be one based more on the full picture rather than just a snapshot. Get the full story here.

Read More

The Real U.S. Election Battlefield is Online

datadrivenrace

The playing field has narrowed, the debates have begun and we are slowly reaching the end of the U.S. Presidential race.  And, what a race it is. Not just on the campaign trail but online as well. Twitter, LinkedIn, Facebook and more are the hub of political commentary. Flooded with voters debating the issues, the politicians and the latest exploits of Donald Trump and Hillary Clinton, the people are speaking. A new OpenTextVoice article on Forbes.com explores how that data is changing the race.  While President Obama made a big political play online, it has only grown more sophisticated in the past few years. With tools and advanced analysis, campaigns can use the posts to “see how the tweets, mentions and articles praise their candidate or cast them in a negative light.” One thing is for sure whether you’re a Republican, Democrat, independent, undecided or Donald J. Trump himself, the real race is online. Check out the full article on Forbes.

Read More

Hidden in the Information: Success with Advanced Analytics

advanced-analytics

In today’s world, information is all around us. And, hidden in that Information is the key to both understanding your customers and predicting their behavior. Pretty nice, huh? Well, here comes the tricky part. With so much information how do you find the useful insights? A new OpenTextVoice article on Forbes.com explores how to uncover the information and apply it to your business. How can companies use that data to advance its business process or turn an industry on its head? As the article points out “Data is the foundation that allows transformative, digital change to happen.” Analytics are the key to unlocking the potential of data and turning it into something greater. Check out the full article on Forbes to find out to use advanced analytics to take charge of an industry and launch the next Netflix, AirBnB or Uber.  

Read More

With Cognitive Computing, Johnny Five is Alive… or at Least he Will be!

trusted-advisor

Have you ever wanted your own robot advisor? Well according to a new OpenTextVoice article on Forbes.com, your very own Lieutenant Commander Data, KITT or C3PO may be closer than you think. The article explains how unstructured data is filling the void in traditional computer systems to create cognitive computing systems which can mimic human thinking. Cognitive computing combines structured and unstructured data to enable organizations to make better decisions with intelligent systems that go beyond numbers and rows. The systems can “make predictions and recommendations that offer profound, actionable insights into a host of common business challenges.” Adding Natural Language Processing to the mix, you get a system that can think, “feel” and interact like any other human. Check out the full article on Forbes to find out how cognitive computing is bringing the automated trusted advisor to life.

Read More

Understanding Customer Feelings: Unstructured Data Analytics Tells all

customerfeelings

Complex database queries and focus groups can help a business track customer insight, but there’s another approach. In a new OpenTextVoice article on Forbes.com, Meet The Algorithm That Knows How You Feel, you’ll learn how today’s businesses can analyze unstructured data using text analytics algorithms to actually grasp how customers are feeling. Unstructured data analytics lets businesses sift through documents, text messages, email and social media posts, to get to where the important insight lives. And now, using OpenText InfoFusion, businesses can manage, analyze, and understand information in ways that are transforming customer experience. As huge volumes of unstructured data come into the enterprise from multiple channels, the only way marketers can gain a competitive advantage is if they can see trends and preferences before the competition does. This insight not only fuels important business decisions but also changes the way a company relates to its customers. The article offers some history of the technology and also shares real-world applications. Get the full story here.

Read More

Turn Your Big Data into Big Value – Attend our Workshops

big data workshop

The ever-growing digitization of business processes and the rise of Big Data mean workplaces are drowning in information. Data analytics and reporting can help you find useful patterns and trends, or predict performance. The problem is many analytics platforms require expert help in sorting through billions of lines of data. Even full-fledged data scientists spend 50 to 80 percent of their time preparing data, not getting insights from it.  Moreover, there’s a built-in time lag if you need to ask your in-house data scientist to run the numbers when you, a line manager or subject expert, need answers right away. That’s why 95% of organizations want end users to be able to manage and prepare their own data, according to respected market researcher Howard Dresner. Luckily, there’s help. The OpenText™ Analytics Suite combines powerful, richly-featured data analysis with self-service convenience.  You can upload your own data by the terabyte, then access, blend, and explore it quickly, without coding. Then the analysis results can be shared and socialized via the highly-scalable, embedded BI and data visualization platform, which lets you design, deploy, and manage interactive reports and dashboards that can be embedded into any app or device. These reports can address a wide range of business questions from “What are my customers’ spending habits using the credit card from XYZ Bank?” to “Which customers are most likely to respond to our new offer?” Sure, you may be thinking, this sounds great but I want to see the solution in action before I decide. Here’s your chance. On Tuesday, Sept. 13, in San Diego, OpenText is offering a free, hands-on full day workshop on the OpenText Analytics Suite. This 6-hour session (including breakfast and lunch) provides a guided tour of the various modules within the suite and shows you how to build dynamic, interactive, visually compelling information applications. By the end of the workshop, you’ll understand how to: Build interactive, visually rich applications from the ground up Create data visualizations and reports using multiple data sources Embed these visualizations and reports seamlessly into existing apps Target profitable customers and markets using predictive analytics Our visionary speakers offer a day that’s not only informative but entertaining and engaging. Check here for the full schedule of workshops and dates available.

Read More

Unlock the Value of Your Supply Chain Through Embedded Analytics

supply chain analytics

Over the past few months I have posted a couple of blogs relating to the use of analytics in the supply chain. The first one really discussed the ‘why’ in terms of the reasons for applying analytics to supply chain operations, Understanding the Basics of Supply Chain Analytics. The second blog discussed the ‘how’, in terms of the methods of obtaining meaningful insights from B2B transactions flowing between trading partners, Achieve Deeper Supply Chain Intelligence with Trading Grid Analytics. The blogs were written in support of our recently announced OpenText™ Trading Grid Analytics, one of the many Business Network related offerings in Release 16. Release 16 is the most comprehensive set of products to be released by OpenText to enable companies to build out their digital platforms and enable a better way to work. Now those that have followed my blogs over the years will know that I have worked with many analyst firms to produce white papers and studies and I guess it was only appropriate that I should be fortunate to work with an outside analyst on a thought leadership white paper relating to analytics in the supply chain. I engaged with IDC to write a paper entitled, Unlock the Value of Your Supply Chain Through Embedded Analytics. IDC has been producing some interesting content over the years in support of their ‘Third Platform’ model which embraces IoT, cloud, mobile and big data and how companies can leverage these technologies for increased competitive advantage. The aim of our new analytics related white paper was to discuss the business benefits of embedding analytics into the transaction flows across a business network. Compared to other business intelligence and end user analytics solutions, OpenText is in a unique position as we own our Business Network and we are able to introspect the 16 billion EDI transactions flowing across our network. IDC leveraged a relatively new management theory called VUCA which stands for Volatility, Uncertainty, Complexity and Ambiguity to discuss how analytics can bring better insights into business operations. VUCA was originally defined in the military field and for our paper IDC aligned VUCA so that it leverage against a more connected, information-centric and synchronized business network, namely Velocity, Unity, Coordination and Analysis. I am not going to highlight too much content from the paper but here is one interesting quote from the paper. “It is the view of IDC that the best supply chains will be those that have the ability to quickly analyze large amounts of disparate data and disseminate business insights to decision makers in real time or close to real time. Businesses that consistently fail to do this will find themselves at an increasing competitive disadvantage and locked into a reactionary cycle of firefighting. Analytics really will be the backbone of the future of the supply chain.” Now I am not going to spoil the party by revealing any more from the paper!, if you would like to learn more then please register for our webinar, details are provided below. If you would like to get further insights about the white paper then OpenText will be hosting a joint webinar with IDC on 27th July 2016 at 11 am EDT, 5pm CET. This 40 minute webinar will allow you to: Understand how embedded analytics can provide deeper supply chain intelligence Learn how the VUCA management theory can be applied to a supply chain focused analytics environment and the expected business benefits that can be obtained Find out why it is important to have trading partners connected to a single business network environment to maximize the benefits of applying analytics to supply chain operations Learn how OpenText can provide a cloud based analytics environment to support your supply chain operations You can register for the webinar here.

Read More

Top 10 Trends To Watch in Financial Analytics (Infographic)

financial analytics

General Motors was a company facing challenges when they hired Daniel Akerson as CEO in 2010. His background was tied to financial analytics so it’s not a surprise that they then acquired AmeriCredit  in an all-cash transaction valued at approximately $3.5 billion. Daniel Akerson commented that he started “creating world-class systems in our IT and financial systems, to quickly capitalize on opportunities and protect product programs from exogenous risk“.  Fortune magazine announced recently that General Motors was working on the future of driving, which it believes will be “connected, seamless and autonomous.” Chief Financial Officers are not renowned for underestimating the power of innovation and statistics, better known today as “analytics.” Accenture has defined the CFO as the “Architect of Business Value,” yet if we go back to 2014,  we may recall many companies had deep silos of inconsistently defined, structured data that made it difficult to extract information from different parts of the business. We all know that, so what’s new for financial analytics? 1) Big data is getting bigger with the Internet Of Things → Tweet this. Having accurate data is still critical. ′In God we trust, all others must bring data′ said engineer and statistician W. Edwards Deming. We all agree that not all data is useful, but the wider the data source range, the more accurate the 360º view; so the challenge in a world with 6.4 billion connected devices is even bigger and finance organizations are challenged. 2) Digital is killing the finance organization → Tweet this. Big data coming from new structured and unstructured data sources are increasing risk but also offering new opportunities. Growing opportunities to store and analyze data from disparate sources is driving many leaders to alter their decision-making process. The number of mobile banking users in the US is now 111 million, in 2010 there were just 35 million as a comparison and today banks adopt e-commerce tactics, tracking customers through mobile to reduce churn and increase sales opportunities. 3) Data gravity will pull financial analytics to the cloud → Tweet this. Cloud data warehousing is becoming widely adopted, so the question for Information Technology now is “When?” When your data is already in the cloud, you better have analytics right there. Having your data within OpenText™ Cloud has the benefit of being with the leader in Enterprise Information Management and being able to load data via the front-end or back-end within a columnar database. 4) Compliance audit is still a challenge → Tweet this. Self-service data preparation is becoming key to better data audit. Analysts are saying that the next big market disruption is self service data preparation and that by 2019 it will represent 9.7% of the global revenue opportunity of the business intelligence market. OpenText™ Big Data Analytics, for example, provides financial analysts with easy-to-use tools for self-service data preparation like the following: Data preparation – tools like normalization, linear scaling, logistic scaling or softmax scaling Data enrichment – aggregates, decodes, expressions, numeric ranges, quantile ranges, parametric or ranking Data audit – count, kurtosis, maximum, mean, median, minimum, mode, skewness, sum, or standard deviation 5) Organizations should focus analytics on decisions, not reporting → Tweet this. Analysts introduced the concept of the algorithmic business to describe the next stage of digital business and forecast that by 2018, over half of large organizations globally will compete using advanced analytics and proprietary algorithms, causing the disruption of entire industries. 6) Analytics is not only about reporting any more → Tweet this. Analytics is maturing and priorities are naturally focused on the business. Algorithms will not only provide insights and support decision-making, but will be able to make decisions for you. 49% of companies are using predictive analytics and a further 37% are planning to use it within 3 years. Predictive analytics with big data has been the cornerstone of a successful financial services organization for some time now. What’s new is how easily any business user can access predictive insights with tools like OpenText™ Big Data Analytics. According to Francisco Margarite, CIO at Inversis Bank, “extracting and analyzing information used to take a considerable amount of time of our technical experts, and as a consequence, we were not providing the appropriate service. Now, we have provided more user autonomy and reduced the work overload in IT.” Here are some examples of how analytics can help: Venn Diagram: Identify which customers that purchased “Insurance A” and “Insurance C” did not purchase “Insurance B”, so you can get a profile and a shared list. Profile: Identify variables that describe customers by what they are and what they are not, so your company can personalize campaigns. Decision Tree: Identify which real estate is the most appropriate for each prospect according to the profile and features of the product – price, size, etc. Human Resources will probably find it useful analyzing social data to recruit people who already have an affinity with the company or its products. Forecast: Anticipate changes, trends and seasonal patterns. You can accurately predict sales opportunities or anticipate a volume of orders so you can also anticipate any associated risks. You can use sensor data to predict the failure of an ATM cash withdrawal transaction for example. Subscribers are also able to run easy-to-use pre-built Clustering, Association Rules, Logistic Regressions, Correlations, or Naive Bayes. 7) Financial data is the leading indicator when using analytics to predict serious events → Tweet this. Recent cases like CaixaBank or MasterCard prove that there is growing interest in using analytics to get ahead of social and political developments that affect corporate, and sometimes national interests. Financial data is a leading indicator. Now, we can also look at big data from social media to determine sentiment. 8) What will make emotional analytics really helpful is to have a stronger analytics model behind it → Tweet this. Risk or fraud can now be identified by sentiment analysis on social media, so you could determine future actions. But, if you look at sentiment or emotion alone, it essentially flags a lot of false positives. 9) The need to manage, monitor, and troubleshoot applications in real-time has never been more critical → Tweet this. Digital transformation grows as the reliance on new software grows, so the need for full-stack visibility and agility is the true business demand underpinning the growth of analytics. 10) To know your customer and to deliver relevant data is a key business differentiator → Tweet this. TransUnion, for instance, reported profit for 2015 after booking losses since 2012, when moving from relational databases to predictive and embedded analytics. Before the overhaul, IT spent half its time maintaining older equipment. About 60% of capital spending went to managing legacy systems, now it′s 40% to 45%, they said. First things first – no matter if you are forecasting risk of fraud, crime, or financial outcomes, once you have the insight from predictive analytics, you will need a way to communicate this to the right person. 77% of companies cite predictive capabilities as one of top reasons to deploy self-service business intelligence, but increasing loyalty by improving customer satisfaction is also a very relevant reason. Firms can now deliver a tailored, personalized experience where customers enjoy self-service access to their own account data and history, and are proactively provided recommendations and information about additional products and services to: Drive increased revenue via cross selling, by automatically suggesting new products and services to targeted customers Engage customers by providing self-service access to personal views of account data on any device Increase loyalty by enabling customers to interactively customize how they analyze their financial data Realize a more rapid time to market by leveraging its APIs to embed new features in existing customer applications OpenText™ Information Hub is providing great embedded analytics according to Dresner, and also offers insights from predictive analysis because of its unique integration with OpenText Big Data Analytics. Research “Operationalizing and Embedding Analytics for Action” from TDWI The report notes that operationalizing and embedding analytics requires more than static dashboards that are updated once a day, or less. It requires integrating actionable insights into your applications, devices and databases. Download the report here.

Read More

How to Identify Return on Investment of Social Media Marketing (Infographic)

social media ROI

The B2B marketing leaders will be spending more money on technology than the CIO in 2017. Sure they may already spend a lot, but the interesting question here is: will they now finally be able to identify the revenue? It was not long ago when CMO’s were perfectly ok with having more and more new technology tools. The problem came when they were forced to maintain the tools and, also to measure the performance of those tools in order to prove that they were still needed. While each platform may include its own reporting tools, in an omnichannel world, having so many partial views of the truth makes little sense. Many business users and decision makers can’t see the forest for the trees when it comes to the current analytics environment. How will they manage then in an Analytics of Things of world with 6.4 billion connected devices? Are CMO’s prepared? According to a recent study on The State of Marketing Analytics by VentureBeat, “Analytics are key to showing value, yet the market is huge and fragmented.” Customers are no longer using a single channel to buy yet only 60% of marketers create integrated strategies. Probably the most painful source to measure is social media. It is painful because there’s no way that investors will accept engagement metrics such as impressions or likes as revenue. It is also painful because the CMO is being pushed by market analysts to invest more and more budget on social media. Social media spending is expected to climb to a 20.9% share of marketing budgets in 5 years even though analytics is not yet fully integrated or embedded according to a recent CMO Survey. ROI of social media in the form of adding new revenue sources, enhancing revenue sources and increasing revenues is more pronounced among companies with a more mature data-driven culture. The graph below shows where competitive advantage has been achieved as a result of data-driven marketing by stage of development according to Forbes. What’s worse for the laggards is that their immature analytics culture is resulting in the lowest profitability – they even don’t know often which analytics platforms they are paying for, according to the same research. “There is no substitute for hard work,” said Thomas Edison. In order to identify the ROI of social media marketing, we will go through the hard work by going through the requirements for your social media data below, including: Why you need integration and democratization of data in the cloud, including social media data Why you need self-service advanced and predictive analytics for your campaigns Why agility is so important when it comes to marketing analytics Why integrated reporting and insights should be easy to access by any marketer or business user Integration and democratization of data in the cloud, including social media data This is not just useful to get a single view of the truth. Working in an environment where each platform’s reporting is separate from the others, is not possible to calculate ROI of social media marketing. You may find a way to calculate the revenue of a Paid Per Click campaign in LinkedIn for example, but this is a very narrow view of what social media marketing is about. The first step to identify the real revenue of social media should be to integrate all disparate data sources in the cloud. They should be integrated, because that way you will be able to cross reference information, discover the real 360º customer view and the actual ROI. Also, it should be integrated in the cloud specifically so other business users can access and self-service the final insights. Some people may think they have a 360º customer view when they integrate Google Analytics and Salesforce to calculate the ROI of a paid campaign in Google Adwords. But they are still far from that view. According to Think with Google, for instance, customers in most industries in US click on a paid ad long after they were engaged in social. That means that a percentage of the ROI of the Pay Per Click campaign should be attributed to social to be accurate. What if instead of making assumptions you start to track which channel is the first, then which is next, and which is the latest? You can do this with tools like Piwik or Eloqua Insights because they track all different devices from which the customer is visiting your website as well as specific URLs they land on, and order the events by date. While this is fine if you have less than 200 sessions per day on your website, if you have more than that you will quickly understand why big data is more than just hype! Trust me, if you had the time to explore, analyze and export using Piwik or Eloqua Insights you would really know what patience means. With big data, even if you just want to use a selected small part of it, you need columnar database technology like the one used by OpenText™ Big Data Analytics. Having your data sources integrated at the start and managed by IT is fine, but when it comes to data audits, data cleansing and data enrichment you had better expect this to be self-service. B2B companies need to know as much as they can about the companies they market to and 75% of B2B marketers say that accurate data is critical for achieving their goals but lack of data on Industry, Revenue, and Employees is a problem in up to 87% of the examples. Bad data affects not just marketing but also sales, according to Forrester, executive buyers find that 77% of the sales people they meet don’t understand their issues and where they can help. As a result a lot of CMOs are taking the initiative to start profiling and creating more targeted leads so that sales don’t have this problem. Self-service advanced and predictive analytics for marketing campaigns According to a recent study by MDG Advertising, B2B organizations that utilize predictive analytics are 2x more likely to exceed their annual marketing ROI goal. The research offers interesting reasons why 89% of B2B organizations have predictive analytics on their roadmap. According to VentureBeat there are 3 main reasons why marketers aren’t that advanced in their analytical approaches – including skill gaps around data science. Easy-to-use tools can make it easier to run reports, but without a real understanding of data-driven approaches, the final report may not be accurate enough. Predictive lead scoring, for instance, can yield significant ROI and 90% of large organizations will have a Chief Data Officer or CDO by 2019. Meanwhile CMOs are not inactive and 55% of B2B organizations are already hiring for marketing analytics roles. Success in social media is not as easy as being 30 years old. You know that you will be 29 years old for 12 months and then it will automatically change to 30. This is easy to predict, but it is not scalable to social media. In terms of social media you need to go deeper than the surface to measure, for example, if it is worth having 30 social media accounts or maybe 10. You need to measure if it is worth having 30 blog articles or maybe 300. You will want to calculate if “Channel A” is worthwhile because it generated 300 conversions from unqualified leads or not. You may want to go further and identify which of the qualified leads bought “Product A” and “Product C” but haven’t yet bought “Product B”. Do you want better segmentations and profiles of those, so you can create a custom cross-selling campaign, based on information that you can get from LinkedIn, for instance? If so, you will need to go further than data visualization. Companies need advanced analytics to identify ROI. This is what will give you the insight on the ROI of your social media, but more than it could ensure success in the current digital era. You will find the following ad-hoc & pre-built tools at OpenText Big Data Analytics: Venn Diagram: Are you tired of reaching the wrong people? Smarter companies are reporting benefits doing data mining to target advanced segmentations and the most appropriate people with marketing material that resonates with them. Note that segmentations are based on data mining, but can be created by drag and drop of the database objects in the left column. Profile: Are you still re-marketing to visitors who landed on your website by mistake? Our drag and drop, easy-to-use tool is needed by marketing in the current customer-centric culture. B2B marketing goals for predictive analytics span the customer funnel including customer retention, customer lifetime value, customer effectiveness right up to customer acquisition – so having a customer profile is a must-have to begin. Association Rules: What if you could identify which users are likely to abandon with sentiment analysis of their activity in social media and help-desks – so you can reduce churn with a loyalty campaign? Would that help the ROI of your social media? You can find more predefined analysis screenshots and use case videos to help. Companies plan to increase spend on marketing analytics, but many will select the wrong capabilities or be unable to use them properly. Harvard Business Review alerts to this point “marketing analytics can have a substantial impact on a company’s growth, but companies must figure out how to make the best use of it”. Why agility is so important when it comes to marketing analytics You already know that advanced and predictive analytics is not new, if you think about how financial services has been using it. What’s really new is how easy analysis can be created as self-service now. In the real world, only 20% of organizations are able to deploy a model into operational use in less than 2 weeks according to TDWI, so don’t forget to ask for self-service and real-time advanced analytics. You will be happy that your analytics platform technology includes a columnar database when you get to this point. Why integrated reporting and insights should be easy to access by any marketer or business user 3 out of 4 marketers can’t measure and report on the contribution of their programs to the business. Isn’t that scary? To know the customer and how to deliver relevant data is a key business differentiator. Marketing analytics tools need to be more than a nicely displayed report, they need to allow decision makers to interact with the information according to McKinsey & Company. There’s a lot that has been written about the opportunities of using big data in supply chain and retail companies, and specifically the social media capabilities to reach their audiences. The retail analytics market is estimated to grow from 2.2 billion to 5.1 billion in 5 years but difficulty in sharing customer analytics is ranked as a top challenge by the Industry. Social media is a smart way to connect a customer with a specific local store, right? Instagram includes stats of impressions, reach, clicks and follower activity for businesses. There are a few tools with powerful capabilities to personalize, share and embedded HTML5 data visualizations available, but OpenText™ Information Hub is the only one that is tied to advanced and predictive analytics. 9 out of 10 sales and marketing professionals report the greatest departmental interest is in being able to access analytics within front-office applications and OpenText Information Hub is ranked as the top vendor in the latest Embedded Business Intelligence Market Study by analyst Howard Dresner – not without reason. Don’t forget to ask for an analytics platform that is perfectly fine to be scaled to unlimited users. Turn your social data into strategy, then gold Predictive analytics not only applies to what will happen next quarter, but also to what the user may want to find right now. Google doesn’t wait for you to make association rules that have probably helped you a few times – they make their big data work for millions of users in real-time. Now think about your company and one of your prospects sharing your content on social media or email. Do you create a campaign to track and nurture these actions? How do you react if a user won’t complete a form more than once? What do you do when one of your prospects is searching on your site having landed from a social media post about “Product A”? Are you able to identify that “Product C” will be the more likely purchase? There are musicians unhappy about piracy, but there are others tracking, mining and getting revenue from social data. Getting these insights on revenue before running new omnichannel campaigns will provide one voice communication, essential for a successful omnichannel strategy. What is also important: this will help with better data-driven decisions and greater return on investment of your social media budget. 86% of companies that deployed predictive analytics for two or more years saw increased marketing return on investment according to Forbes. Download the Research “Operationalizing and Embedding Analytics for Action” from TDWI The report notes that operationalizing and embedding analytics requires more than static dashboards that are updated once a day, or less. It requires integrating actionable insights into your applications, devices and databases. Download the report here.

Read More

CRM for BI? There’s No Substitute for Full-Powered Analytics

CRM

As CRM systems get more sophisticated, many people think these applications will answer all of their business questions – for example, “Which campaigns are boosting revenues the most?” or “Where are my opportunities getting stuck?” Unfortunately, those people are mistaken. You have a CRM Some vendors of CRM (or customer relationship management) platforms like to talk about their analytic capabilities. Usually, those CRM systems have reporting capabilities and let users make some dashboards or charts to follow the evolution of their business: tracking pipeline, campaign effectiveness, lead conversion, quarterly revenue and so on. But from the perspective of analytics, this is only the smallest fraction of the value of the information your CRM is capturing from your sales teams and your customers. A CRM system is perfect for its intended purpose: managing the relationship between your sales force and your customers (or potential customers) and all the information related to them. And yes, it gathers lots of data in its repository, ready to be mined. What you may not realize is that the tool that collects transactional data is not necessarily the best tool to take advantage of it. It is critical for a CRM to be agile and fast in the interactions with your sales force. If it’s not, it interferes with sales people selling, building relationships with customers, and contacting prospects. So an ideal CRM system should be architected to collect, structure, and deploy transactional data in a way that the platform can manage it easily. Here’s the bad news: this kind of agile, transactional data structure isn’t great for analytics. Complex questions and CRMs don’t match Some CRM vendors try to provide certain business intelligence capabilities to their applications, but they face a basic problem. Data that is optimized for quick transactional access is not prepared to be analyzed, and in this scenario there are lots of questions that can’t be answered easily: Which accounts have gone unattended in the last 30 days? Who owns specific accounts? Where are my opportunities getting stuck? At a given point in the sales cycle, which accounts are generating the most revenue? How profitable is this account? Which campaigns are influencing my opportunities? Which products are sold together most frequently? These are only a few of the hundreds of questions that can’t be answered only through the data and analytic techniques CRM offers. This becomes impossible if you want to blend this CRM data with other external data in a closed system like a CRM because CRM doesn’t have this capability. CRM and full-powered analytics software CRM is a perfect tool for some things. However, it isn’t up to the demands of addressing the ever more complex questions companies need answered. Getting a 360-degree view of their business shouldn’t be a block to growing and increasing revenue. OpenText™ Big Data Analytics helps companies blend and prepare their data, and provides a fast, agile and flexible toolbox to let business analysts look for answers to the questions that decision-makers require. CRM data is one important part of this equation: being more competitive using full-powered analytic software is another.

Read More

How to Identify the Return on Investment of Big Data (Infographic)

shutterstock_245965159-min

If you are a CIO, you know what goes on at a board of directors meeting. This is not the place to be confused when your CEO asks you a simple, commonly asked question, which requires a simple, accurate answer: “What is the ROI of Big Data costs?” Let’s be honest, Big Data is a big, painful issue. It is often recommended to use just a little information in your dashboard if you want to be heard. But at the same time, you want this information to be accurate, right? That’s when you are happy that your data is big. Why is data size so important? Data scientists, for instance, are always asking for big data because they know how predictive analysis can easily be inaccurate if there isn’t enough data on which to base your predictions. We all know this when it comes to the weather forecast – it is the same when it comes to risk anticipation or sales opportunities identification. What’s really new is how easily any software can access big data. If 90% of the world’s data today has been created in the last 2 years alone, what can we expect in the near future? For starters, the Internet of Things is anticipated to hugely increase the volumes of data businesses will have to cope with. ROI is all about results. Big data is here to stay and bigger data is coming, so the best we can do is to make it worth the trouble. “Cloud is the new electricity,” according to the latest IT Industry Outlook published by CompTIA. But I don’t have good news for you, if you feel comfortable just planning to move your data to the cloud. This is just the beginning. Experts often say that big data doesn’t have much value when simply stored; your spending on big data projects should be driven by business goals. So it’s not a surprise that there is increased interest in gaining insights from big data and making data-based decisions. In fact, advanced analytics and self-service reporting is what you should be planning for your big data. I’ll briefly tell you why: You need to support democratization of data integration and data preparation in the cloud You should enable software for self-service advanced and predictive analytics Big data insights and reporting should be put where they’re needed Why support democratization of data integration and data preparation in the cloud Big data analytics, recruiting talent and user experience top the CIO agenda for 2016, according to The Wall Street Journal; but these gaps will hardly be solved in time, because of the shortage of data-savvy people. Actually, according to analysts, there is an anticipated 100,000+ analytic talent shortage of people through to 2020. So, meanwhile CIOs find solutions to their own talent gaps; new software and cloud services appear in the market to enable business users to get the business insights and advance ROI of big data. Hopefully, someday, a data scientist can provide those insights but platforms like OpenText™ Big Data Analytics includes easy-to-use, drag-and-drop features to load and integrate different data sources from the front end or the back end, in the cloud. Now, I say hopefully because requirements for data scientists are no longer the same. Knowledge of coding is often not required. According to Robert J. Lake, what he requires from data scientists at Cisco is to know how to make data-driven decisions – that’s why he leaves data scientists to play with any self-service analytics tool that may help them to reach that goal. Data scientists spend around 80% of their time preparing data, rather than actually getting insights from it – so interest in self-service data preparation is growing. Leaving the data cleansing to data scientists may be a good idea for some of their colleages, but actually it is not a good idea in terms of agility and accuracy. That’s the reason why cloud solutions like Salesforce are appreciated, because it leaves sales people time to collaborate – adding, editing or removing information that will give a more precise view of their prospects, one that only they are able to identify with such precision. What if you could expect the same from a Supply Chain Management or Electronic Health Record, where data audits depends on multiple worldwide data sources, with distinct processes and with no dependency on data experts at all? In fact, 95% of organizations want end users to be able to manage and prepare their own data, according to noted market analyst Howard Dresner. Analysts predict that the next big market disruption is self-service data preparation, so expect to hear more about it in the near future. Why you should enable self-service advanced and predictive analytics Very small businesses may find desktop tools like Excel good enough for their data analysis, but after digital disruption these tools have become inadequate even for small firms. The need for powerful analytic tools is even greater for larger companies from data-intensive industries such as telecommunications, healthcare, or government. The columnar database has been proposed as the solution, as it is much speedier than relational databases when querying hundreds of millions or billions of rows. Speed of a cloud service is dependent on the volume of data as well as the hardware itself. Measuring the speed of this emerging technology is not easy but even a whole NoSQL movement is advising that relational databases are not the best future option. Companies have been able to identify the ROI of big data using predictive analytics to anticipate risk or forecast opportunities for years. For example, banks, mortgage lenders, and credit card companies use credit scoring to predict customers’ profitability. They have been doing this even when complex algorithms require data scientists, hard-to-find expertise, not just to build but to keep them running. That limits their spread through an organization. That’s why OpenText™ Big Data Analytics in the Cloud includes ad-hoc and pre-built algorithms like: Profile: If you are able to visualize a Profile of a specific segment of your citizens, customers or patients and then personalize a campaign based on the differentiation values of this segment, why would the ROI of the campaign not be attributed to the big data that previously stored it? Forecasting: If the cloud application is able to identify cross-selling opportunities and a series of campaigns are launched, the ROI of those campaigns could be attributed to the big data that you previously secured Decision Tree: You should be able to measure the ROI of a new process based on customer risk identification during the next fiscal year and attribute it to big data that you previously stored in the cloud Association Rules: You can report the ROI of a new recruitment requirement based on an analysis of job abandonment information and attribute it to big data that you had previously enabled as a self-service solution The greater the number of stars shown on the Forecast screenshot above, the stronger the evidence for non-randomness. This is actually when you are grateful for having so much information and having it so clean! Customer analytics for sales and marketing provide some of the classic use cases. Looking at the patterns from terabytes of information on past transactions can help organizations identify the reasons behind customer churn, the ideal next offer to make to a prospect, detect fraud, or target existing customers for cross-selling and up-selling. Put Big Data insights and reporting where they’re needed Embedded visualizations and self-service reporting are key to allow the benefits of data-driven decisions into more departments, because it doesn’t require expert intervention. Instead, non-technical users can spontaneously “crunch the numbers” on business issues as they come up. Today 74% of marketers can’t measure and report on the contribution of their programs to the business according to VisionEdge Marketing. Imagine that you as a CIO have adopted a very strong advanced analytics platform, but the insights are not reaching the right people – that is, in case of a hospital, the doctor or the patient. Let’s say the profile of the patient and drug consumption is available in someone’s computer, but that insight is not reachable by any user who can make the difference when a new action is required. The hospital’s results will never be affected in that case by big data and the ROI potential will not be achieved because the people who need the insights are not getting them, and the hospital will not change with or without big data. This is called invisible analytics. Consider route optimization of a Supply Chain – the classic “traveling salesman problem.” When a sizable chunk of your workforce spends its day driving from location to location (sales force, delivery trucks, maintenance workers), you want to minimize the time, miles, gas, and vehicle wear and tear, while making sure urgent calls are given priority. Moreover, you want to be able to change routes on the fly – and let your remote employees make updates in real-time, rather than forcing them to wait for a dispatcher’s call. Real-time analytics and reporting should be able to put those insights literally in their hands, via tablets, phones, or smart watches, giving them the power to anticipate or adjust their routes. OpenText™ Information Hub offers these capabilities as a powerful ad-hoc and built-in reporting tool that enables any user to personalize how their company wants information and the data visualization to be displayed. You should always ensure that the security and scalable capabilities of the tool you need is carefully selected, because in such cases you will be dealing not only with billions of rows, but also maybe millions of end users. As mentioned at the start of this blog, user experience is also at the top of the CIO’s agenda. True personalization that ensures the best user experience requires technology that can be fully branded and customized. The goal should be to adapt data visualizations to the same look and feel as the application to provide a seamless user experience. UPS gathers information at every possible moment and stores over 16 petabytes of data. They make more than 16 million shipments to over 8.8 million customers globally, receive on average 39.5 million tracking requests from customers per day, employ 399.000 people in 220 different countries. They spend $1 billion a year on big data but their revenue in 2012 was $ 54.1 billion. Identification of the ROI of big data is dependent on the democratization of the business insights coming from advanced and predictive analytics of that information. Nobody said it is simple but it can lower operating costs and boost profits, which every business users identifies as ROI. Moreover when line-of-business users rather than technology users are driving the analysis, and the right people are getting the right insight when they need it, improved future actions should feed the wheel of big data with the bigger data that is coming. And sure you want it to come to the right environment, right? Download the Internet of Things and Business Intelligence by Dresner The Internet of Things and Business Intelligence from Dresner Advisory Services is a 70-page research that provides a wealth of information and analysis, offering value to consumers and producers of business intelligence technology and services. The business intelligence vendor ratings include scores for location intelligence, end-user data preparation, cloud BI, and advanced and predictive analytics–all key capabilities for business intelligence in an IoT context. Download here.

Read More

Big Data: The Key is Bridging Disparate Data Sources

Big Data

People say Big Data is the difference between driving blind in your business and having a full 360-degree view of your surroundings. But, adopting big data is not only about collecting data. You don’t get a Big Data club card just for changing your old (but still trustworthy) data warehouse into a data lake (or even worse, a data swamp). Big Data is not only about sheer volume of data. It’s not about making a muscular demonstration of how many petabytes you stored. To make a Big Data initiative succeed, the trick is to handle widely varied types of data, disparate sources, datasets that aren’t easily linkable, dirty data, and unstructured or semi-structured data. At least 40% of the C-level and high-ranking executives surveyed in the most recent NewVantage Partners’ Big Data Analytics Survey agree. Only 14.5% are worried about the volume of the data they’re trying to handle. One OpenText prospect’s Big Data struggle is a perfect example of why the key challenge is not data size but complexity. Recently, OpenText™ Analytics got an inquiry from an airline that needed better insights in order to head off customer losses. This low-cost airline had made a discovery about its loyal customers. Some of them, without explanation, would stop booking flights. These were customers that used to fly with them every month or even every week, but were now disappearing unexpectedly. The airline’s CIO asked why this was happening. The IT department struggled to push SQL queries against different systems and databases, exploring common scenarios for why customers leave. They examined: The booking application, looking for lost customers (or “churners”). Who has purchased flights in previous months but not the most recent month? Which were their last booked flights? The customer service ticketing system to find if any of the “churners” found in the booking system had a recent claim. Were any of those claims solved? Closed by the customer? Was there any hint of customer dissatisfaction? What are the most commonly used terms in their communications with the airline – for example, prices? Customer support? Seats? Delays? And what was the tone or sentiment around such terms? Were they calm or angry?  Merely irked, or furious and threatening to boycott the airline? The database of flight delays, looking for information about the churners’ last bookings. Were there any delays? How long? Were any of these delayed flights cancelled? Identifying segments of customers who left the company during the last month, whether due to claims unresolved or too many flights delayed or canceled, would be the first step towards winning them back. So at that point, the airline’s IT department’s most important job was to answer the CIO’s question – May I have this list of customers? The IT staff needed more than a month to get answers to these questions, because the three applications and their databases didn’t share information effectively. First they had to move long lists of customer IDs, booking codes, and flight numbers from one system to another. Then repeat the process when the results weren’t useful. It was a nightmare crafted of disperse data, complex SQL queries, transformation processes, and lots of efforts – and it delivered answers too late for the decision-maker. A new month came with more lost customers. That’s when the airline realized it needed a more powerful, flexible analytics solution that could effortlessly draw from all its various data sources. Intrigued by the possibilities of OpenText Analytics, it asked us to demonstrate how we could solve its problems. Using Big Data Analytics, we blended the three disparate data sources. In just 24 hours we were able to answer the questions and OpenText™ Big Data Analytics had worked its magic. The true value of Big Data is getting answers out of data coming from several diverse sources and different departments. This is the pure 360-degree view of business that everyone is talking about. But without an agile and flexible way to get that view, value is lost in delay. Analytical repositories that use columnar technologies – i.e., what OpenText Analytics solutions are built on – are there to help answer questions fast when a decision-maker needs answers to business challenges.

Read More