Analytics

For Usable Insights, You Need Both Information and the Right Analytical Engine

Data

“It’s all about the information!” Chances are you’ve heard this before. If you are a Ben Kingsley or Robert Redford fan you may recognize the line from Sneakers (released in 1992). Yes, 1992. Before the World Wide Web!  (Remember, Netscape didn’t launch the first commercially successful Web browser until 1993). Actually it’s always been about the information, or at least the right information – what’s needed to make an informed decision, not just an intuitive one. In many ways the information, the data, has always been there; it’s just that until recently, it wasn’t readily accessible in a timely manner. Today we may not realize how much data is available to us through technology, like the mobile device in your pocket – at 12GB an iPhone 6S is 2,000 times bigger than the 6MB programs IBM developed to monitor the Apollo spacecrafts’ environmental data. (Which demonstrates the reality of Moore’s Law, but that’s another story).  Yet because it’s so easy to create and store large amounts of data today, far too often we’re drowning in data and experiencing information overload. Drowning in Data Chances are you’re reading this in between deleting that last email, before your next Tweet, because the conference call you are on has someone repeating the information you provided yesterday. Bernard Marr, a contributor to Forbes, notes “that more data has been created in the past two years than in the entire previous history of the human race”.  Marr’s piece has at least 19 other eye-opening facts about how much data is becoming available to us, but the one that struck me the most was this one: 0.5%! Imagine the opportunities missed. Just within the financial industry, the possibilities are limitless. For example, what if the transaction patterns of a customer indicated they were buying more and more auto parts as well as making more payments to their local garage (or mechanic). Combined with a recent increase in automatic payroll deposits, might that indicate this customer would be a good prospect for a 0.9% new car financing offer? Or imagine the crises which could be avoided. Think back to February 2016 and the Bangladesh Bank heist where thieves managed to arrange the transfer of $81 million to the Rizal Commercial Banking Corporation in the Philippines. While it’s reasonable to expect existing controls might have detected the theft, it turns out that a “printer error” alerted bank staff in time to forestall an even larger theft, up to $1 billion. The SWIFT interface at the bank is configured to print out a record each time a funds transfer is executed, but on the morning of February 5 the print tray was empty. It took until the next day to get the printer restarted. The New York Federal Reserve Bank had sent queries to the Bank questioning the transfer. What alerted them? A typo. Funds to be sent to the Shalika Foundation were addressed to the “Shalika fandation.” The full implications of this are covered in WIRED Magazine. Analytics, Spotting Problems Before They Become Problems Consider the difference if the bank had the toolset able to flag the anomaly of a misspelled beneficiary in time to generate alerts and hold up the transfers for additional verification. The system was programmed to generate alerts as print-outs. It’s only a small step to have alerts like this sent as an SMS text, or email to the bank’s compliance team, which may have attracted notice sooner. To best extract value from the business data available to you requires two things: An engine and a network. The engine should be like the one in OpenText™ Analytics, designed to perform the data-driven analysis needed. With the OpenText™ Analytics Suite, financial institutions can not only derive data-driven insights to offer value-added solutions to clients, they can also better manage the risk of fraudulent payment instructions, based on insights derived from a client’s payment behavior. For example, with the Bangladesh Bank, analytics might have flagged some of the fraudulent transfers, to Rizal Bank in the Philippines,by correlating the fact that the Rizal accounts were only opened in May 2015, contained only $500 each, and had not been previous beneficiaries. Business Network: Delivering Data to Analytical Engines But the other equally important tool is the network. As trains need tracks, an analytical tools engine needs data (as well as the network to deliver it).   Today more and more of this data needed to extract value comes from outside the enterprise. The Open Text™ Business Network is one way thousands of organizations exchange the data needed to manage their business, and provide the fuel for their analytical engines. For example, suppose a bank wanted to offer their customers the ability to generate ad-hoc reporting through their banking portal. With payment, collection, and reporting data flows delivered through the Open Text Business Network Managed Services, the underlying data would be available for the bank’s analytical engine. Obviously much of the data involved in the examples I’ve provided would be sensitive, confidential, and in need of robust information security controls to keep it safe. That will be the subject of my next post.

Read More

Steel Mill Gains Insight, Makes Better Decisions Through Analytics

analytics

When you think of a steel mill, crucibles of glowing molten metal, giant molds and rollers probably come to mind, not complex financial analysis. But like every other industry nowadays, steel mills – especially ones that specialize in scrap metal recycling – have to keep reviewing their material and production costs and the ever-changing demand for their products, so that they can perform efficiently in a competitive global market. That was the case for North Star BlueScope Steel in Delta, Ohio, which produces hot-rolled steel coils, mostly for the automotive and construction industries. Founded in 1997, the company is the largest scrap steel recycler in Ohio, processing nearly 1.5 million tons of metal a year. To operate profitably, North Star BlueScope examines and analyzes its costs and workflow every month, pulling in data from all over the company, plus external market research. But it was hampered by slow and inefficient technology, centered on Microsoft Excel spreadsheets so large and unwieldy, they took up to 10 minutes just to open. Comparing costs for, say, the period of January through May required North Star staffers to open five separate spreadsheets (one for each month) and combine the information manually. Luckily, the company was already using OpenText™ iHub  as a business intelligence platform for its ERP and asset management systems. It quickly realized iHub would be a much more efficient solution for its monthly costing analysis than the Excel-based manual process. Making Insights Actionable In fact, North Star BlueScope Steel ended up adopting the entire OpenText™ Analytics Suite, including OpenText™ Big Data Analytics (BDA),  whose advanced approach to business intelligence lets it easily access, blend, explore, and analyze data. The results were impressive. The steel company can now analyze a much larger range of its data and get better insights to steer decision-making. For example, it can draw on up to five years’ worth of data in a single, big-picture report, or drill down to a cost-per-minute understanding of mill operations. Now it has a better idea of the grades and mixes of steel products most likely to generate higher profits, and the customers most likely to buy those products. To learn more about how North Star BlueScope Steel is using OpenText Analytics to optimize its operations, plus its plans to embrace the Internet of Things by plugging data streams from its instruments about electricity consumption, material usage, steel prices, and even weather directly into Big Data Analytics, click here.

Read More

Unlock Unstructured Data and Maximize Success in Your Supply Chain

By any standard, a successful business is one that can find new customers, discover new markets, and pursue new revenue streams. But today, succeeding via digital channels, delivering an excellent customer experience, and embracing the digital transformation is the true benchmark. Going digital can increase your agility, and with analytics you can get the level of insight you need to make better decisions. Advances in analytics and content management software are giving companies more power to cross-examine unstructured content, rather than leaving them to rely on intuition and gut instinct. Now, you can quickly identify patterns and offer a new level of visibility into business operations. Look inside your organization to find the value locked within the information you have today. The unstructured data being generated every day inside and outside your business holds targeted, specific intelligence that is unique to your organization and can be used to find the keys to current and future business drivers. Unstructured data like emails, voicemails, written documents, presentations, social media feeds, surveys, legal depositions, web pages, videos, and more offer a rich mine of information that can inform how you do business. Unstructured content, on its own, or paired with structured data, can be put to work to refine your strategy. Predictive and prescriptive analytics offer unprecedented benefits in the digital world. Consider, for instance, the data collected from a bank’s web chat service. Customer service managers cannot read through millions of lines of free text, but ignoring this wealth of information is not an option either. Sophisticated data analytics allow banks to spot and understand trends, like common product complaints or frequently asked questions. They can see what customers are requesting to identify new product categories or business opportunities. Every exchange, every interaction, and all of your content holds opportunity that you can maximize. Making the most of relevant information is a core principle of modern enterprise information management. This includes analyzing unstructured information that is outside the organization, or passed between the company and trading partners across a supply chain or business network. As more companies use business networks, there is an increase in the types and amounts of information flowing across them; things like orders, invoices, delivery information, partner performance metrics, and more. Imagine the value of understanding the detail behind all that data? Imagine the insight it can provide to future planning? And even better: if you could analyze it fast enough to make a difference in what you do today. Here are two common, yet challenging, scenarios and their solutions. Solving challenges in your enterprise Challenges within the business network – A business network was falling behind in serving its customers. They needed to increase speed and efficiency within their supply chain to provide customers with deeper business process support and rich analytics across their entire trading partner ecosystem. With data analytics, the company learned more from their unstructured data—emails and documents—and was able to gain clearer insights into transactions flowing across the network. The new system allows them to identify issues and exceptions earlier, take corrective action, and avoid problems before they occur. Loss of enterprise visibility – A retail organization was having difficulty supporting automatic machine-to-machine data feeds coming from a large number of connected devices within their business network. With the addition of data analytics across unstructured data sources, they gained extensive visibility into the information flowing across their supply chain. Implementing advanced data analytics allowed them to analyze information coming from all connected devices, which afforded a much deeper view into data trends. This intelligence allowed the retailer to streamline their supply chain processes even further. Want to learn more? Explore how you can move forward with your digital transformation; take a look at how OpenText Release 16 enables companies to manage the flow of information in the digital enterprise, from engagement to insight.

Read More

Westpac Bank Automates and Speeds Up Regulatory Reporting with OpenText Analytics

Westpac

When Westpac Banking Corporation was founded in 1817 in a small waterfront settlement in Australia, banking was rudimentary. Records were kept with quill pens in leather-bound ledgers: Pounds, shillings, and pence into the cashbox; pounds, shillings, and pence out.  (Until a cashier ran off with half the fledgling bank’s capital in 1821, that is.) Now, exactly 200 years after Westpac’s parent company opened its doors, it’s not only the oldest bank in Australia but the second-largest, with 13 million customers worldwide and over A$812 billion under management. Every year it does more and more business in China, Hong Kong, and other Asia-Pacific nations. The downside to this expansion is: More forms to fill out – managing the electronic and physical flow of cash across national borders is highly regulated, requiring prompt and detailed reports of transactions, delivered in different formats for each country and agency that oversees various aspects of Westpac’s business. These reports require information from multiple sources throughout the company. Until recently, pulling out and consolidating all these complex pieces of data was a manual, slow, labor-intensive process that often generated data errors, according to Craig Chu, Westpac’s CIO for Asia.  The bank knew there had to be a better way to meet its regulatory requirements – but one that wouldn’t create its own new IT burden. A successful proof of concept led to Westpac adopting an information management and reporting solution from OpenText™ Analytics. To hear Chu explain how Westpac streamlined and automated its reporting process with OpenText™ iHub and Big Data Analytics, and all the benefits his company has realized, check out this short video showcasing this success story.  (Spoiler alert: Self-service information access empowers customers and employees.) If you’d like to learn more about what the OpenText Analytics Suite could do for your organization, click here.

Read More

Post-Election Score: Pundits 0, Election Tracker 1

election tracker

In the midst of post-election second-guessing over why so many polls and pundits failed to predict Donald Trump’s win, there was one clear success story: OpenText™ Election Tracker. ElectionTracker, the web app that analyzed news coverage of the Presidential race from over 200 media outlets worldwide for topics and sentiment, was a great showcase for the speed, robustness, and scalability of the OpenText™ Information Hub (iHub) technical platform it was built on. With demands for more than 54,000 graphic visualizations an hour on Election Day, it ramped up quickly with no downtime, performance you’d expect from OpenText™ Analytics. Moreover, the tracker’s value in revealing patterns in the tone and extent of campaign news content provided valuable extra insight into voter concerns that pre-election polls didn’t uncover, and that insight didn’t just end after Election Day. It’s just one in the series of proofs-of-concept on how our unstructured data analytics solutions shine at analyzing text and other unstructured data. They bring to the surface previously hard-to-see patterns in any kind of content stream – social media, customer comments, healthcare service ratings, and much more. OpenText Analytics solutions analyze these patterns and bring them to life in attractive, easy-to-understand, interactive visualizations. Also if some unforeseen event ends up generating millions of unexpected clicks, Tweets, or comments that you need to sift through quickly, iHub offers the power and reliability to handle billions of data points on the fly. Hello, Surprise Visitors! Speaking of unforeseen events: Some of the Election Tracker traffic was due to mistaken identity. On Election Day, so many people were searching online for sites with live tracking of state-by-state election results that electiontracker.us became one of the top results on Google that day. At peak demand, the site was getting nearly 8,000 hits an hour, more than 100 times the usual traffic. Senior Director of Technical Marketing Mark Gamble, an Election Tracker evangelist, was the site administrator that day. “On November 8 at around 6 a.m. I was about to get on a flight when I started getting e-mail alerts from our cloud platform provider that the Election Tracker infrastructure was getting hammered from all those Google searches. I’d resolve that alert, and another one would pop up.” “We had it running at just two nodes of our four-node cluster, to keep day-to-day operating costs down. Our technical team said, ‘Let’s spin up the other two nodes.’  That worked while I was changing planes in Detroit. But when I got off, my phone lit up again: Demand was still climbing. It was just unprecedented traffic.” “So we had our cloud provider double the number of cores, or CPUs, that run on each node. And that kept up with demand. The site took a bit longer to load, but it never once crashed. That’s the advantage of running in the cloud – you can turn up the volume on the fly.” “Of course, the flexibility of our iHub-based platform is unique. All the cloud resources in the world won’t help you if you can’t quickly and efficiently take advantage of them.” Easy Visualizing Demand on the site was heightened by the Election Tracker’s live, interactive interface. That’s intentional, because OpenText Analytics solutions encourage users to take a self-service approach to exploring their data. “It’s not just a series of static pages,” explains Clement Wong, Director of Analytics On-Demand Operations. “The infographics are live and change as the viewer adjusts the parameters.  With each page hit, a visitor was asking for an average of seven visualizations. That means the interface is constantly issuing additional calls back and forth  to the database and the analytic engine. iHub has the robustness to support that.” (In fact, at peak demand the Tracker was creating more than 15 new visualizations every second.)” “Some of the reporters who wrote about Election Tracker told us how much they enjoyed being able to go in and do comparisons on their own,” Gamble says. “For example, look at how much coverage each candidate got over the past 90 days, compared to the last 7 days, then filter for only non-U.S. news sources, or drill down to specific topics like healthcare or foreign policy. That way they didn’t have to look at static figures and then contact us to interpret for them; the application granted the autonomy to draw their own conclusions.” Great Fit for Embedding “The self-service aspect is one reason that iHub and other OpenText Analytics solutions are a great fit for embedding into other web sites (use cases such as bank statements or utility usage)”, Gamble adds. “First of all, an effective embedded analytic application has to be highly interactive and informative, so people want to use it – not just look at ready-made pages, but feel comfortable exploring on their own.” “Embedded analytics also requires seamless integration with the underlying data sources so the visuals are integral and indistinguishable from the rest of the site, and it needs high scalability to keep up with growing usage.” What’s Next? The iHub/InfoFusion integration underlying the Election Tracker is already being used in other proofs-of-concept. One is helping consumer goods manufacturers analyze customers’ social media streams for their sentiments about the product and needs or concerns. “If you think of Election Tracker as the Voice of the Media, the logical next step is Voice of the Customer,” Gamble says. The Election Tracker is headlining the OpenText Innovation Tour, which just wrapped up in Asia and resumes in spring 2017.

Read More

Telco Accessibility 101: What’s Now Covered by U.S. Legislation

telco accessibility

In a word, everything. Name a telecommunications product or service and chances are it has a legal requirement to comply with federal accessibility laws. Let’s see… Mobile connectivity services for smartphones, tablets, and computers? Check Smartphones, tablets, and computers? Check Internet services (e.g., cable, satellite)? Check Television services (e.g., cable, satellite, broadcast)? Check Televisions, radios, DVD/Blu-ray players, DVRs, and on-demand video devices? Check Email, texting, and other text-based communication? Check VoIP communications and online video conferencing? Check Fixed-line phone services? Check Fixed-line telephones, modems, answering machines, and fax machines? Check Two tin cans attached by a string? Check All of these products and services are covered by U.S. accessibility legislation (except the cans and string). What laws are we talking about here? Mainly Section 255 of the Telecommunications Act of 1996, for products and services that existed before 1996, and the Twenty-­First Century Communications and Video Accessibility Act (CVAA) of 2010, which picked up where Section 255 left off, defining accessibility regulations for broadband-enabled advanced communications services. Web accessibility legislation, while not telco-specific, is also relevant. The Americans with Disabilities Act (ADA) doesn’t explicitly define commercial websites as “places of public accommodation” (because the ADA predates the Internet), but the courts have increasingly interpreted the law this way. Therefore, as “places of public accommodation,” company websites—and all associated content –must be accessible to people with disabilities. For more insight on this, try searching on “Netflix ADA Title III” or reading this article. (By the way, a web-focused update of the ADA is in the offing.) Last but not least, we come to Section 508 of the Rehabilitation Act, which spells out accessibility guidelines for businesses wanting to sell electronic and information technology (EIT) to the federal government. If your company doesn’t do that, then Section 508 doesn’t apply to you. What this means for businesses Not unreasonably, telecommunications companies must ensure that their products and services comply with accessibility regulations and are also usable by people with disabilities. This usability requirement means that telecom service providers must offer contracts, bills, and customer support communications in accessible formats. For product manufacturers, usability means providing customers with a full range of relevant learning resources in accessible formats: installation guides, user manuals, and product support communications. To comply with the legislation, telecommunications companies must find and implement cost-effective technology solutions that will allow them to deliver accessible customer-facing content. Organizations that fail to meet federal accessibility standards could leave themselves open to consumer complaints, lawsuits, and, possibly, stiff FCC fines. Meeting the document challenge with accessible PDF Telecommunications companies looking for ways to comply with federal regulations should consider a solution that can transform their existing document output of contracts, bills, manuals, and customer support communications into accessible PDF format. Why PDF? PDF is already the de facto electronic document standard for high-volume customer communications such as service contracts and monthly bills because it’s portable and provides an unchanging snapshot, a necessity for any kind of recordkeeping. But what about HTML? Why not use that? While HTML is ideal for delivering dynamic web and mobile content such as on-demand, customizable summaries of customer account data, it doesn’t produce discrete, time-locked documents. Plus, HTML doesn’t support archiving or portability, meaning HTML files are not “official” documents that can be stored and distributed as fixed entities. Document content is low-hanging fruit Document inaccessibility is not a problem that organizations need to live with because it can be solved immediately — and economically — with OpenText’s Automated Output Accessibility Solution, the only enterprise PDF accessibility solution on the market for high-volume, template-driven documents. This unique software solution enables telecommunications companies to quickly transform service contracts, monthly bills, product guides, and other electronic documents into WCAG 2.0 Level AA-compliant accessible PDFs. Whatever the data source, our performance numbers are measured in milliseconds so customers will receive their content right when they ask for it. OpenText has successfully deployed this solution at government agencies, as well as large commercial organizations, giving them the experience and expertise required to deliver accessible documents within a short time frame, with minimal disruption of day-to-day business. Fast, reliable, compliant, and affordable, our automated solution can help you serve customers and meet your compliance obligations. Learn more about the OpenText™Automated Output Accessibility solution.

Read More

Banking Technology Trends: Overcoming Process Hurdles

Financial analytics

Editor’s Note: This is the second half of a wide-ranging interview with OpenText Senior Industry Strategist Gerry Gibney on the state of the global financial services industry and its technical needs.  The interview has been edited for length and continuity. Unifying Information for Financial Reporting I heard a lot of discussion at the SIBOS 2016 conference in Geneva around financial reporting. Banks face procedural hurdles, especially if they’re doing merchant or commercial banking.  A lot of them still have manual processes. In terms of the procedures, the bigger the bank, the bigger the problem. That’s because the information is often in many places. For example, different groups from the bank may approach their corporate banking customers to buy or use a service or product – which is great, but they have to track and report on it. Often in the beginning, these separate group reporting processes are manual. Eventually, they’ want to automate the reporting and join the information to other data sources, but that’s the big challenge – it takes time to assemble and coordinate all the information streams and get them to work as an internal dashboard.  A similar challenge is creating portals to track financial liquidity. Another example is where clients ask for specific reports. The bank don’t want to say no, so they have to produce the reports manually, often as a rush job, and in a format that the client finds useful. The challenge is to take large amounts of data and summarize so you can give people what they ask for with the periodicity, the look, and the format that they want. Embedded Visualizations for our Customers’ Customers That’s where we come in. A lot of the value we offer with OpenText Analytics is embedding our analytic and visualization applications in a client’s own application so that they can offer dashboards, windows, reporting and so forth, to their own internal or external customers. The beauty of our embeddable business intelligence or analytics piece is that no one on the business side has to see it or work with it. It offers functionality that can be applied as needed, without having to make IT adjustments on your part or requiring people to enter data into bulky third-party programs. Tremendous capabilities are suddenly just there. Users can build a data map that automatically gathers and manages data, then organizes and reports it – in any format required, whether visual via charts and graphs, or numeric, if you prefer. Plus, it has powerful drill-down ability. Flexibility to Cope with Regulatory Shifts The other aspect of reporting is reporting to regulatory agencies. After the Great Recession and the banking crisis, governments worldwide have been stepping up their efforts in regulating the financial industry. Not just nations – local governments also. In fact, the fastest-growing department in every bank now is regulatory compliance. There are ever-increasing workloads, more workflow but without more people to deal with it. The problem for the U.S. government is that it presents a moving target. Dodd-Frank controls and the Volcker Rule, required banks to end proprietary trading. There is potentially a new level of risk from government changes in requirements and the need for banks to produce new reports, even sometimes on things that they weren’t aware they needed to report on. Banks and other financial institutions need a reporting solution that enables quick and easy production of whatever information government regulators are asking for.  An ideal reporting solution will maximize the flexibility in how you can look across both unstructured data and all the structured data in multiple silos. This is a good use case for ad hoc analytics and reporting – the power to create new types of reports whatever regulators may require. Financial Analytics: Understanding Your Customer Another analytics-related topic I heard at SIBOS was the need to understand customers better and how to identify a good target customer. This is top-of-mind for banks. I’m amazed that people gather streams of data in their CRM systems and then don’t use it. Often their CRM systems are stand-alone, not connected to anything. They might contain information that’s extremely valuable and could enhance their efforts. For example, sales efforts, proposals, and pitch books. They could tie these things together, and then analyze their findings to correlate sales resources to the results. With a unified analytics flow, you can drive business by managing client relationships, figuring out through advanced analytics who is the best candidate for up-selling or cross-selling, as well as identifying new customers. Finding new insights by searching all these CRM systems is a tremendous value that analytics, especially embeddable analytics from OpenText, can deliver. Analytics can have tremendous amount of value to business operations and make them more efficient, productive, and profitable. You can’t ask more than that. To learn more about how OpenText Analytics can help the financial services industry unlock the business value of existing data, consider our Webinar, “Extracting Value from Your Data with Embedded Analytics,” Wednesday, Dec. 14, at 11 a.m. Pacific Time/2 p.m. Eastern.  Click here for more information and to register.

Read More

Banking Trends from SIBOS: Technology Solutions to Tame Rampaging Workflows

banking trends from SIBOS

Editor’s Note: Gerry Gibney, Senior Industry Strategist at OpenText and resident expert on the financial services industry, was recently interviewed on the banking trends and technical needs he discovered at SIBOS (the annual trade show hosted by SWIFT, provider of global secure financial messaging services).  I always come back from SIBOS having learned new things, it’s one of the largest banking events in the world and this year, one of the big topics was domestic payments. Many people aren’t aware that for large banks, corporate internet banking payments represent around 24% of their revenue. They benefit from payment money while it is in their hands and they can charge fees for the payment services. It’s a big market because payments have to be made, whether regular payments such as rent and utilities on buildings or one-time money transfers. And they add up. For bigger banks, we’re talking several hundred million dollars each. Of course, they would prefer to keep that balance in their bank or extract it over time. I see a big role for OpenText here. Our BPM solution can be deployed to help with business networks, so banks can manage the workflow, the processes, and the controls. Managing the controls is important because with the SWIFT processes (payments and messaging), issues include: Who is authorized to send the money? Who else can do it? Who else can approve it? What if that person leaves? How do we add them into the system or remove them? Automating Banking Workflow Our own experience at OpenText is typical. Every year, our company  goes through the payment permissions updating process. What do we need to know? What do we need to get? How do we get it? Where do we apply it? How many accounts are responsive? Doing business in, say, Hong Kong, Shanghai, or Japan, we may have 10 or 20 people with different signatory levels, each needing to sign an eight page statement. Eight pages times 10 people, every year, for every account – that’s 80 pages per account every year, and that’s typical of many companies. A company might well have several hundred accounts with just one bank, and this has to be managed every year, with ever changing rules, like regulators now requiring the CFO’s home address for example. Another workflow example is client onboarding, which has to be done every time. Even if the customer has 200 accounts and they want to add number 201, you still have to go through the onboarding process. So all the information is out there in different places, who knows how well protected it all is? OpenText’s security capabilities, our ability to add workflow, control workflow, minimize, and automate it, adds a lot of value. OpenText is also a SWIFT service bureau. We help with payments reporting, via EDI and our Business Network, to enhance what banks do. We help banking in many areas, across all our solutions – for example, with analytics, on the content side for unstructured data, or helping with records management, which is strong on compliance. With embeddable analytics we can gather all sorts of information, whether it’s for bank employees internally or their clients and customers. This information can be transformed into reports, perform sophisticated analysis, and help companies find new ways to get revenue from it. It can also help to track things more efficiently, comply with government regulations more easily, and improve bottom line without increasing operating costs. In summary, it can be a tremendously powerful component of a bank’s overall offering. The second half of this interview will be published next week.

Read More

Data Quality is the Key to Business Success

data quality

In the age of transformation, all successful companies collect data, but one of the most expensive and difficult problems to solve is the quality of that information. Data analysis is useless if we don’t have reliable information, because the answers we derive from it could deviate greatly from reality. Consequently, we could make bad decisions. Most organizations believe the data they work with is reasonably good, but they recognize that poor-quality data poses a substantial risk to their bottom line. (The State of Enterprise Quality Data 2016 – 451 Research) Meanwhile, the idiosyncrasies of Big Data are only making the data quality problem more acute. Information is being generated at increasingly faster rates, while larger data volumes are innately harder to manage. Data quality challenges There are four main drivers of dirty data: Lack of knowledge. You may not know what certain data mean. For example, does the entry “2017” refer to a year, a price ($2,017.00), the number of widgets sold (2,017), or an arbitrary employee ID number? This could happen because the structure is too complex, especially in large transactional database systems, or the data source is unclear (particularly if that source is external). Variety of data. This is a problem when you’re trying to integrate incompatible types of information. The incompatibility can be as simple as one data source reporting weights in pounds and another in kilograms, or as complex as different database formats. Data transfers. Employee typing errors can be reduced through proofreading and better training. But a business model that relies on external customers or partners to enter their own data has greater risks of “dirty” data because it can’t control the quality of their inputs. System errors caused by server outages, malfunctions, duplicates, and so forth. Dealing with dirty data? Correcting a data quality problem is not easy. For one thing, it is complicated and expensive; benefits aren’t apparent in the short term, so it can be hard to justify to management. And as I mentioned above, the data gathering and interpretation process has many vulnerable places where error can creep in. Furthermore, both the business processes from which you’re gathering data and the technology you’re using are liable to change at short notice, so quality correction processes need to be flexible. Therefore, an organization that wants reliable data quality needs to build in multiple quality checkpoints: during collection, delivery, storage, integration, recovery, and during analysis or data mining. The trick is having a plan Monitoring so many potential checkpoints, each requiring a different approach, calls for a thorough quality assurance plan. A classic starting point is analyzing data quality when it first enters the system – often via manual input, or where the organization may not have standardized data input systems. The risk analyzed is that data entry can be erroneous, duplicated, or overly abbreviated (e.g. “NY” instead of “New York City.)”  In these cases, data quality experts’ guidance falls into two categories. First, you can act preventively on the process architecture, e.g. building integrity checkpoints, enforcing existing checkpoints better, limiting the range of data to be entered (for example, replacing free-form entries with drop-down menus), rewarding successful data entry, and eliminating hardware or software limitations (for example, if a CRM system can’t pull data straight from a sales revenue database). The other option is to strike retrospectively, focused on data cleaning or diagnostic tasks (error detection). Experts recommend these steps: Analyzing the accuracy of the data, through either making a full inventory of the current situation (trustworthy but potentially expensive) or examining work and audit samples (less expensive, but not 100% reliable). Measuring the consistency and correspondence between data elements; problems here can affect the overall truth of your business information. Quantifying systems errors in analysis that could damage data quality. Measuring the success of completed processes, from data collection through transformation to consumption One example might be how many “invalid” or “incomplete” alerts remain at the end of a pass through of the data. Your secret weapon: “Data provocateurs” None of this will help if you don’t have the whole organization involved in improving data quality. Thomas C. Redman, an authority in the field, presents a model for this in a Harvard Business Review article, “Data Quality Should Be Everyone’s Job.” Redman says it’s necessary to involve what he calls “data provocateurs,” people in different areas of the business (from top executives to new employees), who will challenge data quality and think outside the box for ways to improve it. Some companies are even proposing awards to employees who detect process flaws where poor data quality can sneak in. This not only cuts down on errors, it has the added benefit of promoting the idea through the whole company that clean, accurate data is important. Summing up Organizations are rightly concerned about data quality and its impact on their bottom line. The ones that take measures to improve their data quality are seeing higher profits and more efficient operations because their decisions are based on reliable data. They also see lower costs from fixing errors and spend less time gathering and processing their data. The journey towards better data quality requires involving all levels of the company. It also requires assuming costs whose benefits may not be visible in the short term, but which eventually will end up boosting these companies’ profits and competitiveness.

Read More

Attention All Airlines: Is Your Inaccessible Document Technology Turning Away Customers?

accessible PDF

Imagine you’re an airline executive and a small but significant percentage of your customers—let’s say 10% or less—download flight itineraries and boarding passes from your website only to find that the information in these documents was jumbled up and, in some cases, missing altogether. What would you do? Would you be concerned enough to take action? Would it matter if these customers didn’t know their flight number, boarding gate, and seat assignment? After all, 90% or more of your customers would still be receiving this information as usual. Before venturing an answer to these hypothetical questions, let’s pause for a quick look at your industry. Over the last 60 years, airline profit margins have averaged less than 1%, though the situation has been improving in recent years. The International Air Transport Association (IATA) reported a net profit margin of 1.8% in 2014 and 4.9% in 2015; industry profits are expected to be 5.6% in 2016. With such narrow margins, it’s clear that airlines need every customer they can get, and the industry has little tolerance for inefficiencies. Now back to your document problem… Even if less than 10% of customers were affected, it seems likely that you’d take steps to fix the problem and also pull out the stops to get it done as fast as possible, before the company loses many customers. Of course, the underlying assumption here is that a proven, economically feasible IT solution is available. This might be happening at your airline—for real All hypotheticals aside, a scenario like this could actually be playing out at your company right now. Consider: According to the 2014 National Health Interview Survey, 22.5 million adult Americans—nearly 10% of adult Americans—reported being blind or having some sort of visual impairment. To access online flight booking tools, along with electronic documents such as itineraries and boarding passes, many of these people need to use screen reader programs that convert text into audio. If, however, the documents aren’t saved in a format like accessible PDF (with a heading structure, defined reading order, etc.), they’re likely to come out garbled or incomplete in a screen reader. Of course, visually impaired customers could book their flights by phone and opt to receive Braille or Large Print documents in the mail (expensive for your airline). Then again, theoretically, all of your other customers could book by phone, too. The point is you don’t really want customers booking by phone because your self-serve website is less costly to operate than customer call centers; electronic documents are cheaper than paper and postage, and much cheaper than Braille and Large Print. So, wouldn’t it be nice if there was an affordable technology solution that you could plug in to serve up the documents that all of your customers—that’s 90% plus 10%—need to fly with your airline? Or course, it would be even better if the solution met the requirements of new Department of Transportation (DOT) rules implementing the Air Carrier Accessibility Act (ACAA), which have a compliance deadline of December 12, 2016. Customer satisfaction and regulatory compliance? Now that would be good. OpenText Automated Output Accessibility Solution OpenText has the only enterprise PDF accessibility solution for high-volume, template-driven documents. This unique software solution can dynamically transform electronic documents such as e-ticket itineraries/receipts and boarding passes into accessible PDFs that comply with the DOT’s new ACAA rules. Designed to be inserted between the back office (e.g., a passenger reservation system) and a customer-facing Web portal, OpenText™ Automated Output Accessibility Solution has minimal impact on existing IT infrastructure. Even better, the solution generates WCAG 2.0 Level AA compliant output that has been tested and validated by prominent organizations and advocacy groups for visually impaired people. OpenText has successfully deployed this solution at government agencies, as well as large commercial organizations, giving them the experience and expertise required to deliver accessible documents within a short time frame, with minimal disruption of day-to-day business. As the de facto electronic document standard for high-volume customer communications, PDF format offers both portability and an unchanging snapshot of information, necessities for a document of record. Contact us to discuss more about how we can help you deliver accessible, ACAA-compliant PDF documents to your customers. Remember, the DOT’s deadline is December 12, 2016.

Read More

Fintech at SIBOS: From Everyday Banking to Science Fiction

FinTech

In late September, we were at SIBOS 2016, the annual trade show hosted by SWIFT. It’s a major showcase for innovative financial technology (“Fintech”), featuring the latest news and analysis about banking and electronic payments. Over 8,500 people attended this year, with Eurasian, Middle Eastern, African, and Chinese banks showing a stronger presence than in the past. The show was in Geneva, and we were struck by the contrast between the timeless beauty of the Alpine setting (snow-covered Mont Blanc looming over the skyline, Lac Leman at the foot of our hotel, and our daily commute passing vineyards) and the dynamic, even futuristic, requirements of the financial industry in the 21st century discussed at the conference. SIBOS has been morphing as SWIFT seeks out new directions. It offers a showcase for Fintech through its Innotribe program, continues to develop electronic communications standards for a wide range of uses, and this year was focusing more on cyber-security (after hackers exploited a SWIFT weakness to steal $81 million from the central bank of Bangladesh’s account at the Federal Reserve) to defend the SWIFT electronic transaction platform. We also heard a lot of buzz about new Fintech developments such as cybersecurity and artificial intelligence (e.g. creating algorithms that could manage simple investment portfolios as well as most human advisors). Blockchain: From cyberpunk to competitive advantage Of all these new technologies, blockchain was a key discussion at SIBOS. The conference was full of questions about blockchain: How does it work?, what does it mean for us?, how will payments be affected?, who is working with it?, and what are the real security issues? “Blockchain” is a type of distributed, online database that keeps a permanent, tamper-proof record of financial transactions. It was created to settle trades in Bitcoin, the virtual currency, and has since become popular for deals in “real” currencies because all parties can track the transaction securely, with no need for third-party verification. SWIFT is interested in blockchain technology even though – or maybe because – it could pose strong competition to SWIFT’s own secure payments service, which can take days or weeks to settle a complex transaction. Fintech competitors using blockchain are forecasting that they will be able to cut transaction time down to near-zero. “Let’s get it done” However, what interested us most about SIBOS 2016 is that despite all that buzz, there was still a core of business as usual – practical, “let’s get it done” business. The banks still face the same issues: “How do we do this payments business faster, cheaper, and take better advantage of the relationships it creates?” For example, client onboarding continues to be a challenge for many banks, and we provide a lot of value in this area. We had many discussions about how OpenText helps banks to improve their overall compliance, get to revenue sooner, and achieve higher customer satisfaction rates. We also had conversations about the wealth of data that banks have and how they can enable better use of it, for both themselves and their customers. While they may be experimenting with new technologies driven by the Fintech boom, the practical business in the next cycle will be in establishing value from the networks and relationships already in place. OpenText’s role in the new Fintech world Naturally we feel OpenText has a role to play here. To start with, OpenText™ Analytics solutions help financial companies extract more value from their information by liberating it from the many separate silos it’s often housed in and integrating these various streams of information into a complete, accurate picture of investment performance, customer satisfaction, user experience, or response to various marketing incentives. Our message attracted a lot of interest at SIBOS, where we had great meetings with clients and prospects. One of the highlights was our Oktoberfest party co-sponsored with SAP, It was a great success with more than 260 people attending. Next year, SIBOS will take place in Toronto, Canada’s financial capital – not far from our corporate headquarters in Waterloo.  Who knows what strides the Fintech world will have taken by then?

Read More

How the Same Data Analytics Used in Politics Can Help Your Business

Every four years, about 230 million people turn out to vote in the U.S presidential election. That’s a huge audience to influence, no matter how sophisticated your analytics. I mean, which data pool on which channel do you start with? Where do you look first? Just like businesses trying to reach customers, political campaigns can analyze sentiment, social media activity, and feedback from a variety of channels to spot trends, make predictions and craft persuasive messages. A new OpenTextVoice article on Forbes.com, 3 Things The Presidential Race Can Teach Business About Digital Strategy, explores how the same micro-targeting used in political data analytics can be applied in business. As November draws near, the data crunching will undoubtedly grow more feverish. To get a sense of how often the media is covering each candidate and in what context, check out the free online Election Tracker ’16 from OpenText. To read the full article on Forbes.com, go here.

Read More

Power Up Your iHub Projects with Free Interactive Viewer Extensions

Interactive Viewer

We’ve updated and are republishing a series of helpful tips for getting the most out of the Interactive Viewer tool for OpenText™ iHub, with free extensions created by our software engineers.  These extensions boost the appearance and functionality of Interactive Viewer, the go-to product for putting personalized iHub content in the hands of all users.  (If you don’t already have iHub installed, click here for a free trial.) Below are links to the full series of six blog posts.  If you have any suggestions for further extensions or other resources, please let us know through the comments below. 1. Extend Interactive Viewer with Row Highlighting A simple jQuery script for highlighting the row the user’s pointer is on. 2. Extend Interactive Viewer with a Pop-Up Dialog Box  Add a fully configurable pop-up dialog box to display details that don’t need to appear in every row. 3. Extend iHub Reports and Dashboards with Font Symbols  Dress up your iHub reports with nifty symbols. 4. Extend iHub Dashboards with Disqus Discussion Boards Encourage conversation the easy way, by embedding a discussion board in the popular Disqus format. 5. Extend iHub Interactive Viewer with Fast Filters  Make column-based filtering easy by using JSAPI to build a Fast Filter – a selectable drop-down menu of distinct values that appears in the header of a column. 6. Extend Interactive Viewer with Table-Wide Search   Filter across multiple columns in an iHub table by creating a table-wide search box.

Read More

Your Medical Biography is a Digitized Trail of Big Data

Healthcare

A far cry from the walls of paper files in medical offices, today’s digitized records let patients see lab results, get refill alerts and generally have more perspective and control over their medical destiny. These new systems are even using natural language processing algorithms to grasp the nuances of language, just like humans do. A new OpenTextVoice article on Forbes.com, Big Data Takes Turn As Storyteller, Pulls Together Health Narratives, shows how the digitization of health records is changing the way our medical histories are recorded. Now, we can save and analyze not only structured data like dates and procedure codes, but also unstructured information like doctor’s notes. All of these advancements are changing the medical paradigm to create a partnership between doctor and patient. It’s even supporting more sophisticated analysis, so that a treatment plan can be one based more on the full picture rather than just a snapshot. Get the full story here.

Read More

The Real U.S. Election Battlefield is Online

The playing field has narrowed, the debates have begun and we are slowly reaching the end of the U.S. Presidential race.  And, what a race it is. Not just on the campaign trail but online as well. Twitter, LinkedIn, Facebook and more are the hub of political commentary. Flooded with voters debating the issues, the politicians and the latest exploits of Donald Trump and Hillary Clinton, the people are speaking. A new OpenTextVoice article on Forbes.com explores how that data is changing the race.  While President Obama made a big political play online, it has only grown more sophisticated in the past few years. With tools and advanced analysis, campaigns can use the posts to “see how the tweets, mentions and articles praise their candidate or cast them in a negative light.” One thing is for sure whether you’re a Republican, Democrat, independent, undecided or Donald J. Trump himself, the real race is online. Check out the full article on Forbes.

Read More

Hidden in the Information: Success with Advanced Analytics

In today’s world, information is all around us. And, hidden in that Information is the key to both understanding your customers and predicting their behavior. Pretty nice, huh? Well, here comes the tricky part. With so much information how do you find the useful insights? A new OpenTextVoice article on Forbes.com explores how to uncover the information and apply it to your business. How can companies use that data to advance its business process or turn an industry on its head? As the article points out “Data is the foundation that allows transformative, digital change to happen.” Analytics are the key to unlocking the potential of data and turning it into something greater. Check out the full article on Forbes to find out to use advanced analytics to take charge of an industry and launch the next Netflix, AirBnB or Uber.  

Read More

With Cognitive Computing, Johnny Five is Alive… or at Least he Will be!

Have you ever wanted your own robot advisor? Well according to a new OpenTextVoice article on Forbes.com, your very own Lieutenant Commander Data, KITT or C3PO may be closer than you think. The article explains how unstructured data is filling the void in traditional computer systems to create cognitive computing systems which can mimic human thinking. Cognitive computing combines structured and unstructured data to enable organizations to make better decisions with intelligent systems that go beyond numbers and rows. The systems can “make predictions and recommendations that offer profound, actionable insights into a host of common business challenges.” Adding Natural Language Processing to the mix, you get a system that can think, “feel” and interact like any other human. Check out the full article on Forbes to find out how cognitive computing is bringing the automated trusted advisor to life.

Read More

Understanding Customer Feelings: Unstructured Data Analytics Tells all

online customer experience

Complex database queries and focus groups can help a business track customer insight, but there’s another approach. In a new OpenTextVoice article on Forbes.com, Meet The Algorithm That Knows How You Feel, you’ll learn how today’s businesses can analyze unstructured data using text analytics algorithms to actually grasp how customers are feeling. Unstructured data analytics lets businesses sift through documents, text messages, email and social media posts, to get to where the important insight lives. And now, using OpenText InfoFusion, businesses can manage, analyze, and understand information in ways that are transforming customer experience. As huge volumes of unstructured data come into the enterprise from multiple channels, the only way marketers can gain a competitive advantage is if they can see trends and preferences before the competition does. This insight not only fuels important business decisions but also changes the way a company relates to its customers. The article offers some history of the technology and also shares real-world applications. Get the full story here.

Read More

Turn Your Big Data into Big Value – Attend our Workshops

big data workshop

The ever-growing digitization of business processes and the rise of Big Data mean workplaces are drowning in information. Data analytics and reporting can help you find useful patterns and trends, or predict performance. The problem is many analytics platforms require expert help in sorting through billions of lines of data. Even full-fledged data scientists spend 50 to 80 percent of their time preparing data, not getting insights from it.  Moreover, there’s a built-in time lag if you need to ask your in-house data scientist to run the numbers when you, a line manager or subject expert, need answers right away. That’s why 95% of organizations want end users to be able to manage and prepare their own data, according to respected market researcher Howard Dresner. Luckily, there’s help. The OpenText™ Analytics Suite combines powerful, richly-featured data analysis with self-service convenience.  You can upload your own data by the terabyte, then access, blend, and explore it quickly, without coding. Then the analysis results can be shared and socialized via the highly-scalable, embedded BI and data visualization platform, which lets you design, deploy, and manage interactive reports and dashboards that can be embedded into any app or device. These reports can address a wide range of business questions from “What are my customers’ spending habits using the credit card from XYZ Bank?” to “Which customers are most likely to respond to our new offer?” Sure, you may be thinking, this sounds great but I want to see the solution in action before I decide. Here’s your chance. On Tuesday, Sept. 13, in San Diego, OpenText is offering a free, hands-on full day workshop on the OpenText Analytics Suite. This 6-hour session (including breakfast and lunch) provides a guided tour of the various modules within the suite and shows you how to build dynamic, interactive, visually compelling information applications. By the end of the workshop, you’ll understand how to: Build interactive, visually rich applications from the ground up Create data visualizations and reports using multiple data sources Embed these visualizations and reports seamlessly into existing apps Target profitable customers and markets using predictive analytics Our visionary speakers offer a day that’s not only informative but entertaining and engaging. Check here for the full schedule of workshops and dates available.

Read More

Unlock the Value of Your Supply Chain Through Embedded Analytics

supply chain analytics

Over the past few months I have posted a couple of blogs relating to the use of analytics in the supply chain. The first one really discussed the ‘why’ in terms of the reasons for applying analytics to supply chain operations, Understanding the Basics of Supply Chain Analytics. The second blog discussed the ‘how’, in terms of the methods of obtaining meaningful insights from B2B transactions flowing between trading partners, Achieve Deeper Supply Chain Intelligence with Trading Grid Analytics. The blogs were written in support of our recently announced OpenText™ Trading Grid Analytics, one of the many Business Network related offerings in Release 16. Release 16 is the most comprehensive set of products to be released by OpenText to enable companies to build out their digital platforms and enable a better way to work. Now those that have followed my blogs over the years will know that I have worked with many analyst firms to produce white papers and studies and I guess it was only appropriate that I should be fortunate to work with an outside analyst on a thought leadership white paper relating to analytics in the supply chain. I engaged with IDC to write a paper entitled, Unlock the Value of Your Supply Chain Through Embedded Analytics. IDC has been producing some interesting content over the years in support of their ‘Third Platform’ model which embraces IoT, cloud, mobile and big data and how companies can leverage these technologies for increased competitive advantage. The aim of our new analytics related white paper was to discuss the business benefits of embedding analytics into the transaction flows across a business network. Compared to other business intelligence and end user analytics solutions, OpenText is in a unique position as we own our Business Network and we are able to introspect the 16 billion EDI transactions flowing across our network. IDC leveraged a relatively new management theory called VUCA which stands for Volatility, Uncertainty, Complexity and Ambiguity to discuss how analytics can bring better insights into business operations. VUCA was originally defined in the military field and for our paper IDC aligned VUCA so that it leverage against a more connected, information-centric and synchronized business network, namely Velocity, Unity, Coordination and Analysis. I am not going to highlight too much content from the paper but here is one interesting quote from the paper. “It is the view of IDC that the best supply chains will be those that have the ability to quickly analyze large amounts of disparate data and disseminate business insights to decision makers in real time or close to real time. Businesses that consistently fail to do this will find themselves at an increasing competitive disadvantage and locked into a reactionary cycle of firefighting. Analytics really will be the backbone of the future of the supply chain.” Now I am not going to spoil the party by revealing any more from the paper!, if you would like to learn more then please register for our webinar, details are provided below. If you would like to get further insights about the white paper then OpenText will be hosting a joint webinar with IDC on 27th July 2016 at 11 am EDT, 5pm CET. This 40 minute webinar will allow you to: Understand how embedded analytics can provide deeper supply chain intelligence Learn how the VUCA management theory can be applied to a supply chain focused analytics environment and the expected business benefits that can be obtained Find out why it is important to have trading partners connected to a single business network environment to maximize the benefits of applying analytics to supply chain operations Learn how OpenText can provide a cloud based analytics environment to support your supply chain operations You can register for the webinar here.

Read More