Analytics

3 Questions: TDWI’s Fern Halper on Succeeding with Advanced Analytics

Dr. Fern Halper is director of TDWI (The Data Warehousing Institute) Research for advanced analytics. Well known in the analytics community, Halper has published hundreds of articles, research reports, speeches, and Webinars on data mining and information technology during the past 20 years. She focuses on advanced analytics, including predictive analytics, social media analysis, text analytics, cloud computing, and “Big Data” approaches to analytics. Halper is also co-author of several “Dummies” books on cloud computing, the hybrid cloud, and Big Data. OpenText chatted with Halper about the evolution of data-driven and advanced analytics. OpenText: More companies are embracing embedded analytics. Although the concept is not new, what has changed in this space? Where are we now with the technology? Halper:  Traditionally, the term “embedded analytics” referred to analytics that was built into an enterprise application, such as a CRM [customer relationship management] or an ERP [enterprise resource planning] system. This might have included reports or visualizations embedded into these applications.  I’m seeing both the terminology and technology changing to include visual analytics as a seamless part of a user interface as well as actionable analytics in interactive dashboards, automated analytics, analytics that operate in real time to change behavior, and much more. The idea is to operationalize these analytics; in other words, to make the analytics part of the business or operational process.  This brings the results closer to the decision makers and their actions. There are a number of factors driving this trend. First, organizations want right-time/real-time analytics so they can make more timely decisions and become more competitive. They are beginning to see data and analytics as a way to improve business processes, drive operational efficiencies, and grow revenue. Additionally, as the volume and frequency of data increase, it is often not possible to perform analysis and make decisions manually, so analytic actions need to become more automated. OpenText: What part do visualizations and dashboards play in the success of an analytics deployment? Halper:  A lot of the embedding is happening in dashboards.  In fact, in our most recent TDWI study on embedding analytics, 72% of those organizations that were already embedding analytics were doing it in dashboards.  These dashboards are evolving to include more visualizations as well as become more interactive.  The use of these dashboards – by executives, marketing, sales, finance, and operations – can help to drive an analytics culture.  In operations, it can help organizations become more effective and efficient.  These dashboards can be used to inform actions, which is a great first step in operationalizing analytics and making it more pervasive. OpenText: Can you provide examples of how embedded analytics is succeeding today? Where do you see the focus going next? Halper:  We’re seeing analytics embedded in devices, databases, dashboards, systems, and applications.  While a lot of the focus is in applications and dashboards, there is a move by companies to also embed more advanced analytics into databases or into systems.  For example, a data scientist might build a retention model. That retention model is then embedded into the database. As new data about a customer comes in, the customer is scored for the probability of defection. That information might be passed to others in the organization. Or, the analytics might operate automatically—that is a focus. An example of this would include a recommendation engine or a preventive maintenance application.  This is an evolving category. Customer support and operations are two major use cases.  In many instances a main driver is automating analysis that is too complex for people to perform. It can often involve making small automated decisions over and over again. This is one future path for operationalizing analytics, more commonly referred to as “enterprise decision management.” In terms of value, in this study I saw that those organizations that measured either top or bottom-line impact are more likely to embed analytics than those who did not, almost by a two-to-one margin. They tended to be a bit more advanced analytically.  They were also more likely to take automated action on their analytics, use alerts, and embed models into their systems. Halper is expected to discuss the research findings of her latest best practices report entitled “Operationalizing and Embedding Analytics for Action.” The webinar—sponsored by OpenText—will present her findings as well as best practices for getting started operationalizing analytics. Register for this webinar, to learn more.  

Read More

Data Driven Digest for December 11: Holiday Lights

A holiday light show illuminates Canada's

Christmas and the New Year are approaching, so this week we’re sharing some data visualizations with connections to holiday celebrations.  Pour yourself some eggnog (or glögg or other favorite holiday beverage), put on some seasonal music, and settle in for some great watching.  Enjoy! Shed Some Light on the Issue December 13 is celebrated as St. Lucia Day in several countries, from Sweden to Italy and even the St. Lucia Islands.  As fits a (possibly legendary) Catholic saint whose name derives from the Latin word for light, lux / lucis, this is a celebration of light at a time when the winter solstice is approaching and the days are at their shortest. Speaking of light, when we’re surrounded by inexpensive electric light around the clock, it’s hard to imagine how dependent productivity is on reliable light sources. Professor Max Roser at Our World in Data, one of our favorite resources, demonstrates how the Industrial Revolution both drove the demand for more light and filled it.   His interactive graph, based on the work of economists Roger Fouquet and P.J.G. Pearson, shows how the price of lighting dropped sharply starting about 1750.  That’s when new energy sources became available, starting with whale oil and kerosene (for lamps) and cheap beef tallow (for candles).  The mid-19th century added arc lights and gas lamps.  Then, once electricity became common around 1900, the price of illumination dropped to nearly nothing, relative to what it had been in the Middle Ages. Meanwhile, as lighting became cheaper, cleaner, and more convenient, everyone took advantage of it.  Cities began putting up lamps to make streets safer.  Factory owners added night shifts.  Students, housewives, shoppers, entertainment-seekers – everyone felt liberated by the electric bulb. This Little Light of Mine… And of course, that leads us to Christmas lights.  In many countries, they’re a source of neighborhood, city, or even national pride (as shown by the Wonders of Winter show every year in our headquarters of Waterloo, Canada, and the sound-and-light shows on Parliament Hill in Ottawa).   Despite the huge cost advantage of electricity over earlier light sources, incandescent bulbs are not very energy-efficient.  So Christmas lights can still cause a sizable bump in many household budgets (about $20-50 extra, depending on the price of a kilowatt-hour in your area and how intensely you decorate). But in recent years, innovations in bulbs, especially small LEDs, have dropped their energy demands considerably.  The Western Area Power Administration (WAPA) reported in 2010 that a string of LED C7 bulbs (the thumb-sized ones used mostly outdoors) would cost only 23 cents to run during the entire holiday season, compared to $7.03 for conventional incandescent bulbs.  (Miniature bulbs are cheaper than C7s, even if you don’t switch to LEDs.  The State of California estimates that a string of indoor lights, running a total of 300 hours a month, would cost $1.38 to operate if it’s made up of miniature bulbs, vs. $4.31 for C7 bulbs — a 3-fold price difference.)   We’re Burning Daylight Want to take the best photos of your holiday light display?  Professional photographer Jay P. Morgan has great tips on his blog, The Slanted Lens. For a pleasant, soft glow, shoot your photos just as the sun is going down.  That way, there’s still some light in the sky to help illuminate your house, yard, and so forth, Morgan explains.  If you wait until full darkness, the contrast between the lights and the rest of the image is too stark; details on the house won’t “pop” and it won’t show up well against the sky. The data-visualization angle?  His handy chart showing how the ideal moment for Christmas-card photography comes when the fading daylight drops to the same brightness level as your lights.  (He also illustrates how the color temperature drops with the light level.)     When the Lights Go Down on the City While we’re on the topic of light, let’s consider how much can be gleaned from high-altitude pictures of the Earth after dark.  Images taken by NASA satellites show interesting correlations to human activities.   The NASA scientists who compiled the satellite images into this impressive display and shared it through the Visible Earth project note: The brightest areas of the Earth are the most urbanized, but not necessarily the most populated. (Compare Western Europe with China and India.) Cities tend to grow along coastlines and transportation networks. … The United States interstate highway system appears as a lattice connecting the brighter dots of city centers. In Russia, the Trans-Siberian railroad is a thin line stretching from Moscow through the center of Asia to Vladivostok. … Even more than 100 years after the invention of the electric light, some regions remain thinly populated and unlit. The interior jungles of Africa and South America are mostly dark, but lights are beginning to appear there. Deserts in Africa, Arabia, Australia, Mongolia, and the United States are poorly lit as well (except along the coast), along with the boreal forests of Canada and Russia, and the great mountains of the Himalaya. And Roser, doing his own analysis of the Visible Earth images, points out that the level of lighting often marks a sharp political and economic divide, such as between North and South Korea.  Prosperous South Korea glows after dark, especially around the capital, Seoul.  But its northern counterpart, kept poor by decades of Communist dictatorship, is nearly invisible after dark.   Meanwhile, we’re hoping to shed some light on a topic dear to our heart – analytics.  On Jan. 12, 2016, we’re hosting a Webinar featuring TDWI Research Director Fern Halper, who will talk about Operationalizing and Embedding Analytics for Action. As Halper points out, what good is having analytic capacity in your business processes if nobody uses it?  Analytics needs to be embedded into your systems so they can provide answers right where and when they’re needed.  Uses include support for logistics, asset management, customer call centers, and recommendation engines—to name just a few.  We hope you’ll dial in – we promise you’ll learn something! We share our favorite data-driven observations and visualizations every week here.  What topics would you like to read about?  Please leave suggestions and questions in the comment area below. Recent Data Driven Digests: December 4: 2015 winners of the Kantar Information Is Beautiful Awards November 27: Mapping music in color November 20: Popular diets, parole risk assessment, hot startup universities  

Read More

Data Driven Digest for December 4: Data Is Beautiful

The end of the year is approaching, which means that for many of us, it’s time to take stock: “Biggest News Stories of 2015,” “10 Beloved Celebrities We Lost This Year,” and the like. In our line of work—analytics—it’s a good opportunity to survey some of the best data visualization examples of the past 12 months. So this week we’re sharing with you some of the 2015 winners of the Kantar Information Is Beautiful Awards, an annual contest organized by independent data journalist David McCandless (@infobeautiful), author of “Knowledge Is Beautiful” and “Information is Beautiful.” Winners were selected by a public vote from among an impressive long list of candidates. Their topics ranged from weather to politics to the growth of an unborn baby during pregnancy. Find yourself a comfortable seat and settle in for browsing—you’ll find a lot of great things to look at. Enjoy! Red vs Blue: If you live in the United States and feel, judging by the tone of political coverage, that politics has gotten more ruthlessly partisan in recent years, you’re not wrong.  When political scientists crunched the numbers on the voting behavior of U.S. Representatives since the end of World War II, they found that across-the-aisle cooperation between members of different parties has dropped steadily in the last 40 years. The reasons for this increasing polarization are complex—Congressional districts designed to favor one party or the other, an increasingly mobile population more likely to elect candidates by party rather than their stance on purely local issues, big money, politicians flying home on weekends to be with constituents rather than staying in Washington, D.C. to build relationships, and more. The authors of the underlying paper documented their findings in a table of statistics. That’s fine, but what really sells this story is the visualization drawn up by Mauro Martino, an Italian designer who leads the Cognitive Visualization Lab for IBM’s Watson group. His network diagrams show how, year by year, the blue dots representing Democrats and the red ones for Republicans pull further apart. The images are hauntingly beautiful – they could be galaxies colliding or cells dividing. Yet they are telling a story of increasing division, bitterness, and gridlock. Hello world/ Bonjour, le monde! / Nǐ hǎo, shìjiè! Silver medalist in the Information is Beautiful Awards’ data visualization category is “A World of Languages” by Alberto Lucas López (@aLucasLopez), graphics director at the South China Morning Post in Hong Kong. López’s diagram ingeniously carves up a circle representing the Earth into sectors for each of the major languages, and within them the countries where each language is spoken. He supplements his eye-catching image with smaller charts showing language distributions by country (curiously, Papua New Guinea is far ahead of the rest, with 839 separate languages spoken) and the most popular languages to learn. (No surprise to anybody that English far outstrips the rest, with 1.5 billion people learning it – nor that Chinese and two former colonial languages still spoken in many countries, French and Spanish, are next.  But it says something about cultural influence, from Goethe to manga, that German, Italian, and Japanese are the runners-up.) Working for a Living: Job recruiters and economists are keenly aware of rises and falls in national unemployment rates and which industries or sectors are growing, but most businesses can derive value of this information too. The Wall Street Journal team of graphics editor Renee Lightner (@lightnerrenee) and data journalist Andrew Van Dam (@andrewvandam) created an interactive dashboard of unemployment and job gain/loss statistics in the U.S. that conveys an amazing amount of data from the past 10 years (and going back to 1950 in some views). This job data tracker only tied for a bronze medal in the category of Interactive Data Visualization – which says something about the level of competition in this field. Because of the tracker’s thoughtful design, it can answer a wide range of questions.  How about: What sectors of the economy have shown the most sustained growth? Health care and social services, followed by professional services and then restaurants—probably a sign that the economic recovery means people have more to spend on a dinner out. Most sustained losses? Manufacturing, government, and construction—even though construction also shows up on the dot-matrix chart as having frequent peaks as well. (Construction jobs went up .83% in Feb. 2006—something we now recognize as a symptom of the housing bubble.) Meanwhile, “Unemployment rate, overall” vividly charts the recessions of the past 60 years in a colorful mosaic that could easily be featured in an upscale décor magazine. This is just a small sampling of the ingenious ways some of our best thinkers and designers have come up with to analyze and display data patterns. We share our favorites every Friday here. If you have a favorite you’d like to share, please comment below. Recent Data Driven Digests: November 27: Mapping music in color November 20: Popular diets, parole risk assessment, hot startup universities November 13: Working parents, NBA team chemistry, scoring different colleges

Read More

Data Driven Digest for November 27: Data Mapping Music

Étude Chromatique

One of the biggest motivations behind creating graphs, charts, dashboards, and other visualizations is that they make patterns in data easier to perceive. We humans are visually oriented creatures who can intuitively note patterns and rhythms, or spot a detail out of place, through imagery long before we can detect them in written reports or spreadsheets. Or sheet music, for that matter. For this week, we present some examples of how to display music visually, which may get you thinking of other creative ways to visualize data and bring patterns to the surface. Enjoy! If you’ve had any experience reading music, you may be able to tell some things about a piece by looking at its written score. For example, you could probably tell that this piece (an excerpt from Arvo Pärt’s “Spiegel Im Spiegel”) is of a gentler, less agitated nature than this one (the introduction to “Why Do the Nations So Furiously Rage,” from Handel’s “Messiah,” which you might be hearing this holiday season). In fact, Handel and his contemporaries expected listeners to be reading along to the music in the printed score and appreciate the “word-painting” with which they illustrated the text or mood of the music. The practice of word-painting has become less common as fewer and fewer people in modern generations learn to read sheet music. But some composers have found other ways to illustrate their music. The avant-garde composer Ivan Wyschnegradsky created “chromatic” music in both senses of the word. He used not only every note in the 12-note tuning system of classical Western music (where adjoining notes on a piano keyboard are a half-step apart, like A, B-flat, and B natural – what is called a chromatic scale), but notes “between the cracks.” These “ultra-chromatic” pieces required special keyboards that could play two or three different notes between the keys of a regular piano. It’s hard for people who don’t have perfect pitch to hear the difference between these so-called “quarter-tones,” but they lend a subtle eeriness to his music. (Here’s an example: “24 Preludes in Quarter-Tone System.” Then Wyschnegradsky turned to a familiar data-visualization technique: Color. He started representing his music in rainbow-hued wheels, like this (picture via the Association Ivan Wyschnegradsky, Paris). Ever since childhood, he had been fascinated by rainbows. As an adult, he noted the parallels between the 12 colors of the chromatic spectrum (red, orange, yellow, green, blue, and purple, plus the intermediate hues of red-orange, orange-yellow, and so forth) and the 12 chromatic notes in classical music. And just as colors can shade into one another subtly (is this lipstick reddish-orange, or an orangey-red?), so can musical notes (like a slide whistle or trombone). The parallels were too good to pass up. So Wyschnegradsky assigned a color on the spectrum to each of the dozens of quarter-tones in his musical system, then plotted his melodies in circles like a spiderweb or radar chart. As Slate Magazine blogger Greta Weber wrote: Each cell on these drawings corresponds to a different semitone in his complex musical sequences. If you look closely enough, you can follow the spirals as if it were a melody and “listen” to the scores they represent. Wyschnegradsky’s color-wheel scheme never caught on. But the patterns it brings to light have parallels in popular visualization systems, from traffic delays to weather. It’s clear that color helps illuminate. Like what you see? Every Friday we share great data visualizations and embedded analytics. If you have a favorite or trending example, please comment below. Recent Data Driven Digests: November 20: Popular diets, parole risk assessment, hot startup universities November 13: Working parents, NBA team chemistry, scoring different colleges

Read More

Data Driven Digest for November 20: The Last Mile of Big Data

In a recent study by the Digital Clarity Group, Thinking Small: Bringing the Power of Big Data to the Masses, our very own Allen Bonde (@abonde) from before he joined OpenText noted that the best opinions are formed and actions are taken within the “Last Mile” of Big Data. By Last Mile, Allen means the most immediate information and data that is accessed or consumed. For application designers, meeting the Last Mile challenge requires understanding self-service use cases and leveraging tools that turn Big Data into “small data” that helps people perform specific tasks. Some of these use cases include insight, monitoring and targeting. For this week, we provide some examples of visualizations that crunch their fair share of Big Data on the back end but present it in a way that meets the Last Mile challenge. Enjoy. Popular Diets With the holidays coming up, we thought we’d look at dieting trends over the last few years. Health and science reporter Julia Belluz (@juliaoftoronto) assembled a review using Google Analytics based on most searched diets by year and metropolitan area. Reaching back to 2005, the series of visualizations allows the viewer to see a slow yet steady spread of first the gluten-free and now the Paleolithic diets that command the news cycles and self-help bookshelves. Other diets covered include vegan eating, the low-carb diet, the South Beach Diet, and the Atkins Diet. Parole Risk Assessment While we leave the subject of incarceration up to the experts, this interactive visualization from our friends over at FiveThirtyEight (@FiveThirtyEight) caught our eye. A recent trend emerging in criminal sentencing is the notion of using predictive analytics and risk assessments to determine how likely a prisoner will commit the same crime in the future. Scores are determined by factors such as gender, county, age, current offense, number of prior arrests, and if multiple charges were filed. The authors of the FiveThirtyEight study point out that more than 60 risk assessment tools are being used across the U.S.  Although they vary widely, “in their simplest form, they are questionnaires — typically filled out by a jail staff member, probation officer or psychologist — that assign points to offenders based on anything from demographic factors to family background to criminal history. The resulting scores are based on statistical probabilities derived from previous offenders’ behavior. A low score designates an offender as ‘low risk’ and could result in lower bail, less prison time or less restrictive probation or parole terms; a high score can lead to tougher sentences or tighter monitoring.” The simulation I quote is loosely based on the Ohio Risk Assessment System’s Re-Entry Tool, which is intended to assess the probability of prisoners reoffending after they are released from prison. The visualization was produced in collaboration with The Marshall Project (@MarshallProj), a nonprofit news organization that covers the criminal justice system. Considering the U.S. Attorney General’s office has endorsed the idea of risk assessment, it’s likely that visualizations will be used in the future to manage criminal sentencing. School for Startups It’s not who you know, but where you go to college, that could determine the success of your startup, according to our last visualization. Our friends over at DataBucket built a series of visualizations based on data and a Crunchbase API in order to compare the top 5,000 most funded startups over the past 15 years and the education of each of their founders. They found success has a pattern. Graduating from universities that are prestigious, on the West Coast, focused on engineering, and/or offer high-powered MBA programs helps increase your chances for smarter founders and benefactors with deep pockets. “In terms of average amount of funding graduates from each school gets, Harvard, MIT, and Stanford get a standard amount of funding. Indian Institute of Technology has a disproportionately high average funding as well as a large number of founders,” the DataBucket authors comment. “Hanzhou Normal University and Zhejiang University of Technology are off the charts for average funding received. This is attributed completely to Jack Ma and Eddie Wu, [the] founders of Alibaba.” Like what you see? Every Friday we share great data visualizations and embedded analytics. If you have a favorite or trending example, please comment below. Recent Data Driven Digests: November 13: Working parents, NBA team chemistry, scoring different colleges

Read More

3 Questions: James Taylor Surveys the Analytics Landscape

More and more, organizations turn to analytic tools to drive decision-making. But are business leaders using the right tools to make those choices efficient and effective? Despite the progress in advanced analytics tools, business leaders often favor antiquated methods for analyzing data. Or worse, they choose instinct over facts. We chatted with James Taylor (@jamet123), CEO at Decision Management Solutions, and asked how businesses today should be thinking about data analytics. His take? Advanced analytics software, like OpenText Analytics Suite, must support analytic capabilities to align with business needs. The best analytics tools are geared toward the intended user, and support decision-making, both human and automated. OpenText: Let’s start with the Business Intelligence landscape. What’s changing? What should companies be looking for in the new powers of analytics? James Taylor: The critical change we identified in the research, and we see this with our clients also, is the move from reporting and monitoring to deciding and acting. Where once companies invested in Business Intelligence simply to give themselves an ability to report on their data and perhaps track some key metrics, now they see the opportunity in using analytics to improve business decision-making. This in turn is broadening and deepening the capabilities they need. Because a typical organization makes a wide range of decisions, they need a broad set of analytic capabilities; and because the focus is decision-making, they need deeper, more capable analytics. Once organizations focus on analytics for decision-making, they will find they want to expand their footprint into more advanced kinds of analytic technologies. [But this will require them to integrate these] new capabilities and skills to their existing ones, creating a rich set of analytic capabilities that can be applied to meet their decision-making needs. OpenText:  What problems do modern Analytics address? Are companies still forecasting and analyzing their customers with Excel pivot tables and gut instinct? James Taylor: Yes, sadly, some companies are still using Excel Pivot tables and what they think of as gut instinct. It’s important to remember, though, that gut instinct is really just what we have synthesized from the data we have seen before – without rigor, documentation or real analysis, but data nonetheless. Modern analytics allows organizations to bring all their decision-making to a more rational place, applying more robust and more formal analysis of a broader array of data to improve their decisions. Once an organization focuses analytics on decision-making there is almost no limit to what problems they can address though. Especially with the growth in external data available from APIs and cloud-based analytic platforms, the sky’s the limit. OpenText: Where should organizations put their energy when it comes to Analytics?  James Taylor: Organizations should be making the switch to focusing analytics on decisions, not reporting or metrics, as fast as they can. This means getting more serious about understanding the decision-making that drives their business, especially the front-line and operational decisions that tend to be neglected. They need to think about a broad array of analytic capabilities from self-service to highly embedded solutions that support all the different roles engaged in decision-making. Tactically, they should be focusing on the problems they need to solve and let that drive the analytic capabilities they need, while putting those capabilities into an overall enterprise framework for long-term strategic success. Decision Management Solutions‘ 2015 assessment of the Analytics Landscape is currently available for download. Visit this page to download the free report and view the infographic.

Read More

Moving at the Speed of Cloud: Innovation, Insight, and Integration in the OpenText Cloud

As customers look to their digital transformation they are increasingly looking to the cloud to provide them with agile business options and improved productivity while maintaining control of their critical information assets. At OpenText, we provide cloud and hybrid cloud options with seamless movement between on-premises and cloud. Here is a look at some of the recent and emerging cloud innovations: Innovation Our growth in SaaS apps is significant, from advances in existing SaaS applications such as OpenText Core with many new features including Outlook® integration, AD Synch, and Content Suite integration. Archive Center with Exchange online support, file system, and email handling as well as CMIS and custom file support. And there are new SaaS offerings such as PowerDocs, which drives customized document creation, to iHub bringing easily purchased and deployed analytics for any data source. There are new, simple to buy, and easy to deploy standard Managed Cloud Services Packages (MCS). MCS offerings provide complete flexibility to have custom configurations and choices within cloud and hybrid cloud implementations. Now there are also simple standard offerings in Content Suite Cloud Edition, Media Management Cloud Edition, and Big Data Analytics in the cloud. These standard packages have a range of options for service level, functionality, and capacity growth. They are available today with currently shipping products and will also be a part of the Cloud 16 release. Industry solutions and features sets have been added in the cloud. Healthcare Direct for RightFax, which will soon be available with Fax2Mail, provides the ability to translate fax transmissions into direct messages so customers can comply with recent US healthcare legislation and not have to change their business processes. ROSMA addresses Procurement Performance Management, and Core has added features requested by legal and professional services organizations. Insight As we amass greater and greater volumes of information, both structured and unstructured, gaining a true understanding of what the information shows becomes increasing challenging. That is where analytics comes in. Big Data Analytics in the cloud and iHub provide analytics services and allow customers to create their own analytics reporting without the need to have a data scientist on staff. Analytics has been embedded and/or integrated with all of the EIM Suites to provide that insight for customers. The Trading Grid will provide business analytics and logistics Track and Trace for supply chain shipment visibility. Analysis on content and processes is provided through integration with Content Suite and Process Suite. Analytics as a Service in the Cloud allows customers to “bring their own data” and very quickly begin to analyze it and gain insights that can drive their business. Integration We believe it is a hybrid world, and customers need to have the flexibility to integrate their cloud systems with on-premises systems. We provide complete flexibility to integrate with any system on-premises, in the OpenText Cloud, and with other clouds. Many of our Managed Cloud Services customers choose hybrid implementations. Recent innovations in this area include Archive Center CE with SAP® integration and integrations with SAP Hana and S/4Hana.  There is also a new integration that brings the power of ECM together with Salesforce®. In the OpenText Cloud, customers can have integrations across EIM suites and with their other systems on-premises or in other clouds or operated and fully managed in our cloud. We see an increasing number of customers taking advantage of this option. Cloud 16 While some of this innovation is available today, most of it comes together in the Cloud 16 release that consolidates suites and applications in five major areas. This release will be followed by quarterly releases that bring innovations to customers on a more frequent basis. OpenText Content Cloud 16 provides information governance options that are quickly and easily deployed and fully managed in cloud and hybrid cloud scenarios. The Experience Cloud 16 empowers businesses to increase user engagement and improve customer satisfaction while avoiding time spent on managing applications or infrastructure. OpenText Process Cloud 16 enables businesses to rapidly automate their business processes and have the platform managed by EIM specialists in the OpenText Cloud. OpenText Analytics Cloud 16 provides embedded analytics for EIM applications and for custom content sources fully managed by EIM experts in the OpenText Cloud. OpenText Business Network 16 provides a B2B integration platform for managing transactions such as EDI and On Demand messaging with platform services and Managed Services options. Customers can try beta versions of Cloud 16 now, and it will be released in March 2016 and then enjoy ongoing quarterly releases of new innovations. We are moving at the speed of cloud to bring innovative solutions to business challenges.

Read More

The Backstage Pass to OpenText Suite 16 & OpenText Cloud 16

We just shared some huge news with our attendees here at Enterprise World in Las Vegas: we’ve presented an exclusive future roadmap unveiling OpenText Suite 16 and OpenText Cloud 16—the next generation of our EIM offerings that will become available March 2016. We shared a lot of details about Suite 16 and Cloud 16 in our press release (you can read it here), but we couldn’t fit in all the thinking that went into this release. Strategic principles of OpenText Suite 16 & Cloud 16 From what we’ve experienced and from what our customers tell us they want and need, we came up with a dream list of innovations for an enterprise to make the most of digital disruption. We focused on that list of strategic principles across our entire Suite 16 and Cloud 16 offerings. Below is what was going through our heads: 1. Deepen functionality across all suites. This is a major release, full of new features and innovations. Keeping in mind our customers’ suggestions, we significantly deepened functionality throughout the offerings. Practically every product line is raising the bar in terms of customer value proposition and competitive differentiation. Check out the press release for some of the highlights, and watch for more details as we launch the new suites and cloud offerings in March. 2. Help customers transition to the cloud. Right off the top, we wanted to be sure to offer all of our suites in the cloud, available as a subscription or as managed cloud services. That was a must-do, but then we took it further and added integration with other cloud products such as Salesforce® and Office 365®. We also added new Software-as-a-Service (SaaS) applications, such as OpenText Core, PowerDocs, and Archive Center, to enable as much flexibility and savings as possible—customers want what they need, and no one wants to pay for more than they need. 3. Focus on user experience and remove barriers to user adoption. Enterprise productivity begins by removing the barriers to adoption—the easier it is to use, the more they use it—so we invested heavily in improving the user experience across all the suites (in the browser as well as on mobile devices). We put a big focus on HTML5-based responsive experience, and customers who got to preview the new UI of products such as Content Suite or Media Management this week are raving about it. The improved user experience is one of the most noticeable innovations in Suite 16 and Cloud 16. In fact, this alone makes upgrading worthwhile. 4. Integrate with more enterprise applications and across the suites (with an extra focus on analytics). We’ve added integration with more enterprise apps, like Salesforce and SuccessFactors®, in addition to SAP®, Oracle®, and Microsoft®. And there is major integration between the suites. For example, OpenText Process Suite fully integrates with OpenText Content Suite, Archive Center, Media Management, Capture Center, CCM—speaking of CCM, that integrates with OpenText WEM, DAM, CS, Notifications…it goes on and on across all the suites, enabling us to solve customer problems no other vendor can solve. We also put an extra focus on analytics, which is itself a new suite, and it’s embedded into all the other suites, which brings out even more value from existing deployments. 5. Deliver information flows as a way to solve complex problems. All the products in the world are not going to solve real business problem if they’re not integrated in a way that follows the logical flow of information through the business processes and applications. That’s why we focus on the core information flow from information-centric flows, such as capture-to-disposition, create-to-consume, and incident-to resolution, all the way to business flows such as procure-to-pay. For more efficient information flows, we’ve added automation to the Procure-to-Pay business process and delivery of P2P and a new entity-modeling layer in the Process Suite platform, and we’ve extended process-centric collaboration and information sharing. 6. Provide more value from existing deployments. When you can get more value out of existing deployments, you reduce the total cost of ownership. New capabilities, pervasive use of analytics, new UI, focus on cost of ownership, cloud delivery, and subscription-based pricing bring more flexibility and value to the enterprise. Each of these strategic principles makes upgrading to Suite 16 and Cloud 16 worthwhile. This is a milestone release that existing customers will love and want to upgrade to. Read more about OpenText Suite 16 and OpenText Cloud 16 here, and let me know what you think about it.

Read More

Data Driven Digest for November 13: Clear Visual Data

PDF conversion software

Often in data sciences, visualizations are criticized for having way too much information crammed into an overly busy design. We here at Data Driven Digest could not agree more. Making your data relevant in a visual way allows you to communicate clearly the story or feeling you wish to express. Like other forms of art, data visualization requires an easy to understand framework that catches your attention and inspires more thinking. “Music is powered by ideas. If you don’t have clarity of ideas, you’re just communicating sheer sound,” famed cellist Yo-Yo Ma once said. For this week, we provide some examples of how complex data can be displayed in an easy-to-understand fashion. Layers of Working Parents: The number of U.S. homes in which both parents work full time increased to 46 percent, up from 31 percent in 1970, according to the graphic above from a new Pew Research Center survey. The survey, conducted between September 15 and October 13 this year, illustrates some of the challenges in balancing work and family. Pew researchers interviewed more than 1,800 parents with children younger than 18 and cross-compared these results with its population surveys and other microdata. What is significant about this report is that it compresses a lot of information about changing family structure into a simple, clear format—a layered bar graph that is marked inside the text instead of relying on a key with notes. The visualization is comprehensive, yet incomplete. The survey did not include data prior to 1970, because Pew only began asking about working couples once the women’s movement began taking hold. Additionally, same-sex parents were not included in the findings as Pew researchers wanted to show a relevant comparison between couples then and now. Despite the very general, high-level overview, you can see significant dips in employment, such as the fallout of the 2008 subprime mortgage crisis. Plotting NBA Team Success: Scatter plot data visualizations are often used to show correlations between two variables that aren’t tied to a linear time sequence and are great for identifying trends. While business users might want a lot of points on the X-Y graph, it can be overwhelming for the eyes. Adding filters is the best way to overcome this obstacle. Our friends over at DataBucket are fans of U.S. professional basketball—an area benefiting immensely from data analysis. They’ve created a scatter plot based around the premise that great team chemistry and player retention are sure-fire ways for general managers to establish winning seasons year after year i.e. the dynasty approach. “Plotting retention against chemistry allows us to classify NBA franchises into four categories as shown in the table above,” Data Buckets wrote. “Teams with low retention but above-average performance are indications of a newly formed core team, as newly acquired players have figured out how to work together in a short period of time. General managers with a high retention and above-average performance team have found a core group of players that work well together. In these two instances, GMs should not break up their rosters.” Filtering for certain teams finds the Los Angeles Lakers knows how to build great teams with good chemistry in most years while the New York Knicks do not. The scatter plot predicts the Golden State Warriors, Atlanta Hawks and Memphis Grizzlies should perform exceedingly well this season. College Scorecard: Sometimes, the best visualization come in the form of a list or scorecard. It’s as simple as that. The visualization presented above is derived from the U.S. Department of Education’s College Scorecard database and Bureau of Economic Analysis using a methodology developed by Jonathan Rothwell (@jtrothwell), a fellow at Brookings Institution’s Metropolitan Policy Program. Rankings for 3,173 colleges (1,507 two-year colleges and 1,666 four-year colleges) are scored against variables like curriculum value, STEM orientation, graduation rates, and faculty salaries. As Jonathan notes: “Value-added measures attempt to isolate the contribution of the college to student outcomes, as distinct from what one might predict based on student characteristics or the level of degree offered. It is not a measure of return on investment, but rather a way to compare colleges on a more equal footing, by adjusting for the relative advantages or disadvantages faced by diverse students pursuing different levels of study across different local economies.” The list’s simple design allows the user to filter and compile information so that parents and students can predict the long-term value of their college choices.

Read More

What is the OT 9000? Find Out at Enterprise World

At OpenText’s US Headquarters, there is a secret project underway we expect will pave the way for the adoption of “Analytics Everywhere.” Lucky you if you attend this month’s Enterprise World in Las Vegas; you will get an advanced look at the future of analytics. The project started earlier this year when the OpenText Analytics team was looking for a new way to design, deploy and display IOT data. With the sheer number of people coming to Las Vegas for the Enterprise World show, we felt that the audience could provide us with real time data, fed into a mobile data visualization app that we could run in a remote setting. The project was conceived and designed by the OpenText Analytics Innovation team (Kristopher Clark, Mark Gamble, Dan Melcher, Jesse Freeman, Clement Wong, Trevor Huston and Brian Combs). These guys spent many a night tinkering with the design and the output. OT9000 What they came up with are these small black boxes—dubbed the OT9000—situated around the OpenText US corporate office at strategic locations. What does it do? Why does the orb glow red? That’s the secret. But what we can tell you is that the whole project uses a lot of new-age tech. The boxes were designed using a 3D printer. Each box includes a Raspberry Pi controller inside with a Wi-Fi radio. The software includes using an MQTT Broker and MySQL Database. The devices send information to OpenText Information Hub (iHub), which can be viewed on a mobile device or embedded dashboard. But perhaps, we’ve said too much already. Here’s a little teaser for you that shows the technology in action. To find out the answers to more of these questions, you’ll have to attend Enterprise World in Las Vegas and look for the sessions on the “Future of Embedded Analytics” and “Big and Small Data” For more OpenText Analytics content, you’ll also want to check out the Data Driven Apps customer panel moderated by Forrester Principal Analyst Boris Evelson on Wednesday; a presentation of OpenText Analytics and Digital Asset Management on Thursday; and Reaching all Consumers through Accessibility, also on Thursday. Register today!

Read More

Session Spotlight: Analyze This

Recently, we announced OpenText Big Data Analytics to access, blend, explore, and analyze Big Data without depending on IT or data experts. To celebrate, we’ve compiled the list of must- attend Analytics sessions at the 2015 Enterprise World! Don’t miss out!

Read More

All Digital All the Time—Why No Enterprise is Safe from Disruption and Some will Not Survive

We hear a lot about Digital Disruption, the new mantra of the coming apocalypse. Yet disruption doesn’t have to be cataclysmic. In fact, many organizations are embracing it as a new way of working and the increasing number of Chief Digital Officers are making it their business to lead their enterprises safely along uncharted paths. Almost 200 sessions at Enterprise World 2015 will help attendees to navigate digital disruption in their sectors. We specialize in enabling the digital world. Here’s a preview from our team of Industry Strategists. Manufacturing Today’s manufacturing industry continues to globalise operations, reduce costs and embrace complex regional compliance regulations. Combined with embracing disruptive digital technologies such as 3D printing, wearable devices and the Internet of Things, today’s manufacturing CIOs have to overcome an immense digital challenge. CIOs may feel as though they are gambling away their IT budgets on digital projects that provide minimal returns. At Enterprise World 2015, we will be showcasing new cloud, analytics and information management related solutions. The ability for a manufacturer to choose how they manage an enterprise information management environment, for example, in either a cloud or hybrid approach, provides the opportunity for them to scale their IT infrastructure according to the requirements of their business. Increased use of Big Data analytics can provide a deeper understanding or what is going on across a manufacturing operation or supply chain and we will be show casing our latest cloud based analytics solution at EW15. Our Digital Disruption Zone will also showcase how new technologies such as 3D printers and wearable devices can work with OpenText EIM solutions. Financial Services Financial Services is still an industry in flux, routinely testing new business models. A recent Industry Insight blog wrote about a newly licensed bank in the UK, called Atom Bank. They have some capital, and their other asset is an application. That is all. This is amazing when you think about it. It could have happened without smartphones, but they really changed this game. The smartphones of today outperform the fastest and biggest computer of 1985. Essentially they are a small and handy way to deliver chips to you, and you can make a phone call too! And they are easy to carry and fit in your pocket. Of course, it is all in the chips which provide the digital technology. Now Volvo has stated that in the near future if you buy one of their cars, they will provide the insurance. Commercial Banking North American Corporate Banking and Corporate Treasury organizations are facing digital disruption in many flavors, from different directions and in different timeframes. In the short term, a myriad of disruptive regulations, technologies and new players are bombarding banks and their corporate treasury clients, requiring massive, parallel changes to their payment systems. Figuring out a strategy for future proofing a payments environment is critical and partnerships with a provider that’s “been there, done that” in Europe and elsewhere is an important step. Longer term, digital disruption represents both an opportunity and a threat to the corporate banking business. Will the combination of open APIs for financial transactions, distributed ledgers (a la blockchain) and the Internet of Things make banks irrelevant to their corporate clients’ day to day business? Or will the traditional role as a trusted intermediary allow banks to take an even more prominent role in B2B digital commerce as the provider of the equivalent of safe deposit boxes for high assurance digital identity management? Or both? If you have thoughts on these topics, please join us at our Enterprise World Industry User Forums, Friday, November 13, where you will have an opportunity to share with your peers how you are tackling digital disruption in your organization. Public Sector When we talk about digital disruption, our examples typically come from business—Uber, for instance, where technology inspired the execution from the beginning. There were no internal processes to disable, SOP’s to revise, legacy applications to transition or decommission and, most significant, no employees to retrain. These challenges have long impeded government’s ability to modernize. To digitize, public organizations have to apply technology to internal mission-delivery processes—not incrementally automate process steps as they are performed now, in silos, but envision the way they can share information across their functional silos to take giant leaps to cut service delivery times, increase inspection or regulatory effectiveness, improve facility or asset maintenance, investigative efficiency and so on. To read more about how to move to Digital Government, take a look here. Life Sciences The Life Science industries are not immune to digital transformation. In fact, we’re focusing on the emergence of the Information Enterprise: what it is, how to manage it, and why it offers unprecedented opportunities for everyone involved, especially in a world of increasing regulatory scrutiny. As in previous years, there will be content useful for the traditional Life Sciences and Healthcare industries, with added focus for other FDA-regulated enterprises in Food Safety, Cosmetics, and Tobacco. Learn specific information management solutions and strategies for our industry by participating in Breakout Sessions; Customer Roundtables; Industry specific short talks in the Disruption Zone theater; and, Meet Your Industry Peers breakfasts. New this year, we are introducing a Life Sciences user group program for Friday morning, entitled “EIM Best Practices for FDA-Regulated Industries” to discuss best practices and allow for us to learn from each other’s perspectives. So most organizations, regardless of industry are facing the same challenges and are looking for ways to optimize their work and their outcomes. That’s just why we host our annual Enterprise World, to help you create a better way to work. See you there!

Read More

Enterprise World 2015 and the OpenText Cloud

Are you wondering where the cloud will be covered at Enterprise World this year? Well, the simple answer is pretty much everywhere! Whether your Enterprise is looking at solutions for Information Management, Analytics or Business Networks there are things to explore at Enterprise World. We start the week with a variety of training sessions. In ½ day to 3 days you can become an expert in the technology aspects that are most critical to your organization. While there are many great courses and several apply to products available in the OpenText Cloud, there is a new ½ day course on Big Data Analytics now available in our cloud, which you should consider regardless of your other technology choices. U-TR-2-6150 Introduction to Big Data Analytics is available on Nov 9. Register soon as this one is bound to be quite popular. If you are an organization that is actively looking at upgrading Content Suite and having us manage your application in the OpenText Cloud, there is a special ½ day workshop you may be able to take advantage of. It will provide background on the upgrade process to the cloud, and help you to look at your TCO business case for moving to the OpenText Cloud. Register now – space is limited. On Partner Day we will meet with our partner community to share strategy, updates and guidance on partnering with us on cloud opportunities. Tuesday the show Expo floor opens up and begins the opportunities to meet with our product and cloud experts in all areas, as well as with partners. Great conversations take place in the Expo and most of the pod areas will have people to address your cloud questions related to their solutions. If you are interesting in digging into Managed Cloud Services you will be able to find experts at the OpenText Cloud pod, the B2B Managed Services pod, and the Global Technical Services area. We look forward to meeting with you. Don’t miss a single one of the keynote presentations. Our top executives will take us through inspiring and informative presentations that highlight many areas including our cloud strategy, innovations and advances. You will see the expansion of cloud in all of our portfolio areas and see some of the innovations in live demonstrations. There are many breakout sessions that focus on cloud strategy, innovations, customer insights and more. Filter on Cloud under the Track or Product sections and see the many sessions that are devoted to cloud or hybrid cloud topics. Some cloud sessions of note include: CLD-402 Building Your Cloud Strategy Featuring Forrester Research (Analyst Liz Herbert) CLD-410 OpenText Cloud Strategy and Roadmap CLD –412 Our ECM Journey to the Cloud featuring the New Zealand Transport Agency CLD-400 How Managed Cloud Services drives essential benefits in your Cloud Strategy Want to get hands-on and help to shape new and emerging product offerings? Join us in the very popular Innovation Lab. Sign up early for these spots as they fill up fast! Calling all Developers! Here is a chance to roll up your sleeves in the Developer Lab and work with AppWorks to build mobile apps. Let your creativity show and present your innovative apps for the chance at a prize on Thursday. There will be many opportunities to meet with the OpenText teams and with other customers and more cloud learning opportunities. We look forward to seeing you at the MGM Grand for an action packed and educational week.

Read More

Discover the Future of Embedded Analytics at Enterprise World

Whether you are looking to build smarter “data-driven” customer applications or tap the power of Big Data to beat the competition to market, using analytics to deliver better insight is key to any digital strategy. Encompassing a range of approaches from basic reports and dashboards, to more sophisticated advanced analytics for forecasting, optimization and even machine learning (think AI), the analytics landscape is broad – and the bridge between all your data and smarter decisions. Yet to reach all users and deliver sustainable value through both better customer engagement as well as improved productivity, the business intelligence tools of the past no longer work. They don’t scale, they are not intuitive to use, and they often require extensive programming to deploy the most advanced features. Plus many are built on proprietary foundations that make it hard to interface with other systems or easily embed visuals in your favorite app or device. The embedded revolution Just as embedded processors (and cheap memory) sparked a new generation of smart systems and consumer devices like the smartphone, embedded analytics —enabled by solutions like the OpenText Information Hub (iHub) along with advances in big and small data processing, open source projects like BIRT and Hadoop, and rich APIs – has the potential to change the face of many categories of applications (and devices). Think hyper personalized and portable customer experience, or smarter trading grids that anticipate disruptions or automatically seek out the best deal. Or new views into markets or business operations that reveal previously unseen relationships or potential innovations. Meanwhile we are all looking to get closer to our customers, by gaining a true 360 degree view of what they want and how they are interacting with us and each other. This is where the OpenText Big Data Analytics offering (now available in the OpenText cloud!) fits in. Combining advanced and predictive analytics, delivered as an easy to use on premise or cloud offering, BDA aims to bring the power of big data to everyday business analysts, creating one view of their customer base, with a super-fast dedicated analytics data base and pre-built algorithms for handling the most common marketing and operational analyses. Discover more in Vegas The future of analytics is clearly about these types of tools that serve the growing population of “citizen data scientists.” It’s also about delivering insights from new data sources (think IoT) to users on their device of choice like smartphones, tablets or even smart watches. All of these scenarios will be front and center in the “Future of Embedded Analytics” breakout session that Mark Gamble and I will be presented in Vegas. We’ll explore the key requirements for transforming ordinary apps into data-driven powerhouses that can access information from any source, and deliver insights to millions of users. We’ll also look specifically at the role of data in the customer journey, and how analytical apps are becoming smarter and more portable, powered by next-gen predictive analytics, cloud services, and open standards. Plus we’ll showcase our embedded and mobile interfaces—including a behind the scenes look at our latest IoT “activity tracker” demo. Check out information on this and all the breakout sessions at Enterprise World 2015 online at http://www.opentext.com/campaigns/enterprise-world-2015/program/sessions.

Read More

3 Questions: John Johnson of Dell Services Discusses Analytics and Reporting for IT Services

Dell Services, the global system integrator and IT outsourcing arm at Dell, provides support, application, cloud, consulting, and many other mission-critical IT services to hundreds of organizations worldwide across many sectors. The company collects and manages massive amounts of data concerning customer infrastructures from simple, high-frequency metrics (such as CPU, memory, and disk utilization) to helpdesk tickets and service requests, including hardware and software asset information. Using this data to understand and respond to customer needs before they become a problem falls to John M. Johnson, Manager, Capacity Management at Dell Services. In an informal format, Johnson will explain how his team uses OpenText Actuate Information Hub (iHub) to ensure system reliability and keep their customers informed in a free webinar. Johnson recently spoke with OpenText about the type of data Dell Services collects, and the evolving ways his customers consume that data. He also spoke about how he uses this data to plan for the future. OpenText: You have a 12-terabyte data warehouse of performance metrics on your customers’ systems and applications. Tell us about that data and how you use it. Johnson: Our infrastructure reporting data warehouse has been around for seven-plus years. It collects aggregated information about more than a hundred customers, which is just a segment of our base. Originally we started the data warehouse to meet legal retention requirements, and it evolved to become the repository for ticketing data, service request data, and SLA performance data. Now it’s an open warehouse where we continually add information related to our services delivery. It’s fantastic data, and a fantastic amount of data, but we lacked two things: an automated way to present it, and a consistent process behind its presentation. My twenty capacity planners were spending too much of their valuable time churning out Excel reports to present the data to our clients, and far too little time understanding the data. A little less than two years ago we started using open source BIRT for report automation and to eliminate manual errors, consistency issues, and remove the “personal analysis methods” that each engineer was introducing to the process. The next maturing of the process was to leverage iHub to further automate report generation, delivery and presentation. OpenText: Some of your customers and users get dynamic dashboards, while others get static reports. How do you decide who gets what? Johnson: That’s an easy answer: It begins with contract requirements. Those expectations are drawn out and agreed upon by legal counsel on both sides. Once those fundamental requirements are met, the question of, “Who gets what?” is very simply based on how they need and want the data. I have three customer bases: my services customers, and my delivery teams, and peer technical teams who have reporting requirements. And everybody wants a different mix of data. DBAs want to see what’s going on with their infrastructure – their top databases, hardware configurations, software versions and patch level, clusters performance, and replication stats. Other teams, such as service delivery managers, and the account teams, want to see pictures more on a financial level. They need answers to standard questions like, “What has the customer purchased, and is that service meeting the customer’s expectations?” In some cases we handle customer applications in addition to their infrastructure. In those cases, the customer needs reports on uptime, availability, performance, user-response time, outstanding trouble tickets, number of users on each system, and various other application metrics married with the infrastructure data. Those are all static reports we typically deliver on a monthly schedule, but we’re looking to make that particular reporting a daily process with iHub Dashboards. Dashboards will serve three major groups: 1. Application owners, who will see what’s going on with their infrastructure and applications in real-time2. Our service managers, who coordinate the daily delivery of our services around the world3. Senior leaders at the director, VP and CxO levels. That last group has much less interest in a single trouble ticket or server performance, but they do care about service levels and want to know how the infrastructure looks on a daily basis. I think the executive-level dashboards will be big consumers of data in the future, so we’re evolving and maturing our offering from a technical level – where we have traditionally been engaged – to the business level. Because that’s where people buy. OpenText: That is an ambitious plan to extend your reporting platform. How do you prioritize your projects, and what advice would you give to peers with similar plans? Johnson: There’s one overall strategy I try to employ with all my applications: Apply modern, agile software development methodologies to them. You have to stay up-to-date on software patches and capabilities. You have to keep your platform relevant. We keep updates coming rapidly enough that our customers don’t have to create workarounds or manual processes. Fortunately, iHub works well with how we manage upgrades. We manage reports as a unit of work inside of iHub, so I don’t have to make monolithic changes. When I’m prioritizing projects, I first ask, “Who is my most willing customer?” The customer who’s going to work with you provides the quickest path to success, and that success is the foundation upon which you build. Second is to expect to get your hands dirty and do a lot of the lifting. Most customers are always going to have trouble verbalizing what they need and how they want data to look. So you have to just get that first visualization done and ask, “Does this data presented this way answer your needs?” Don’t be afraid of responses such as, “That is not what I wanted at all. I told you I wanted a report,” and that’s one of the most frustrating things about the job. You have to accept that you are a statistical artist and visual presentation is something you own, and then embrace and drive it. Fortunately, the ease of developing and managing data with iHub means we can respond to these inputs rapidly. Learn more about how John Johnson and Dell Services automate and present monthly reports and analytics, provide dashboards and performance metrics to CIOs, and perform monitoring, capacity planning, and workload automation in a free OpenText Analytics webinar. Sign up today.

Read More

Data Driven Digest for October 9: Baseball Playoffs

Take me out to the data. Take me out to the stats. Design me some bar charts and Venn diagrams. I don’t care if I error correct. Cause it’s Hadoop, Hadoop for NoSQL. If you can’t graph it’s a shame. For it’s one, two, x and y-axis… at the old data viz game. We love our data the way we love our baseball—lots of visuals and power hitters that crush it over the wall for the walk-off win. The 2015 Major League Baseball playoffs are now underway with the Divisional Series brackets set. Every game, every inning, every play is packed full of data that can and has been visualized. In the spirit of the grand old game, we present three data visualizations that really swing for the bleachers. Enjoy! FIRST BASE: Visualizing Last Night’s Game Wild Card hopefuls like the Astros and Cubs look to unseat the Royals and Cardinals. The Mets and Dodgers along with the Rangers and Blue Jays are suiting up to settle their annual rivalries. For fans of baseball box scores, the team over at Statlas (stats + atlas) has reprised its visualization recaps of previous games. Laid out like a transit map, each row represents an at-bat with detailed notations similar to ones used by electrical engineers. The “Win Expectancy” column on the right is a statistical representation of the likelihood of the team winning the game based on the outcome of each batter and the impact of big plays. Chris Ring (@cringthis) and Mike Deal (@dealville) currently run the site founded in 2014 by Dan Chaparian, Mike Deal, and Geoff Beck. The site has even more interactive features if you run it on a tablet. SECOND BASE—Rise and Fall of 2014 Oakland Athletics Sometimes it’s good to analyze the trends in a team’s season to determine where things improved and where things fell apart. Such was the case of the hapless Oakland A’s whose 2014 season started off like a rocket and then imploded in September with the release of outfielder Yoenis Cespedes and injuries to starting pitchers Jarrod Parker and A.J. Griffin. Digital Splash Media owner, Jeff Bennett (@DigitalSplash & @VizThinker), masterfully compiled several bar graphs and line graphs that illustrate the problems of the clubhouse that crafted the concept of Moneyball. THIRD BASE: Who Is The Greatest Player? Inadvertently, baseball conversations always boil down to who was the greatest player of the game? A tricky question to be certain as it depends on if the player is a pitcher or a position player. Does this player have the most hits or homers? How is their capacity to capitalize on the numbers of innings played versus the number plate appearances? In baseball statistics, there is also the WAR (Wins Above Replacement) that attempts to summarize a player’s total contributions to their team in one statistic. However, the debate continues. Helping categorize and visualize the oceans of data in making these determinations, Northeastern University assistant professor of history, Benjamin Schmidt has created an extensive and interactive Baseline Cherrypicker that you can use to see how baseball’s statistical leaders shape up against the field. “The x axis shows the starting year for any stat: the y axis shows the length of time being measured,” Schmidt writes in his explanation. “So, for example, if you go down 7 cells from ‘1940,’ it will look up the player who led the league in WAR for the 7 years following 1940, and show the sentence ‘Ted Williams led the majors with 48.28 WAR from 1940 to 1947.'” The visualization lets you hone in on the patches of interest. If you just run your mouse over the chart and read the text that pops up, you’ll start to get the general idea, he added. The interactive visualization includes individual franchise and league leaderboards and would take about 120,000 pages, single-spaced to compile. That’s a lot of baseball to debate.

Read More

October OpenText Live Webinars

Introducing OpenText™ Extended ECM for Oracle® EBS (xECM for EBS) 10.5 SP2. With new features like E-Business Suite 12.2 support, Records Management (RM) enhancements, advanced archiving and our new Manufacturing Management solution accelerator, xECM for EBS is better than ever at synchronizing content, metadata and permissions across both systems. It also provides a deep integration between OpenText and Oracle systems that ensures transactional and business content is consolidated and managed for integrity, cost reduction and regulatory compliance. Join us to not only learn about what we’re doing, but how our customers are using xECM for EBS to accomplish their business goals. We’ll also preview how xECM for EBS fits into the OpenText Blue Carbon project, set for launch at Enterprise World 2015 this November. Register » Project & Program Management for OpenText™ Content Server Thursday, October 8, 2015 at 11:00 a.m. EDT (UTC -4) Project data is growing exponentially, but is often stored out of context or disconnected to project Gantts. Project managers require project management tools integrated with enterprise content, helping the enterprise to optimize Information Governance. Enterprises need a central database for all their project data. Join us to learn more about how Project & Program Management (PPM) for Content Server provides this on your existing ECM system, presented by OpenText partner KineMatik. This webinar will also discuss the latest features available for PPM 10.5.3. Register » Records Disposition Simplified Wednesday, October 14, 2015 at 11:00 a.m. EDT (UTC -4) Driven by actual customer need, the Cassia Records Disposition Approval module (RDA) was built to make it easy for users to sign-off on records as well as to reduce the time it takes for records managers to process the records once they receive the approvals. RDA simplifies the sign-off process for approvers, simplifies records disposition support, simplifies the review process and includes a reporting framework. Join us to learn how RDA can simplify your records management. Register » The Next Wave of Advanced Analytics Thursday, October 15, 2015 at 11:00 a.m. EDT (UTC -4) Are you ready for the next wave of Analytics? Join us for a fast paced Webinar showcasing OpenText™ Actuate Big Data Analytics, Cloud Edition, an advanced Analytics-as-a-Service solution that brings the power of Big Data to everyday business analysts. Learn about the advantages of “Big Data Analytics” in the cloud, including our convenient capacity-based pricing and easy-to-use predictive algorithms. Plus we’ll provide a quick-hit demo of the coolest features and share how easy it is to blend, explore and visualize your data. Gain a top-level understanding of the analytics lifecycle Learn about the emerging requirements for Analytics-as-a-Service See Big Data Analytics, Cloud Edition in action Register » What’s New in OpenText™ RightFax 10.6 Feature Pack 2 Tuesday, October 20, 2015 at 11:00 a.m. EDT (UTC -4) The latest release of OpenText™ RightFax is packed with exciting new features for the user and administrator. Join Mike Stover, Product Manager, as he unveils the new functionality for RightFax that focuses on: User Experience: Designed with the user in mind, the latest release of RightFax includes several enhancements to the end user experience, including newly redesigned tools and additional features. Compatibility: The latest releases of RightFax provide updated and new interoperability between RightFax and other industry software. Enterprise-Grade: OpenText has developed several new improvements that will increase the performance and scalability of the RightFax fax server. These enhancements help increase the productivity, throughput, and workload efficiency in the toughest enterprise environments. Compliance and Security: New features and functionality enhance the security and compliance-grade capabilities of a RightFax server environment. Administration and Use: New functionality makes it easier to manage and use RightFax. New tools allow for managing the environment more efficiently, resulting in demonstrable time-savings for administrators. Register » Increase Your OpenText™ eDOCS DM User Adoption Thursday, October 22, 2015 at 11:00 a.m. EDT (UTC -4) OpenText™ eDOCS DM is a feature rich product with different types of functionality for creating, saving, searching, and interacting with day-to-day content. This OpenText Live webinar session will focus on how you can help your end users get the most out of existing functionality and increase adoption. How often have we heard “it’s the little things that count”? Well, this session will cover some of the basic features that are often overlooked, provide tips and tricks for working with Dynamic Views, more efficient email management using Email Filing, and more. All this will be presented with simplified DM profile forms—making it easier for users to save or search for their documents. Register »  

Read More

Data Driven Digest for October 2: Traffic and Public Transit

Does getting to work on Monday feel like a breeze or miles and miles of bad road? The average worker in the United States spends 200 hours at a cost of $2,600 a year on commuting, according to recent polls. Despite constant congestion, data scientists are always coming up with ways to analyze traffic patterns to ensure you get to your desk by 9 a.m. Whether your transportation is trains or cars, we’ve got you covered, this week. We’ve assembled three very innovative ways of showing and analyzing transportation data. Enjoy!   Big Data on the MTA: Well, let me tell you of the story of a man named Charlie On a tragic and fateful day He put ten cents in his pocket, kissed his wife and family Went to ride on the MTA The ballad made famous by the Kingston Trio’s song MTA (commonly known as Charlie on the MTA) reminds us that transit can be a real nightmare for some and a treasure trove for others. MIT electrical engineering and computer science Ian Reynolds boarded the train to data central with this homemade live data visualization. Taking GPS data from Boston’s own transit agency, Reynolds recreated the track system using Adafruit NeoPixel strips driven by an Arduino Uno, which in turn takes orders from a Python script running on a Raspberry Pi. “Every ten seconds or so, it calls the MBTA API to grab the GPS coordinates of all the trains in the system. It maps those to some LEDs, decides which ones actually need to be changed, and then sends that information to the Arduino, which does the bit pushing,” Reynolds wrote in his description. His next step is writing an app that lets him change the visualization and adjust the brightness. Reynolds posted his project on GitHub if you want to get under the hood. We’re just hoping this type of visualization helps Charlie get to Jamaica Plain once and for all.   Keep Calm and Visualize the Traffic Data: Of course, using data to alleviate traffic problems is not new. In the 60s and 70s, the British government often positioned workers with clipboards to tally the number of cars and the various types that passed through intersections. Decades later, traffic data was computerized onto a 2-dimensional map, which showed various hotspots at selected intervals. In 2009, Jer Thorp (@blprnt) and a team of engineers saw a need to better represent the data in an interactive way. Their presentation entitled “Visualising Transport Data for data.gov.uk, [itoworld.blogspot.com] by ITO” was based on the traffic count data for data.gov.uk Developer Camp. The team used more than 1,000 existing data sets from seven departments and community resources. The visualizations and resulting maps were built in nearly two days. The accompanying movies that document the presentation of the project are available as well as the resulting maps at Flickr.   Go With The Flow: Monitoring traffic flows is also helpful for retailers. Kirkland, Washington-based INRIX provided a proof of concept recently showed the migration of billions of anonymous data points from GPS or mobile networks. The data visualization showed anonymous data flowing in and around Manchester, UK during a popular time for school shopping. The map revealed the local migration from the suburbs to shopping areas in Trafford Centre and Manchester City Centre. Their analysis revealed that much of traffic into the city was coming from the south. The novelty of the visualization is not only showing the traffic patterns but also growing on a horizontal plane, the growing number of people in the city itself. The company said its data could be used by marketers, retailers and civic offices to keep people moving and market to specific demographics based on destinations. Reviews of the project are available at the Silicon.de site (in German). Like what you see? Every Friday we share great data visualizations and embedded analytics. If you have a favorite or trending example, tell us: Submit ideas to blogactuate@actuate.com or add a comment below. Subscribe (at left) and we’ll email you when new entries are posted. Recent Data Driven Digests: September 25: Visualizing data on a US Map including burger prices, presidential debates and startup funding September 18: Original US Congressional debt, Euro spending, and world population based on income September 11: Cloud visualizations related wind, bird migration and El Niño Source: Actuate

Read More