Analytics

Crowd-Sourcing the Internet of Things (Data Driven Digest for January 19, 2016)

Runner with fitness tracker passes the Golden Gate Bridge

The Internet of Things is getting a lot of press these days. The potential use cases are endless, as colleague Noelia Llorente has pointed out: Refrigerators that keep track of the food inside and order more milk or lettuce whenever you’re running low. Mirrors that can determine if you have symptoms of illness and make health recommendations for you. Automated plantations, smart city lighting, autonomous cars that pick you up anywhere in the city… So in this week’s Data Driven Digest, we’re looking at real-world instances of the Internet of Things that do a good job of sharing and visualizing data. As always, we welcome your comments and suggestions for topics in the field of data visualization and analysis.  Enjoy! The Journey of a Million Miles Starts with a Single Step Fitness tracking has long been a popular use for the Internet of Things. Your correspondent was an early adopter, having bought Nike+ running shoes, with a special pocket for a small Internet-enabled sensor, back in 2007.  (Nike+ is now an app using customers’ smartphones, smart watches, and so forth as trackers.) These sensors track where you go and how fast, and your runs can be uploaded and displayed on the Nike+ online forum, along with user-generated commentary – “trash talk” to motivate your running buddies, describing and rating routes, and so forth. Nike is hardly the only online run-sharing provider, but its site is popular enough to have generated years of activity patterns by millions of users worldwide. Here’s one example, a heat map of workouts in the beautiful waterfront parks near San Francisco’s upscale Presidio and Marina neighborhoods.  (You can see which streets are most popular – and, probably, which corners have the best coffeehouses…) The Air That I Breathe Running makes people more aware of the quality of the air that they breathe. HabitatMap.org, an “environmental justice” nonprofit in Brooklyn, N.Y., is trying to make people more conscious of the invisible problem of air pollution through palm-sized sensors called AirBeams. These handheld sensors can measure levels of microparticulate pollution, ozone, carbon monoxide, and nitrogen dioxide (which can be blamed for everything from asthma to heart disease and lung cancer) as well as temperature, humidity, ambient noise, and other conditions. So far so good – buy an AirBeam for $250 and get a personal air-quality meter, whose findings may surprise you. (For example, cooking on a range that doesn’t have an effective air vent subjects you to more grease, soot, and other pollution than the worst smog day in Beijing.) But the Internet of Things is what really makes the device valuable. Just like with the Nike+ activity trackers, AirBeam users upload their sensor data to create collaborative maps of air quality in their neighborhoods. Here, a user demonstrates how his bicycle commute across the Manhattan Bridge subjects him to a lot of truck exhaust and other pollution – a peak of about 80 micrograms of particulate per cubic meter (µg/m3), over twice the Environmental Protection Agency’s 24-hour limit of 35 µg/m3. And here’s a realtime aggregation of hundreds of users’ data about the air quality over Manhattan and Brooklyn.  (Curiously, some of the worst air quality is over the Ozone Park neighborhood…) Clearly, the network effect applies with these and many other crowd-sourced Internet of Things applications – the more data points users are willing to share, the more valuable the overall solution becomes. OpenText Named a Leader in the Internet of Things Speaking of sharing data points, OpenText was honored recently in the area of Internet of Things by Dresner Advisory Services, a leading analyst firm in the field of business intelligence, with its first-ever Technical Innovation Awards. You can view an infographic on Dresner’s Wisdom of Crowds research. Recent Data Driven Digests: January 15: Location Intelligence January 5: Life and Expectations December 22: The Passage of Time in Sun, Stone, and Stars

Read More

Next Step For Internet of Things: Analytics of Things

Infographic Internet of Things Dresner Report

“Theory is when you know everything but nothing works. Practice is when everything works but no one knows why. Between us, Theory and Practice agree: nothing works and nobody knows why.” Anonymous. The Internet of Things (IoT) is clearly the next big step in the technology industry. It will be almost like the second Industrial Revolution, opening a world of incalculable, even greater possibilities in the digital age, potentially achieving greater independence and, therefore, greater efficiency. Imagine the possibilities. Refrigerators that measure the food inside and replenish the right product from your preferred vendor. Mirrors that can determine if you have symptoms of illness and make health recommendations for you. Smart watches that monitor your vital signs and warn the health services if you have a problem or emergency. Traffic lights that connect to a circuit of cameras to identify the level of traffic and mass movement, thus preventing absurd waiting times in areas of little movement. Automated plantations, smart city lighting, courier drones that deliver whatever you want and wherever you want, autonomous cars that pick you up anywhere in the city…  Sounds like futurist Sci-Fi. But what if this scenario was closer than we think? I had the pleasure of attending Big Data Week in Barcelona (#BDW15) recently, which featured top speakers from industry leading companies. My expectation was that I would listen to a lot of talks on technology, programming languages, Hadoop, R and theories about the future for humans and business in this new era of Big Data. After hearing the first presentations from Telefonica (Telcos), BBVA Data & Analytics (Finance) and the Smart Living Program at Mobile World Capital Barcelona (Technology), I realized something. Regardless of the industry, it was all about how insights from data produced by physical objects disrupt our lives as individuals, consumers, parents, and business leaders. It doesn’t matter which role you play. Yes, it was all about the “Internet of Things” or to take it a step forward, the “Analytics of Things”. These companies are already changing the way they do business by leveraging information from internet connected devices that they already have. And, that is just the beginning. Gartner estimates that by 2020, there will be 20+ billion of devices connected to IoT. Dresner’s 2015 The Internet of Things and Business Intelligence report, estimates that 67% of enterprises consider IoT to be an important part of their future strategies and 89% of information professionals think predictive analytics will do the same within the Internet of Things. The data itself is not the point, it is how Big Data Analytics technologies enable organizations to collect, cross, integrate and analyze data from devices to design better products and make our lives easier. So, have Theory and Practice finally converged? What if the future is right now? Take a look at the infographic based on the findings of the IoT Dresner Report or download the full study to find out more.

Read More

Data Driven Digest for January 15: Location, Location, Location

Location intelligence is a trendy term in the business world these days. Basically, it means tracking the whereabouts of things or people, often in real time, and combining that with other information to provide relevant, useful insights. At a consumer level, location intelligence can help with things like finding a coffeehouse open after 9 p.m. or figuring out whether driving via the freeway or city streets will be faster. At a business level, it can help with decisions like where to build a new store branch that doesn’t cannibalize existing customers, or laying out the most efficient delivery truck routes. Location intelligence is particularly on our mind now because OpenText was recently honored by Dresner Advisory Services, a leading analyst firm in the field of business intelligence, with its first-ever Technical Innovation Awards. Dresner recognized our achievements in three areas: Location Intelligence, Internet of Things, and Embedded BI. You’ll be hearing more about these awards later. In the meantime, we’re sharing some great data visualizations based on location intelligence.  As always, we welcome your comments and suggestions.  Enjoy! Take the A Train In cities all over North America, people waiting at bus, train, or trolley stops who are looking at their smartphones aren’t just killing time – they’re finding out exactly when their ride is due to arrive. One of the most popular use cases for location intelligence is real-time transit updates. Scores of transit agencies, from New York and Toronto to Honolulu, have begun tracking the whereabouts of the vehicles in their fleets, and sharing that information in live maps. One of the latest additions is the St. Charles Streetcar line of the New Orleans Regional Transit Authority (NORTA) — actually the oldest continuously operating street railway in the world! (It was created in 1835 as a passenger railway between downtown New Orleans and the Carrollton neighborhood, according to the NORTA Web site.) This is not only a boon to passengers, the location data can also help transit planners figure out where buses are bunching up or falling behind, and adjust schedules accordingly. On the Street Where You Live Crowdsourcing is a popular way to enhance location intelligence. The New York Times offers a great example with this interactive feature describing writers’ and artists’ favorite walks around New York City. You can not only explore the map and associated stories, you can add your own – like this account of a proposal on the Manhattan Bridge. Shelter from the Storm The City of Los Angeles is using location intelligence in a particularly timely way: An interactive map of resources to help residents cope with winter rainstorms (which are expected to be especially bad this year, due to the El Niño weather phenomenon). The city has created a Google Map, embedded in the www.elninola.com site, that shows rainfall severity and any related power outages or flooded streets, along with where residents can find sandbags, hardware stores, or shelter from severe weather, among other things.  It’s accessible via both desktop and smartphones, so users can get directions while they’re driving. (Speaking of directions while driving in the rain, L.A. musician and artist Brad Walsh captured some brilliant footage of an apparently self-driving wheeled trashcan in the Mt. Washington neighborhood. We’re sure it’ll get its own Twitter account any day now.) We share our favorite data-driven observations and visualizations every week here. What topics would you like to read about?  Please leave suggestions and questions in the comment area below. Recent Data Driven Digests: January 5: Life and Expectations December 22: The Passage of Time in Sun, Stone, and Stars December 18: The Data Awakens  

Read More

Data Scientist: What Can I Do For You?

Data Scientist and OpenText Big Data Analytics

After attending our first Enterprise World, I have just one word to define it: intense. In my memory, there are a huge number of incredible moments: spectacular keynotes, lots of demos and amazing breakout sessions. Now, trying to digest all of these experiences, collecting all the opinions, suggestions and thoughts of the customers that visited our booth, I remember a wonderful conversation with a customer about data mining techniques, their best approaches and where we can help with our products. From the details and the way he formed his questions it was pretty clear that, in front of me, I had a data scientist or, maybe someone who deeply understands this amazing world of data mining, machine learning algorithms and predictive analytics. Just to put it in context, usually the data scientist maintains a professional skepticism about applications that provide an easy-to-use interface, without a lot of options and knobs, when running algorithms for prescriptive or predictive analytics. They love to tweak algorithms, writing their own code or accessing and modifying all the parameters of a certain data mining technique, just to obtain the best model for their business challenge. They want to have the full control in the process and it is fully understandable. It is their comfort zone. Data scientists fight against concepts like the democratization of predictive analytics. They have good reasons. And, I agree with a large number of them. Most of the data mining techniques are pretty complex, difficult to understand and need a lot of statistics knowledge just to say, “Okay, this looks pretty good.” Predictive models need to be maintained and revised frequently, based on your business needs and the amount of data you expect to use during the training/testing process. More often than you can imagine, models can’t be reused for similar use cases. Each business challenge has its own data related, and that data is what will define how this prescriptive or predictive model should be trained, tested, validated and, ultimately, applied in the business. On the other hand, a business analyst or a business user without a PhD can take advantage of predictive applications that have those most common algorithms in a box (a black box) and start answering their questions about the business. Moreover, usually their companies can’t assume the expensive compensation of a data scientist, so they deal with all of this by themselves. But, what can we do for you, data scientist? The journey starts with the integration of distinct sources from databases, text files, spreadsheets or, even, applications in a single repository, where everything is connected. Exploring and visualizing complex data models with several levels of hierarchy offers a better approach to the business model than the most common huge table method. Having an analytical repository as a reflection of how the business flows, helps in one of the hardest parts of the Data Scientist: problem definition. Collecting data is just the beginning, there is a huge list of tasks related to data preparation, data quality and data normalization. Here is where the business analyst or the data scientist loses much of their precious time and we are here to help them, accelerating time from raw data to value. Once they have achieved their goal of getting clean data, a data scientist begins the step of analyzing the data, finding patterns, correlations and hidden relationships. OpenText Big Data Analytics can help provide an agile solution to perform all this analysis. Moreover, everything is calculated fast and using all your data, your big data, offering a flexible trial and error environment. So the answer to my question: OpenText Big Data Analytics can reduce the time during the preparation process, increasing time where it is really needed: analysis and decision making, even if the company is dealing with big data. So, why don’t you try it in our 30 days Free Trial or ask us for a demo?

Read More

Data Driven Digest for January 5: Life and Expectations

Welcome to 2016!  Wrapping up 2015 and making resolutions for the year ahead is a good opportunity to consider the passage of time – and in particular, how much is left to each of us. We’re presenting some of the best visualizations of lifespans and life expectancy. So haul that bag of empty champagne bottles and eggnog cartons to the recycling bin, pour yourself a nice glass of kale juice, and enjoy these links for the New Year. “Like Sands Through the Hourglass…” It’s natural to wonder how many years more we’ll live. In fact, it’s an important calculation, when planning for retirement. Figuring out how long a whole population will live is a solvable problem – in fact, statisticians have been forecasting life expectancy for nearly a century. And, the news is generally good:  Life expectancies are going up in nearly every country around the world. But how do you figure out how many years are left to you, personally? (Short of consulting a fortune-teller, a process we don’t recommend as the conclusions are generally not data-driven.) UCLA-trained statistician Nathan Yau of the excellent blog Flowing Data came up with a visualization that looks a bit like a pachinko game. It runs multiple simulations predicting your likely age at death (based on age, gender, and Social Security Administration data) by showing little balls dropping off a slide to hit a range of potential remaining lifespans, everything from “you could die tomorrow” to “you could live to 100.” As the simulations pile up, they peak at the likeliest point. One of the advantages of Yau’s simulator is that it doesn’t provide just one answer, the way many calculators do that ask about your age, gender, race, health habits, and so forth. Instead, it uses the “Monte Carlo” method of multiple randomized trials to get an aggregated answer. Plus, the little rolling, bouncing balls are visually compelling.  (That’s academic-ese for “They’re fun to watch!”) “Visually compelling” is the key.  As flesh-and-blood creatures, we can’t operate entirely in the abstract. It’s one thing to be told you can expect to live X years more; seeing that information as an image somehow has more impact in terms of motivating us to action. That’s why the approach taken by Wait But Why blogger Tim Urban is so striking despite being so simple.  He started with the assumption that we’ll each live to 90 years old – optimistic, but doable. Then he rendered that lifespan as a series of squares, one per year. What makes Urban’s analysis memorable – and a bit chilling – is when he illustrates the remaining years of life as the events in that life – baseball games, trips to the beach, Chinese dumplings, days with aging parents or friends.  Here, he figures that 34 of his 90 expected winter ski trips are already behind him, leaving only 56 to go. Stepping back, he comes to three conclusions: 1) Living in the same place as the people you love matters. I probably have 10X the time left with the people who live in my city as I do with the people who live somewhere else. 2) Priorities matter. Your remaining face time with any person depends largely on where that person falls on your list of life priorities. Make sure this list is set by you—not by unconscious inertia. 3) Quality time matters. If you’re in your last 10% of time with someone you love, keep that fact in the front of your mind when you’re with them and treat that time as what it actually is: precious. Spending Time on Embedded Analytics Since we’re looking ahead to the New Year, on Tuesday, Jan. 12, we’re hosting a webinar featuring TDWI Research Director Fern Halper, discussing Operationalizing and Embedding Analytics for Action. Halper points out that analytics need to be embedded into your systems so they can provide answers right where and when they’re needed. Uses include support for logistics, asset management, customer call centers, and recommendation engines—to name just a few.  Dial in – you’ll learn something! Fern Halper We share our favorite data-driven observations and visualizations every week here.  What topics would you like to read about?  Please leave suggestions and questions in the comment area below. Recent Data Driven Digests: December 22: The Passage of Time in Sun, Stone, and Stars December 18: The Data Awakens December 11: Holiday Lights

Read More

Data Driven Digest for December 22: The Passage of Time in Sun, Stone, and Stars

Photo from RTE News, Ireland

Data visualization has been with us since the first cave-dweller documented the lunar month as a loop of symbols carved onto a piece of bone that hunters could carry with them to track the passage of the seasons. Obviously, technology has moved on in the past 34,000 years – have we told you lately about our iHub dashboards and embedded analytics? – but since the winter solstice (for the Northern Hemisphere) occurs Tuesday, Dec. 22, we thought this would be a good time to review some of the earliest efforts to create live, continuously updated reports of astronomical data, out of stone structures and the landscapes around them.   The fact that many of these calendars still exist after thousands of years, and still work, shows that our prehistoric ancestors must have considered visually recording the time of the year mission-critical, to predict hunting and harvest times, plus other seasonal events such as spring thaws, droughts, and monsoons.  (Whether accurately predicting and planning for those events was part of their yearly job performance review, we leave to the archaeologists…) So step outside and, if the weather permits, take a look at the sunrise or sunset and notice exactly where it hits the horizon, something our ancestors have done for thousands of generations.  Then come back into your nice warm room and check out these links.  Enjoy, and happy holidays! Sun Daggers The winter solstice is the time of year when the days are shortest and nights are longest.  As such, it was an anxious time for primitive people, who wondered when their world would stop getting darker and colder.  That’s why early astronomer-priests (the Chief Data Officers of their time) designed calendars that made clear exactly when the day reached its minimum and the sun was at the lowest point on the horizon – and would then start returning. One of the most impressive solar calendars is at Maeshowe, a 5,000-year-old “chambered cairn” in the Orkney Islands, north of Scotland.  It’s a long passage built of stone slabs dug into an artificial mound.  The passage is oriented so that for a few days around the winter solstice every year, the rays of the setting sun reach all the way down to light up the door at the end.  Two markers pinpoint the sun’s path on the exact date of the solstice: a monolith about half a mile away, called the Barnhouse Stone, and another standing stone at the entrance to Maeshowe (now missing, though its socket remains).         Even more impressive is Newgrange, a 5,000-year-old monument near Dublin, Ireland.  Newgrange was built as a 76-meter-wide circular mound of stones and earth covering an underground passage, possibly used as a tomb. A hollow box above the passage lets in the rising sun’s light for about 20 minutes at dawn around the winter solstice.  The beam starts on the passage’s floor, then gradually reaches down the whole 19-meter length of the passage, flooding it with light.  It’s an impressive spectacle, one that attracts thousands of people to the Newgrange site for the six days each December that the sunbeam is visible.   Nor were early Europeans the only ones taking note of the sun’s travels across the landscape.  At Fajada Butte, New Mexico, three stone slabs were positioned so that “dagger”-shaped beams of sunlight passing between the parallel slabs travel across carved spirals on the cliff face beneath at the summer and winter solstices and spring and fall equinoxes.   Fajada Butte is part of the Chaco Canyon complex, inhabited between the 10th and 13th centuries by the Anasazi or Ancestral Puebloans.  They built impressively engineered cliff dwellings, some as high and densely populated as big-city apartment buildings, laid out 10-meter-wide roads that spanned modern-day New Mexico, and harvested snowmelt and rainwater for irrigation through a sophisticated system of channels and dams.  The Anasazi religion was apparently based on bringing the order seen in the heavens down to earth, so many of their sites were oriented north-south or towards complex alignments of sun, moon, and stars – which may explain why Fajada Butte was just one of many solar observatories they built.   Researchers at the Exploratorium in San Francisco have designed an interactive simulator of how the Sun Daggers worked:   Looping Through Time From the passage of time documented in stone and earth thousands of years ago to the wanderings of a Time Lord:  Just in time for the annual “Dr. Who” Christmas special (and the beloved sci-fi show’s 50th anniversary), our friends at the BBC have created a clever interactive map of the travels through time of all 11 incarnations of Dr. Who. This looping diagram ingeniously displays all the journeys by actor, episode, and whether the trip was into or out of the past and future, as well as the actual year.  It’s not a linear chronology, but the course of a Time Lord’s adventures, like true love, never did run smooth. Light on Embedded Analytics Meanwhile, we’re hoping to shed some light on a topic dear to our heart – analytics.  On Jan. 12, 2016, we’re hosting a Webinar featuring TDWI Research Director Fern Halper, who will talk about Operationalizing and Embedding Analytics for Action. Halper points out that analytics need to be embedded into your systems so they can provide answers right where and when they’re needed.  Uses include support for logistics, asset management, customer call centers, and recommendation engines—to name just a few.  Dial in – we promise you’ll learn something! We share our favorite data-driven observations and visualizations every week here.  What topics would you like to read about?  Please leave suggestions and questions in the comment area below. Recent Data Driven Digests: December 18: The Data Awakens December 11: Holiday Lights December 4: Winners of the Kantar Information Is Beautiful Awards

Read More

Data Driven Digest for December 18: The Data Awakens

It is a period of data confusion. Rebel businesses, striking from a hidden NoSQL base, have assembled their first embedded application against the evil Big Data. During the battle, data scientists managed to steal secret plans to the Empire’s ultimate weapon, the SPREADSHEET, a mass of rows and columns with enough frustration and lack of scale that it could crash an entire business plan. While that may not be the plot of the new Star Wars film (or any for that matter), the scenario may invoke a few cheers for the noble data scientists tasked with creating dashboards and visualizations to battle the dark side of Big Data. Find out more on how to battle your own Big Data problem with Analytics As the world enjoys the latest installment of the Star Wars franchise, it seemed fitting for us to acknowledge visualizations based on the movie series. Strong is the data behind the Force. Enjoy these examples. The Force is Strong with This One Image source: Bloomberg Business At its core, the Star Wars movie franchise is about the battle between the light and dark sides of the Force. But how much time do they spend exploring that mystical power that surrounds us and penetrates us, and binds the galaxy together? Amazingly, a mere 34 minutes out the total 805 minutes amassed in the first six films. The screen above is one of five outstanding visualizations of the use of the Force created by a team of data reporters and visualization designers at Bloomberg Business. Creators Dashiell Bennett (@dashbot), Tait Foster (@taitfoster), Mark Glassman (@markglassman), Chandra Illick (@chandraelise), Chloe Whiteaker (@ChloeWhiteaker), and Jeremy Scott Diamond (@_jsdiamond) really draw you in. They break down not only the time spent talking about the Force but identifying which character uses the Force the most and what types of Force abilities are used. Each movie was viewed by the team with data compiled by hand and then entered into a spreadsheet.  If there were discrepancies, the team used the novelizations and screenplays of the films as references. While the project is engaging, it also digs deep, offering secondary layers of data such as the number of times Obi-Wan Kenobi uses the Jedi Mind Trick versus Luke Skywalker or Qui-Gon Jinn.   Great Shot, Kid, That Was One in a Million Image source: Gaston Sanchez  Sometimes the technologies behind visualizations need to be acknowledged. Our second entry is an example of an arc diagram that was compiled using the R technology.  The Star Wars tie-in here is a statistical text analysis of the scripts from the second trilogy (Episodes IV, V, and VI) using arc-diagram representations. Arc diagrams are often used to visualize repetition patterns. The thickness of the arc lines can be used to represent frequency from the source to the targets or “nodes,” as they are often called. The visualization is not often used as the reader may not clearly see the correlation between the different nodes. However, arc diagrams are great for showing relationships where time or numerical values aren’t involved. Here, the chart shows which characters speak to each other most often, and the words they use most. (No surprise, “sir” and “master” are C-3PO’s most common utterances, while Han Solo says “hey,” “kid,” and “get going” a lot.) Gaston Sanchez, a data scientist and lecturer with the University of California, Berkeley and Berkeley City College, came up with this arc diagram as part of a lecture he was giving on the use of Arc Diagrams with R. Sanchez showed how to use R’s “tm” and “igraph” packages to extract text out of the scripts and compute adjacency matrices. R has become embedded in the corporate world. R is an implementation of the S programming language developed by Bell Labs back in the 1990s. The language has been compared to Python as a way to dive into data analysis or apply statistical techniques. While R has typically been used by academics and researchers, more businesses are embracing R because it is seen as good for user-friendly analysis and graphical modeling.   This is the Data You are Looking For Image source: Eshan Wickrema and Lachlan James While “Star Wars, The Force Awakens” is expected to break box office records, it faces strong challengers to rank as one of the highest-grossing films of all time. According to stats from Box Office Mojo and Variety, the 1999 release of “Star Wars Episode I: The Phantom Menace” ranks number 20 on the list. When adjusted for inflation, the 1977 release of “Star Wars” is ranked third on the list of all-time movies behind “Gone with the Wind” and “Avatar.” Looking at the first 100 days of release is one key to understanding the return on investment for a given film. Writers Eshan Wickrema and Lachlan James compared the stats of the first six Star Wars films against each other. What’s significant is that each film made more in revenue than its predecessor, with the prequel films making nearly twice the amount of “Return of the Jedi,” the most popular of the original trilogy. We share our favorite data-driven observations and visualizations every week here.  What topics would you like to read about?  Please leave suggestions and questions in the comment area below. Recent Data Driven Digests: December 11: Illuminating the cost and output of Christmas lights December 4: 2015 winners of the Kantar Information Is Beautiful Awards November 27: Mapping music in color

Read More

Delivering Insights Interactively in a Customer-Centric World

Consumers today expect access to their information from wherever they are, at any time and from any device. This includes information such as statements, bills, invoices, explanations of benefits, and other transactional records that help them better understand their relationship with a business. What’s more, they want to be able to interact with their data to gain greater insight through sorting, grouping, graphing, charting, or otherwise manipulating the information. This directly impacts customer satisfaction and loyalty – the better companies are at giving customers ubiquitous access to that information, the more opportunities they have to delight and retain their customers. According to IDG’s 2015 “State of the CIO” report, customer experience technologies and mobile technologies are among CIOs’ top five priorities. The ability to access and interact with transactional data from any device falls squarely into both of these categories, so it’s no surprise organizations want to do so. The chart below demonstrates the level of importance CIOs have placed on providing customers the ability to interact with their information. However, while the understanding of the importance of this need is there, it doesn’t necessarily align with organizations’ ability to deliver. Some interesting facts uncovered through IDG’s August 2015 survey on customer data interactivity: 17 percent of organizations cannot provide access to that data across multiple devices. Two-thirds let customers access transactional data on any device, but only in a static format. Only 18 percent can give customers truly interactive transactional data via the device of their choice. The question then is, why is it so difficult for companies to provide information in interactive formats? The IDG survey reveals that many companies lack not only a strategy to enable interactivity, but the skilled resources to implement any strategy they might develop. Any attempts at cross-device interactivity are ad hoc and therefore difficult, where customers expect to be able to slice and dice their own transaction histories at will, allowing them to do so from their devices of choice. What can companies do about this? Where should they begin? What are the best practices in achieving the goal of interactivity in this information-driven age? To enable a better way to work in this age of disruption, OpenText recommends that IT leaders place priority on these three principles: Simplify access to data to reduce costs, improve efficiencies, and increase competitiveness. Consolidate and upgrade information and process platforms. Increase the speed of information delivery through integrated systems and visual presentation. Simplifying access to data is a key component of this puzzle. For very large organizations, data exists in different formats in different archives across disparate departments. This can become a huge barrier to getting information in a consolidated fashion, in the appropriate formats. Some of this data needs to be extracted in its native format, and other data can be repurposed from archived statements and reports. The data can exist in legacy and newer formats, and transforming this information into a common format acceptable to any device requires organization. Finally, accelerating the delivery of information in a device-agnostic manner can only be accomplished through integrated systems that can talk to each other and deliver data in visually compelling formats. All this requires an integrated look at the enterprise architecture and information flow. While this is very much achievable, it needs to be done in a systematic manner, with solutions that can address all the requirements as well as barriers in opening and freeing up information flow across the organization. OpenText, with its suite of products in the Output Transformation and Enterprise Content Management areas, has the toolkit to address different parts of this challenge with industry leading, best-in-class solutions. Download the IDG report,”Customers are demanding insights from their data. Are you ready?” to learn more about these principles of success, and how OpenText can help you deliver the interactive, digital experiences that the customer of today is demanding.

Read More

3 Questions: TDWI’s Fern Halper on Succeeding with Advanced Analytics

Dr. Fern Halper is director of TDWI (The Data Warehousing Institute) Research for advanced analytics. Well known in the analytics community, Halper has published hundreds of articles, research reports, speeches, and Webinars on data mining and information technology during the past 20 years. She focuses on advanced analytics, including predictive analytics, social media analysis, text analytics, cloud computing, and “Big Data” approaches to analytics. Halper is also co-author of several “Dummies” books on cloud computing, the hybrid cloud, and Big Data. OpenText chatted with Halper about the evolution of data-driven and advanced analytics. OpenText: More companies are embracing embedded analytics. Although the concept is not new, what has changed in this space? Where are we now with the technology? Halper:  Traditionally, the term “embedded analytics” referred to analytics that was built into an enterprise application, such as a CRM [customer relationship management] or an ERP [enterprise resource planning] system. This might have included reports or visualizations embedded into these applications.  I’m seeing both the terminology and technology changing to include visual analytics as a seamless part of a user interface as well as actionable analytics in interactive dashboards, automated analytics, analytics that operate in real time to change behavior, and much more. The idea is to operationalize these analytics; in other words, to make the analytics part of the business or operational process.  This brings the results closer to the decision makers and their actions. There are a number of factors driving this trend. First, organizations want right-time/real-time analytics so they can make more timely decisions and become more competitive. They are beginning to see data and analytics as a way to improve business processes, drive operational efficiencies, and grow revenue. Additionally, as the volume and frequency of data increase, it is often not possible to perform analysis and make decisions manually, so analytic actions need to become more automated. OpenText: What part do visualizations and dashboards play in the success of an analytics deployment? Halper:  A lot of the embedding is happening in dashboards.  In fact, in our most recent TDWI study on embedding analytics, 72% of those organizations that were already embedding analytics were doing it in dashboards.  These dashboards are evolving to include more visualizations as well as become more interactive.  The use of these dashboards – by executives, marketing, sales, finance, and operations – can help to drive an analytics culture.  In operations, it can help organizations become more effective and efficient.  These dashboards can be used to inform actions, which is a great first step in operationalizing analytics and making it more pervasive. OpenText: Can you provide examples of how embedded analytics is succeeding today? Where do you see the focus going next? Halper:  We’re seeing analytics embedded in devices, databases, dashboards, systems, and applications.  While a lot of the focus is in applications and dashboards, there is a move by companies to also embed more advanced analytics into databases or into systems.  For example, a data scientist might build a retention model. That retention model is then embedded into the database. As new data about a customer comes in, the customer is scored for the probability of defection. That information might be passed to others in the organization. Or, the analytics might operate automatically—that is a focus. An example of this would include a recommendation engine or a preventive maintenance application.  This is an evolving category. Customer support and operations are two major use cases.  In many instances a main driver is automating analysis that is too complex for people to perform. It can often involve making small automated decisions over and over again. This is one future path for operationalizing analytics, more commonly referred to as “enterprise decision management.” In terms of value, in this study I saw that those organizations that measured either top or bottom-line impact are more likely to embed analytics than those who did not, almost by a two-to-one margin. They tended to be a bit more advanced analytically.  They were also more likely to take automated action on their analytics, use alerts, and embed models into their systems. Halper is expected to discuss the research findings of her latest best practices report entitled “Operationalizing and Embedding Analytics for Action.” The webinar—sponsored by OpenText—will present her findings as well as best practices for getting started operationalizing analytics. Register for this webinar, to learn more.  

Read More

Data Driven Digest for December 11: Holiday Lights

A holiday light show illuminates Canada's

Christmas and the New Year are approaching, so this week we’re sharing some data visualizations with connections to holiday celebrations.  Pour yourself some eggnog (or glögg or other favorite holiday beverage), put on some seasonal music, and settle in for some great watching.  Enjoy! Shed Some Light on the Issue December 13 is celebrated as St. Lucia Day in several countries, from Sweden to Italy and even the St. Lucia Islands.  As fits a (possibly legendary) Catholic saint whose name derives from the Latin word for light, lux / lucis, this is a celebration of light at a time when the winter solstice is approaching and the days are at their shortest. Speaking of light, when we’re surrounded by inexpensive electric light around the clock, it’s hard to imagine how dependent productivity is on reliable light sources. Professor Max Roser at Our World in Data, one of our favorite resources, demonstrates how the Industrial Revolution both drove the demand for more light and filled it.   His interactive graph, based on the work of economists Roger Fouquet and P.J.G. Pearson, shows how the price of lighting dropped sharply starting about 1750.  That’s when new energy sources became available, starting with whale oil and kerosene (for lamps) and cheap beef tallow (for candles).  The mid-19th century added arc lights and gas lamps.  Then, once electricity became common around 1900, the price of illumination dropped to nearly nothing, relative to what it had been in the Middle Ages. Meanwhile, as lighting became cheaper, cleaner, and more convenient, everyone took advantage of it.  Cities began putting up lamps to make streets safer.  Factory owners added night shifts.  Students, housewives, shoppers, entertainment-seekers – everyone felt liberated by the electric bulb. This Little Light of Mine… And of course, that leads us to Christmas lights.  In many countries, they’re a source of neighborhood, city, or even national pride (as shown by the Wonders of Winter show every year in our headquarters of Waterloo, Canada, and the sound-and-light shows on Parliament Hill in Ottawa).   Despite the huge cost advantage of electricity over earlier light sources, incandescent bulbs are not very energy-efficient.  So Christmas lights can still cause a sizable bump in many household budgets (about $20-50 extra, depending on the price of a kilowatt-hour in your area and how intensely you decorate). But in recent years, innovations in bulbs, especially small LEDs, have dropped their energy demands considerably.  The Western Area Power Administration (WAPA) reported in 2010 that a string of LED C7 bulbs (the thumb-sized ones used mostly outdoors) would cost only 23 cents to run during the entire holiday season, compared to $7.03 for conventional incandescent bulbs.  (Miniature bulbs are cheaper than C7s, even if you don’t switch to LEDs.  The State of California estimates that a string of indoor lights, running a total of 300 hours a month, would cost $1.38 to operate if it’s made up of miniature bulbs, vs. $4.31 for C7 bulbs — a 3-fold price difference.)   We’re Burning Daylight Want to take the best photos of your holiday light display?  Professional photographer Jay P. Morgan has great tips on his blog, The Slanted Lens. For a pleasant, soft glow, shoot your photos just as the sun is going down.  That way, there’s still some light in the sky to help illuminate your house, yard, and so forth, Morgan explains.  If you wait until full darkness, the contrast between the lights and the rest of the image is too stark; details on the house won’t “pop” and it won’t show up well against the sky. The data-visualization angle?  His handy chart showing how the ideal moment for Christmas-card photography comes when the fading daylight drops to the same brightness level as your lights.  (He also illustrates how the color temperature drops with the light level.)     When the Lights Go Down on the City While we’re on the topic of light, let’s consider how much can be gleaned from high-altitude pictures of the Earth after dark.  Images taken by NASA satellites show interesting correlations to human activities.   The NASA scientists who compiled the satellite images into this impressive display and shared it through the Visible Earth project note: The brightest areas of the Earth are the most urbanized, but not necessarily the most populated. (Compare Western Europe with China and India.) Cities tend to grow along coastlines and transportation networks. … The United States interstate highway system appears as a lattice connecting the brighter dots of city centers. In Russia, the Trans-Siberian railroad is a thin line stretching from Moscow through the center of Asia to Vladivostok. … Even more than 100 years after the invention of the electric light, some regions remain thinly populated and unlit. The interior jungles of Africa and South America are mostly dark, but lights are beginning to appear there. Deserts in Africa, Arabia, Australia, Mongolia, and the United States are poorly lit as well (except along the coast), along with the boreal forests of Canada and Russia, and the great mountains of the Himalaya. And Roser, doing his own analysis of the Visible Earth images, points out that the level of lighting often marks a sharp political and economic divide, such as between North and South Korea.  Prosperous South Korea glows after dark, especially around the capital, Seoul.  But its northern counterpart, kept poor by decades of Communist dictatorship, is nearly invisible after dark.   Meanwhile, we’re hoping to shed some light on a topic dear to our heart – analytics.  On Jan. 12, 2016, we’re hosting a Webinar featuring TDWI Research Director Fern Halper, who will talk about Operationalizing and Embedding Analytics for Action. As Halper points out, what good is having analytic capacity in your business processes if nobody uses it?  Analytics needs to be embedded into your systems so they can provide answers right where and when they’re needed.  Uses include support for logistics, asset management, customer call centers, and recommendation engines—to name just a few.  We hope you’ll dial in – we promise you’ll learn something! We share our favorite data-driven observations and visualizations every week here.  What topics would you like to read about?  Please leave suggestions and questions in the comment area below. Recent Data Driven Digests: December 4: 2015 winners of the Kantar Information Is Beautiful Awards November 27: Mapping music in color November 20: Popular diets, parole risk assessment, hot startup universities  

Read More

Data Driven Digest for December 4: Data Is Beautiful

The end of the year is approaching, which means that for many of us, it’s time to take stock: “Biggest News Stories of 2015,” “10 Beloved Celebrities We Lost This Year,” and the like. In our line of work—analytics—it’s a good opportunity to survey some of the best data visualization examples of the past 12 months. So this week we’re sharing with you some of the 2015 winners of the Kantar Information Is Beautiful Awards, an annual contest organized by independent data journalist David McCandless (@infobeautiful), author of “Knowledge Is Beautiful” and “Information is Beautiful.” Winners were selected by a public vote from among an impressive long list of candidates. Their topics ranged from weather to politics to the growth of an unborn baby during pregnancy. Find yourself a comfortable seat and settle in for browsing—you’ll find a lot of great things to look at. Enjoy! Red vs Blue: If you live in the United States and feel, judging by the tone of political coverage, that politics has gotten more ruthlessly partisan in recent years, you’re not wrong.  When political scientists crunched the numbers on the voting behavior of U.S. Representatives since the end of World War II, they found that across-the-aisle cooperation between members of different parties has dropped steadily in the last 40 years. The reasons for this increasing polarization are complex—Congressional districts designed to favor one party or the other, an increasingly mobile population more likely to elect candidates by party rather than their stance on purely local issues, big money, politicians flying home on weekends to be with constituents rather than staying in Washington, D.C. to build relationships, and more. The authors of the underlying paper documented their findings in a table of statistics. That’s fine, but what really sells this story is the visualization drawn up by Mauro Martino, an Italian designer who leads the Cognitive Visualization Lab for IBM’s Watson group. His network diagrams show how, year by year, the blue dots representing Democrats and the red ones for Republicans pull further apart. The images are hauntingly beautiful – they could be galaxies colliding or cells dividing. Yet they are telling a story of increasing division, bitterness, and gridlock. Hello world/ Bonjour, le monde! / Nǐ hǎo, shìjiè! Silver medalist in the Information is Beautiful Awards’ data visualization category is “A World of Languages” by Alberto Lucas López (@aLucasLopez), graphics director at the South China Morning Post in Hong Kong. López’s diagram ingeniously carves up a circle representing the Earth into sectors for each of the major languages, and within them the countries where each language is spoken. He supplements his eye-catching image with smaller charts showing language distributions by country (curiously, Papua New Guinea is far ahead of the rest, with 839 separate languages spoken) and the most popular languages to learn. (No surprise to anybody that English far outstrips the rest, with 1.5 billion people learning it – nor that Chinese and two former colonial languages still spoken in many countries, French and Spanish, are next.  But it says something about cultural influence, from Goethe to manga, that German, Italian, and Japanese are the runners-up.) Working for a Living: Job recruiters and economists are keenly aware of rises and falls in national unemployment rates and which industries or sectors are growing, but most businesses can derive value of this information too. The Wall Street Journal team of graphics editor Renee Lightner (@lightnerrenee) and data journalist Andrew Van Dam (@andrewvandam) created an interactive dashboard of unemployment and job gain/loss statistics in the U.S. that conveys an amazing amount of data from the past 10 years (and going back to 1950 in some views). This job data tracker only tied for a bronze medal in the category of Interactive Data Visualization – which says something about the level of competition in this field. Because of the tracker’s thoughtful design, it can answer a wide range of questions.  How about: What sectors of the economy have shown the most sustained growth? Health care and social services, followed by professional services and then restaurants—probably a sign that the economic recovery means people have more to spend on a dinner out. Most sustained losses? Manufacturing, government, and construction—even though construction also shows up on the dot-matrix chart as having frequent peaks as well. (Construction jobs went up .83% in Feb. 2006—something we now recognize as a symptom of the housing bubble.) Meanwhile, “Unemployment rate, overall” vividly charts the recessions of the past 60 years in a colorful mosaic that could easily be featured in an upscale décor magazine. This is just a small sampling of the ingenious ways some of our best thinkers and designers have come up with to analyze and display data patterns. We share our favorites every Friday here. If you have a favorite you’d like to share, please comment below. Recent Data Driven Digests: November 27: Mapping music in color November 20: Popular diets, parole risk assessment, hot startup universities November 13: Working parents, NBA team chemistry, scoring different colleges

Read More

Data Driven Digest for November 27: Data Mapping Music

Étude Chromatique

One of the biggest motivations behind creating graphs, charts, dashboards, and other visualizations is that they make patterns in data easier to perceive. We humans are visually oriented creatures who can intuitively note patterns and rhythms, or spot a detail out of place, through imagery long before we can detect them in written reports or spreadsheets. Or sheet music, for that matter. For this week, we present some examples of how to display music visually, which may get you thinking of other creative ways to visualize data and bring patterns to the surface. Enjoy! If you’ve had any experience reading music, you may be able to tell some things about a piece by looking at its written score. For example, you could probably tell that this piece (an excerpt from Arvo Pärt’s “Spiegel Im Spiegel”) is of a gentler, less agitated nature than this one (the introduction to “Why Do the Nations So Furiously Rage,” from Handel’s “Messiah,” which you might be hearing this holiday season). In fact, Handel and his contemporaries expected listeners to be reading along to the music in the printed score and appreciate the “word-painting” with which they illustrated the text or mood of the music. The practice of word-painting has become less common as fewer and fewer people in modern generations learn to read sheet music. But some composers have found other ways to illustrate their music. The avant-garde composer Ivan Wyschnegradsky created “chromatic” music in both senses of the word. He used not only every note in the 12-note tuning system of classical Western music (where adjoining notes on a piano keyboard are a half-step apart, like A, B-flat, and B natural – what is called a chromatic scale), but notes “between the cracks.” These “ultra-chromatic” pieces required special keyboards that could play two or three different notes between the keys of a regular piano. It’s hard for people who don’t have perfect pitch to hear the difference between these so-called “quarter-tones,” but they lend a subtle eeriness to his music. (Here’s an example: “24 Preludes in Quarter-Tone System.” Then Wyschnegradsky turned to a familiar data-visualization technique: Color. He started representing his music in rainbow-hued wheels, like this (picture via the Association Ivan Wyschnegradsky, Paris). Ever since childhood, he had been fascinated by rainbows. As an adult, he noted the parallels between the 12 colors of the chromatic spectrum (red, orange, yellow, green, blue, and purple, plus the intermediate hues of red-orange, orange-yellow, and so forth) and the 12 chromatic notes in classical music. And just as colors can shade into one another subtly (is this lipstick reddish-orange, or an orangey-red?), so can musical notes (like a slide whistle or trombone). The parallels were too good to pass up. So Wyschnegradsky assigned a color on the spectrum to each of the dozens of quarter-tones in his musical system, then plotted his melodies in circles like a spiderweb or radar chart. As Slate Magazine blogger Greta Weber wrote: Each cell on these drawings corresponds to a different semitone in his complex musical sequences. If you look closely enough, you can follow the spirals as if it were a melody and “listen” to the scores they represent. Wyschnegradsky’s color-wheel scheme never caught on. But the patterns it brings to light have parallels in popular visualization systems, from traffic delays to weather. It’s clear that color helps illuminate. Like what you see? Every Friday we share great data visualizations and embedded analytics. If you have a favorite or trending example, please comment below. Recent Data Driven Digests: November 20: Popular diets, parole risk assessment, hot startup universities November 13: Working parents, NBA team chemistry, scoring different colleges

Read More

Data Driven Digest for November 20: The Last Mile of Big Data

In a recent study by the Digital Clarity Group, Thinking Small: Bringing the Power of Big Data to the Masses, our very own Allen Bonde (@abonde) from before he joined OpenText noted that the best opinions are formed and actions are taken within the “Last Mile” of Big Data. By Last Mile, Allen means the most immediate information and data that is accessed or consumed. For application designers, meeting the Last Mile challenge requires understanding self-service use cases and leveraging tools that turn Big Data into “small data” that helps people perform specific tasks. Some of these use cases include insight, monitoring and targeting. For this week, we provide some examples of visualizations that crunch their fair share of Big Data on the back end but present it in a way that meets the Last Mile challenge. Enjoy. Popular Diets With the holidays coming up, we thought we’d look at dieting trends over the last few years. Health and science reporter Julia Belluz (@juliaoftoronto) assembled a review using Google Analytics based on most searched diets by year and metropolitan area. Reaching back to 2005, the series of visualizations allows the viewer to see a slow yet steady spread of first the gluten-free and now the Paleolithic diets that command the news cycles and self-help bookshelves. Other diets covered include vegan eating, the low-carb diet, the South Beach Diet, and the Atkins Diet. Parole Risk Assessment While we leave the subject of incarceration up to the experts, this interactive visualization from our friends over at FiveThirtyEight (@FiveThirtyEight) caught our eye. A recent trend emerging in criminal sentencing is the notion of using predictive analytics and risk assessments to determine how likely a prisoner will commit the same crime in the future. Scores are determined by factors such as gender, county, age, current offense, number of prior arrests, and if multiple charges were filed. The authors of the FiveThirtyEight study point out that more than 60 risk assessment tools are being used across the U.S.  Although they vary widely, “in their simplest form, they are questionnaires — typically filled out by a jail staff member, probation officer or psychologist — that assign points to offenders based on anything from demographic factors to family background to criminal history. The resulting scores are based on statistical probabilities derived from previous offenders’ behavior. A low score designates an offender as ‘low risk’ and could result in lower bail, less prison time or less restrictive probation or parole terms; a high score can lead to tougher sentences or tighter monitoring.” The simulation I quote is loosely based on the Ohio Risk Assessment System’s Re-Entry Tool, which is intended to assess the probability of prisoners reoffending after they are released from prison. The visualization was produced in collaboration with The Marshall Project (@MarshallProj), a nonprofit news organization that covers the criminal justice system. Considering the U.S. Attorney General’s office has endorsed the idea of risk assessment, it’s likely that visualizations will be used in the future to manage criminal sentencing. School for Startups It’s not who you know, but where you go to college, that could determine the success of your startup, according to our last visualization. Our friends over at DataBucket built a series of visualizations based on data and a Crunchbase API in order to compare the top 5,000 most funded startups over the past 15 years and the education of each of their founders. They found success has a pattern. Graduating from universities that are prestigious, on the West Coast, focused on engineering, and/or offer high-powered MBA programs helps increase your chances for smarter founders and benefactors with deep pockets. “In terms of average amount of funding graduates from each school gets, Harvard, MIT, and Stanford get a standard amount of funding. Indian Institute of Technology has a disproportionately high average funding as well as a large number of founders,” the DataBucket authors comment. “Hanzhou Normal University and Zhejiang University of Technology are off the charts for average funding received. This is attributed completely to Jack Ma and Eddie Wu, [the] founders of Alibaba.” Like what you see? Every Friday we share great data visualizations and embedded analytics. If you have a favorite or trending example, please comment below. Recent Data Driven Digests: November 13: Working parents, NBA team chemistry, scoring different colleges

Read More

3 Questions: James Taylor Surveys the Analytics Landscape

More and more, organizations turn to analytic tools to drive decision-making. But are business leaders using the right tools to make those choices efficient and effective? Despite the progress in advanced analytics tools, business leaders often favor antiquated methods for analyzing data. Or worse, they choose instinct over facts. We chatted with James Taylor (@jamet123), CEO at Decision Management Solutions, and asked how businesses today should be thinking about data analytics. His take? Advanced analytics software, like OpenText Analytics Suite, must support analytic capabilities to align with business needs. The best analytics tools are geared toward the intended user, and support decision-making, both human and automated. OpenText: Let’s start with the Business Intelligence landscape. What’s changing? What should companies be looking for in the new powers of analytics? James Taylor: The critical change we identified in the research, and we see this with our clients also, is the move from reporting and monitoring to deciding and acting. Where once companies invested in Business Intelligence simply to give themselves an ability to report on their data and perhaps track some key metrics, now they see the opportunity in using analytics to improve business decision-making. This in turn is broadening and deepening the capabilities they need. Because a typical organization makes a wide range of decisions, they need a broad set of analytic capabilities; and because the focus is decision-making, they need deeper, more capable analytics. Once organizations focus on analytics for decision-making, they will find they want to expand their footprint into more advanced kinds of analytic technologies. [But this will require them to integrate these] new capabilities and skills to their existing ones, creating a rich set of analytic capabilities that can be applied to meet their decision-making needs. OpenText:  What problems do modern Analytics address? Are companies still forecasting and analyzing their customers with Excel pivot tables and gut instinct? James Taylor: Yes, sadly, some companies are still using Excel Pivot tables and what they think of as gut instinct. It’s important to remember, though, that gut instinct is really just what we have synthesized from the data we have seen before – without rigor, documentation or real analysis, but data nonetheless. Modern analytics allows organizations to bring all their decision-making to a more rational place, applying more robust and more formal analysis of a broader array of data to improve their decisions. Once an organization focuses analytics on decision-making there is almost no limit to what problems they can address though. Especially with the growth in external data available from APIs and cloud-based analytic platforms, the sky’s the limit. OpenText: Where should organizations put their energy when it comes to Analytics?  James Taylor: Organizations should be making the switch to focusing analytics on decisions, not reporting or metrics, as fast as they can. This means getting more serious about understanding the decision-making that drives their business, especially the front-line and operational decisions that tend to be neglected. They need to think about a broad array of analytic capabilities from self-service to highly embedded solutions that support all the different roles engaged in decision-making. Tactically, they should be focusing on the problems they need to solve and let that drive the analytic capabilities they need, while putting those capabilities into an overall enterprise framework for long-term strategic success. Decision Management Solutions‘ 2015 assessment of the Analytics Landscape is currently available for download. Visit this page to download the free report and view the infographic.

Read More

Moving at the Speed of Cloud: Innovation, Insight, and Integration in the OpenText Cloud

As customers look to their digital transformation they are increasingly looking to the cloud to provide them with agile business options and improved productivity while maintaining control of their critical information assets. At OpenText, we provide cloud and hybrid cloud options with seamless movement between on-premises and cloud. Here is a look at some of the recent and emerging cloud innovations: Innovation Our growth in SaaS apps is significant, from advances in existing SaaS applications such as OpenText Core with many new features including Outlook® integration, AD Synch, and Content Suite integration. Archive Center with Exchange online support, file system, and email handling as well as CMIS and custom file support. And there are new SaaS offerings such as PowerDocs, which drives customized document creation, to iHub bringing easily purchased and deployed analytics for any data source. There are new, simple to buy, and easy to deploy standard Managed Cloud Services Packages (MCS). MCS offerings provide complete flexibility to have custom configurations and choices within cloud and hybrid cloud implementations. Now there are also simple standard offerings in Content Suite Cloud Edition, Media Management Cloud Edition, and Big Data Analytics in the cloud. These standard packages have a range of options for service level, functionality, and capacity growth. They are available today with currently shipping products and will also be a part of the Cloud 16 release. Industry solutions and features sets have been added in the cloud. Healthcare Direct for RightFax, which will soon be available with Fax2Mail, provides the ability to translate fax transmissions into direct messages so customers can comply with recent US healthcare legislation and not have to change their business processes. ROSMA addresses Procurement Performance Management, and Core has added features requested by legal and professional services organizations. Insight As we amass greater and greater volumes of information, both structured and unstructured, gaining a true understanding of what the information shows becomes increasing challenging. That is where analytics comes in. Big Data Analytics in the cloud and iHub provide analytics services and allow customers to create their own analytics reporting without the need to have a data scientist on staff. Analytics has been embedded and/or integrated with all of the EIM Suites to provide that insight for customers. The Trading Grid will provide business analytics and logistics Track and Trace for supply chain shipment visibility. Analysis on content and processes is provided through integration with Content Suite and Process Suite. Analytics as a Service in the Cloud allows customers to “bring their own data” and very quickly begin to analyze it and gain insights that can drive their business. Integration We believe it is a hybrid world, and customers need to have the flexibility to integrate their cloud systems with on-premises systems. We provide complete flexibility to integrate with any system on-premises, in the OpenText Cloud, and with other clouds. Many of our Managed Cloud Services customers choose hybrid implementations. Recent innovations in this area include Archive Center CE with SAP® integration and integrations with SAP Hana and S/4Hana.  There is also a new integration that brings the power of ECM together with Salesforce®. In the OpenText Cloud, customers can have integrations across EIM suites and with their other systems on-premises or in other clouds or operated and fully managed in our cloud. We see an increasing number of customers taking advantage of this option. Cloud 16 While some of this innovation is available today, most of it comes together in the Cloud 16 release that consolidates suites and applications in five major areas. This release will be followed by quarterly releases that bring innovations to customers on a more frequent basis. OpenText Content Cloud 16 provides information governance options that are quickly and easily deployed and fully managed in cloud and hybrid cloud scenarios. The Experience Cloud 16 empowers businesses to increase user engagement and improve customer satisfaction while avoiding time spent on managing applications or infrastructure. OpenText Process Cloud 16 enables businesses to rapidly automate their business processes and have the platform managed by EIM specialists in the OpenText Cloud. OpenText Analytics Cloud 16 provides embedded analytics for EIM applications and for custom content sources fully managed by EIM experts in the OpenText Cloud. OpenText Business Network 16 provides a B2B integration platform for managing transactions such as EDI and On Demand messaging with platform services and Managed Services options. Customers can try beta versions of Cloud 16 now, and it will be released in March 2016 and then enjoy ongoing quarterly releases of new innovations. We are moving at the speed of cloud to bring innovative solutions to business challenges.

Read More

The Backstage Pass to OpenText Suite 16 & OpenText Cloud 16

We just shared some huge news with our attendees here at Enterprise World in Las Vegas: we’ve presented an exclusive future roadmap unveiling OpenText Suite 16 and OpenText Cloud 16—the next generation of our EIM offerings that will become available March 2016. We shared a lot of details about Suite 16 and Cloud 16 in our press release (you can read it here), but we couldn’t fit in all the thinking that went into this release. Strategic principles of OpenText Suite 16 & Cloud 16 From what we’ve experienced and from what our customers tell us they want and need, we came up with a dream list of innovations for an enterprise to make the most of digital disruption. We focused on that list of strategic principles across our entire Suite 16 and Cloud 16 offerings. Below is what was going through our heads: 1. Deepen functionality across all suites. This is a major release, full of new features and innovations. Keeping in mind our customers’ suggestions, we significantly deepened functionality throughout the offerings. Practically every product line is raising the bar in terms of customer value proposition and competitive differentiation. Check out the press release for some of the highlights, and watch for more details as we launch the new suites and cloud offerings in March. 2. Help customers transition to the cloud. Right off the top, we wanted to be sure to offer all of our suites in the cloud, available as a subscription or as managed cloud services. That was a must-do, but then we took it further and added integration with other cloud products such as Salesforce® and Office 365®. We also added new Software-as-a-Service (SaaS) applications, such as OpenText Core, PowerDocs, and Archive Center, to enable as much flexibility and savings as possible—customers want what they need, and no one wants to pay for more than they need. 3. Focus on user experience and remove barriers to user adoption. Enterprise productivity begins by removing the barriers to adoption—the easier it is to use, the more they use it—so we invested heavily in improving the user experience across all the suites (in the browser as well as on mobile devices). We put a big focus on HTML5-based responsive experience, and customers who got to preview the new UI of products such as Content Suite or Media Management this week are raving about it. The improved user experience is one of the most noticeable innovations in Suite 16 and Cloud 16. In fact, this alone makes upgrading worthwhile. 4. Integrate with more enterprise applications and across the suites (with an extra focus on analytics). We’ve added integration with more enterprise apps, like Salesforce and SuccessFactors®, in addition to SAP®, Oracle®, and Microsoft®. And there is major integration between the suites. For example, OpenText Process Suite fully integrates with OpenText Content Suite, Archive Center, Media Management, Capture Center, CCM—speaking of CCM, that integrates with OpenText WEM, DAM, CS, Notifications…it goes on and on across all the suites, enabling us to solve customer problems no other vendor can solve. We also put an extra focus on analytics, which is itself a new suite, and it’s embedded into all the other suites, which brings out even more value from existing deployments. 5. Deliver information flows as a way to solve complex problems. All the products in the world are not going to solve real business problem if they’re not integrated in a way that follows the logical flow of information through the business processes and applications. That’s why we focus on the core information flow from information-centric flows, such as capture-to-disposition, create-to-consume, and incident-to resolution, all the way to business flows such as procure-to-pay. For more efficient information flows, we’ve added automation to the Procure-to-Pay business process and delivery of P2P and a new entity-modeling layer in the Process Suite platform, and we’ve extended process-centric collaboration and information sharing. 6. Provide more value from existing deployments. When you can get more value out of existing deployments, you reduce the total cost of ownership. New capabilities, pervasive use of analytics, new UI, focus on cost of ownership, cloud delivery, and subscription-based pricing bring more flexibility and value to the enterprise. Each of these strategic principles makes upgrading to Suite 16 and Cloud 16 worthwhile. This is a milestone release that existing customers will love and want to upgrade to. Read more about OpenText Suite 16 and OpenText Cloud 16 here, and let me know what you think about it.

Read More

Data Driven Digest for November 13: Clear Visual Data

PDF conversion software

Often in data sciences, visualizations are criticized for having way too much information crammed into an overly busy design. We here at Data Driven Digest could not agree more. Making your data relevant in a visual way allows you to communicate clearly the story or feeling you wish to express. Like other forms of art, data visualization requires an easy to understand framework that catches your attention and inspires more thinking. “Music is powered by ideas. If you don’t have clarity of ideas, you’re just communicating sheer sound,” famed cellist Yo-Yo Ma once said. For this week, we provide some examples of how complex data can be displayed in an easy-to-understand fashion. Layers of Working Parents: The number of U.S. homes in which both parents work full time increased to 46 percent, up from 31 percent in 1970, according to the graphic above from a new Pew Research Center survey. The survey, conducted between September 15 and October 13 this year, illustrates some of the challenges in balancing work and family. Pew researchers interviewed more than 1,800 parents with children younger than 18 and cross-compared these results with its population surveys and other microdata. What is significant about this report is that it compresses a lot of information about changing family structure into a simple, clear format—a layered bar graph that is marked inside the text instead of relying on a key with notes. The visualization is comprehensive, yet incomplete. The survey did not include data prior to 1970, because Pew only began asking about working couples once the women’s movement began taking hold. Additionally, same-sex parents were not included in the findings as Pew researchers wanted to show a relevant comparison between couples then and now. Despite the very general, high-level overview, you can see significant dips in employment, such as the fallout of the 2008 subprime mortgage crisis. Plotting NBA Team Success: Scatter plot data visualizations are often used to show correlations between two variables that aren’t tied to a linear time sequence and are great for identifying trends. While business users might want a lot of points on the X-Y graph, it can be overwhelming for the eyes. Adding filters is the best way to overcome this obstacle. Our friends over at DataBucket are fans of U.S. professional basketball—an area benefiting immensely from data analysis. They’ve created a scatter plot based around the premise that great team chemistry and player retention are sure-fire ways for general managers to establish winning seasons year after year i.e. the dynasty approach. “Plotting retention against chemistry allows us to classify NBA franchises into four categories as shown in the table above,” Data Buckets wrote. “Teams with low retention but above-average performance are indications of a newly formed core team, as newly acquired players have figured out how to work together in a short period of time. General managers with a high retention and above-average performance team have found a core group of players that work well together. In these two instances, GMs should not break up their rosters.” Filtering for certain teams finds the Los Angeles Lakers knows how to build great teams with good chemistry in most years while the New York Knicks do not. The scatter plot predicts the Golden State Warriors, Atlanta Hawks and Memphis Grizzlies should perform exceedingly well this season. College Scorecard: Sometimes, the best visualization come in the form of a list or scorecard. It’s as simple as that. The visualization presented above is derived from the U.S. Department of Education’s College Scorecard database and Bureau of Economic Analysis using a methodology developed by Jonathan Rothwell (@jtrothwell), a fellow at Brookings Institution’s Metropolitan Policy Program. Rankings for 3,173 colleges (1,507 two-year colleges and 1,666 four-year colleges) are scored against variables like curriculum value, STEM orientation, graduation rates, and faculty salaries. As Jonathan notes: “Value-added measures attempt to isolate the contribution of the college to student outcomes, as distinct from what one might predict based on student characteristics or the level of degree offered. It is not a measure of return on investment, but rather a way to compare colleges on a more equal footing, by adjusting for the relative advantages or disadvantages faced by diverse students pursuing different levels of study across different local economies.” The list’s simple design allows the user to filter and compile information so that parents and students can predict the long-term value of their college choices.

Read More

What is the OT 9000? Find Out at Enterprise World

At OpenText’s US Headquarters, there is a secret project underway we expect will pave the way for the adoption of “Analytics Everywhere.” Lucky you if you attend this month’s Enterprise World in Las Vegas; you will get an advanced look at the future of analytics. The project started earlier this year when the OpenText Analytics team was looking for a new way to design, deploy and display IOT data. With the sheer number of people coming to Las Vegas for the Enterprise World show, we felt that the audience could provide us with real time data, fed into a mobile data visualization app that we could run in a remote setting. The project was conceived and designed by the OpenText Analytics Innovation team (Kristopher Clark, Mark Gamble, Dan Melcher, Jesse Freeman, Clement Wong, Trevor Huston and Brian Combs). These guys spent many a night tinkering with the design and the output. OT9000 What they came up with are these small black boxes—dubbed the OT9000—situated around the OpenText US corporate office at strategic locations. What does it do? Why does the orb glow red? That’s the secret. But what we can tell you is that the whole project uses a lot of new-age tech. The boxes were designed using a 3D printer. Each box includes a Raspberry Pi controller inside with a Wi-Fi radio. The software includes using an MQTT Broker and MySQL Database. The devices send information to OpenText Information Hub (iHub), which can be viewed on a mobile device or embedded dashboard. But perhaps, we’ve said too much already. Here’s a little teaser for you that shows the technology in action. To find out the answers to more of these questions, you’ll have to attend Enterprise World in Las Vegas and look for the sessions on the “Future of Embedded Analytics” and “Big and Small Data” For more OpenText Analytics content, you’ll also want to check out the Data Driven Apps customer panel moderated by Forrester Principal Analyst Boris Evelson on Wednesday; a presentation of OpenText Analytics and Digital Asset Management on Thursday; and Reaching all Consumers through Accessibility, also on Thursday. Register today!

Read More