Analytics

Session Spotlight: Analyze This

Recently, we announced OpenText Big Data Analytics to access, blend, explore, and analyze Big Data without depending on IT or data experts. To celebrate, we’ve compiled the list of must- attend Analytics sessions at the 2015 Enterprise World! Don’t miss out!

Read More

All Digital All the Time—Why No Enterprise is Safe from Disruption and Some will Not Survive

We hear a lot about Digital Disruption, the new mantra of the coming apocalypse. Yet disruption doesn’t have to be cataclysmic. In fact, many organizations are embracing it as a new way of working and the increasing number of Chief Digital Officers are making it their business to lead their enterprises safely along uncharted paths. Almost 200 sessions at Enterprise World 2015 will help attendees to navigate digital disruption in their sectors. We specialize in enabling the digital world. Here’s a preview from our team of Industry Strategists. Manufacturing Today’s manufacturing industry continues to globalise operations, reduce costs and embrace complex regional compliance regulations. Combined with embracing disruptive digital technologies such as 3D printing, wearable devices and the Internet of Things, today’s manufacturing CIOs have to overcome an immense digital challenge. CIOs may feel as though they are gambling away their IT budgets on digital projects that provide minimal returns. At Enterprise World 2015, we will be showcasing new cloud, analytics and information management related solutions. The ability for a manufacturer to choose how they manage an enterprise information management environment, for example, in either a cloud or hybrid approach, provides the opportunity for them to scale their IT infrastructure according to the requirements of their business. Increased use of Big Data analytics can provide a deeper understanding or what is going on across a manufacturing operation or supply chain and we will be show casing our latest cloud based analytics solution at EW15. Our Digital Disruption Zone will also showcase how new technologies such as 3D printers and wearable devices can work with OpenText EIM solutions. Financial Services Financial Services is still an industry in flux, routinely testing new business models. A recent Industry Insight blog wrote about a newly licensed bank in the UK, called Atom Bank. They have some capital, and their other asset is an application. That is all. This is amazing when you think about it. It could have happened without smartphones, but they really changed this game. The smartphones of today outperform the fastest and biggest computer of 1985. Essentially they are a small and handy way to deliver chips to you, and you can make a phone call too! And they are easy to carry and fit in your pocket. Of course, it is all in the chips which provide the digital technology. Now Volvo has stated that in the near future if you buy one of their cars, they will provide the insurance. Commercial Banking North American Corporate Banking and Corporate Treasury organizations are facing digital disruption in many flavors, from different directions and in different timeframes. In the short term, a myriad of disruptive regulations, technologies and new players are bombarding banks and their corporate treasury clients, requiring massive, parallel changes to their payment systems. Figuring out a strategy for future proofing a payments environment is critical and partnerships with a provider that’s “been there, done that” in Europe and elsewhere is an important step. Longer term, digital disruption represents both an opportunity and a threat to the corporate banking business. Will the combination of open APIs for financial transactions, distributed ledgers (a la blockchain) and the Internet of Things make banks irrelevant to their corporate clients’ day to day business? Or will the traditional role as a trusted intermediary allow banks to take an even more prominent role in B2B digital commerce as the provider of the equivalent of safe deposit boxes for high assurance digital identity management? Or both? If you have thoughts on these topics, please join us at our Enterprise World Industry User Forums, Friday, November 13, where you will have an opportunity to share with your peers how you are tackling digital disruption in your organization. Public Sector When we talk about digital disruption, our examples typically come from business—Uber, for instance, where technology inspired the execution from the beginning. There were no internal processes to disable, SOP’s to revise, legacy applications to transition or decommission and, most significant, no employees to retrain. These challenges have long impeded government’s ability to modernize. To digitize, public organizations have to apply technology to internal mission-delivery processes—not incrementally automate process steps as they are performed now, in silos, but envision the way they can share information across their functional silos to take giant leaps to cut service delivery times, increase inspection or regulatory effectiveness, improve facility or asset maintenance, investigative efficiency and so on. To read more about how to move to Digital Government, take a look here. Life Sciences The Life Science industries are not immune to digital transformation. In fact, we’re focusing on the emergence of the Information Enterprise: what it is, how to manage it, and why it offers unprecedented opportunities for everyone involved, especially in a world of increasing regulatory scrutiny. As in previous years, there will be content useful for the traditional Life Sciences and Healthcare industries, with added focus for other FDA-regulated enterprises in Food Safety, Cosmetics, and Tobacco. Learn specific information management solutions and strategies for our industry by participating in Breakout Sessions; Customer Roundtables; Industry specific short talks in the Disruption Zone theater; and, Meet Your Industry Peers breakfasts. New this year, we are introducing a Life Sciences user group program for Friday morning, entitled “EIM Best Practices for FDA-Regulated Industries” to discuss best practices and allow for us to learn from each other’s perspectives. So most organizations, regardless of industry are facing the same challenges and are looking for ways to optimize their work and their outcomes. That’s just why we host our annual Enterprise World, to help you create a better way to work. See you there!

Read More

Enterprise World 2015 and the OpenText Cloud

Are you wondering where the cloud will be covered at Enterprise World this year? Well, the simple answer is pretty much everywhere! Whether your Enterprise is looking at solutions for Information Management, Analytics or Business Networks there are things to explore at Enterprise World. We start the week with a variety of training sessions. In ½ day to 3 days you can become an expert in the technology aspects that are most critical to your organization. While there are many great courses and several apply to products available in the OpenText Cloud, there is a new ½ day course on Big Data Analytics now available in our cloud, which you should consider regardless of your other technology choices. U-TR-2-6150 Introduction to Big Data Analytics is available on Nov 9. Register soon as this one is bound to be quite popular. If you are an organization that is actively looking at upgrading Content Suite and having us manage your application in the OpenText Cloud, there is a special ½ day workshop you may be able to take advantage of. It will provide background on the upgrade process to the cloud, and help you to look at your TCO business case for moving to the OpenText Cloud. Register now – space is limited. On Partner Day we will meet with our partner community to share strategy, updates and guidance on partnering with us on cloud opportunities. Tuesday the show Expo floor opens up and begins the opportunities to meet with our product and cloud experts in all areas, as well as with partners. Great conversations take place in the Expo and most of the pod areas will have people to address your cloud questions related to their solutions. If you are interesting in digging into Managed Cloud Services you will be able to find experts at the OpenText Cloud pod, the B2B Managed Services pod, and the Global Technical Services area. We look forward to meeting with you. Don’t miss a single one of the keynote presentations. Our top executives will take us through inspiring and informative presentations that highlight many areas including our cloud strategy, innovations and advances. You will see the expansion of cloud in all of our portfolio areas and see some of the innovations in live demonstrations. There are many breakout sessions that focus on cloud strategy, innovations, customer insights and more. Filter on Cloud under the Track or Product sections and see the many sessions that are devoted to cloud or hybrid cloud topics. Some cloud sessions of note include: CLD-402 Building Your Cloud Strategy Featuring Forrester Research (Analyst Liz Herbert) CLD-410 OpenText Cloud Strategy and Roadmap CLD –412 Our ECM Journey to the Cloud featuring the New Zealand Transport Agency CLD-400 How Managed Cloud Services drives essential benefits in your Cloud Strategy Want to get hands-on and help to shape new and emerging product offerings? Join us in the very popular Innovation Lab. Sign up early for these spots as they fill up fast! Calling all Developers! Here is a chance to roll up your sleeves in the Developer Lab and work with AppWorks to build mobile apps. Let your creativity show and present your innovative apps for the chance at a prize on Thursday. There will be many opportunities to meet with the OpenText teams and with other customers and more cloud learning opportunities. We look forward to seeing you at the MGM Grand for an action packed and educational week.

Read More

Discover the Future of Embedded Analytics at Enterprise World

Whether you are looking to build smarter “data-driven” customer applications or tap the power of Big Data to beat the competition to market, using analytics to deliver better insight is key to any digital strategy. Encompassing a range of approaches from basic reports and dashboards, to more sophisticated advanced analytics for forecasting, optimization and even machine learning (think AI), the analytics landscape is broad – and the bridge between all your data and smarter decisions. Yet to reach all users and deliver sustainable value through both better customer engagement as well as improved productivity, the business intelligence tools of the past no longer work. They don’t scale, they are not intuitive to use, and they often require extensive programming to deploy the most advanced features. Plus many are built on proprietary foundations that make it hard to interface with other systems or easily embed visuals in your favorite app or device. The embedded revolution Just as embedded processors (and cheap memory) sparked a new generation of smart systems and consumer devices like the smartphone, embedded analytics —enabled by solutions like the OpenText Information Hub (iHub) along with advances in big and small data processing, open source projects like BIRT and Hadoop, and rich APIs – has the potential to change the face of many categories of applications (and devices). Think hyper personalized and portable customer experience, or smarter trading grids that anticipate disruptions or automatically seek out the best deal. Or new views into markets or business operations that reveal previously unseen relationships or potential innovations. Meanwhile we are all looking to get closer to our customers, by gaining a true 360 degree view of what they want and how they are interacting with us and each other. This is where the OpenText Big Data Analytics offering (now available in the OpenText cloud!) fits in. Combining advanced and predictive analytics, delivered as an easy to use on premise or cloud offering, BDA aims to bring the power of big data to everyday business analysts, creating one view of their customer base, with a super-fast dedicated analytics data base and pre-built algorithms for handling the most common marketing and operational analyses. Discover more in Vegas The future of analytics is clearly about these types of tools that serve the growing population of “citizen data scientists.” It’s also about delivering insights from new data sources (think IoT) to users on their device of choice like smartphones, tablets or even smart watches. All of these scenarios will be front and center in the “Future of Embedded Analytics” breakout session that Mark Gamble and I will be presented in Vegas. We’ll explore the key requirements for transforming ordinary apps into data-driven powerhouses that can access information from any source, and deliver insights to millions of users. We’ll also look specifically at the role of data in the customer journey, and how analytical apps are becoming smarter and more portable, powered by next-gen predictive analytics, cloud services, and open standards. Plus we’ll showcase our embedded and mobile interfaces—including a behind the scenes look at our latest IoT “activity tracker” demo. Check out information on this and all the breakout sessions at Enterprise World 2015 online at http://www.opentext.com/campaigns/enterprise-world-2015/program/sessions.

Read More

3 Questions: John Johnson of Dell Services Discusses Analytics and Reporting for IT Services

Dell Services, the global system integrator and IT outsourcing arm at Dell, provides support, application, cloud, consulting, and many other mission-critical IT services to hundreds of organizations worldwide across many sectors. The company collects and manages massive amounts of data concerning customer infrastructures from simple, high-frequency metrics (such as CPU, memory, and disk utilization) to helpdesk tickets and service requests, including hardware and software asset information. Using this data to understand and respond to customer needs before they become a problem falls to John M. Johnson, Manager, Capacity Management at Dell Services. In an informal format, Johnson will explain how his team uses OpenText Actuate Information Hub (iHub) to ensure system reliability and keep their customers informed in a free webinar. Johnson recently spoke with OpenText about the type of data Dell Services collects, and the evolving ways his customers consume that data. He also spoke about how he uses this data to plan for the future. OpenText: You have a 12-terabyte data warehouse of performance metrics on your customers’ systems and applications. Tell us about that data and how you use it. Johnson: Our infrastructure reporting data warehouse has been around for seven-plus years. It collects aggregated information about more than a hundred customers, which is just a segment of our base. Originally we started the data warehouse to meet legal retention requirements, and it evolved to become the repository for ticketing data, service request data, and SLA performance data. Now it’s an open warehouse where we continually add information related to our services delivery. It’s fantastic data, and a fantastic amount of data, but we lacked two things: an automated way to present it, and a consistent process behind its presentation. My twenty capacity planners were spending too much of their valuable time churning out Excel reports to present the data to our clients, and far too little time understanding the data. A little less than two years ago we started using open source BIRT for report automation and to eliminate manual errors, consistency issues, and remove the “personal analysis methods” that each engineer was introducing to the process. The next maturing of the process was to leverage iHub to further automate report generation, delivery and presentation. OpenText: Some of your customers and users get dynamic dashboards, while others get static reports. How do you decide who gets what? Johnson: That’s an easy answer: It begins with contract requirements. Those expectations are drawn out and agreed upon by legal counsel on both sides. Once those fundamental requirements are met, the question of, “Who gets what?” is very simply based on how they need and want the data. I have three customer bases: my services customers, and my delivery teams, and peer technical teams who have reporting requirements. And everybody wants a different mix of data. DBAs want to see what’s going on with their infrastructure – their top databases, hardware configurations, software versions and patch level, clusters performance, and replication stats. Other teams, such as service delivery managers, and the account teams, want to see pictures more on a financial level. They need answers to standard questions like, “What has the customer purchased, and is that service meeting the customer’s expectations?” In some cases we handle customer applications in addition to their infrastructure. In those cases, the customer needs reports on uptime, availability, performance, user-response time, outstanding trouble tickets, number of users on each system, and various other application metrics married with the infrastructure data. Those are all static reports we typically deliver on a monthly schedule, but we’re looking to make that particular reporting a daily process with iHub Dashboards. Dashboards will serve three major groups: 1. Application owners, who will see what’s going on with their infrastructure and applications in real-time2. Our service managers, who coordinate the daily delivery of our services around the world3. Senior leaders at the director, VP and CxO levels. That last group has much less interest in a single trouble ticket or server performance, but they do care about service levels and want to know how the infrastructure looks on a daily basis. I think the executive-level dashboards will be big consumers of data in the future, so we’re evolving and maturing our offering from a technical level – where we have traditionally been engaged – to the business level. Because that’s where people buy. OpenText: That is an ambitious plan to extend your reporting platform. How do you prioritize your projects, and what advice would you give to peers with similar plans? Johnson: There’s one overall strategy I try to employ with all my applications: Apply modern, agile software development methodologies to them. You have to stay up-to-date on software patches and capabilities. You have to keep your platform relevant. We keep updates coming rapidly enough that our customers don’t have to create workarounds or manual processes. Fortunately, iHub works well with how we manage upgrades. We manage reports as a unit of work inside of iHub, so I don’t have to make monolithic changes. When I’m prioritizing projects, I first ask, “Who is my most willing customer?” The customer who’s going to work with you provides the quickest path to success, and that success is the foundation upon which you build. Second is to expect to get your hands dirty and do a lot of the lifting. Most customers are always going to have trouble verbalizing what they need and how they want data to look. So you have to just get that first visualization done and ask, “Does this data presented this way answer your needs?” Don’t be afraid of responses such as, “That is not what I wanted at all. I told you I wanted a report,” and that’s one of the most frustrating things about the job. You have to accept that you are a statistical artist and visual presentation is something you own, and then embrace and drive it. Fortunately, the ease of developing and managing data with iHub means we can respond to these inputs rapidly. Learn more about how John Johnson and Dell Services automate and present monthly reports and analytics, provide dashboards and performance metrics to CIOs, and perform monitoring, capacity planning, and workload automation in a free OpenText Analytics webinar. Sign up today.

Read More

Data Driven Digest for October 9: Baseball Playoffs

Take me out to the data. Take me out to the stats. Design me some bar charts and Venn diagrams. I don’t care if I error correct. Cause it’s Hadoop, Hadoop for NoSQL. If you can’t graph it’s a shame. For it’s one, two, x and y-axis… at the old data viz game. We love our data the way we love our baseball—lots of visuals and power hitters that crush it over the wall for the walk-off win. The 2015 Major League Baseball playoffs are now underway with the Divisional Series brackets set. Every game, every inning, every play is packed full of data that can and has been visualized. In the spirit of the grand old game, we present three data visualizations that really swing for the bleachers. Enjoy! FIRST BASE: Visualizing Last Night’s Game Wild Card hopefuls like the Astros and Cubs look to unseat the Royals and Cardinals. The Mets and Dodgers along with the Rangers and Blue Jays are suiting up to settle their annual rivalries. For fans of baseball box scores, the team over at Statlas (stats + atlas) has reprised its visualization recaps of previous games. Laid out like a transit map, each row represents an at-bat with detailed notations similar to ones used by electrical engineers. The “Win Expectancy” column on the right is a statistical representation of the likelihood of the team winning the game based on the outcome of each batter and the impact of big plays. Chris Ring (@cringthis) and Mike Deal (@dealville) currently run the site founded in 2014 by Dan Chaparian, Mike Deal, and Geoff Beck. The site has even more interactive features if you run it on a tablet. SECOND BASE—Rise and Fall of 2014 Oakland Athletics Sometimes it’s good to analyze the trends in a team’s season to determine where things improved and where things fell apart. Such was the case of the hapless Oakland A’s whose 2014 season started off like a rocket and then imploded in September with the release of outfielder Yoenis Cespedes and injuries to starting pitchers Jarrod Parker and A.J. Griffin. Digital Splash Media owner, Jeff Bennett (@DigitalSplash & @VizThinker), masterfully compiled several bar graphs and line graphs that illustrate the problems of the clubhouse that crafted the concept of Moneyball. THIRD BASE: Who Is The Greatest Player? Inadvertently, baseball conversations always boil down to who was the greatest player of the game? A tricky question to be certain as it depends on if the player is a pitcher or a position player. Does this player have the most hits or homers? How is their capacity to capitalize on the numbers of innings played versus the number plate appearances? In baseball statistics, there is also the WAR (Wins Above Replacement) that attempts to summarize a player’s total contributions to their team in one statistic. However, the debate continues. Helping categorize and visualize the oceans of data in making these determinations, Northeastern University assistant professor of history, Benjamin Schmidt has created an extensive and interactive Baseline Cherrypicker that you can use to see how baseball’s statistical leaders shape up against the field. “The x axis shows the starting year for any stat: the y axis shows the length of time being measured,” Schmidt writes in his explanation. “So, for example, if you go down 7 cells from ‘1940,’ it will look up the player who led the league in WAR for the 7 years following 1940, and show the sentence ‘Ted Williams led the majors with 48.28 WAR from 1940 to 1947.'” The visualization lets you hone in on the patches of interest. If you just run your mouse over the chart and read the text that pops up, you’ll start to get the general idea, he added. The interactive visualization includes individual franchise and league leaderboards and would take about 120,000 pages, single-spaced to compile. That’s a lot of baseball to debate.

Read More

October OpenText Live Webinars

Introducing OpenText™ Extended ECM for Oracle® EBS (xECM for EBS) 10.5 SP2. With new features like E-Business Suite 12.2 support, Records Management (RM) enhancements, advanced archiving and our new Manufacturing Management solution accelerator, xECM for EBS is better than ever at synchronizing content, metadata and permissions across both systems. It also provides a deep integration between OpenText and Oracle systems that ensures transactional and business content is consolidated and managed for integrity, cost reduction and regulatory compliance. Join us to not only learn about what we’re doing, but how our customers are using xECM for EBS to accomplish their business goals. We’ll also preview how xECM for EBS fits into the OpenText Blue Carbon project, set for launch at Enterprise World 2015 this November. Register » Project & Program Management for OpenText™ Content Server Thursday, October 8, 2015 at 11:00 a.m. EDT (UTC -4) Project data is growing exponentially, but is often stored out of context or disconnected to project Gantts. Project managers require project management tools integrated with enterprise content, helping the enterprise to optimize Information Governance. Enterprises need a central database for all their project data. Join us to learn more about how Project & Program Management (PPM) for Content Server provides this on your existing ECM system, presented by OpenText partner KineMatik. This webinar will also discuss the latest features available for PPM 10.5.3. Register » Records Disposition Simplified Wednesday, October 14, 2015 at 11:00 a.m. EDT (UTC -4) Driven by actual customer need, the Cassia Records Disposition Approval module (RDA) was built to make it easy for users to sign-off on records as well as to reduce the time it takes for records managers to process the records once they receive the approvals. RDA simplifies the sign-off process for approvers, simplifies records disposition support, simplifies the review process and includes a reporting framework. Join us to learn how RDA can simplify your records management. Register » The Next Wave of Advanced Analytics Thursday, October 15, 2015 at 11:00 a.m. EDT (UTC -4) Are you ready for the next wave of Analytics? Join us for a fast paced Webinar showcasing OpenText™ Actuate Big Data Analytics, Cloud Edition, an advanced Analytics-as-a-Service solution that brings the power of Big Data to everyday business analysts. Learn about the advantages of “Big Data Analytics” in the cloud, including our convenient capacity-based pricing and easy-to-use predictive algorithms. Plus we’ll provide a quick-hit demo of the coolest features and share how easy it is to blend, explore and visualize your data. Gain a top-level understanding of the analytics lifecycle Learn about the emerging requirements for Analytics-as-a-Service See Big Data Analytics, Cloud Edition in action Register » What’s New in OpenText™ RightFax 10.6 Feature Pack 2 Tuesday, October 20, 2015 at 11:00 a.m. EDT (UTC -4) The latest release of OpenText™ RightFax is packed with exciting new features for the user and administrator. Join Mike Stover, Product Manager, as he unveils the new functionality for RightFax that focuses on: User Experience: Designed with the user in mind, the latest release of RightFax includes several enhancements to the end user experience, including newly redesigned tools and additional features. Compatibility: The latest releases of RightFax provide updated and new interoperability between RightFax and other industry software. Enterprise-Grade: OpenText has developed several new improvements that will increase the performance and scalability of the RightFax fax server. These enhancements help increase the productivity, throughput, and workload efficiency in the toughest enterprise environments. Compliance and Security: New features and functionality enhance the security and compliance-grade capabilities of a RightFax server environment. Administration and Use: New functionality makes it easier to manage and use RightFax. New tools allow for managing the environment more efficiently, resulting in demonstrable time-savings for administrators. Register » Increase Your OpenText™ eDOCS DM User Adoption Thursday, October 22, 2015 at 11:00 a.m. EDT (UTC -4) OpenText™ eDOCS DM is a feature rich product with different types of functionality for creating, saving, searching, and interacting with day-to-day content. This OpenText Live webinar session will focus on how you can help your end users get the most out of existing functionality and increase adoption. How often have we heard “it’s the little things that count”? Well, this session will cover some of the basic features that are often overlooked, provide tips and tricks for working with Dynamic Views, more efficient email management using Email Filing, and more. All this will be presented with simplified DM profile forms—making it easier for users to save or search for their documents. Register »  

Read More

Data Driven Digest for October 2: Traffic and Public Transit

Does getting to work on Monday feel like a breeze or miles and miles of bad road? The average worker in the United States spends 200 hours at a cost of $2,600 a year on commuting, according to recent polls. Despite constant congestion, data scientists are always coming up with ways to analyze traffic patterns to ensure you get to your desk by 9 a.m. Whether your transportation is trains or cars, we’ve got you covered, this week. We’ve assembled three very innovative ways of showing and analyzing transportation data. Enjoy!   Big Data on the MTA: Well, let me tell you of the story of a man named Charlie On a tragic and fateful day He put ten cents in his pocket, kissed his wife and family Went to ride on the MTA The ballad made famous by the Kingston Trio’s song MTA (commonly known as Charlie on the MTA) reminds us that transit can be a real nightmare for some and a treasure trove for others. MIT electrical engineering and computer science Ian Reynolds boarded the train to data central with this homemade live data visualization. Taking GPS data from Boston’s own transit agency, Reynolds recreated the track system using Adafruit NeoPixel strips driven by an Arduino Uno, which in turn takes orders from a Python script running on a Raspberry Pi. “Every ten seconds or so, it calls the MBTA API to grab the GPS coordinates of all the trains in the system. It maps those to some LEDs, decides which ones actually need to be changed, and then sends that information to the Arduino, which does the bit pushing,” Reynolds wrote in his description. His next step is writing an app that lets him change the visualization and adjust the brightness. Reynolds posted his project on GitHub if you want to get under the hood. We’re just hoping this type of visualization helps Charlie get to Jamaica Plain once and for all.   Keep Calm and Visualize the Traffic Data: Of course, using data to alleviate traffic problems is not new. In the 60s and 70s, the British government often positioned workers with clipboards to tally the number of cars and the various types that passed through intersections. Decades later, traffic data was computerized onto a 2-dimensional map, which showed various hotspots at selected intervals. In 2009, Jer Thorp (@blprnt) and a team of engineers saw a need to better represent the data in an interactive way. Their presentation entitled “Visualising Transport Data for data.gov.uk, [itoworld.blogspot.com] by ITO” was based on the traffic count data for data.gov.uk Developer Camp. The team used more than 1,000 existing data sets from seven departments and community resources. The visualizations and resulting maps were built in nearly two days. The accompanying movies that document the presentation of the project are available as well as the resulting maps at Flickr.   Go With The Flow: Monitoring traffic flows is also helpful for retailers. Kirkland, Washington-based INRIX provided a proof of concept recently showed the migration of billions of anonymous data points from GPS or mobile networks. The data visualization showed anonymous data flowing in and around Manchester, UK during a popular time for school shopping. The map revealed the local migration from the suburbs to shopping areas in Trafford Centre and Manchester City Centre. Their analysis revealed that much of traffic into the city was coming from the south. The novelty of the visualization is not only showing the traffic patterns but also growing on a horizontal plane, the growing number of people in the city itself. The company said its data could be used by marketers, retailers and civic offices to keep people moving and market to specific demographics based on destinations. Reviews of the project are available at the Silicon.de site (in German). Like what you see? Every Friday we share great data visualizations and embedded analytics. If you have a favorite or trending example, tell us: Submit ideas to blogactuate@actuate.com or add a comment below. Subscribe (at left) and we’ll email you when new entries are posted. Recent Data Driven Digests: September 25: Visualizing data on a US Map including burger prices, presidential debates and startup funding September 18: Original US Congressional debt, Euro spending, and world population based on income September 11: Cloud visualizations related wind, bird migration and El Niño Source: Actuate

Read More

Exploring iHub Examples: SF Wealth

This post is the third in a series exploring the free example applications that come bundled with OpenText Information Hub (iHub), Trial Edition. Read part 1 and part 2. Financial services firms differentiate themselves and build loyalty by providing their customers with in-depth portals for exploring personalized financial data. Many top banks and other providers use OpenText Analytics and Reporting tools to present, analyze, and report on customer data, and the iHub example application we explore in this post showcases just a few of the capabilities at their disposal. The example application is called SF Wealth. It’s a dashboard with five tabs – Home, My Spending History, My Wealth, Retirement Roadmap, and Portfolio Performance – each of which illustrates different capabilities. When you launch iHub, Trial Edition and open the Examples, click on the SF Wealth image and you’ll see the screen below. Home What it is: A landing page – that is, the first screen a high net worth customer might see when logging into a financial institution website. What to look for: The page presents personalized information about the customer’s portfolio status, progress toward goals, and risk position in snapshot form. Down either side you’ll see links to outside resources of interest to our imaginary customer. In a real-world portal these HTML links could point to offers geared toward the customer’s specific needs and wants, news headlines about companies in the customer’s portfolio, and other relevant information. We’ve included them here to show how seamlessly third-party, outside sources can be incorporated with data that is unique to the customer. My Spending History What it is: A dashboard (shown above) that lets a customer explore spending in depth. What to look for: To see how various aspects of the same data can be presented in different ways, first select the Restaurant category (in the left column) and watch chart immediately to its right; it will show spending on restaurants, color-coded by account. Now Clear the Categories, find Restaurant in the bar chart, and click its bar; you’ll now see details about all of the restaurants where our user spent money. You can also experiment with the Month Range in the selectors and see how all four graphs change. My Wealth What it is: A dashboard (shown above) that gives the user multiple ways to gauge progress and performance. What to look for: The seven elements on this dashboard mostly provide a variety of comparisons. The thermometer at left shows that our user has saved almost $968,000 – a good figure, until you compare it to the user’s $2 million goal. Now check the map in the lower right corner; our user’s savings are above average for Idaho, but below average for Illinois. (Maybe his money would go further – or he’d feel richer – if he moved.) Two of the gadgets on this dashboard are interactive. My Account Update lets you get a detailed report on any fund by clicking its name, and My Wealth vs. Market Indices lets you adjust the timeframe of the chart with a click. Lastly – and just for fun – check the world map in the lower center of the dashboard; our imaginary user has travel goals and is measuring a travel fund against those goals. Retirement Roadmap What it is: An interactive page (shown above) that lets the user experiment with a variety of retirement savings scenarios. What to look for: This dashboard combines two data visualizations, a selector, and a crosstab, all of which interact with each other. The quickest way to see this in action is to enter a retirement year and choose an investment style – Conservative, Moderate, or Aggressive – and watch the all three elements change. If the user decides that a different investment strategy is needed, a financial advisor is just a click away using the Contact Us button.  (Note that this example application uses artificial data. Please don’t use it to plan your own retirement!) Portfolio Performance What it is: A page that packs a ton of financial performance information into an efficient table. What to look for: The table (shown above) is a marvel of efficiency and demonstrates iHub’s powerful capabilities. Each line in the table includes five charts in three styles, along with some of the numbers that underlie those charts. You’ll also see scorecard-style red dots indicating a portfolio item that is – appropriately enough – in the red. With this table you can quickly identify which portfolio items are performing well year-over-year, and which items are riskier than others. One More Thing If you’re following along in the Example application, you probably noticed that we’ve cropped a toolbar out of the screenshots in this post. (It’s shown above.) You also may have noticed the light blue grid behind the pages. Here’s what those things tell you: You’ve actually been working in the visual dashboard design tool that is built into iHub, Trial Edition. Because that’s so, you can rearrange and resize many of the elements on these dashboards. (Try it!) We also encourage you to create a new tab in the example application and experiment with these tools. You’ll want to keep the iHub Dashboards documentation and tutorials handy as you do. Next Up In our next blog post in this series, we will explore another dashboard – the Call Center example application – from the user perspective.  

Read More

Data Driven Digest for September 25: US Map Potpourri

In his book, How the States Got Their Shapes, author Mark Stein notes that many of the lines that separate us are arbitrary at best; based on historical negotiations and treaties with other countries. It’s not uncommon for communities to be split down the middle of two states just because an 18 Century surveyor identified a geo-spatial line. Do the lines separate us or do our behaviors? We ask this because this week’s Data Driven Digest focuses on three maps of the United States that are distinct more because of the behaviors of the people that live in them than the actual lines that separate them. Enjoy! You Want Fries With That?: If you are hungry for data, our friends over at DataBucket serve up a hot dish of information. An interactive map based around prices at Five Guys Burger and Fries restaurants tracks prices on four items available at the restaurant: Bacon Burger, Fries, Bacon Cheeseburger and Hot Dog. It’s no surprise that a meal in Midtown Manhattan costs nearly $3.00 more than one in Kansas City, Kansas. The map is novel, however, in that it displays data gathered by using Five Guys online ordering website. The 29-year-old chain currently has 1,000 locations in the United States, Canada, Britain and United Arab Emirates with 1,500 more under development. There is one item available at the restaurant that is priced consistently. All roasted in-shell peanuts found in the dining area are free. No Debate: With several months to go before the 2016 US Presidential election, we’re taking a look at some early returns. Washington Post writer Philip Bump (@pbump) notes that the upcoming televised Presidential debate in St. Louis, Missouri will be the fourth such time the city has hosted a presidential or vice presidential contest in the last 56 years. Bump suggests the St. Louis bias may come from the familiarity with host Washington University of St. Louis as well as Missouri’s past swing-state status. The Post’s visualization’s auto-build is quite impressive. You can clearly see the 28 states that have yet to host the marquee stump. Three other sites slated for next year include Dayton’s Wright State University; Farmville, Va.’s, Longwood University; and University of Nevada, Las Vegas. Startup Funding Nation: Silicon Valley startups get a lot of attention from venture capital firms, but how much is much? Our friends over at DataBucket used datasets from CrunchBase drawn from Github and built it into an interactive map that tracks the average funding amount and number of companies for each state and funding round. The wealth distribution weighs so heavily in California that the second largest state for funding (New York) is nearly a quarter of the investments flowing into the Golden State. While any state can spawn startup seed money (yes, we are looking at you South Dakota) emerging venture capital hotspots include Florida, Texas and Massachusetts. Startups are big money. VCs invested more than $48.3 billion in 2014 according to data released by the National Venture Capital Association. That seed money supported upwards of 4,356 deals. Series A funds are typically used to get a company going, while successive funds are used to help grow the business. After receiving $1.6 billion in early funds, Uber is currently locking into place a $2.8 billion to expand. Elon Musk’s SpaceX grabbed a $1 billion-dollar cash infusion valued at a more than $10 billion. Recent Data Driven Digests: September 18: Original US Congressional debt, Euro spending, and world population based on income September 11: Cloud visualizations related wind, bird migration and El Niño September 4: Seasons represented by fall color, energy production, wildfire smoke, air pollution  

Read More

Exploring iHub Examples: Integration Framework

This post is the second in a series exploring the free example applications that come bundled with OpenText Information Hub (iHub), Trial Edition. Read the first post here. The ability to integrate and embed data within other applications is an iHub strength. This ability is particularly interesting to independent software vendors (ISVs) who want to embed analytics in their applications, and it’s thoroughly demonstrated in the Integration Framework example application. Once you launch iHub, Trial Edition and open the Examples, click on the Integration Framework image and you’ll see the Mashboard screen below. (More about that name in a bit.) Mashboard What it is: The first screen you get when entering the Integration Framework example application is a modern tile-style page – one with a few surprises up its sleeve. What to look for: Some attractive visuals are immediately apparent, such as the subtle animation when you mouse over the four boxes at the top of the page. But to ISVs, more important is the way the page is constructed: the charts, maps and tables are created with iHub, but the boxes that contain them are created by the application itself. ­(That’s where the name Mashboard comes from – the page is a cross between a mashup and a dashboard.) All of the visualizations on the page are interactive, and the minus/plus icons in the upper right of each tile in the dashboard let you minimize or maximize the individual visualization independently of the others. To explore the other Integration examples, pull down the menu icon at the top left corner of the page next to the words Sample Application. You’ll see five choices: Reporting, Analytics, Web Tools, Charts, and Embedded Code. Select Reporting. Reporting What it is: Three different reports – Sales Order, Worldwide Sales, and On-Demand – that show iHub’s ability to generate and integrate reports. What to look for: Each of these reports illustrates different strengths and integration capabilities, so let’s look at them individually. Sales Order Report (shown above) combines infographic-style presentation of key metrics, a comparison bar chart, and two interactive tables of data. The lower right table uses an inline bar chart for each record (in the Actual vs Target column) as well as a red/yellow/green scorecard presentation for numbers. Click on any salesperson’s name and you’re taken to a dashboard for that individual. The first page of this report presents aggregated data, and subsequent pages give detailed reports. (Page navigation is found in the upper-right corner of the screen; it’s cropped out of the screenshot above.) Go to one of the subsequent pages, Enable Interactivity (as explained in the previous Examples post) and you’ll see that these aren’t ordinary static reports. You can easily sort data in the tables, hide and rearrange columns, and otherwise fine-tune the reports’ appearance and performance to suit your needs. Worldwide Sales (shown above) is a report organized as a grouped table. Each page of the 20-page report presents data for a different geographic area; note how the infographic header changes as you click through the pages. The Order Status column uses a scorecard-style color code, so the type color changes depending on the displayed text. On-Demand Report, when you first select it from the menu, appears as a nearly blank page. Choose an item using the pull-down Parameter Picker in the upper right corner, click Run Report, and iHub immediately generates a detailed invoice for the customer, like the one shown above. (This is not a static page, but rather is a generated report.) Incidentally, iHub uses its JavaScript API (JSAPI) to display the parameter picker; significantly for ISVs, the JSAPI parameter module and iHub’s viewer module are integrated seamlessly on a single page. Analytics What it is: A dashboard with four tabs – Inventory & Sales, Cross Tab, Performance, and Other gadgets. What to look for: Each tab shows a different way that data can be integrated and displayed in a dashboard. We’ll consider each option individually. Inventory & Sales dashboard showcases iHub’s treeview control. Explore the selectors along the left side of the dashboard; you can make selections, click Apply, and watch the graphs change. You can select any combination of whole countries and cities within countries; when you do, notice that the additional selectors at the top of the page (Cancelled, Disputed, etc.) also change according to the available data. Hovering over elements in the charts causes detailed figures to pop up, as seen above. Cross Tab demonstrates that you can build a dashboard that contains a cross tab. (It sounds simple – and with iHub it is – but not all dashboard-creation products have this capability.) Selectors on the left side of the dashboard enable you to filter the data that is presented in the cross tab Performance shows a variety of simple meters and thermometers, based on data picked using the selectors at the left of the dashboard. Don’t like the appearance of a meter? Click on the meter, click the triangle on its upper right corner, and select Change Type to see what options are available. (Developers control what other chart types are made available to users when the dashboard is created.) Other Gadgets shows how you can embed a live web page – in this case, our own developer site — into an iHub report. This capability exists because many business dashboards also must provide portals to external data and content. Web Tools What it is: A direct link to iHub tools for building things: Dashboard, Interactive Crosstab, Report Studio, and My Documents. What to look for: The four options here allow you to experiment with the visual developer tools that come with iHub. There’s too much here to highlight in a blog post, so keep the iHub documentation and tutorials handy as you explore these examples on your own. But one thing you can easily do, even without reading the manual, is to compare the user interfaces of the Dashboard builder and Report Studio. You can also check out the many types of data visualizations that come with iHub, shown above. (We’ll talk more about your data visualization options in the next section.) Charts What it is: A collection of data visualization examples, both native to iHub and built using third-party technology. What to look for:  These visualizations are grouped into four broad categories: Lines – Columns/Bars, Pies – Donuts, Other Visualizations, and 3rd-Party Visualizations. On the Lines – Columns/Bars page, note that the three visualizations across the top show the same data in different formats. This illustrates the fact that choosing the correct visualization for a given situation is not always easy. Same goes for the Pies – Donuts page; the same data is presented in a number of different ways. To learn more about choosing the right data visualization for your needs, see 8 Tips to Big Data Visualization and UX Design, featuring a video of our own Mark Gamble discussing data visualization best practices. The Other Visualizations page shows six other data visualizations that come out-of-the-box with iHub, and the 3rd-Party Visualizations page presents just a few of the JavaScript-based visualizations that iHub can bind data to and render. Your options here are almost limitless; learn more about using iHub Custom Visualizations here. Embedded Code What it is: Three examples of how you can add code to iHub content to increase the content’s functionality and improve its appearance. What to look for: The first Embedded Code item, JQuery, shows how a few lines of JQuery code can be used to enhance standard iHub tables. The left table has extra highlighting (here’s a blog post explaining how that’s done), and the right table demonstrates one-click expand and collapse capability. In both of these cases, the report tables were developed using iHub’s standard table tools, and JQuery code was applied after the fact to provide a different interactive experience. The second Embedded Code option, GoogleMap, shows how a familiar map format can be embedded and integrated into an iHub application to provide location intelligence. The quickest way to see how data from iHub affects the map is to right-click the State column, choose Filter, and then set the Condition as Equal To CA. The map will zoom in on California and only show entries for that state. The final Embedded Code item, Custom ToolBar, changes the Sales Order Report that we explored earlier by replacing the standard menu pull-down and the page navigation selectors with versions that better match the report’s overall style. Again, this is accomplished with a few lines of code, and shows how seamlessly iHub visualization and reporting can integrate and blend with your own content. Next Up In our next blog post in this series, we will explore the SF Wealth example application and demonstrate some iHub capabilities particularly well-suited to financial institutions. Subscribe (at left) and you’ll be notified when that post and others are published.  

Read More

See Big Data Analytics in Action

Not since the mashup of chocolate and peanut butter have people been so excited about two great products that fit great together: Analytics and the Cloud. Earlier this month, OpenText announced Big Data Analytics in The Cloud,  an all-in-one software appliance built for business analysts that need to access, blend, explore, and analyze data fast without depending on IT or data experts. The need for Big Data Analytics should be obvious. Businesses need to understand their data requirements. They need to digest hundreds of tables and billions of rows of data from disparate data sources. With a powerful analytics tool on their side, companies speed up their time to value with the ability to integrate data from multiple sources to get a single view of their business. No complex data modeling or coding is required. They can clean, enrich and analyze billions of records in seconds and apply advanced and predictive techniques in a visual, intuitive way. But seeing is believing. This is why we have assembled a demonstration video that shows just how Big Data Analytics works and some scenarios that may mirror your own needs. Check out the demonstration here: And if you are interested testing out Big Data Analytics yourself, we have also a free trial available.  

Read More

Data Driven Digest for September 18: Money and Finance

This week marks the 133 anniversary of the opening of the Pacific Stock Exchange in San Francisco. The establishment was created to serve the interest of businesses that struck it rich mining for gold during the California Gold Rush. Nowadays, businesses mine for data hoping to strike it rich by analyzing that data for clues about how to best serve their customers, streamline their operations, or gain a competitive advantage. In honor of those financial pioneers, this week we offer three different visualizations of financial data. Eureka! U.S. Fiscal Responsibility   In 1789, the United States established its first loan to pay salaries of the existing and future presidents and the Congress. As our friend Katy French (@katyifrench) posted in Visual News this week, bean counters in Washington kept great records and even produced stunning visualizations to represent trends. The graphic above represents the Fiscal Chart of Debt and Expenditures by the U.S. Government between 1789 and 1870. Note the spikes in military spending during the War of 1812 and Civil War as well as the first major accumulation of debt in 1861.   Euro Spending How do Europeans spend their paychecks? That was the premise of a recent data plot developed by The Economist (@TheEconomist). Based on data sets from Eurostat entitled Final consumption expenditure of households by consumption purpose, The Economist found life in the Euro zone is quite diverse. Living in Lithuania? Your budget is dominated by food and clothes. Lithuanians also spend more per capita on alcohol and tobacco than the rest of Europe. Meeting in Malta? Forget about eating at home. Nearly 20 percent of Maltese spending goes toward restaurants and hotels. Spaniards spend the least on their transportation. Germans spend more on their furnishings than their E.U. neighbors   World Population Based on Income Our friends over at Pew Research Center (@PewResearch) have come up with an interactive visualization based around the paradigms of income and how it relates to world population. For example, the map above shows the density of people living under what they term as a middle income. By middle income, that means your daily wages are between $10.01 and $20. According to the map, 13 percent of the 7+ billion people in the world are middle income. The map has a second option that reveals the percentage point change in that population between 2000 and 2011. It’s a fascinating study on both financial statistics as well as data maps. The income groups are defined as follows: The poor live on $2 or less daily, low income on $2.01-10, middle-income on $10.01-20, upper-middle income on $20.01-50, and high income on more than $50; figures expressed in 2011 purchasing power parities in 2011 prices.

Read More

Digital Engagement: A New Business Requirement

Digital engagement isn’t an option anymore, it’s a requirement. Today’s consumers are savvy and fickle, and companies must work to earn their loyalty. They’re demanding more from the brands they love, and their tolerance for anything but a seamless, engaging, and compelling experience is flagging. In a digital world, organizations must digitize their customer journeys, from initial interest through to purchase and follow-on service or support. The best way to do this is to shift to a digital marketing strategy. One that creates consistent and compelling customer experiences at every touchpoint through omni-channel delivery, responsive design, and targeted communications and information. Digital technologies have introduced new customer touchpoints and increased opportunities to engage. Since consumers often use more than one channel to interact with a brand (in some instances they use five or six), delivering uniform and relevant messages across all channels is crucial for return on marketing investments and customer satisfaction. Omni-channel focuses on meeting consumer needs by pulling together programs to provide a cohesive brand experience across channels, platforms, and devices. To borrow from Bruce Lee, digital design should “be like water”. You put water into a cup, it becomes the cup. You put water into a bottle, it becomes the bottle. You put water into a teapot, it becomes the teapot. The same holds true for digital experiences. The transition from desktop to device to point-of-sale should be fluid. This is achieved through responsive design. Customers don’t see individual devices or channels; they look for a consistent and familiar brand experience that delivers relevant content. Nirvana on the customer journey is realized when a company anticipates the needs and wants of a customer and serves up targeted and tailored content, products, or services, in the moment of need, wherever the customer is. Organizations that can predict customer behavior have a better chance at fulfilling consumer needs. Analytics—or analyzing data collected across various touchpoints of the customer journey (transactions, interactions, social media sites, and devices) helps organizations discover valuable customer insights so that they can offer more personalized and satisfying experiences. The most effective way to target different audiences is to use messages that focus on products and services with the greatest appeal for each segment. Using dynamically generated customer communications, organizations can create and automate their marketing campaigns. When correspondence is part of a digitized process, end results are gains in efficiency and the ability to create superior customer experiences. As one of the foundational suites for Enterprise Information Management (EIM), Customer Experience Management (CEM) aims to create a richer, more interactive online experience across multiple channels without sacrificing requirements for compliance and information governance. CEM brings together all of the technologies required to re-architect back-office systems, consolidate customer data, and create digitized front-end experiences. Digital engagement starts inside the firewall and extends outside the enterprise and all along the supply chain. In the next post in this series, I’ll explore how the supply chain is being disrupted and how enterprises can digitize key processes for greater collaboration, information exchange, and business agility. Find out how you can capitalize on digital disruption. To learn more, read my book, Digital: Disrupt or Die.

Read More

Exploring iHub Examples – First in a Series

There’s a big-box retail store not far from the OpenText Analytics office in San Mateo, California. Some of us (ahem) have been known to visit there at lunchtime to score free samples for lunch. And why not? Most people like to get something of value for free. If you download the 45-day free trial of OpenText Information Hub (iHub), you know the feeling. Not only do you get to try out a full-featured version of the software for free, but you’re also given a number of free sample applications that preview some of the software’s remarkable capabilities. To help you steer your cart around the wide aisles to find the good stuff, we have prepared a series of blog posts exploring what each sample apps is and what you should look for when you use it. We’ll count on you to imagine how the capability it demonstrates could be used in your organization. One other thing: You and your development team can learn a lot more about how these sample apps are constructed – and how they work – by exploring them in Analytics Designer, the free companion design tool for iHub. Brian Combs has published a step-by-step guide to help you do this. Finding the Samples When you launch iHub the first time you’re greeted with a main screen showing just one item: an HTML file called Examples. Click on it and you’ll see the Sample Content screen below – it’s your jumping-off point for all of the samples. (screenshot) The first sample we’ll explore is labeled Other Applications and can be found in the upper-right corner of the Sample Content screen. Click it and you’ll see three sample visualizations and one dashboard.  These samples (and others) are based on Classic Models or SF Wealth, two of the sample databases that come with iHub. They all present data in a clean, uncluttered format that invites further exploration. Customer Revenue Metrics What it is: A report with a bar chart, a table of top customers (with scorecard arrows showing trends), and pie and bar charts that break out revenue in different ways. What to look for: All of the elements of this report are interactive, so alter them to see what happens. For example, you can change the date range in the bar chart three different ways: by clicking the Zoom setting (upper left), by typing dates in the “from” and “to” boxes, or by moving the slider below the chart. (Modify one of these controls and the other two change in response.) Now click on any bar in the top bar chart for details on a single month. Next, hover over any segment of the pie chart; when a data point pops up, click on it for more detail. Client Investment Portfolio What it is: In essence, this is a periodic statement – like the one you might get from your investment advisor or broker – on steroids. What to look for: This report is an ideal place to explore the power of iHub’s Interactive Viewer. Click the menu button in the upper left corner of the report and select Enable Interactivity. Then click on %Change (the sixth column) and extra controls will appear to filter, sort, and otherwise modify the table. You can use these to sort the entire table based on a single parameter. (When you do this, the right column of the table – with its red and green tags that display the data in scorecard style – will sort accordingly.) Enabling Interactivity unlocks a wide range of capabilities that vary depending on the data or visualization you’re working with. One other thing: click the name of one of the stocks in the portfolio (such as Coca Cola Company), and a new tab will open with the Yahoo Finance page for that asset. This shows how reports in iHub can seamlessly connect with external assets. Top Sales Performers What it is: A ton of data about salespeople, presented in a compact, efficient format. What to look for: While your eye may be drawn to the radar chart at the top of the first page, a sales manager might find the sub-tables under the chart more compelling. These tables demonstrate how complex, multi-layered data can be aggregated and organized in a number of different ways: The salespeople are ranked, and their total sales are calculated. Within that level of organization, each salesperson’s top customers and top products are listed in order. This type of consolidated, interactive information is invaluable to people who manage large, distributed sales forces. Customer Sales Dashboard What it is: A basic interactive dashboard of sales data. What to look for: One big distinction between dashboards and reports is the presence of selectors on dashboards. In this simple example, the selectors are on the left, labeled Sales Territories, Customer Countries, and Year. Click on any element within those selectors and watch the data visualizations (also called Gadgets) on the dashboard respond. Now look at the Historical Revenue Analysis gadget in the lower right corner of the dashboard. If you find it difficult to distinguish between individual data lines in the graph, click the triangle in the gadget’s upper-right corner and choose Maximize. The graph now fills the screen for easier exploration. Next Up In our next blog post in this series, we will walk you through the example called Integration Framework. Geared toward ISVs, this example showcases various capabilities iHub provides for embedding content within an application.   photo courtesy of Sarah Murray

Read More

3 Questions: Content Marketing Expert Robert Rose on the Power of Analytics

Think your organization can tell the difference between good marketing content and great content? Only 36 percent of B2B marketers surveyed in 2014 by the Content Marketing Institute said they were effective at content marketing. To help increase its effectiveness, marketing experts suggest improving content measurement methods. White papers, brochures and blogs get the message out. Analytics illustrates a richer story. Robert Rose is the Chief Strategy Officer for the Content Marketing Institute and a senior contributing consultant for Digital Clarity Group. Robert’s highly anticipated second book – Experiences: The Seventh Era of Marketing is now available. His first book, Managing Content Marketing, spent two weeks as a top ten marketing book on Amazon.com and is generally considered to be the “owner’s manual” of the Content Marketing process. Robert is also the co-host of the podcast PNR’s This Old Marketing, the Number 1 podcast as reviewed by MarketingPodcasts.com We sat down with Robert to discuss the importance of transforming content into digital and the best ways to optimize value from analyzing that content. OpenText: With the world migrating towards a digital-first approach, talk about the importance of content-driven experiences. How should marketing, and other departments, optimize their operations to gain the most value of their digital assets? Robert Rose: The real trend is that content-driven experiences are the differentiation of the entire business these days. Whether you look at this as a layer of product development, an element of marketing – or the new way that you handle customer service, consumers now expect a better experience at any part of their particular journey. This means that marketing – and the development of content-driven experiences – must stretch across the entire customer journey. So, this inherently means that the business has to evolve “content” as a strategic asset.  It can simply no longer be just a byproduct of what people produce as part of their jobs – but must be cohesively created, managed, published, optimized and measured as a function in the business. And, in order to do that – the organization’s first step is to actually look at each of those tasks as a recognized function in the business. It must have actual organization, real responsibility, budget and measurability. OpenText: The intersection of digital content, cloud delivery and Big Data analysis seems like the next step for so many organizations. What recommendations can you give to decision makers in their quest for a digital content supply chain? Robert Rose: The key is to simplify. A great content-as-supply-chain process should actually reduce the amount of content being produced, but optimize its quality and efficacy. This means, ultimately, that the data it produces becomes higher quality and get be used to derive better meaning, and thus greater insight into how to improve the experiences being created.  The classic mistake that most businesses make is they create content in order to facilitate the sales, marketing and service of products – and then simply can’t keep up with the cadence that the product/service requires. Instead, they need to start with the customer, and the experience they’re trying to deliver – then work backwards to see how content can be created to build that experience. OpenText: There are many organizations that are successful in transforming their content and measuring its effectiveness. What are you top favorites and what made them so successful? Robert Rose: I think my current favorite is what Motorola Solutions has done by integrating technology and marketing into one common department. Eduardo Conrado is the Chief Innovation Officer (and wrote the introduction to my newest book). He recognized as the head of marketing and IT that both were truly focused on the same goal; creating a more compelling customer experience. So, he merged both of them together so that they work together. As he says, this really does create an environment where “technology can help you get closer to the customer.” For more insight, Robert’s strategy whitepaper entitled, The Marketing Transformation: From Managing Campaigns to Orchestrating Experiences can be found at OpenText.

Read More

Opening iHub Examples in Analytics Designer

If you’re just getting started with OpenText™ Information Hub (iHub), you’ve probably found the Examples that ship with the product. A collection of sample applications, dashboards and reports, the Examples are meant to inspire you as you create your own iHub-based projects. The best way to understand how these sample applications work is to open them in Analytics Designer, the free commercial-grade companion design tool for iHub. Download Analytics Designer here. This blog post shows you how to get the Sample Application content out of iHub and load it into Analytics Designer. The following steps are based on iHub 3.1.1 Trial Edition and Analytics Designer 4.4. Download iHub, Trial Edition here. 1. Launch the Analytics Designer, and make sure that no report designs, data objects, or other elements are open. 2. Create a New BIRT Project in Analytics Designer, as shown in this screenshot: 3. Select Blank Project as shown below, and give your new project the same name as the application folder in iHub. (For this example we are using the name “Sample Application,” because that is the specific iHub example that we will move – but the process for the other examples is essentially the same.) Then click Finish. 4. Create a server profile for iHub. Select the Server Explorer tab in the Analytics Designer, then click the New Server Profile button (shown with a red arrow below). Make sure that you select the Servers icon (as shown) before you click  the New Server Profile button. 5. Provide a name for your Server Profile and enter the server connection information. For this example I am using the IP address for the iHub 3.1.1 virtual machine that I am running (marked with a red box, below). I am also using the default settings: Administrator for the User name, and no Password. (If you’ve changed the defaults on your iHub instance, you’ll need to use your own credentials.) When you’re done, click  Finish. 6. Make sure that the project you want to download the file into  – the one you created in Step 2 – is selected. 7. From the Analytics Designer’s File menu, select Download… 8. In the download dialog, check Download Files. For the Download Location, click the Browse button and navigate to the project where you want the sample application to reside. If you only have one project, you may only see a “/” for the path, as seen in the screenshot under step 9d below. 9. To download all the files and folders that are in the Sample Application folder in iHub, do the following: a. Check Sample Application. b. Uncheck the first folder under Sample Application. c. Now re-check the first folder under Sample Application. You will see that all the folders and files under Sample Application are now checked, but that the Sample Application folder itself is not (as shown below). By doing this, you make sure that you only download the folders and files in the Sample Application folder into your project. d. Click Download. 9. If you get a warning saying that a file already exists, click Yes to replace the file. 10. You can now expand your project in Analytics Designer and see all the content you just download. Now that you have the iHub Sample Applications in Analytics Designer, you can explore them, modify them, and even borrow from them when you create your own projects. If you’re just learning about software from OpenText Analytics, you can download iHub, Trial Edition here. You can download Analytics Designer here – this companion design tool for iHub is free.

Read More

Data Driven Digest for September 11: Clouds

There’s big news at OpenText this week: Big Data Analytics is now available on the OpenText Cloud. This exciting event got us thinking about actual clouds – you know, the kind up in the sky – and other things that fly through the air. Not surprisingly, there are countless great data visualizations related to clouds and weather, so it was tough to choose just three to share. Swirling Winds: Cameron Beccario (@cambecc) has created a stunning animated visualization, called simply earth, that does a beautiful job of presenting diverse atmospheric data. The visualization blends four data sources – weather conditions, ocean currents, ocean surface temperatures, and ocean waves – each of a different time interval. Click on the word “earth” in the lower left corner of the screen, and controls pop up (as shown in the screenshot above) that let you control the resulting visualization. By the way, the care Beccario takes to document his work is as impressive as the visualization itself. Fly-By: Weather radar systems are (obviously) designed to monitor and record weather. But scientists at the European Network for the Radar Surveillance of Animal Movement (ENRAM) have developed ways to use meteorological technology to monitor bird migration.  Last summer they developed a proof of concept (screenshot above) showing bird movement in the Netherlands over just a few days. The idea has taken off (so to speak), and now biodiversity scientists at the Netherlands Research Institute for Nature and Forest (INBO) are exploring ways to use weather instruments to track other species. Check out the POC, and read more about the work on the INBO blog. Data Flow: For Californians in a drought, no meteorological phenomenon is more important than El Niño. Warming waters in the Pacific Ocean affect the weather worldwide, and often help to bring needed precipitation to the western United States. In an effort to understand this year’s El Niño, Matt Rehme of the National Center for Atmospheric Research (NCAR) released a video comparing the epic 1997 El Niño with the one brewing this year. The result (embedded above, and linked here) demonstrates both the weaknesses and strengths of video for data visualization. One obvious shortcoming is that you can’t explore the data in depth; you just let the image flow by. But ease of consumption and sharing are a video strength; indeed, Rehme’s video has been viewed some 61,000 times in less than a week.  

Read More

Analytics in the Cloud: No Hardware, No Coding, No Punch Cards

Data analysis has come a long way. The British Census of 1911 was the first to use automatic data processing with the advent of punch cards that could be sorted by machine. It was also the first census to ask multiple-choice questions, which helped Britain gather and analyze data on various segments, such as the number of carpenters compared to butchers in the country. Since then, data analysis has become so ubiquitous that more data is generated online every second today than was stored in the entire Internet 20 years ago. If your business is going to survive this digital transformation, it needs to quickly access, blend, explore and analyze all data without depending on IT or data experts. The good news is that OpenText is addressing those needs with the launch of its Big Data Analytics in the Cloud. To address the needs of companies seeking Advanced Analytics (a $1.4 billion market, according to estimates from IDC), Big Data Analytics has a built-in high-speed columnar database that provides thousands of times faster performance than traditional relational databases. The software incorporates statistical algorithms, making it easy to do profiling, mapping, clustering, forecasting, decision trees and more without programming. Delivering these capabilities as a managed cloud service reduces investment in infrastructure and maintenance staff. In a nutshell: no hardware to buy and no coding required. You can get all of your data in a single view and the ability to analyze billions of records in seconds. Powerful Enough For All Your Data Big Data Analytics is engineered to read virtually any data source. It includes native connectors for popular SQL databases, an Open Database Connectivity (ODBC) driver for creating custom connections, built-in ability to access flat files and CSV files, and a remote data provider option for loading files using a web address. On top of these powerful tools, Big Data Analytics gives everyday users access to advanced analytic algorithms formerly available only to data scientists. These tools and algorithms are optimized and hard-wired into the product and accessed via a toolbar in the Analysis window (as seen below).   Crosstab allows you to cross multiple data fields – either within the same database table or from different tables – and display the results as dynamic tables and graphics. Venn diagrams visually identify coincidences and differences between up to five data segments for rapid discovery. Bubble diagrams show the distribution of categorical data across two axes of numeric variables. A third variable can affect the size of the bubbles that represent the data. Results of bubble analyses can also be viewed in table form. Evolution analysis shows data progression over time. Visually, evolution analyses resemble bubble diagrams, but the spheres representing data move to show time passing. The user can freeze playback and adjust the time interval. Profile analysis groups values and determines relatedness to a profile segment. Users can easily see how individual attributes contribute to the overall profile. Results are presented in a table that visually represents statistical relationships (known as Z-score). Map analysis displays data on a choropleth map, in which different colors or shades represent the magnitude of data values. Multiple maps with region names are encoded in the product, and new maps can be added. Pareto analysis is the algorithmic expression of the 80/20 rule. It enables users to see if and how their data conforms to that rule.  If only Britain had this kind of technology back in 1911. Perhaps, they could have predicted the British Music Invasion of the 60s or that one day, Tim Berners-Lee would define the World Wide Web. To find out more about how to optimize your data for a digital future, I recommend attending one of the upcoming Big Data Analytics webinars on September 22 or October 15.

Read More

Data Driven Digest for September 4: The Seasons

It’s Labor Day Weekend in North America – the traditional end of summer, if not the season’s end as marked by the equinox. To acknowledge the impending shift to autumn, this week’s Data Driven Digest is about seasons – how they affect vegetation, energy use, and air quality. Fall Back: The very first Data Driven Digest we published almost a year ago contained a link to the lovely Fall Foliage Map published by SmokyMountains.com. So I was delighted to hear from David Angotti (@DavidAngotti) that the 2015 Fall Foliage Map is now online. Based on more than 37,000 data points, the map visualizes fall color creeping across the United States; move a slider (representing time) and watch the colors change. “The data behind the map is primarily a conglomeration of NOAA precipitation forecasts, daylight and temperature forecasts, historical precipitation data for the current year, and a deduction of many government and private party data sources available online,” Melton says developer Wes Melton. “Once we have accumulated the data, there are manual changes we then make to the dataset based on our knowledge of the topic.”   Power Up: Energy use shifts logically with the seasons: In summer it’s used for cooling, while in winter energy is applied to lighting and heating. What’s less apparent is where the energy comes from. The Washington Post team of John Muyskens, Dan Keating and Samuel Granados published a terrific interactive site that explores how the United States generates electricity. Each circle on the map represents a power plant; the size of the circle represents the plant’s output, and the color of the circle reveals its power source. Roll over the main map (click through for the interactive version) to see a breakdown by state. Farther down in the site you’ll find an interactive bar chart showing the same data in a very different manner. If you’re interested in exploring more energy-related data, the U.S. Energy Information Administration’s Short-Term Energy Outlook publishes incredibly detailed reports on a monthly basis.   Smoke Signals: Our colleague Michelle Ballou, who lives smack dab in the middle of Washington State, visited our San Mateo office this week and talked about the choking smoke from seasonal wildfires in her areas. Coincidentally, my colleague Michael Singer (@MichaelSinger) learned about BlueSky . A project of the United States Forest Service Research & Development arm, BlueSky links a variety of independent data models – fire information, fuel loading, fire consumption, fire emissions, and smoke dispersion – and uses predictive models to simulate and animate the cumulative impacts of smoke on air quality from forest, agricultural, and range fires. The Forest Service, by the way, collects and shares a wealth of public data sets. Seeing Spots (Bonus Item): Shout out to Massimiliano Mauro (@MM_cco), an app designer for Wired Italia, for the stylish and functional 3D data visualization shown above. Called 3D and D3.js, it is a Braille-inspired three-dimensional data visualization that shows invisible air pollution (unlike the smoke in Michelle’s town, which is very visible). Each dot represents a day, and days are arranged by season to better reveal patterns in the data.

Read More