Cloud

What the Department of Homeland Security Knows About Data

What does the Department of Homeland Security know that you don’t know? OK, that’s a trick question. The answer (in this case) is this: It knows how to get from data to information. On April 22, OpenText Analytics and Reporting hosted a webinar featuring Chris Chilbert, Chief Enterprise Architect at the Department of Homeland Security. With hundreds in attendance, Chilbert made a powerful case that data is worthless unless you can turn it into relevant information – and that information can then become knowledge, wisdom and ultimately action. The graphic above is from Chilbert’s presentation. There’s a human element in gaining information from data, Chilbert explained. People need to understand the organization they work in and the processes they use; those pieces help us put data in context and make better decisions. If you missed Chilbert’s presentation, please check out this free replay. After Chilbert’s presentation, I talked for a few minutes about how OpenText Analytics and Reporting products support the Four Pillars of Business Analytics: data, people, processes, and technology.  (Read more about the Four Pillars in this blog post and a free ebook.) Many webinar attendees asked follow-up questions – more than we had time to answer during the hour, so I’ve answered some of them here. Q: Explain how OpenText Analytics secures data from unauthorized access. A: OpenText Actuate Information Hub (iHub), our data visualization platform, provides multiple layers of security. These include authentication integration with Active Directory, single sign-on solutions, and two-factor authentication. iHub administrators also can control what a user can access by securing data at the row or column level, and also by controlling page-level security in reports. To read more about the security features in iHub, check out the white paper, The Ten Layers of Security in iHub. Q: Can you give an example of social data and explain how iHub accesses it? A: Twitter and Facebook are the most common social data sources today. OpenText Analytics has several social data connectors for iHub in our developer center, including the Facebook ODA Driver,  BIRT Twitter Gadget  and Twitter JSON Search ODA.  For other unstructured data – which social data is, in essence – we provide APIs that you can use to connect to and query any data source. You may want to take a look at these two blog posts from Kris Clark: Creating a Custom ODA and Use JSON as a Scripted Data Set. Q: What mobile devices does iHub support? A: We support all mobile devices by providing APIs that allow you to integrate content from iHub into your mobile application. This way you get to select what mobile devices you want to support. For ideas and inspiration, take a look at two example applications we have created using iHub and its APIs:  Aviatio, a mobile web application (GitHub link for Aviatio),  and Gazetteer, an iOS hybrid application (GitHub link for Gazetteer). Q: How does iHub work in a multitenant architecture model in a cloud environment? A:  Multitenant support is built into iHub. With multitenant support, each project instance within a cluster isolates several characteristics (including security, user and role management, and scheduling) to allow them to be managed independently, even as the instances all share cluster resources. This allows a single iHub installation to support multiple applications and projects with a variety of characteristics and requirements. Developers use multitenant capabilities to build software-as-a-service (SaaS) solutions in the cloud. The benefits include lower cost, because hardware and software are used more efficiently; faster time to market, because adding a new project requires just a few administrative commands; and improved security, because each application instance has its own security processes. Technical benefits of using multitenancy with iHub include reduced deployment burden, eased administrative load, simplified system impact testing, and flexible backup and recovery. Q: Does OpenText Analytics have an electronic scorecard to allow input of information from the bottom up, as well as from the top down? A: Yes, with OpenText Analytics users can input information at any level that may have a bearing on a specific key performance indicator (KPI). The flexibility of our scorecard function accommodates any performance framework, including Balanced Scorecard, Malcolm Baldrige, Six Sigma and custom frameworks, and scales to meet the needs of large initiatives. The Briefing Book function of iHub scorecards allows users to create and deploy customized performance views. Briefing Book measures can be selected manually or filtered based on criteria such as performance, criticality, location or ownership. Links to relevant standard and custom reports, maps, external documents and websites can be added. Briefing Books can be defined as private or shared, and include advanced security features to ensure that users only have access to the information they are entitled to see. If you require more clarification on any of these answers, please leave a note in the comments. And be sure to check the replay of my webinar with Chris Chilbert of the Department of Homeland Security.

Read More

3 Questions: John Johnson of Dell Services Discusses Analytics and Reporting for IT Services

Dell Services, the global system integrator and IT outsourcing arm at Dell, provides support, application, cloud, consulting, and many other mission-critical IT services to hundreds of organizations worldwide across many sectors. The company collects and manages massive amounts of data concerning customer infrastructures from simple, high-frequency metrics (such as CPU, memory, and disk utilization) to helpdesk tickets and service requests, including hardware and software asset information. Using this data to understand and respond to customer needs before they become a problem falls to John M. Johnson, Manager, Capacity Management at Dell Services. Johnson recently spoke with OpenText about the type of data Dell Services collects, and the evolving ways his customers consume that data. He also spoke about how he uses this data to plan for the future. OpenText: You have a 12-terabyte data warehouse of performance metrics on your customers’ systems and applications. Tell us about that data and how you use it. Johnson: Our infrastructure reporting data warehouse has been around for seven-plus years. It collects aggregated information about more than a hundred customers, which is just a segment of our base. Originally we started the data warehouse to meet legal retention requirements, and it evolved to become the repository for ticketing data, service request data, and SLA performance data. Now it’s an open warehouse where we continually add information related to our services delivery. It’s fantastic data, and a fantastic amount of data, but we lacked two things: an automated way to present it, and a consistent process behind its presentation. My twenty capacity planners were spending too much of their valuable time churning out Excel reports to present the data to our clients, and far too little time understanding the data. A little less than two years ago we started using open source BIRT for report automation and to eliminate manual errors, consistency issues, and remove the “personal analysis methods” that each engineer was introducing to the process. The next maturing of the process was to leverage iHub to further automate report generation, delivery and presentation. OpenText: Some of your customers and users get dynamic dashboards, while others get static reports. How do you decide who gets what? Johnson: That’s an easy answer: It begins with contract requirements. Those expectations are drawn out and agreed upon by legal counsel on both sides. Once those fundamental requirements are met, the question of, “Who gets what?” is very simply based on how they need and want the data. I have three customer bases: my services customers, and my delivery teams, and peer technical teams who have reporting requirements. And everybody wants a different mix of data. DBAs want to see what’s going on with their infrastructure – their top databases, hardware configurations, software versions and patch level, clusters performance, and replication stats. Other teams, such as service delivery managers, and the account teams, want to see pictures more on a financial level. They need answers to standard questions like, “What has the customer purchased, and is that service meeting the customer’s expectations?” In some cases we handle customer applications in addition to their infrastructure. In those cases, the customer needs reports on uptime, availability, performance, user-response time, outstanding trouble tickets, number of users on each system, and various other application metrics married with the infrastructure data. Those are all static reports we typically deliver on a monthly schedule, but we’re looking to make that particular reporting a daily process with iHub Dashboards. Dashboards will serve three major groups: 1. Application owners, who will see what’s going on with their infrastructure and applications in real-time 2. Our service managers, who coordinate the daily delivery of our services around the world 3. Senior leaders at the director, VP and CxO levels. That last group has much less interest in a single trouble ticket or server performance, but they do care about service levels and want to know how the infrastructure looks on a daily basis. I think the executive-level dashboards will be big consumers of data in the future, so we’re evolving and maturing our offering from a technical level – where we have traditionally been engaged – to the business level. Because that’s where people buy. OpenText: That is an ambitious plan to extend your reporting platform. How do you prioritize your projects, and what advice would you give to peers with similar plans? Johnson: There’s one overall strategy I try to employ with all my applications: Apply modern, agile software development methodologies to them. You have to stay up-to-date on software patches and capabilities. You have to keep your platform relevant. We keep updates coming rapidly enough that our customers don’t have to create workarounds or manual processes. Fortunately, iHub works well with how we manage upgrades. We manage reports as a unit of work inside of iHub, so I don’t have to make monolithic changes. When I’m prioritizing projects, I first ask, “Who is my most willing customer?” The customer who’s going to work with you provides the quickest path to success, and that success is the foundation upon which you build. Second is to expect to get your hands dirty and do a lot of the lifting. Most customers are always going to have trouble verbalizing what they need and how they want data to look. So you have to just get that first visualization done and ask, “Does this data presented this way answer your needs?” Don’t be afraid of responses such as, “That is not what I wanted at all. I told you I wanted a report,” and that’s one of the most frustrating things about the job. You have to accept that you are a statistical artist and visual presentation is something you own, and then embrace and drive it. Fortunately, the ease of developing and managing data with iHub means we can respond to these inputs rapidly.  

Read More

Sports on the Web: Newer, Better, Global

Are you a sports fan? Are you one of the lucky ones that enjoys sports that are carried on your local station at convenient times, or are you like a growing number of fans that enjoy sports that are carried more often in other countries, on their time zone and not broadcast on local television stations? Modern sports viewing is now often enabled by strong web experiences, and a growing number of fans are now able to enjoy their favourite sports on the web, at times of their convenience regardless of where in the world the sport is played live. Many sports franchises are now taking advantage of the streaming vs. viewing methods fans are adopting, and there are some great sites powered by Web Experience Management software such as OpenText’s Customer Experience Suite. Cloud enabled apps provide viewing and stats on all types of devices, and allow viewers to enjoy the sport and the commentary that goes with it – from both the official commentators and the other viewers. As reported in a recent press release, UK-based Aberdeen Football Club (if you are from North America think Soccer) recently remodeled their site with OpenText’s Experience Suite to include real-time stats, commentary, Twitter feed, pre and post game analysis and real-time photos. The omni-channel experience is critical as 58% of their fans enjoy their site on mobile devices, often as they are watching the game live in the stadium. Check out www.afc.co.uk to see the latest. This time of year sees some of my favorite sports back live and online. While the sites stay up all year sharing info, they come alive when the teams are back and playing again. It is rugby season again and the 6 Nations site www.rbs6nations.com once again brought the tournament and all the news to the locals and those of us in other parts of the world. The site is great with game info and pictures and my favourite is the running list of clips that summarize some of the great moments. Even if you don’t watch the full games, you can get a pretty good idea of the play, the emotion and certainly the outcomes from this site. And of course. my favorite Australian Rules Football (AFL) season has just started, so I will be spending increasing time on their site www.afl.com.au checking out the predictions, results and pictures, and watching highlights or full games at times that are convenient wherever in the world I am. Watching with two screens is a bonus so I have the player info and stats handy while streaming the games from a second device. In the immediacy era we live in, sports on the internet can now be consumed at our leisure and our convenience. Thanks to strong web experience sites we now have a PVR-like option for watching the games, and the apps provide extras like real-time stats and commentary. There is no substitute for the excitement of live sport but when you can’t be there in person, web experiences are now a great alternative.

Read More

OpenText Featured on Bloomberg Business

For this year’s global Innovation Tour, we’ve taken our Digital-First World message on the road. Already underway, members of our executive leadership team have presented in major cities in Asia Pacific, including Mumbai, Tokyo, Sydney, and Singapore. As one of many highlights of the tour, OpenText CMO Adam Howatson was featured on Bloomberg Asia’s Brandstanding in Singapore. The interview covers a number of topics, ranging from the value of information in creating new services and opening up new revenue opportunities to why the cloud is it important for brands and businesses, and how OpenText is successfully rewriting the rules of business for successful digital transformation. According to Howatson, “Being able to connect the way [the business is] represented on social platforms and the way that brands are represented and shared on the Internet and through our connected society, through to back office operations, to manufacturing and internal business processes… It truly is the organizations who are able to integrate that experience and that flow of information who will outperform their competitors in every industry.” Bloomberg Business delivers business and markets news, data, analysis, and video nationwide featuring stories from Businessweek and Bloomberg News. As a global business network, Bloomberg has over 22 million visitors to its web video assets. Watch the video. Learn more about the Digital-First World by downloading the book: Digital: Disrupt or Die.

Read More

3 Questions: Adam Dennison Discusses Analytics for CIOs

Adam Dennison, senior vice president and publisher of CIO and IDG Enterprise’s events, boasts a 15-year career in technology publishing. CIO Perspectives, the company’s regional event series, brings CIOs and senior IT executives together with top technology vendors for a day of thought-provoking, action-oriented discussions. “At a high level, we cover three main topics: strategy, innovation and leadership,” Dennison explains. “More specifically, emerging technologies, digital transformation, and security are three major initiatives for CIOs today.”  We asked Dennison (@adamidg) how CIOs can leverage analytics in their work. OpenText: You’re moderating a panel at CIO Perspectives called “Straight Talk about SMAC” in which you’ll share key stats on trends and investments in social, mobile, analytics and cloud. Can you give us a preview of a stat you find compelling or timely? Dennison: Our research and discussions with CIOs show that currently 32 percent of enterprise IT investment is slated for edge technologies (like SMAC) vs. core technologies (like legacy systems and infrastructure). However, this will shift to 45 percent within the next one to three years. Additionally, 54 percent of enterprise CIOs plan to increase their spend with “newer vendors” within the next 12 months. OpenText: Late last year you published CIOs Must Market IT’s Value, arguing that internal marketing of IT can help businesspeople understand the value of IT departments. Meanwhile, marketing efforts are increasingly driven by data and metrics. How do the two relate? Dennison: My column on CIOs marketing IT to the business was more broad-based than just data and analytics. However, if asked to focus on the data aspect of it, I would say showing the business exactly what they are paying for is step one in the process. Explain to the business units through data, facts and transparency that they are getting true value from enterprise IT vs. a “do it yourself” solution. OpenText: A lot of industry research – and anecdotal evidence, including a recent CIO Quick Takes – shows that CIOs are eager to use analytics to improve business insights. What drives CIOs’ push for analytics? Dennison: What I read from the CIO Quick Takes was a laser focus on customers. Our CIO research shows us that 41 percent of CIOs’ current spend is on external customer experience, relationship and interaction. The only way to get closer to your customers and better serve them is to gather data on them and use that to your advantage. Our 2015 State of the CIO research also shows us that Data/Analytics is the number one tech priority for the next 12 months that will drive investment. Mobile and Cloud came in a close second and third respectively, but it’s all about data today. Adam Dennison photo courtesy of IDG Enterprise. Used with permission.

Read More

6 F-Type Examples in DevShare You Can Use Now

With the release of OpenText Information Hub, Free Edition (formerly BIRT iHub F-Type) we added corresponding content to the DevShare portion of the Developer Center. One section of that content, F-Type Examples, houses sample apps and working examples developed specifically for iHub, Free Edition. When it comes to working with a new product, or using aspects of a product I know well but have not used before, I find examples valuable for getting started. Whether I need ideas for what is possible or confirmation I am going down the right path, I find examples to be particularly useful. Currently, six different examples have been uploaded to the DevShare to demonstrate various ways that iHub, Free Edition can help you. These examples range from simple, well-designed reports to a full sample application. In this post, I will step you through these six examples and detail their various aspects and features. The Six Examples Call Center Example Dashboard – With the large number call centers around the world and companies increasingly focusing on custom experience, business leaders demand metrics about call center performance, and expect those metrics on quick-reading, real-time dashboards. This example demonstrates how the iHub, Free Edition can do exactly that by creating a dashboard to quickly and effectively analyze the metrics of a call center. It utilizes a well designed data object that uses CSV files as the data source, includes multiple data sets that have properly configured column properties (format, header, etc.), and uses a well designed data model with proper categories and hierarchies. Additionally, the use of multiple data sets within the data object is part of what makes this a well designed data model. When used against a relational database, the multiple data sets allow for optimal query trimming within the datadesign file and the generated data object will have better compression. The Dashboard itself has two tabs: Call Analysis (screenshot below) displays an interactive dashboard with drill-downs and selectors, and Calls By State displays a United States map; you click a state on the map to launch a sub-report for that state. InfographiX Examples – This example can create three different infographics. These samples can inspire you with ways to display your own data in highy visual formats. The three included samples are: Classic Models – This example uses the Classic Models sample database that comes with iHub, Free Edition and displays information about the customers in the database by geography, purchasing habits and more in graphic form. The result is shown below. Storage Statistics – This example uses static data and displays statistics for data in a cloud-based storage system – for example, data formats (audio, video, photos, etc.), traffic by data type, and downloads vs. uploads. Tornado – Uses example uses a data object for its data source. It displays statistics about tornadoes in the United States, such as the number of tornadoes per month, relative numbers of tornadoes by strength (on the Enhanced Fujita, or EF, scale) and fatalities by state. Chicago Crimes Example Dashboard – As a developer and a user, I am always interested in seeing location information integrated into applications. I find it particularly useful when a map is used to visualize location information as it allows you to quickly analyze important geographic information about a data set. This web application demonstrates how we can interact with location information. It uses the Custom Visualizations feature in iHub 3.1 to display a Google Map of the Windy City, and adds custom icons and marker clustering to show crime data. It also provides a data selector gadget so users can choose which types of crimes to display. This example, which can be deployed to iHub as an application, shows you how to display a Dashboard when opening a BIRT application. Simple and Well Designed Reports – This example includes two well designed reports that are simple and elegant. By “well designed,” we mean that elements of the data model have been configured to use libraries for styles, which unifies the whole report and makes its appearance consistent. Additionally, they conform to industry standards for layout and design. If you are new to report development, these are good examples to use as a starting point for developing quality reports with a uniform look and feel. If you are more experienced, these are still worth going over because you may find a feature you have not used before, and they will reinforce good practices on the features you already know. These reports demonstrate how to use a report library, and they highlight several features – such as alternate rows, hyperlinks, highlighting, aggregations and formatting – that help you to display complex data more efficiently and effectively. The reports also show how to use themes to standardize the look and feel of charts and tables, and demonstrate parameters with drop-down lists that use both static and scripted default values. A screenshot of one of the reports appears below. BIRT with the Power of jQuery – This includes two examples that use jQuery to add functionality to BIRT reports. Expand and Collapse – This example uses jQuery to automatically expand and collapse sections in a report. A plus (+) or minus (-) is added next to the report elements that, when clicked, will expand or collapse various sections for the report. For example, say you have organized your customers based on region and country. With this added interactivity, you can start with a compact table and then expand the table into the desired country and region in real time. This enables you to limit the displayed results without the need for filtering or re-rendering the report. Highlight on Mouse Hover – This example (shown below) uses jQuery to highlight rows and columns when the user’s mouse hovers over them, which helps users navigate through very wide or very tall tables. The jQuery used to achieve this behavior updates the CSS properties of the various elements, which gives you flexibility when modifying the styling of the report elements. The effects in this example are primarily achieved through modifying the background color, so any color you specify though standard CSS can be used for the highlight. City Taxi Sample App – This example, as the name implies, is a sample application for an imaginary urban transportation company. It demonstrates how BIRT can power compelling embedded analytic applications. Features demonstrated by this application include: Information graphics for displaying data in visually appealing formats Columnar reports with filtering capabilities Interactive reports that end users can modify from their browsers Geospatial visualization built into a report Dashboards designed for Big Data analysis Additionally, this application demonstrates how BIRT can seamlessly add analytic capabilities to existing web applications. All content is presented as part of the HTML web pages, providing a consistent overall user experience for data analysis. The app also demonstrates how to use the Javascript API to embed content with minimal coding. As you can see, this new DevShare section provides a unified area where iHub, Free Edition examples can be shared and distributed. The multiple examples already available range from simple, elegant reports for people who are just getting started, to full sample applications for developers who are ready to integrate BIRT into your existing applications. Thank you for reading this post on the Examples DevShare. We encourage you to download and work with the examples, then tell us what you like and what more you want to see.

Read More

Information Security in the Digital Age [Podcast]

This is the first of what we hope to be many podcasts in which we explore the technology and culture of Enterprise Information Management (EIM). We’re going to share stories about how OpenText is delivering world class technology and improving our Customer Experience on a daily basis. In this installment, we hope to give you a better understanding of the current cyber security climate, show you what we’re doing to keep your data secure and protect your privacy, and tell you how you can protect yourself online. Our discussion on information security has been recorded as a podcast! If you’d like to listen but don’t see the player above click here. If you don’t want to listen to the podcast, we’ve transcribed it for you below: … The unknown unknown… … If it was three in the morning and there was a bunch of guys standing down a poorly lit alley, would you walk down there by yourself? Probably not. Yet on the Internet, we do that continuously—we walk down that street—and then we’re shocked when negative things happen… … People have an expectation that once they put a lock on their door they’re secure. And that might be the case in their home. But electronically it’s not quite so simple… Are we safe online? Perhaps a better question is whether our information is safe online. 2014 was a banner year for information, data—what we now call cyber—security, and if analyst reports can be any indication, security professionals are on high alert in 2015. International governing bodies have also placed an urgency on better understanding cyber security risks and putting in place strategies to ensure stable telecommunications and safeguard information. There has also been growing concern around data privacy. Though security and privacy work hand-in- hand and it’s difficult to have data privacy without security, there is a difference between the two terms. Security involves the confidentiality, availability and integrity of data. It’s about only collecting information that’s required, then keeping that information safe and destroying it when it’s no longer needed. On the other hand, privacy is about the appropriate use of data. To help us through the topic of cyber security, we talked to Greg Murray, VP of Information Security and Chief Information Security Officer at OpenText. The OpenText security team is made up of specialists around the world who provide operational response, risk assessments and compliance. They also brief executive leadership regularly, and keep development teams abreast of pertinent security information. More importantly, Greg and his team work with our customers to ensure their unique security needs are covered end-to-end. “It starts early in the process,” says Greg. “It starts in the presales cycle where we try to understand the risks that [our customers] are trying to manage in their organization. We find out how they are applying security against that, and then that becomes contractual obligation that we make sure is clearly stated in our agreement with the customer. From there, it goes into our operations center—or risk center, depending on what we’re looking at—and we ensure that whatever our obligations, we’re on top of them and following the different verticals and industries.” Again, 2014 was a big year for cyber security in the news (I think we all remember the stories of not too long ago). But while news agencies focused on the scope and possible future threats, Greg learned something else: “I think if we look at media, one probably would not have argued until last year that media was a high threat area compared to something like aerospace defense. That has changed. Clearly that has changed. As a result, customers come back and say, ‘Hey, our environment has changed. What can you do to help us with that?’” “What a financial institution requires is very different than what a manufacturing provider requires or a pharmaceutical organization. Some of that, as a provider to these organizations and customers, we can carry for them on their behalf. In other cases they must carry it themselves. A lot of the discussions that we have with customers are in regards to ‘Where’s that line?’” “At the end of the day, there’s a collaboration. It’s not all on the customer, it’s not all on OpenText. We have to work together to be able to prove compliance and prove security across the environment.” Regardless of the size, industry or location of an organization, security needs to be a top priority. This concept isn’t a new one. As Greg told Adam Howatson, OpenText CMO in a recent Tech Talk interview, information security hasn’t evolved that much over the last 50 years (view the discussion on YouTube). Greg’s answer may surprise, but after some digging I learned that back in 1998, the Russian Federation brought the issue of information security to the UN’s attention by suggesting that telecommunications were beginning to be used for purposes “inconsistent with the objectives of maintaining international stability and security.” Since then, the UN has been trying to increase transparency, predictability and cooperation among the nations of the world in an effort to police the Internet and private networks. Additionally, if you have seen the Alan Turing biopic The Imitation Game, you know that people have been trying to encrypt and decipher messages since the 1940s and probably even earlier. Today, the lack of physical borders online has certainly complicated things, but the information security game remains the same, and cooperation among allies remains the key. “Are we all contributing together?” Greg asks. “If we’re all working together—just like Neighborhood Watch—we need that same neighborhood community watch on the internet. If you see stuff that doesn’t look right, you should probably report it.” The bad guys are organized and we need to be organized as well. The more we share information and the more we work together… Particularly at OpenText, we have a lot of customer outreach programs and security work where we work hand-in-hand with customer security teams. By doing that, we improve not only our security, but we improve security across the industry.” Recently I attended a talk given by Dr. Ann Cavoukian, former Ontario Privacy Commissioner and Executive Director at the Privacy and Big Data Institute at Ryerson University in Toronto. In it, she said that “privacy cannot be assured solely by compliance with regulatory frameworks; rather, privacy assurance must ideally become an organization’s default mode of operation.” She said that privacy—which again, involves the appropriate use of information—must be at the core of IT systems, accountable business practices and in the physical design and networked infrastructure. Privacy needs to be built into the very design of a business. And I think it’s evident from what Greg says about security, and the way OpenText designs its software with the users’ needs in mind, that our customers’ privacy and security is an essential part of what we offer. “We have a tremendous number of technical controls that are in place throughout all of our systems. For us, though, it starts on the drawing board. That’s when we start thinking about security.” “As soon as Product Management comes up with a new idea, we sit down with them to understand what they’re trying to achieve for the customer and how we’re going to secure it. So that by the time somebody’s uploading that document, it’s already gone through design, engineering, regression testing analysis, security penetration testing.” “One of the other things we do is called threat modelling. Typically we look at the different types of solutions—whether they’re file transfer or transactional, for example—and we look across the industry to see who has been breached and how. We then specifically include that in all of our security and regression testing.” You don’t need to look further than the OpenText Cloud Bill of Rights for proof in our dedication to information security and privacy. In it, we guarantee for our cloud customers the following: You own your content We will not lose your data We will not spy on your data We will not sell your data We will not withhold your data You locate your data where you want it Not everyone is up front with their data privacy policy, but with people becoming more aware of information security and privacy concerns, organizations are going to find themselves facing serious consequences if they do not make the appropriate changes to internal processes and policy. Data security doesn’t lie solely in the hands of cloud vendors or software developers, however. We asked Greg what users and IT administrators can do to protect themselves, and he said it comes down to three things: “One is change your passwords regularly. I know it sounds kind of foolish, but in this day and age if you can use two-factor or multi-factor authentication that does make a big difference.” “The second thing you can do is make sure your systems are patched. 95% of breaches happen because systems aren’t patched. When people ask ‘What’s the sexy side of security?’, it’s not patching. But it works. And it’s not that expensive—it’s typically included free from most vendors.” “The third thing is ‘think before you click.’ If you don’t know who it is or you don’t know what it is… Curiosity kills the cat and curiosity infects computers.” We hope you enjoyed our discussion on information privacy and cyber security. If you’d like to know more about the topics discussed today, visit opencanada.org, privacybydesign.com and of course Opentext.com. We also encourage you to learn more about security regulations and compliance by visiting the CCIRC and FS-ISAC websites.  

Read More

Cloud Quotation Management: Mobility Redefined

Consider this. It’s Friday afternoon. The sun is shining and everyone at the office is enjoying the chance to eat lunch outside. You’re already considering your plans for the weekend. Then the phone rings with a customer on the other end. He needs a quotation. Right away. You have two options. You can wolf down your lunch, return to your office, and ruin not just your own afternoon, but that of your co-workers and office staff too – trying to put everything together as fast as possible for the client in question. Maybe you are getting home at 8 p.m., if you’re lucky, with no energy to enjoy the rest of the day. Alternatively, you can take out your smartphone and have the quotation ready to go before your next bite, sending it directly to the customer or to your manager for approval. And you can have all of that done in a couple of minutes, with calculated prices and discounts, up-to-date information and your company’s corporate branding incorporated. Sounds impossible? It isn’t. In fact, that’s precisely what cloud quotation management is for. It’s true that cloud computing has had its fair share of growing pains and inconsistencies over the past few years. At first, scalability, document production, storage and printing from the cloud were, at best, available only by using unwieldy workarounds. And to make matters worse, there were all kinds of uncertainties regarding hosting and the countries in which sensitive data would be stored. But things have changed: many of those problems are now a thing of the past.   The impact of these changes couldn’t be clearer. Cloud-based CRM systems are becoming increasingly more prevalent, and the number of cloud products, apps and solutions designed to increase productivity is growing day by day. This, in turn, has resulted in the slow but sure migration of processes to the cloud as well. And quotation management is no exception, with benefits that include not only completely new possibilities for sales departments, but also an unparalleled boost in efficiency. Save Time by Using Cloud Quotation Management There’s nothing worse than having to waste time. Commuting, traveling, waiting at the airport or train station – few other things are as maddeningly unproductive. That’s why vendors such as Salesforce.com, Cobra and SugarCRM are releasing solutions that push the envelope in terms of how useful smartphone business apps can be. And their users couldn’t be happier. This trend may or may not have been kick-started by Salesforce CEO Marc Benioff, who had a vision of managing his company through his smartphone. And now software vendors and app programmers are pursuing that same vision too, coming up with new business apps every second. The result? Even the most experienced salesperson can now increase their productivity and save valuable time by using cloud-based tools – including cloud quotation management – wherever they are. Increase Reliability by Using Cloud Quotation Management If you work with Customer Relationship Management (CRM) or Enterprise Resource Planning (ERP) systems, you are probably already aware of the fact that appropriate process-specific actions are triggered automatically. However, you may not be aware that you can take advantage of this exact same behavior when using cloud quotation management systems. Just like a local or on-premise CRM system, you can rely on processes and actions that will run automatically as soon as they’re initiated. Cloud quotation management enables you to trigger these actions and processes from your smartphone or tablet. In fact, these processes can take care of most of the work in the background. They can be used, for example, with PowerDocs to prepare quotations, request approvals and send emails automatically. Reduce Risk by Using Cloud Quotation Management By mapping and integrating company-specific processes into your cloud quotation management system, you can minimize quotation errors before you even get started. For example, the editing process, the data retrieval process and the quotation writing process can all be centrally managed. In addition, the system can be set up in such a way that outside and inside salespeople will be able to access the content they need based on permissions, create quotations with only a few taps by relying on smart querying processes, and send their quotations directly to their customers or to their supervisors for approval. And that’s regardless of the number of pages, documents and attachments involved. To put it simply, correctly delineating and defining the processes for cloud quotation management in advance takes a load off sales departments’ shoulders, making their work much easier overall. Using the Right Cloud Quotation Management Tool What else do you need to know about cloud quotation management? First of all, it doesn’t necessarily require a cloud-based CRM. Cloud quotation management solutions can also be effectively combined with on-premises systems, meaning that, for instance, you can merge data from ERP, CRM and inventory control systems and retrieve, modify or edit it while you’re with a customer. In addition, every large CRM vendor has cloud apps available, though these are admittedly difficult when it comes to company-specific processes and mapping workflows in detail. Not surprisingly, this means that independent tools are often worth a look. In fact, these tools can come in very handy when switching between database systems or merging data from various systems, as they allow salespeople to continue working with the user interface they are familiar with, eliminating frustration and the need for time-consuming training in the process. PowerDocs is one of such tools. Be Ready for the Future Today, with Cloud Quotation Management The revolution that is cloud computing, apps for business applications, and mobile working solutions has only just begun and will keep growing more and more in the coming years. Not surprisingly, communications, as well as the speed at which business is conducted, are changing along with it. This is why it’s necessary to automate as many processes now as possible, in order to be able to meet the future needs of customers today. Cloud quotation management is not only the perfect solution, but is also an ideal way to optimize quotation management – all while making it easier and faster.

Read More

Digital Disruption: The Forces of Data Driven Smart Apps [Part 3]

Editor’s note: Shaku Atre (at right, above, at our Data Driven Summit last December) is the founder and managing partner of the Atre Group, Inc. of New York City, NY and Santa Cruz, California. (Read more about Atre here.) Atre has written a thorough and compelling treatise on the disruptive power of mobile apps, and supported her analysis and conclusions with templates and case studies.  We are privileged to present her analysis here in four parts. In Part 1, she made the case for mobile apps and described some of the forces behind digital disruption. In Part 2 she expanded on those forces. In this post, Atre shares two templates for conceptualizing smart apps, and on March 19 she presents case studies in financial services, telecommunications, car rental, and pharmaceutical industries.  —– Digital Disruption: The Forces of Data Driven Smart Apps [Part 3 of 4] Copyright by Atre Group, Inc. We will use the following two templates for planning our case studies for various industries to conceptually plan and design smart apps: Template 1: Representation of three major participants who create and use data as input to the app.   Figure 1 Publisher X of the app: From a particular industry. The apps, most likely, are stored with the servers in the cloud. Reponses of the customers, as well as those of potential customers, are collected in the cloud servers. Functions provided by the publisher’s app: The app will provide certain functions. Let us consider a commercial bank providing a function such as “Basic Consumer and Business Advantage checking with mobile banking.”Who are the publisher’s customers? E.g. Consumers and small business owners. These are the primary users of the app and primary beneficiaries. What benefits do these customers (the primary beneficiaries) receive? The consumers save substantial time by not having to travel to an ATM. They can deposit checks much quicker and can have access to the funds much faster. Potential customers (secondary beneficiaries) may become the publisher’s customers based on referrals of the customers of the publisher of the app. Referrals could be direct contact between the primary and secondary beneficiaries, or they could happen via web-based reviews or some other vehicle. Once the potential customers become customers they too receive the very same benefits as the primary beneficiaries. There are other types of beneficiaries: Credit scorekeepers of the customers, for example. Publisher of the app’s benefits: The publisher, in our case the bank, receives capital, which it can give out as loans to customers at higher interest rates than the depositing entities. Data is being driven from one of the three participants to other two participants. Decision making with analytics can take place by using the appropriate tools to use the apps by all participants. Template 2: Representation of Data Collection, Data Integration, Data Analysis and Discovery, Data Visualization, and the use of Mobile Technology as the messenger to supply messages to the participants of the Template 1. Figure 2   Generic and High Level View of Big Data and App Logic for the Smart and Novel Apps for your Customers, the Primary and Secondary Beneficiaries of your Industry, along with a multitude of Interfaces. Interface A: Customers are using the app and creating data, such as purchases, returns, money transactions, etc. This results in web logs using the cloud servers. Responses may be sent back via the cloud servers to the customers.Interface B: Potential customers are using the app and creating data, such as purchases, returns, money transactions, etc. This results in web logs using the cloud servers. Responses may be sent back by the cloud servers to the potential customers.Interface C: Some data created by the customers may be sent directly to Data Storage. Similar interface may exist for potential customers.Interface D: Customers’ invoices and archived internal data may be stored.  Potential customers’ invoices and archived internal data may be stored. Interface E: Data created by the customers and by the potential customers using the app and the cloud servers is transmitted to and from the customers’ data storage. Interface F: Archived data, created by customers and potential customers, may be stored with the cloud servers and might be retrieved when necessary. Interface G: Potential customers (secondary beneficiaries) create data with web logs, click stream, likes, dislikes, and sentiments Interface H: Customers’ and potential customers’ transactional data is stored in real time storage. Interface I:  Potential customers’ (secondary beneficiaries’) data with web logs, click stream, likes, dislikes and sentiments is transferred to Real Time Data Storage Interface J:  Customers’ (primary beneficiaries’) data with weblogs, click stream, likes, dislikes and sentiments is sent to the Real Time Data Storage Interface K: Cloud Severs provide marketing and industry related big data such as competitive data, data collected by sensors or other means, social media data (customers’ DNA) are streamed Interface L: Marketing and industry related big data such as competitive data, data collected by sensors or other means, social media data (customers’ DNA) are streamed to the Real Time Data storage Interface M: Real Time data is transferred to the Analytics Logic Box: Data is analyzed and discoveries are made of customers’ and of potential customers’ performance, likes, dislikes with analytics Interface N: Events are processed at mobile speed, Analytics Logic Box and Real Time “Make It Happen” Logic Box work together. Results are modified and travel between the Analytics Logic Box and Real Time “Make It Happen Box” in real time. Interface O & P: At the same time Analytics Logic Box and Real Time “Make it Happen Logic Box” work together with matching your Products/Services Logic Box for Matching of your products and/or of services with customers’ and potential customers’ likes and dislikes discoveries and analytics results are performed Interface Q: Real Time Make It Happen Logic Box provides visual display from Real Time Make It Happen Logic Box Interface R: Matching your Products/Services Logic Box Provides results of matches and/or mismatches to the Digital Responses Logic Box Interface S: Digital responses are created for communication with the smart phones of the customers and potential customers. If necessary, human staff contacts the customers and/or potential customers. Interface T & V:  Visualization Logic Box transfers to the customers and to the potential customers’ digital responses pictured in visual format. Interface U & W:  Digital responses are sent by the app to the customers and the potential customers. The customers and the potential customers respond back to the app. —– On March 19, this four-part blog series will wrap up with Shaku Atre’s case studies for smart apps in financial services, telecommunications, car rental and pharmaceutical industries. Subscribe (at left) to be notified when the posts are available.    

Read More

Infusing the Supply Chain with Analytics

The strategic use of supply chain information is a key driver of competitive advantage in the Digital-First World. Once a company’s B2B processes are automated and transactions are flowing, visibility into those transactions can fuel better strategic and tactical decision-making across the entire business network. Insights from analytics allow trading partners to speed their decision-making, rapidly respond to changing customer and market demands, and optimize their business processes. Our mission at OpenText is to enable our customers to prepare for and thrive in the digital future. Analytic capabilities will play a key role. As I’ve said in previous posts, analytic technologies represent the next frontier in extracting value from enterprise information. For this reason, we are infusing new analytic capabilities into all our core solutions. We recently added new analytic capabilities to the OpenText Trading Grid to help our customers easily access insights for improving their supply chains’ effectiveness. The OpenText Trading Grid is powered by the OpenText Cloud and is the world’s leading B2B integration network, processing more than 16 billion transactions per year, integrating 600,000 trading partners for more than 60,000 customers around the globe. The solution boasts easy-to-use dashboards which depict and summarize data trends and compare them to key performance indicators (KPIs) for the business. They allow companies to evaluate the performance of suppliers or the behavior of customers and use this information to improve processes and relationships. On a more granular level, ‘track and trace’ data highlights exception information about a specific order, shipment, or invoice. By taking prompt corrective action, companies can remedy a situation before process performance degrades or costs accumulate. There are many, many scenarios in which analytic insights bring incredible benefit to the supply chain. Armed with information about the physical location of products or shipments, companies are able to plan operations with greater efficiency, reduce the number of items lost in transit, and fulfill orders with greater accuracy. They can replenish products as shortages are detected. When it comes to process automation, information about the performance of equipment is used to track degradation, order replacement parts, and schedule service before failure occurs. Today, the digital supply chain is an information supply chain that coordinates the flow of goods, communications, and commerce internally and externally across an extended ecosystem of business partners. It seamlessly integrates data from supply chain processes and smart equipment, and tracks intelligent products, parts, and shipments tagged with sensors. Within a few short years it will expand to integrate data from the Internet of Things (IoT). Data will flow online from myriad devices, including wearable technologies, 3-D printers, and logistics drones. It is expected that we will see a thirty-fold increase in web-enabled physical devices by the year 2020. All of these devices producing volumes of data will create a network rich with information and insights. This is the future. A future in which analytics bring incredible value and competitive advantage to the supply chain. And we are only just beginning to envision it. To learn more about OpenText Trading Grid Analytics, read our press release or visit our website.

Read More

Apple Watch Validates the Power of Analytics

  The Apple Watch is not only the company’s foray into the smartwatch and wearable technology space, but it also validates the importance of digital disruption, cloud delivery and embedded analytics. Even before the company’s smartwatch made its formal debut this week, application developers representing companies that provide sports-related, entertainment, and productivity software were called by Apple to create innovative app designs that highlighted the device as a customized timepiece, instant communications device and health and fitness companion. The result: Apple executives announced 50 new apps that will all work immediately with its upcoming Apple Watch including Instagram, MLB.com At Bat, Nike+ Running, OpenTable, Shazam, Twitter, WeChat, Uber, Salesforce, American Airlines and Honeywell Lyric thermostat. What these applications have in common is that they all disrupt our notion of how content is delivered as well as how and where information is analyzed and provided. For example, Apple demonstrated with the help of supermodel Christy Turlington Burns how apps might access data such as weight, blood pressure, glucose levels and asthma inhaler use. Third-party devices and apps can measure the data through a cloud delivery system and then notify the user to take appropriate actions. Other data that can be measured and analyzed include the watch’s accelerometer, taptic engine, haptic feedback, microphone, gyroscope and GPS sensors in your iPhone to gain insight into the wearer’s gait, motor impairment, fitness, speech and even memory. Apple Watch and Digital Disruption In reviewing the potential of the Apple Watch, it is apparent that businesses will be able to capitalize on these digital disruptions in several areas. From a paperwork reduction initiative, Apple said its Watch can make it easier to recruit participants for large-scale research studies. Instead of sending out reams of survey packets, participants can complete tasks or submit surveys right from an app on their wrist, so researchers spend less time on paperwork and more time analyzing data. Using cloud-delivered analytics, researchers might then present an interactive informed consent process. Here are some other ways Apple’s Watch creates opportunities for businesses to take advantage of a cloud-delivered embedded analytics engine: Business Process Management (BPM): The most common process interaction is an approve/reject function. The Apple Watch is likely to raise the bar on how mobile devices handle BPM on the go. Status updates, alerts, reports and approval steps involved in a business process can be conducted on the watch. Enterprise Content Management (ECM): Collaboration will be the likely use case here. Commenting and following comments from co-workers, trending topics, volume of interactions, as well as simple sharing of documents and folders are tasks that may move to a watch. Customer Experience Management (CEM): All content that you see on a watch represents the brand identity of the application provider and defines the essence of the customer experience. Watches will become a digital experience channel that needs to be treated as part of a consistent omni-channel strategy, while delivering the best possible experience given the capabilities a limitations of the device. Information Exchange (IX): The Near field communication (NFC) sensor in Apple Watch will enable a new class of applications that can interact with the physical world. In a factory or warehouse, this may include actions such as retrieving information about a box of parts,  ordering new parts when supplies are low, retrieving status update on current shipments, or delivering supplier alerts. Another Demo to Watch While Apple put on an impressive show of its design strength, the company is by no means the first smartwatch maker to demonstrate applications that tap into Big Data and deliver analytics via the cloud. In November 2014, Actuate (now OpenText) combined the power of integrated Big Data access along multiple devices (including a smartwatch) with visualizations, open APIs and embedded analytics. Check out that demonstration in this video. Any thoughts on the Apple Watch, digital disruption or the future of analytics? Leave your comments below.

Read More

Step Aside Cloud, Mobile and Big Data, IoT has just Entered the Room

Mark Morley

This article provides a review of the ARC Advisory Group Forum in Orlando and expands on the ever increasing importance of analytics in relation to the Internet of Things The room I am referring to here is the office of the CIO, or should that be CTO or CDO (Chief Digital Officer), you see even as technology is evolving, the corporate role to manage digital transformation is evolving too. Since 2011, when Cloud, Mobile and Big Data technologies started to go mainstream, individual strategies to support each of these technologies have been evolving and some would argue that in some cases they remain separate strategies today. However the introduction of the Internet of Things (IoT) is changing the strategic agenda very quickly. For some reason IoT as a ‘collective & strategic’ term, has caught the interest of the enterprise and the consumer alike. IoT allows companies to effectively define one strategy that potentially embraces elements of cloud, mobile and Big Data. I would argue that in terms of IoT, cloud is nearly a commodity term that has evolved into offering connectivity any time, any place or anywhere. Mobile has evolved from simply porting enterprise applications to HTML5 to wearable technology such as Microsoft HoloLens, shown below. Finally Big Data which is broadening its appeal by focussing more on the analytics of information rather than just archiving huge volumes of data. In short, IoT has brought a stronger sense of purpose to cloud, mobile and Big Data. Two weeks ago I was fortunate to attend the ARC Advisory Group Forum in Orlando, a great conference if you have an interest in the Industrial Internet of Things and the direction this is taking. The terminology being used here is interesting as it is just another strand of the IoT, I will expand more on this naming convention a bit later in this post. There were over 700 attendees to the conference, and a lot of interest, as you would expect from industrial manufacturers such as GE, ABB, ThyssenKrupp & Schneider Electric. These companies weren’t just attending as delegates, they were actually showcasing their own IoT related technologies in the expo hall. In fact it was quite interesting to hear how many industrial companies were establishing state of the art software divisions for developing their own IoT applications. For me, the company that made the biggest impact at the conference was GE and their Intelligent Platforms division. GEIP focused heavily on industrial analytics and in particular how it could help companies improve the maintenance of equipment, either in the field or in a factory by using advanced analytics techniques to support predictive maintenance routines. So how does IoT support predictive maintenance scenarios then? It is really about applying IoT technologies such as sensors and analytics to industrial equipment and then being able to process the information coming from the sensors in real time to help identify trends in data and how it is then possible to predict when a component such as a water pump is likely to fail.  If you can predict when a component is likely to fail, you can replace a faulty component as part of a predictive maintenance routine and the piece of equipment is less likely to experience any unexpected downtime. In GE’s case they have many years of experience and knowledge of how their equipment performs in the field and so they can utilise this historical data as well to determine the potential timeline of component failure.  In fact GE went to great lengths to discuss the future of the ‘Brilliant Factory’. The IoT has brought a sense of intelligence or awareness to many pieces of industrial equipment and it was interesting learning from these companies about how they would leverage the IoT moving forwards. There were two common themes to the presentations and what the exhibitors were showcasing in the expo hall. Firstly cyber-security, over the past few months there has been no end of hacking related stories in the press and industrial companies are working very hard to ensure that connected equipment is not ‘hackable’.  The last thing you want is a rogue country hacking into your network, logging into a machine on the shopfloor and stealing tool path cutting information for your next great product that is likely to take the world by storm.  So device or equipment security is really a key focus area for industrial companies in 2015.  Interestingly it wasn’t just cyber-security of connected devices that was keeping CIOs awake at night, a new threat is emerging on the horizon.  What if a complete plant full of connected devices could be brought down by a simple Electro Magnetic Pulse (EMP) threat, this was another scenario discussed in one of the sessions at the conference. So encryption and shielding of data is a key focus area for many research establishments at the moment. The second key theme at the conference was analytics. As we know, Big Data has been around for a few years now but even though companies were good at storing TBs of data on mass storage devices they never really got the true value from the data by mining through it and looking for trends or pieces of information that could either transform the performance of a piece of equipment or improve the efficiency of a production process.  By itself, Big Data is virtually useless unless something is done which results in actionable intelligence and insight that delivers value to the organisation. Interesting quote from Oracle,93% of executives believe that organisations are losing revenue as a result of not being able to fully leverage the information they have. So deriving value from information coming from sensors attached to connected devices is going to become a key growth sector moving forwards. It is certainly an area that the CIO/CTO/CDO is extremely interested in as it can directly impact the bottom line and ultimately bring increased value to shareholders. I guess it is no surprise then that the world’s largest provider of Enterprise Information Management solutions, OpenText, should acquire Actuate, a leading provider of analytics based solutions. Last week the Information Exchange business unit of OpenText, which has a strong focus on B2B integration and supply chain, launched Trading Grid Analytics, a value add service to provide improved insights into transaction based information flowing across our cloud based Trading Grid infrastructure. With 16 billion transactions flowing across our business network each year there is a huge opportunity to mine this information and derive new value from these transactions, not just in the EDI related information that is being transmitted between companies on our network. Can you imagine the benefits that global governments could realise if they could predict a country’s GDP based on the volume of order and production related B2B transactions flowing across our network? Actuate is not integrated to Trading Grid just yet but it will eventually become a core piece of technology to analyse information flowing across not just Trading Grid but our other EIM solutions.  It is certainly an exciting time if you are a customer using our EIM solutions! Actuate has some great embedded analytics capabilities that will potentially help improve the overall operational efficiency of connected industrial equipment. In a previous blog I mentioned about B2B transactions being raised ‘on device’ , well with semi-conductor manufacturers such as Intel  spending millions of dollars developing low power chips to place on connected devices, it means that the device will become even more ‘intelligent’ and almost autonomous in nature.  I think we will see a lot more strategic partnerships announced between the semi-conductor manufacturers and industrial equipment manufacturers such as GE and ABB etc. Naturally, cloud, mobile and big data plays a big part in the overall success of an IoT related strategy. I certainly think we will see the emergence of more FOG based processing environments.  ‘FOG’ I hear you ask?, yes another term I heard at a Cisco IoT world forum two years ago.  Basically a connected device is able to perform some form of processing or analytics task in a FOG environment which is much closer to the connected device than a traditional cloud platform.  Think of FOG as being half way between the connected device and the cloud, ie a lot of pre-processing can take place on or near the connected device before the information is sent to a central cloud platform. So coming back to the conference, there was actually another area that was partially discussed, the area of IoT standards.  I guess it is to be expected that as this is a new technology area it will take time to develop new standards for how devices are connected to each other and standard ways for transporting, processing and securing the information flows. But there is another area of IoT related standards that is bugging me at the moment!, the many derivatives of the term IoT that are emerging.  IoT was certainly the first term defined by Kevin Ashton, closely followed by GE who introduced the Industrial Internet of Things, Cisco introducing the Internet of Everything and then you have the German manufacturers introducing Industry 4.0.  I appreciate that is has been the manufacturing industry that has driven a lot of IoT development so far but what about other industries such as retail, energy, healthcare  and other industry sub-sectors?  Admittedly IoT is a very generic term but already it is being more associated with consumer related technologies such as wearable devices and connected home devices such as NEST.  So in addition to defining standards for IoT cyber security, connectivity and data flows, how about introducing a standard naming convention that could support each and every industry? As there isn’t a suitable set of naming conventions, let me start the ball rolling by defining a common naming convention!  I think the following image nicely explains what I am thinking of here. In closing, I would argue, based on the presentations I saw at the ARC conference, that the industrial manufacturing sector is the most advanced in terms of IoT adoption. Can you imagine what sort of world we will live in when all the industries listed above embrace IoT, one word, exciting! Mark Morley currently leads industry marketing for the manufacturing sector at OpenText.  In this role Mark has a focus on automotive, high tech and the industrial sectors. Mark also defines the go-to-market strategy and thought leadership for applying B2B e-commerce and integration solutions within these sectors.

Read More

Data Driven Digest for February 27

Each Friday we share some favorite reporting on, and examples of, data driven visualizations and embedded analytics that came onto our radar in the past week. Use the “Subscribe” link at left and we’ll email you with new entries.     Om Data: Word clouds can be great,* but they’re not always enough;  sometimes you also need to show relationships between words. That’s what the folks at Information is Beautiful did with the graphic above (part of a larger infographic; click through to see it). It charts the benefits of meditation and mindfulness based on more than 75 clinical studies; organizes them into four broad groups: cognitive, physical, emotional and social benefits; charts the strength of evidence supporting the claim by word size (as in a word cloud); and organizes it all to show where benefits overlap. (For example, “increased self-control & will power” is considered both a cognitive benefit and an emotional one.) The result is an enlightening visualization. * You can now create word clouds in OpenText Analytics using the new custom visualizations capability in BIRT iHub 3.1. Learn about the new version here.   Wave Length: Politicians in Washington are once again arguing about immigration policy. If they want to inform their debate with data, they could start with the chart above. (Click through for the full interactive version.) Natalia Bronshtein has visualized nearly 200 years of U.S. immigration data from the federal Yearbook of Immigration Statistics. Hover your mouse anywhere on the graphic to see statistics on the many lands immigrants hail from, organized by country and decade. Bronshtein’s homepage has many more compelling interactive visualizations. Where Are You: Data Science Central, a social network for big data, business analytics and data science practitioners, has an ongoing project to understand where its members live and work. The organization correlated its own membership list with data from job websites in the U.S. to come up with the map above; Livan Alonso took the data a step further to produce the map below. Vincent Granville, Data Science Central’s founder, has asked Alonso to slice and dice the data further, so this is definitely a work in progress as well as a labor of love. Got a favorite or trending resource on embedded analytics and data visualization? Share it with the readers of the OpenText Analytics blog. Submit ideas to blogactuate@actuate.com or add a comment below. Subscribe (at left) and we’ll email you when new entries are posted. Recent Data Driven Digests: February 20: Lunar new year travels, mapping sounds, winning poker hands February 13: Shopping malls, subway germs, advice for choosing visualizations February 6: Anthony Davis, California farmland, Where’s Waldo?  

Read More

Forget the Oscars, Tata Motors Won a Bigger Award in Mumbai

Last week I had the pleasure of attending our Innovation Tour event in Mumbai, the first leg of a multi-city tour of the world to showcase our Enterprise Information Management solutions and how they help companies move to the digital first world! The event was very well attended and it was good to see keen interest being shown in our new offerings such as Actuate and Core and our other more mature EIM solutions. Enterprise World has traditionally been our key event of the year, but the Innovation Tour provides a way for OpenText to get closer to our customers around the world, Mumbai was no exception with keen interest shown in our expo hall. I have been to India before, two years ago in fact, to meet with an automotive industry association that looks after the ICT needs of the entire Indian automotive industry. Back then, the discussion was focused around B2B integration. However, last week’s event in  Mumbai showcased all solutions from the OpenText portfolio. One of the interesting solution areas being showcased by one of our customers was Business Process Management (BPM) and it is only fitting that one of our Indian based customers won an award for their deployment of BPM. Why fitting? Well, India has long been the global hub for business process outsourcing, so I guess you could say there is a natural interest in improving the management of business processes in India. OpenText has a strong presence in the Indian market. OpenText presented a number of awards during the event, and Tata Motors was the worthy winner of the award for the best deployment of BPM. Incidentally, Tata Motors also won the global Heroes Award at last year’s Enterprise World event for their deployment of our Cordys BPM Solution. So who are Tata Motors, I hear you ask? Well, they are the largest vehicle manufacturer in India with consolidated revenues of $38.9 billion. Tata Motors is part of a large group of companies which includes Tata Steel, Jaguar Land Rover in the UK, Tata Technologies and many other smaller companies that serve the domestic market in India. Tata Group is fast becoming a leading OpenText customer showcasing many different EIM solutions. For example, Jaguar Land Rover uses OpenText Managed Services to manage the B2B communications with over 1,200 suppliers following divestiture from Ford in 2009. Tata Steel in Europe also uses our Managed Services platform to help consolidate eleven separate EDI platforms and three web portals onto a single, common platform. So, simplification and consolidation of IT and B2B infrastructures is a common theme across Tata Group, and Tata Motors is no different with their implementation of OpenText BPM. Tata Motors has struggled over the years to exchange information electronically with over 750 vehicle dealers across India. Varying IT skills, multiple business processes, combined with having to use a notoriously difficult utilities and communications infrastructure across the country was really starting to impact Tata Motor’s business. In addition, their IT infrastructure had to support over 35,000 users and there were over 90 different types of business application in use across 1,200 departments of the company. So ensuring  that accurate, timely information could be exchanged across both internal and external users was proving to be a huge problem for Tata Motors. Step forward, OpenText BPM! Tata Motors decided to depoy our Cordys BPM solution as a SOA based backed platform to connect all their business applications and more importantly provide a common platform to help exchange information electronically across their extensive dealer network. Even though they had deployed Siebel CRM across their dealer network, Tata Motors faced a constant challenge of having to process a high volume of manual, paper based information, quite often this information would be inaccurate due to mis-keying of information. A simple mistake, but when scaled up across 750 dealers, it can have a serious impact on the bottom line and more importantly impact customer satisfaction levels with respect to new vehicle deliveries or spare parts related orders. Tata Motors had a number of goals for this particular project: Implement a Service Oriented Architecture – Primary objective was to setup a SOA environment for leveraging existing services and hence avoid re-inventing the wheel. They also wanted to use this platform to streamline the current integrations between multiple business systems. Process Automation / Business Process Management – They had a lot of manual, semi-automated of completely automated processes. Manual or semi-automated processes were inefficient and in some cases ineffective as well. Some of their automated processes were actually disconnected with actual business case scenarios. So the goal for implementing BPM was to bring these processes more nearer to ‘business design’, thus improving efficiency and process adherence. Uniform Web Services Framework – Tata Motors goal was to try and establish a single source of web services that could convert existing functionalities of underlying service sources into inter-operable web services. So, what were the primary reasons for Tata Motors choosing OpenText BPM? It was a SOA enabler, its business process automation capabilities, comprehensive product for application development, minimizes the application development time and improved cost effectiveness. Their BPM implementation covered two main areas: Enterprise Applications Integration – mainly deals with inward facing functionalities of employee and manufacturing related process applications. They had many applications but they had a common fault, they did not follow SOA principles. Web services had to be developed inside every application which was very inefficient from a time and resources point of view. In addition, if an application had to connect to SAP then it was an independent, unmanaged and insecure connection. Customer Relationship & Dealer Management Systems Integration –Tata Motors is the biggest player in the commercial vehicles sector in India and one of the biggest in terms of passenger car related sales, with over 750 dealers scattered across India. The dealerships are managed using Siebel CRM-DMS implementation but with many changes being rolled out across the system it needed a supporting platform to effectively manage this process. Cordys became the primary environment for developing CRM-DMS applications. So in summary, Cordys BPM has been integrated with SAP, Siebel CRM-DMS, Email/Exchange Server, Active Directory, Oracle Identity Manager, SMS Gateway and mobile applications across Android and iOS. The Cordys implementation also resulted in a number of business benefits including, improved process efficiency, stronger process adherence, built on a SOA based platform, significant cost and time savings. The project has already achieved its ROI ! Moving forwards OpenText BPM will act as a uniform, centrally managed and secure web services base for all applications used across Tata Motors landscape, irrespective of the technology in which it is developed. The platform will also provide an evolving architecture to mobilise existing applications and they plan to integrate to an in-house developed document management system. Finally, the go forward plan is to move their Cordys implementation to the cloud for improved management of their infrastructure. I have visited many car manufacturers over the years and one company head quartered in the Far East had over 300 dealers in Europe and each one had been allowed to implement their own CRM and DMS environments to manage their dealer business processes. Prior to the acquisition of GXS (my former company) by OpenText, I had to inform them that GXS didn’t have a suitable integration platform to help seamlessly connect all 300 dealers to a single platform. With OpenText BPM we can clearly achieve such an integration project now and Tata Motors is certainly a shining light in terms of what is achievable from an extended enterprise application integration point of view. Congratulations Tata Motors! For more information on OpenText BPM solutions, please CLICK HERE. Finally, I just want to say many thanks to my OpenText colleagues in India; it was a very successful event and a team effort to make it happen. For more information on our Innovation Tour schedule, please CLICK HERE

Read More

Could the Smart Trash Can Take Waste Out of the Supply Chain?

In my last post I introduced a vision for the Smart Trash that would automatically identify the items you are throwing away. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. How would the smart trash can work? There are a couple of different options. Version one (circa 2017) would probably rely on the use of RFID tags and readers. If manufacturers put RFID tags on each of the items you purchased then the smart trashcan would be able to identify them automatically. Version two (circa 2018) might add a camera to the lid. As items are being disposed of the camera would automatically recognize the item using “visual search” technology found in Google Goggles. Perhaps the smart trash can might have multiple cameras at different depths that could see through trash bag liners to identify items at rest. Version three (circa 2020) might be more advanced with capabilities to identify items based upon smell. Perhaps, the trash can would be fitted with sensors that can detect odors and identify items based upon their chemical composition. You might be wondering what data retailers and manufacturers use to forecast demand today and whether smart trashcans would provide an improvement. Today, the primary data used for forecasting demand is the information about what shoppers are buying at individual stores or what is called “Point-of-Sale” data. Every night retailers and manufacturers run reports to understand how many of each item was sold in each store. They then try to guesstimate how much inventory they have on hand and whether or not they are going to run out of stock in the coming days (weeks or months). If they are running low on inventory then will need to issue a replenishment order. Would trash can data provide better insights than Point of Sale data? This begs a good question. What provides better insights into future sales – what people are buying or what they are throwing away? Is monitoring Point-of-Sale data a better approach than monitoring waste? Let’s first think about items that are regularly purchased – batteries, diapers, detergent, shampoo, soda, milk, bread and salty snacks. I would argue that monitoring consumption (via trashcans) of these repeat purchases is a better indicator of near-term demand. If someone throws out a milk container they are very likely going to buy a new one in the next 24 hours. In many cases, the disposal of an item after it is consumed is the event that triggers the need to buy another one. But what about items which are not consistent, repeat purchases? Examples might include toys, electronics, clothing, shoes, etc. For these inconsistent purchases you might question the validity of the correlation between waste patterns and future purchases. Just because you throw something away doesn’t mean that you are going to purchase it again – immediately or ever. The value of using the trash data is clear for groceries and regular purchases. Will Nestle, Procter & Gamble and Tesco begin giving away free kitchen trash cans to consumers just to collect data and be optimally positioned for replenishment orders? Or even better what if your smart trash can was linked to your online grocery account? Items detected in your trash can (or recycling bin) could be automatically identified then transmitted to the garbage truck upon pickup at your house. A replenishment algorithm could review your list of “always in stock” items to determine if the item should be replaced immediately. If yes, then a home delivery provider might visit a few hours later to drop off new supplies on your doorstep. Amazon Fresh be extended to include Amazon Trash. Walmart might buy a waste management company. The Smart Trash Can could create a myriad of new opportunities in the supply chain.

Read More

4 Questions: Dieter Meuser Discusses Analytics in Manufacturing

Dieter Meuser and two colleagues founded iTAC Software AG in 1998 to commercialize a manufacturing executing system (MES) developed for Robert Bosch GmbH. The timing was ideal, as manufacturers sought ways to leverage the nascent Internet to automatically monitor and manage their plants and thereby improve quality and efficiency to bolster the bottom line. Today, iTAC (Internet Technologies & Consulting) is one of the leading MES companies serving discrete manufacturers, and has customers throughout Europe, Asia and the Americas. iTAC.MES.Suite is a cloud-based, Java EE-powered application that enables IP-based monitoring and management of every aspect of a manufacturing plant.  OpenText Analytics provides the business intelligence (BI) and analytics capabilities embedded in iTAC.MES.Suite. You can see the software in action in booth 3437 at the IPC APEX EXPO, held February 22-26 in San Diego, California. Learn more about iTAC’s participation in IPC APEX EXPO here, and learn more about the company at the end of this post. Meuser, iTAC’s CTO, has extensive experience working with manufacturers worldwide. He’s also an expert on the German government’s Industry 4.0 initiative to develop and support “Smart Factories” – that is, manufacturing plants that leverage embedded systems and the Internet of Things (IoT) to drive efficiency and improve quality.  We asked Meuser for his thoughts on these topics. OpenText: iTAC has more than two dozen enterprise customers with plants in 20 countries. What do those customers say are their pain points, particularly with regard to data? Meuser: The biggest single pain point is this: Companies have lots of data, but often they are unsure how to analyze it. Let me elaborate:  Many types of data are recorded to fulfill manufacturers’ tracking and tracing requirements. (Called “traceability standards,” these include VW 80131, VW 80160, MBN 10447, GS 95017 and others.) Data collected via sensor networks, such as plant temperature or humidity, are part of these standards. The objective of collecting this data is to continuously improve manufacturing processes through correlation analysis (or Big Data analysis), accomplished by running the data through intelligent algorithms. But because manufacturers frequently aren’t sure which criteria they should use to analyze the data, analysis often does not happen to the extent that manufacturers want.  As a result, data is collected and stored for possible later analysis. This can lead to a growing mountain of unanalyzed data and very little continuous improvement of processes. But it also illustrates why introducing data management right at the beginning of a tracking and tracing project is so important. Data management, supported by analytics, enables process optimization that otherwise would fall by the wayside. OpenText: How do manufacturers use and analyze data – sensor data in particular – to improve their processes? Meuser: Within manufacturing plants, the most common analysis is called Overall Equipment Effectiveness (OEE) based on integrated production data collection and machine data collection (PDC/MDC). This is done within the plant’s manufacturing execution system (MES). PDC/MDC can happen automatically if the plant’s systems are integrated, or manually via rich clients. The captured data can be evaluated in real time and analyzed via free selectable time intervals. Common analyses include comparing planned changeover and turnaround times with actual values; comparing actual production (including scrap) with forecasts; and examining of unexpected equipment breakdowns. Key Performance Indicators (KPIs) in these analyses feed into OEE, productivity and utilization. Reducing non-conformance costs is another important business case for data analysis in both IoT and Industry 4.0. The availability of structured and unstructured sensor data related to product failures (and costs associated with them) enables new opportunities to determine non-conformance. There is enormous potential in systematically analyzing causes of production failure. Failure cause catalogues (which many manufacturers have collected for decades), can be examined with the help of a modern data mining tool. Analyzing this data on the basis of quality, product and process data helps to reduce failure costs in a Smart Factory. OpenText: What is the role of analytics and data visualization in IoT and Industry 4.0? Meuser: A major objective of data analyses and visualizations in IoT and Industry 4.0 is automatic failure cause analysis. This is accomplished by measuring and testing product errors along with data about manufacturing machines, equipment and processes, then identifying inefficient processes in order to establish solutions. These solutions must be checked by process engineers who have years of experience. Humans and machines go hand in hand when we optimize product quality in an Industry 4.0 factory. OpenText: What are the benefits of a Smart Factory? Meuser: A Smart Factory consists of self-learning machines that can identify the causes of failure under specific conditions, determine appropriate measures to address a failure, and send messages to inform operators of problems. This is sometimes called a cyber-physical system (CPS). Combined with appropriate software models, it enables autonomous manufacturing machines (within certain limits) and supports the overall objective to optimize processes and avoid failures before they happen. The Smart Factory is enabled by modern data analysis techniques. It relies on data about products, processes, quality and environment (e.g. room temperature or humidity) as appropriate. The ability to interface an ERP system with production equipment creates continuous vertical integration that covers the entire value chain, from receiving to shipping. Be sure to visit iTAC at the IPC APEX EXPO, and read more about iTAC.MES.Suite in this case study.       More about iTAC iTAC Software AG is a leading provider of next-generation, platform-independent and cloud-based MES solutions for original equipment manufacturers (OEMs) and suppliers within the discrete manufacturing sector. The company has more than 20 years of experience in internet-based solutions for the discrete manufacturing sector, the Internet of Things (IoT) and the Industrial Internet To date, iTAC has amassed an enviable portfolio of over 70 global enterprise customers across five primary industries: automotive, electronics/EMS/telecommunications technology, metal fabrication, energy/utilities and medical devices. Customers including Audi, Bosch, Continental, Hella, Johnson Controls, Lear, Schneider Electric, Siemens and Volkswagen rely on the iTAC MES Suite to optimize their production processes. iTAC’s product portfolio represents the solutions for the Smart Factory of tomorrow. Its principal components are the iTAC.MES.Suite, the iTAC.Enterprise, Framework and iTAC.embedded.Systems, including its platform-independent iTAC.ARTES middleware and iTAC.Smart.Devices, the company’s new physical interface solutions.

Read More

3 Questions: Randy Pennington Discusses Data’s Role in Organizational Change

If there’s one thing today’s CIOs understand, it’s change. Everything about the CIO’s job is changing: technology, staffing, vendors, delivery models, and – most importantly – expectations. But ask CIOs if they have mastered change – and can deliver positive results from it – and their answers may not be so positive. That’s why Randy Pennington was asked to deliver the afternoon keynote address at CIO Perspectives in Dallas on February 26. Pennington has spent a quarter-century helping organizations deliver positive results in a world of constant change. Pennington is author of three top-rated business books, a consultant to companies large and small, and a sought-after public speaker. (Learn more about him at his website.) Pennington made time in his busy schedule to discuss how data and analytics can support organizational change. OpenText: In your keynote at CIO Perspectives you plan to talk about how successful IT leadership requires constant change and adaptation. Can data help CIOs navigate the changes they face? What data – and data analysis tools – are most effective and helpful? Pennington: Data can absolutely help CIOs determine what needs to change and how they are progressing in their efforts. The key point to remember is that the data has to be important from a business perspective, and that means looking at it through the eyes of the customer – or, in many cases, customers. Data analytics tools that show how the change will enable the operation to run faster, better, cheaper, and/or friendlier are critical for gaining initial buy-in from senior level stakeholders. And once the change is underway, showing continual progress against agreed upon metrics helps you fine-tune, and most important, maintain the effort after the initial push. People buy into change because it seems like a good idea or they see others doing something similar. They continue to support and sustain it only when it produces results that are important to them. CIOs must also remember that the data that is important to senior leaders may not be important to end users. To get the buy-in from end users, we should also look at the data from their perspective. We tell our clients that there only three things you can measure in an organization: activities, results, and perceptions. Results (the hard numbers) and perceptions (the feelings) are actually lagging indicators. Data related to the activities required to produce the desired result is the best leading indicator of success. OpenText: You write, “Too often, we have treated people like data and things to be managed rather than as human beings with dreams, aspirations, and choices.” With that in mind, does data play a role in how organizations are organized and run? How can it be effectively marshalled to support change? Pennington: Data plays a critical role in how organizations are organized and run. The best organizations are data driven in how they make decisions. But, there is an emotional aspect to every human decision, even those that are framed in the context of logical, factual analysis. Leaders run into problems when they assume that everyone will respond to the facts supported by the data. People support and take action to change for their own reasons, not for ours. There should be a good business reason to initiate change within the organization, and that decision should be supported by some form of data. But, that alone will not energize and engage people. We have to translate and connect that data to something that is important to the individual in order to secure buy-in for change. OpenText: One theme of your book Results Rule! is this: “Delivering results is more about what you do than what you know. The heroes of the marketplace minimize the gap between knowing and doing.” How can people – particularly those in IT – translate what they know into what they do? Pennington: One question that I often ask when I begin work with a client is this: “Do things need to change around here?” Everyone says yes, and everyone has an idea of what to do. Then I ask this follow-up question: “How long have you known that needs to be done?” The answers range from weeks to years. And that is the point. Most of us are aware of actions we could take, or at least areas on which we should focus, to improve our operations. We may not always have the knowledge or tools to do what we need to do, but we know what it is. One specific practice that IT leaders can use – and this sounds really obvious – is to put our good intentions about empowerment into action. Everyone on your team knows something that must be done to make things faster, better, cheaper, and/or friendlier. For instance, one client reduced the length of time to process travel reimbursements from three weeks to a little over four days. All it took was the leadership deciding to truly empower people who knew what needed to be done. I think it is the same for many IT leaders. They are failing to utilize fully the knowledge and ideas of their staff, and that extends the gap between knowing and doing. Randy Pennington (@RandyPennington) will share more ideas on Making Change Work in a closing keynote at CIO Perspectives, held February 26 in Dallas, Texas. Also on the program, OpenText’s Brian Combs will appear in Straight Talk about SMAC (social, mobile, analytics and cloud), a panel moderated by IDG Enterprise SVP and Publisher Adam Dennison (@adamidg). Learn more and sign up for CIO Perspectives today.

Read More

OpenText Enhances Portfolio with Analytic Capabilities

    By Mark Barrenechea, President and Chief Executive Officer, OpenText Analytics are a hot technology today, and it is easy to see why. They have the power to transform facts into strategic insights that deliver intelligence “in the moment” for profound impact. Think “Moneyball” and the Oakland A’s in 2002, when Billy Bean hired a number-crunching statistician to examine their odds and changed the game of baseball forever. Across the board—from sports analysis to recommending friends to finding the best place to eat steak in town, analytics are replacing intelligence reports with algorithms that can predict behavior and make decisions. It can create that 1 percent advantage that creates the 100 percent difference between winning and losing. Analytics represent the next frontier in deriving value from information, which is why I’m pleased to announce that OpenText has recently acquired Actuate to enhance its portfolio of products. With powerful predictive analytics technology, Actuate complements our existing information management and B2B integration offerings by allowing organizations to analyze and visualize a broad range of structured, semi-structured, and unstructured data. In a recent study, 96 percent of organizations surveyed felt that analytics will become increasingly important to their organizations in the next three years. From a business perspective, analytics offer customers increased business process efficiencies, greater brand experience, and additional personalized insight for better and faster decisions. In a Digital-First World, organizations will tap into sophisticated analytics techniques to identify their best customers, accelerate product innovation, optimize supply chains, and identify the drivers of financial performance. Agile enterprises incorporate consumer and market data into decision making. People are empowered when they have easy access to agile, flexible, and responsive analytical tools and applications. Actuate enables developers to easily create business applications that leverage information about users, processes, and transactions generated by the various OpenText EIM suites. Customers will be able to view analytics for the entire EIM suite based on a common platform to reduce their total cost of ownership and get a comprehensive view for more elevated, strategic business insight. Actuate is the founder of the popular open source integrated development environment (IDE), BIRT, and develops the world-class deployment platform, BIRT iHub™. BIRT iHub™ significantly improves the productivity of developers working on customer-facing applications. More than 3.5 million BIRT developers and OEMs use Actuate to build scalable, secure solutions that deliver personalized analytics and insights to more than 200 million customers, partners and employees. Designed to be embeddable, developers can use the platform to enrich nearly any application. And, these analytics-enriched applications can be delivered on premises, in the cloud, or in any hybrid scenario. We are excited to welcome the Actuate team into the OpenText family as we continue to help drive innovation and offer the most complete EIM solution in the market. Read the press release on the acquisition here.

Read More

Will the Creation of ‘On Device’ or ‘On Thing’ Based B2B Transactions Ever Become a Reality?

Over the past five years CIOs around the world have been rolling out their cloud based B2B strategies. Whether deploying B2B on premise, on cloud or as a hybrid environment, companies have been able to deploy B2B infrastructures according to their budget, strategy and technical capabilities. Infrastructure-as-a-Service, Platform-as-a-Service and Software-as-a-Service initiatives have been deployed with great effect, and numerous other ‘as-a-Service’ definitions have evolved. So where next for B2B based infrastructures?, well with nearly every CIO formulating a strategy in support of the Internet of Things, how about an On Device or On Thing based B2B strategy? I have posted twenty or so blogs relating to cloud infrastructures since 2010 and over the past year I have spent some time looking at the Internet of Things and where this may go in relation to supply chains of the future.  In a couple of my IoT related blogs I provided some examples on how I thought IoT connected devices could connect into an enterprise infrastructure, (read about it here), and then initiate some form of closed loop ordering process as part of a replenishment or predictive maintenance scenario. I read an article on CIO.com last September where the author described something called the Internet of ‘Things as a Service’ or TaaS for short.  I didn’t realise it at the time of writing my own blogs but this is exactly what I was describing, namely a connected device will be able to analyse its own consumption trends or wear rates and then be able to place some form of order for replacement parts without any human intervention.  OK, sounds a bit far-fetched but I can guarantee this is where things, no pun intended, will be going in the future. Billions of dollars are being spent on developing onboard or embedded processing, sensing, storage and analytics based technologies for IoT based devices.  Many companies such as Intel are betting huge research budgets to develop next generation semi-conductor chips that can be embedded on ‘things’. In fact only last week, OpenText acquired a leading analytics company , and they have been looking at embedded analytics for IoT devices. I will take a look at embedded analytics in relation to B2B in a future blog entry as I believe it will transform how companies visualise, interact and manage B2B related information flowing across the extended enterprise. Two weeks ago I had an interesting discussion with ARC Advisory Group relating to device or ‘thing’ level creation of B2B transactions. ARC use the term Industrial Internet of Things (IIoT) to describe their take on this area as they are keen to differentiate themselves from more consumer focused IoT devices such as wearable technology and home automation equipment. As I have mentioned before there are many big players entering the IIoT space, for example GE (who originally coined the IIoT term), Cisco and Bosch to name but a few. Could we see a piece of equipment in the field, for example a generator or excavator, initiating a B2B transaction by itself to order a replacement part that is just about to fail? For the purposes of this blog I just wanted to introduce the idea of a device or ‘thing’ derived B2B transaction and you can read more in the ARC article that was written to support this.  

Read More

Expert Advice on Embedded BI with Howard Dresner [Webinar]

For once, your CEO and CIO agree on something: Your company needs to embed analytics into its applications. You’ve been tasked with researching which platform is best for you, and you probably have two items on your to-do list: Learn from an industry expert who thoroughly studies the many different embedded analytics platforms, and hear from a company that has successfully embedded analytics into its software. You can do both on January 22 by attending  Embedded BI Market Study with Howard Dresner, a free webinar sponsored by Actuate. Dresner, you probably know, is Chief Research Officer of Dresner Advisory Services, a respected technology analyst firm. Dresner (@howarddresner) coined the term “business intelligence” in 1989 and has studied the market drivers, technologies, and companies associated with BI and analytics ever since. It’s safe to say that nobody knows the sector better. In this webinar, Dresner will highlight the results of his recent Wisdom of Crowds report, the Embedded Business Intelligence Market Study, published in October 2014. Dresner’s study taps the expertise of some 2,500 organizations that use BI tools, focusing specifically on their efforts to embed analytics in other applications. In the webinar, Dresner will cover three main subjects: User intentions for – and perceptions of – embedded analytics, segmented by industry, types of users, architecture and vendor Architecture needs and priorities (such as web services, HTML/iFrame and Javascript API) for embedding, as identified by technologists who implement embedded analytics Ratings of 24 embedded BI vendors, based on both the architecture and features the individual vendors offer, and the reasons Actuate garnered the top ranking To add the user’s perspective, Dresner will then give the floor to Kevin Larnach, Executive Vice President of Operations at Elcom. Larnach will explain how Elcom embeds Actuate’s reporting solution in PECOS, its cloud-based e-procurement solution. Embedded analytics enables users of PECOS – a user base 120,000 strong, in more than 200 organizations, managing nearly $20 billion in total procurement spending annually – to access standard reports, slice and dice data for analysis, create custom reports and presentations of the data, and export transaction history to many different formats, all without IT expertise.  As this diagram shows, PECOS touches all aspects of the procurement process. PECOS users include the Scottish Government (including health services, universities and colleges, and government departments), several health services groups in Britain, the Northern Ireland Assembly, several school districts in the United States, the Tennessee Valley Authority (TVA), and many other organizations and companies. Elcom has identified over a billion dollars in audited savings that its customers have accrued thanks to embedded analytics – more than $500 million in the healthcare sector alone. Elcom’s application is truly an embedded analytics success story. The embedded analytics capability in PECOS, delivered with Actuate technology, is an important competitive differentiator for Elcom. Its competitors’ products either have limited fixed reporting, or don’t offer any standard reporting at all. Those competitors “are scrambling to adopt a flexible embedded approach such as the one enjoyed by PECOS users,” Elcom says. You’re sure to have questions for Dresner and Larnach, so the webinar will include a Q&A session. (An Actuate technical expert will also be on hand if you have specific questions about our embedded analytics capabilities.) The webinar will be accompanied by live Tweets using the hashtag #embeddedanalytics.  Register today.

Read More