Cloud

8 Top Considerations for a Successful Cloud Partnership

As your organization embarks on creating a cloud strategy, there are many things to consider. What are the top benefits you are looking to achieve by moving workloads into the cloud? What areas of your business are you willing to consider as cloud or hybrid cloud implementations? Most of all, as you move into the cloud you are faced with extending your infrastructure and IT team to include your cloud provider. What are the key things you should consider as you determine the key cloud partnerships you will embrace? One Size Does Not Fit All Developing a cloud strategy is an exercise in understanding your business process, workloads, security rules, compliance requirements, and user adoption–just to name a few. Not a simple task, moving business to the cloud might mean a combination of both deployment and costing strategies. Options for on-premises, public cloud, private cloud (both on and off site), Hybrid cloud, SaaS, PaaS, and IaaS all offer organizations flexibility but can also be confusing. What offering is ideal for which business process to generate the highest efficiency and cost benefit? There is no one-size-fits-all solution, which means IT leaders need to be prudent about cloud options and work with their vendors to ensure they get the most value for their investment. Information Matters As we live in the digital age, many organizations recognize the value of information and place significant priority on protecting it. Information Governance, knowing what information you have, where it is and what you need to do with it, has never been more important. When organizations look at moving information to the cloud they need to be extra vigilant to ensure that their information policies and compliance regulations are both understood and upheld by their cloud partner. The value and level of control required over information should play a part in an organization’s decision of what applications and what data will reside in the cloud, on premises or as part of a hybrid cloud implementation.

Read More

Accessible PDF Discussion: A Solution for High-Volume Customer Communications

To help them achieve complete web accessibility, organizations require a viable technology solution for automatically generating high-volume (i.e., enterprise-level) personalized customer communications in Accessible PDF format. Such a solution would give blind and visually impaired people immediate and equal access to electronic documents, which they currently do not have. This series of blog posts explains why demand is increasing for customer communication documents in Accessible PDF format, describes current industry practices for producing these documents, and introduces a new technology option that enables organizations to keep PDF format as their default electronic format for high-volume, web-delivered documents of record while enabling full accessibility and usability of those PDFs for individuals who are blind or visually impaired. In recent blog posts we examined the Drivers behind Web Content Accessibility, Best Practices for PDF and Accessible PDF and Approaches to Generating Accessible PDFs. In this post, we will examine a new, patented, state-of-the-art technology solution that is specifically designed to generate high volumes of Accessible PDF. You can also access the complete white paper on this topic, Enterprise-Level PDF Accessiblity: Drivers, Challenges and Innovative Solutions. New Technology for Generating Enterprise-Level Accessible PDFs A first-to-market, enterprise-level technology for automatically producing personalized customer communications in Accessible PDF format is now market available. The new technology converts print streams, documents, and data into Accessible PDFs, either in high-volume batches (i.e., thousands, millions) or individually on demand (dynamically in real time). Organizations can use this type of innovative technology to simultaneously improve customer experience for people with visual disabilities and comply with relevant accessibility legislation such as the ADA, Section 508 of the Rehabilitation Act, Section 255 of the Telecommunications Act, and the CVAA. Accessibility Rules Unlike manual remediation, this automated technology leverages a sophisticated, inherently flexible rules model, ensuring that each source document, whether in PDF or print stream format, completely incorporates the specified accessibility rules. Accessibility templates can be easily edited to ensure production continuity of recurring (e.g., monthly) high-volume transactional documents. Organizations may need to secure accessibility expertise (at least initially) to define PDF accessibility rules for each document type, although the various document authors and creators require no specialized accessibility knowledge because the automated technology includes a highly intelligent graphical user interface. Quality Control Visual PDF tag inspection and usability testing is not required on every page, as it is with manual remediation. Instead, quality control can be maintained with automated and manual accessibility/usability testing on small batches of documents. This patented technology utilizes PDF/UA format (ISO 14289-1) and incorporates the Matterhorn Protocol2 to generate Accessible PDF output that has been independently tested and found to conform to WCAG 2.0, Level AA, by nationally and internationally recognized prominent advocacy organizations and well respected worldwide accessibility firms. Deployment This innovative technology can be deployed as a traditional on-premise traditional or virtual software installation, or as a cloud-based solution. Observed Effect on Traditional Format Production Recent cloud deployments of this new technology are having an unexpected, positive effect on the production of traditional alternate formats such as Braille, large print, and audio. These traditional formats, like manual PDF remediation, can be time and cost intensive to produce, and can delay the delivery of customer communications. The rich, structured output from this automated technology has allowed for automation of the production of traditional formats, including Braille, large print, and audio, lessening the labor resources needed for manual processing. Each personalized communication statement or notice can be produced more efficiently, reducing cost and delivery time for alternate hardcopy formats. ePresentment Options With the advent of affordable, scalable technology for automatically producing high-volume Accessible PDF documents, private organizations and government agencies suddenly have a number of new options for presenting online customer communications. For example, organizations can provide Accessible PDF communications by default, creating an inclusionary environment while meeting legislative mandates. This delivery scenario enables blind and visually impaired people to access their personal information at the same time as other customers, without the delays associated with requesting alternate hardcopy formats. Or, instead of producing Accessible PDFs by default, organizations with existing systems for online presentment of PDF documents may choose to provide customers with on-demand conversion to Accessible PDF format, both for current PDFs and archived documents. For online users, this would mean replacing an inconvenient exception process with merely a few button clicks. Implications for Organizations and Individuals This innovative technology solution for enterprise-level PDF accessibility offers blind and visually impaired people equal (and instant) access to electronic documents, empowering them to be more independent, make more timely financial and other critical decisions, and participate more fully in the 24/7 digital world. Imagine, individuals who prefer digital technology can finally say “No, thanks” to exception processes, accommodations, late-arriving hardcopies in alternate formats, and other such hassles. For blind and visually impaired people who do not currently request documents in alternate formats, and instead rely on family, friends, and support workers to help them manage their affairs, this technology gives them another avenue for becoming more independent. With the arrival of this groundbreaking automated technology and its ready availability to organizations producing customer communication PDFs, 20+ million blind and visually impaired Americans have an opportunity to use their buying power to affect meaningful change. They can achieve this by patronizing service providers that offer instant online access to Accessible PDF versions of bank and credit card statements, phone bills, insurance documents, and other routine (yet vital) communications. Likewise, they can demand a comparable level of service from government agencies, non-profits, and institutions of higher education. This type of technology is a game changer for industry, for government, and most importantly for blind and visually impaired people who deserve equal access to the entire internet, including websites, web content, and web-delivered documents. For more information on this technology, visit ccm.actuate.com.

Read More

Good Cloud. Bad Cloud. Why Cloud?

Confused about the cloud? You’re not alone. Adoption is projected to grow at double digits despite plentiful guidance on why we should fear the cloud. Pundits tell us, “If your organization is not implementing the cloud, you’re already behind.” Yet it is easy to feel the cloud is just beyond our grasp. So let’s take a look at some real-life use cases from sectors that are leading the way in enterprise adoption of the cloud. Cloud Illusions Ask a few CIOs about the cloud and you are likely to hear a wide range of responses, from concern that the cloud endangers security and privacy to elation that the cloud can be the ultimate platform for change. While much of this reflects well-reasoned advice and counsel, some is pure hype. When even The Onion takes on “that cloud thing that everyone is talking about,” we should realize that we are at hype and jargon saturation. With all the noise around cloud computing, cloud storage and cloud apps and debate about the pros and cons of public, private and hybrid clouds, we need to consider what is real and what is merely illusion, and moreover why we should ultimately care. These beautiful lyrics from the 60s seem to foretell our current state of confusion over the cloud: I’ve looked at clouds from both sides now, from up and down, and still somehow it’s cloud illusions I recall. I really don’t know clouds at all.” — Joni Mitchell, Both Sides Now from the album Clouds The cloud is a growing reality. CIOs and IT teams need to clearly understand how it can best be applied to advance their strategic interests. IDC research forecasts public cloud will grow at double digits and spending on private cloud will top $24 billion by 2016. CompTIA predicts that the next decade will see cloud computing becoming even more accepted as a foundational building block. We are seeingthe cloud go mainstream in the public sector, and Gartner predicts the cloud is moving into digital business, advising CIOs and other IT leaders to continually adapt to leverage increasing cloud capabilities. The Open Group Cloud project analyzed 24 business use-casesdriving adoption. In general, the rationale can be classified in five areas: agility, productivity, QoS, cost and the ability to take advantage of new business opportunities — all of which have been guiding principles for applying technology in the past. So how well are our past years of enterprise hardware and software know-how translating to the cloud for large-scale applications? Here are three sectors that are forging the way with successful cloud implementations in order to drive efficiency, improve time to market, and effect business transformation. The Cloud Drives Cost Efficiency World Economic Forum research reveals that governments are adopting cloud services at higher than expected rates. The growing adoption of cloud technology is happening at all levels of government around the globe. We are already seeing cloud play a role in changing how government agencies fundamentally spend money and allocate their IT resources. We came out with a cloud-first policy because… it offers a faster time to market, a reduction to risk, and hopefully a reduction in cost.” CIO Carlos Ramos, California While adoption is being driven in part by cloud-first mandates, the cloud is clearly aligned with government mission objectives. The public sector has embraced a data-driven approach — including open data and big data initiatives — to be responsive to citizens. Cloud implementations are seen as a means of moving beyond data transparency to achieve a cost-effective state of operational excellence. Four Trends to Watch in 2015 highlights the cloud as a means to be responsive to citizens’ wants, needs and ideas. For municipalities, the cloud provides equal, on-demand cost-effective access to a shared pool of computing resources. The City of Barcelona hosts 1.5 million guests for the La Merce festival using the cloud to help manage the surging foot, bike, auto and public transportation traffic. The state ofDelaware has implemented a cloud-based CRM application for constituent tracking in two months, adopting a cloud-first policy that piggybacks on federal policy. The state set up a private cloud and virtualized 85 percent of the state’s physical servers, saving $4 million per year. Delaware now has 70 applications in the cloud —from event notification to cybersecurity training. For central government organizations, including the US Department of the Interior, shared services are eclipsing “cloud-first” mandates as the driver behind cloud adoption. DOI’s groundbreaking cloud initiative consolidates all the records information programs under one IT governance system, and this shared service is expected to save an estimated $59 million in taxpayer dollars by 2020. The Cloud Supports Business Transformation Gartner Research identified financial services banking and insurance segments as two of the top cloud adopters. These segments are driven by the need for more innovation and the value they get from that innovation. Financial services firms are rewarded for systems that can process transactions faster and more securely and are providing new services, such as mobile banking and claims, that are ready-built for cloud-based systems. There is also growing competition with startups that are shifting the playing field. Way back in 2013 (a decade in cloud years), my article The Art of Banking: How Financial Services Approach Great Customer Experiences talked about how bankers would increasingly take innovation cues from consumer tech and smart retailers as they practice the art of banking. Over the past year, the cloud has proven to be both a major disrupter and an enabler for innovation. Like the other big research firms, IDC sees digital transformation as key for businesses and a bridge that CIOs must learn to cross, and that bridge includes the disruptive influence of cloud computing. A recent article from Banking Technology, “Why I’m backing the banks,” declares that traditional banks are now in a race to remain relevant as they face a slew of non-bank competitors with offer models that consumers increasingly value. Accenture found that one in five consumers would be happy to bank with PayPal — a cloud firm born in Silicon Valley. Though often a cost-saving measure, CIOs are seeing the potential in the cloud to create a flexible platform for future innovation. A poll of financial services sector decision makers revealed the top two benefits of adopting cloud platforms as cost savings (voiced by 62 percent of respondents) and a simplified IT environment (52 percent). It is this simplification of the IT environment that will enable banks to level the playing field with the upstarts: The newer entrants owe much of their success to their extreme agility with ICT: they have got where they are because they use technology better than anyone else. Yet, it would be premature to lament the passing of banks as we know them. They are increasingly taking the tech start-ups’ own medicine… [and the] search for innovation is rapidly pushing the cloud up banks’ technology agendas.” While banking has definitely upped its cloud game in the last few years, insurance is perhaps the granddad of cloud adoption. In How Cloud Computing will Transform Insurance, Accenture highlighted Insurance as being in the forefront of cloud growth and predicted that the cloud would transform the industry. On their list of reasons to adopt cloud, the “ability to respond to market change and reshape operating model[s] to address new and emerging opportunities and challenges.” An SMA study of cloud adoption trends in insurance found that 35 percent of participants said the cloud “provides companies with the flexibility needed to respond quickly to changing needs.” In retrospect, while cost savings has been a driver for insurers to adopt the cloud, there are already a number of insurance cloud success stories that illustrate the cloud’s real potential as a means of innovation and competitive advantage in a changing market with a changing customer demographic. Andre Nieuwendam, director of IT for United Property & Casualty describes their cloud success in customer-centric terms: “From an insured perspective, there are many initiatives on the table that we want to be able to provide them, file a claim electronically, check billing, and interact with customer service people in a real-time environment. Being in the cloud has enabled us to meet all of these objectives in a very, very short period of time.” The Cloud Enables Speed to Market In a recent Forbes article, “Cloud Is the Foundation for Digital Transformation,” Ray Wang (@rwang0) highlights cloud as the single most disruptive of all the new technologies. ”Cloud not only provides a source of unlimited and dynamic capacity, but also helps users consume innovation faster.” The idea of leveraging the cloud as a platform for speed in a changing market is appealing and especially resonates in the communications, media and entertainment sector, one that Gartner has identified as second only to banking in cloud adoption. In Breaking Bad: How Technology is Changing Media & Entertainment, I wrote about the digital media supply chain and how entertainment and broadcast companies are experiencing no less than an industry revolution: Motion pictures used to be cut, approved, and canned for distribution and released in a series of ‘windows’ for consumption. With digital distribution this model stops working — all the traditional ‘windows’ of distribution are collapsing. This has a ripple effect all the way down the chain of production and accounting and requires new IT systems and applications to address the new paradigm.” According to Accenture’s Content in the Cloud in the Broadcast and Entertainment Industry, the cloud can be the platform on which the digital media supply chain operates to better serve changing markets and consumption models. Cloud technology is poised to make an impact by supporting the next round of breakthroughs…from proliferating devices that demand a more flexible business model to new levels of IT capacity requirements that dictate highly scalable IT solutions to competitive pressures for speed and innovation that call for better workflow, business analytics, and customer insight.” How Cloud Computing Will Save Hollywood tells the story of how Lionsgate is using cloud to run their studio and compete with the “big guys” in the industry. Cloud has been helping them deal with their dispersed global environments during film production: media complexity, an unprecedented influx of massive amounts of data, and unique data and workflow requirements. Cloud Resolutions Perhaps the cloud is not so mysterious after all. In a Gathering Clouds interview, David Linthicum (@DavidLinthicum) shared his perspective that businesses that adopt cloud gain a strategic advantage: … the companies who [adopt cloud] can turn on a dime…. These companies will be able to leverage their information in much more innovative ways.” As industries increasingly digitize, the cloud is proving to be a useful partner to the CIO in an increasingly digital-first world. It is not surprising that KPMG’s recent survey, Elevating Business in the Cloud, found the top uses for cloud are to drive cost efficiencies and enact large-scale change including enabling a flexible and mobile workforce, improving alignment and interaction with customers, suppliers and business partners, and better leveraging data to provide insightful business decisions. The key to success, as with any new bright shiny technology, is to apply the cloud to achieve critical business and mission objectives. As Jim Buczkowski of Ford Motor says, The cloud is about delivering services, features, and information…to make the driving experience a better one.” So here’s to accomplishing great things with the cloud! Just keep these tips from KPMG in mind as you resolve to make your cloud initiative a success: Make cloud transformation a continuous process. Drive cloud transformation from the top. Focus on strong leadership and engagement. Avoid silos. Measure success. Plus one bonus tip from me: Avoid the trap of “cloud for cloud’s sake,” lest we discover the biggest truth in Joni Mitchell’s lyric is “So many things I would have done but clouds got in my way.” A version of this article first appeared in CMSWire.

Read More

OpenText and SAP Run Together for Exceptional Customer Impact

As we gear up for another year at SAPPHIRE, I’d like to reflect on the strong relationship that OpenText and SAP have shared for decades and look ahead to an exciting future together. For more than 20 years, we have worked together to empower the enterprise to manage its unstructured and structured information for business success. Our combined solutions make information more discoverable, manageable, secure, and valuable. Connecting SAP business suites with OpenText information suites delivers a powerful platform for innovation and opportunity. Together, we have: Transformed processing operations at Bumblebee Foods from being 100 percent reliant on paper to being 100 percent digital, with automated processes reducing costs by over 50 percent and significantly increasing efficiency. Positioned Alagasco for future growth through increased sustainability and performance. Centralized information has helped break down organizational silos, speed up sales processes, and maintain business continuity. Created a culture of innovation at Distell by empowering employees to share best practices and collaborate. As well as increasing productivity, the organization has managed its intellectual capital more effectively to enhance and protect its brand. As the world around us shifts to digital, the combined value that we deliver as partners grows exponentially. In celebration of this valued relationship, OpenText has been awarded the SAP Pinnacle Award for seven years in a row. Today, I’m pleased to announce that we have just received the 2015 SAP Pinnacle Award for “Solution Extension Partner of the Year”, making OpenText a recipient for the past eight years. This category honors partners who co-innovate with SAP to deliver exceptional customer impact. OpenText was selected for this year’s award based on our innovative approach that enriches and extends the capabilities and scope of SAP products and applications OpenText was formally presented with the 2015 SAP Pinnacle Award at the SAP Global Partner Summit last evening, in conjunction with SAPPHIRE® NOW, SAP’s international customer conference in Orlando, Florida. We’re on hand at this event to showcase the latest advancements in joint OpenText and SAP releases. Look for us at booth #130 at the conference where we’ll be demonstrating the power and flexibility of products like SAP Document Presentation, SAP Invoice Management, and Tempo Box Premium. We continue to build out the OpenText and SAP ecosystem. Our strategic solutions now support a broad range of SAP offerings—from HANA database and analytics to Simple Finance and the HANA Enterprise Cloud. Recent releases include HANA integrations for SAP Document Presentment by OpenText and SAP Invoice Management by OpenText—both designed to deliver deeper insight and content value, enhancing an organization’s process efficiency and the ability to make more strategic decisions. These extensions are available in the cloud, on premise, or as a hybrid solution. At Enterprise World 2014, our annual user conference, we introduced the OpenText Business Center for SAP Solutions, a platform for automating mission-critical business processes across the SAP business suite. We have now announced the general availability of this product. Using the OpenText Business Center for SAP, joint customers will be able to digitize entire processes in SAP—from capture to creation—without requiring complex configuration or programming resources. In the Digital-First World, all of an organization’s information and processes will be digital. This release is part of our commitment to simplify, transform, and accelerate business for the digital enterprise—enabling it to drive efficiency through digitization. In addition to expanding our support for SAP processes, we will be also be introducing Tempo Box Value Edition & Tempo Box Premium. Tempo Box Value Edition & Tempo Box Premium are secure solutions for sharing and synchronizing both personal and SAP enterprise content across different platforms and devices. Both deliver tight integration into SAP Extended ECM, giving users greater freedom to share and work with business content across any device, while still maintaining information governance and control. Tempo Box Value Edition & Tempo Box Premium enhance the SAP ecosystem by securely extending content tied to SAP business processes beyond the firewall to non-SAP users, including unlimited external users such as customers, suppliers, and partners across the business network. The ability to manage unstructured information in the enterprise plays a pivotal role in digital transformation—and it is a key capability that the OpenText and SAP ecosystem delivers. Our partnership continues to drive product breakthroughs that produce impactful and tangible results for our customers. Together, we are laying the foundation for a Digital-First World for over 4,500 customers and 50+ million active users—across two decades of innovation and into the future. Read the press release. Visit our website.

Read More

What the Department of Homeland Security Knows About Data

What does the Department of Homeland Security know that you don’t know? OK, that’s a trick question. The answer (in this case) is this: It knows how to get from data to information. On April 22, OpenText Analytics and Reporting hosted a webinar featuring Chris Chilbert, Chief Enterprise Architect at the Department of Homeland Security. With hundreds in attendance, Chilbert made a powerful case that data is worthless unless you can turn it into relevant information – and that information can then become knowledge, wisdom and ultimately action. The graphic above is from Chilbert’s presentation. There’s a human element in gaining information from data, Chilbert explained. People need to understand the organization they work in and the processes they use; those pieces help us put data in context and make better decisions. If you missed Chilbert’s presentation, please check out this free replay. After Chilbert’s presentation, I talked for a few minutes about how OpenText Analytics and Reporting products support the Four Pillars of Business Analytics: data, people, processes, and technology.  (Read more about the Four Pillars in this blog post and a free ebook.) Many webinar attendees asked follow-up questions – more than we had time to answer during the hour, so I’ve answered some of them here. Q: Explain how OpenText Analytics secures data from unauthorized access. A: OpenText Actuate Information Hub (iHub), our data visualization platform, provides multiple layers of security. These include authentication integration with Active Directory, single sign-on solutions, and two-factor authentication. iHub administrators also can control what a user can access by securing data at the row or column level, and also by controlling page-level security in reports. To read more about the security features in iHub, check out the white paper, The Ten Layers of Security in iHub. Q: Can you give an example of social data and explain how iHub accesses it? A: Twitter and Facebook are the most common social data sources today. OpenText Analytics has several social data connectors for iHub in our developer center, including the Facebook ODA Driver,  BIRT Twitter Gadget  and Twitter JSON Search ODA.  For other unstructured data – which social data is, in essence – we provide APIs that you can use to connect to and query any data source. You may want to take a look at these two blog posts from Kris Clark: Creating a Custom ODA and Use JSON as a Scripted Data Set. Q: What mobile devices does iHub support? A: We support all mobile devices by providing APIs that allow you to integrate content from iHub into your mobile application. This way you get to select what mobile devices you want to support. For ideas and inspiration, take a look at two example applications we have created using iHub and its APIs:  Aviatio, a mobile web application (GitHub link for Aviatio),  and Gazetteer, an iOS hybrid application (GitHub link for Gazetteer). Q: How does iHub work in a multitenant architecture model in a cloud environment? A:  Multitenant support is built into iHub. With multitenant support, each project instance within a cluster isolates several characteristics (including security, user and role management, and scheduling) to allow them to be managed independently, even as the instances all share cluster resources. This allows a single iHub installation to support multiple applications and projects with a variety of characteristics and requirements. Developers use multitenant capabilities to build software-as-a-service (SaaS) solutions in the cloud. The benefits include lower cost, because hardware and software are used more efficiently; faster time to market, because adding a new project requires just a few administrative commands; and improved security, because each application instance has its own security processes. Technical benefits of using multitenancy with iHub include reduced deployment burden, eased administrative load, simplified system impact testing, and flexible backup and recovery. Q: Does OpenText Analytics have an electronic scorecard to allow input of information from the bottom up, as well as from the top down? A: Yes, with OpenText Analytics users can input information at any level that may have a bearing on a specific key performance indicator (KPI). The flexibility of our scorecard function accommodates any performance framework, including Balanced Scorecard, Malcolm Baldrige, Six Sigma and custom frameworks, and scales to meet the needs of large initiatives. The Briefing Book function of iHub scorecards allows users to create and deploy customized performance views. Briefing Book measures can be selected manually or filtered based on criteria such as performance, criticality, location or ownership. Links to relevant standard and custom reports, maps, external documents and websites can be added. Briefing Books can be defined as private or shared, and include advanced security features to ensure that users only have access to the information they are entitled to see. If you require more clarification on any of these answers, please leave a note in the comments. And be sure to check the replay of my webinar with Chris Chilbert of the Department of Homeland Security.

Read More

3 Questions: John Johnson of Dell Services Discusses Analytics and Reporting for IT Services

Dell Services, the global system integrator and IT outsourcing arm at Dell, provides support, application, cloud, consulting, and many other mission-critical IT services to hundreds of organizations worldwide across many sectors. The company collects and manages massive amounts of data concerning customer infrastructures from simple, high-frequency metrics (such as CPU, memory, and disk utilization) to helpdesk tickets and service requests, including hardware and software asset information. Using this data to understand and respond to customer needs before they become a problem falls to John M. Johnson, Manager, Capacity Management at Dell Services. Johnson recently spoke with OpenText about the type of data Dell Services collects, and the evolving ways his customers consume that data. He also spoke about how he uses this data to plan for the future. OpenText: You have a 12-terabyte data warehouse of performance metrics on your customers’ systems and applications. Tell us about that data and how you use it. Johnson: Our infrastructure reporting data warehouse has been around for seven-plus years. It collects aggregated information about more than a hundred customers, which is just a segment of our base. Originally we started the data warehouse to meet legal retention requirements, and it evolved to become the repository for ticketing data, service request data, and SLA performance data. Now it’s an open warehouse where we continually add information related to our services delivery. It’s fantastic data, and a fantastic amount of data, but we lacked two things: an automated way to present it, and a consistent process behind its presentation. My twenty capacity planners were spending too much of their valuable time churning out Excel reports to present the data to our clients, and far too little time understanding the data. A little less than two years ago we started using open source BIRT for report automation and to eliminate manual errors, consistency issues, and remove the “personal analysis methods” that each engineer was introducing to the process. The next maturing of the process was to leverage iHub to further automate report generation, delivery and presentation. OpenText: Some of your customers and users get dynamic dashboards, while others get static reports. How do you decide who gets what? Johnson: That’s an easy answer: It begins with contract requirements. Those expectations are drawn out and agreed upon by legal counsel on both sides. Once those fundamental requirements are met, the question of, “Who gets what?” is very simply based on how they need and want the data. I have three customer bases: my services customers, and my delivery teams, and peer technical teams who have reporting requirements. And everybody wants a different mix of data. DBAs want to see what’s going on with their infrastructure – their top databases, hardware configurations, software versions and patch level, clusters performance, and replication stats. Other teams, such as service delivery managers, and the account teams, want to see pictures more on a financial level. They need answers to standard questions like, “What has the customer purchased, and is that service meeting the customer’s expectations?” In some cases we handle customer applications in addition to their infrastructure. In those cases, the customer needs reports on uptime, availability, performance, user-response time, outstanding trouble tickets, number of users on each system, and various other application metrics married with the infrastructure data. Those are all static reports we typically deliver on a monthly schedule, but we’re looking to make that particular reporting a daily process with iHub Dashboards. Dashboards will serve three major groups: 1. Application owners, who will see what’s going on with their infrastructure and applications in real-time 2. Our service managers, who coordinate the daily delivery of our services around the world 3. Senior leaders at the director, VP and CxO levels. That last group has much less interest in a single trouble ticket or server performance, but they do care about service levels and want to know how the infrastructure looks on a daily basis. I think the executive-level dashboards will be big consumers of data in the future, so we’re evolving and maturing our offering from a technical level – where we have traditionally been engaged – to the business level. Because that’s where people buy. OpenText: That is an ambitious plan to extend your reporting platform. How do you prioritize your projects, and what advice would you give to peers with similar plans? Johnson: There’s one overall strategy I try to employ with all my applications: Apply modern, agile software development methodologies to them. You have to stay up-to-date on software patches and capabilities. You have to keep your platform relevant. We keep updates coming rapidly enough that our customers don’t have to create workarounds or manual processes. Fortunately, iHub works well with how we manage upgrades. We manage reports as a unit of work inside of iHub, so I don’t have to make monolithic changes. When I’m prioritizing projects, I first ask, “Who is my most willing customer?” The customer who’s going to work with you provides the quickest path to success, and that success is the foundation upon which you build. Second is to expect to get your hands dirty and do a lot of the lifting. Most customers are always going to have trouble verbalizing what they need and how they want data to look. So you have to just get that first visualization done and ask, “Does this data presented this way answer your needs?” Don’t be afraid of responses such as, “That is not what I wanted at all. I told you I wanted a report,” and that’s one of the most frustrating things about the job. You have to accept that you are a statistical artist and visual presentation is something you own, and then embrace and drive it. Fortunately, the ease of developing and managing data with iHub means we can respond to these inputs rapidly.  

Read More

Sports on the Web: Newer, Better, Global

Are you a sports fan? Are you one of the lucky ones that enjoys sports that are carried on your local station at convenient times, or are you like a growing number of fans that enjoy sports that are carried more often in other countries, on their time zone and not broadcast on local television stations? Modern sports viewing is now often enabled by strong web experiences, and a growing number of fans are now able to enjoy their favourite sports on the web, at times of their convenience regardless of where in the world the sport is played live. Many sports franchises are now taking advantage of the streaming vs. viewing methods fans are adopting, and there are some great sites powered by Web Experience Management software such as OpenText’s Customer Experience Suite. Cloud enabled apps provide viewing and stats on all types of devices, and allow viewers to enjoy the sport and the commentary that goes with it – from both the official commentators and the other viewers. As reported in a recent press release, UK-based Aberdeen Football Club (if you are from North America think Soccer) recently remodeled their site with OpenText’s Experience Suite to include real-time stats, commentary, Twitter feed, pre and post game analysis and real-time photos. The omni-channel experience is critical as 58% of their fans enjoy their site on mobile devices, often as they are watching the game live in the stadium. Check out www.afc.co.uk to see the latest. This time of year sees some of my favorite sports back live and online. While the sites stay up all year sharing info, they come alive when the teams are back and playing again. It is rugby season again and the 6 Nations site www.rbs6nations.com once again brought the tournament and all the news to the locals and those of us in other parts of the world. The site is great with game info and pictures and my favourite is the running list of clips that summarize some of the great moments. Even if you don’t watch the full games, you can get a pretty good idea of the play, the emotion and certainly the outcomes from this site. And of course. my favorite Australian Rules Football (AFL) season has just started, so I will be spending increasing time on their site www.afl.com.au checking out the predictions, results and pictures, and watching highlights or full games at times that are convenient wherever in the world I am. Watching with two screens is a bonus so I have the player info and stats handy while streaming the games from a second device. In the immediacy era we live in, sports on the internet can now be consumed at our leisure and our convenience. Thanks to strong web experience sites we now have a PVR-like option for watching the games, and the apps provide extras like real-time stats and commentary. There is no substitute for the excitement of live sport but when you can’t be there in person, web experiences are now a great alternative.

Read More

OpenText Featured on Bloomberg Business

For this year’s global Innovation Tour, we’ve taken our Digital-First World message on the road. Already underway, members of our executive leadership team have presented in major cities in Asia Pacific, including Mumbai, Tokyo, Sydney, and Singapore. As one of many highlights of the tour, OpenText CMO Adam Howatson was featured on Bloomberg Asia’s Brandstanding in Singapore. The interview covers a number of topics, ranging from the value of information in creating new services and opening up new revenue opportunities to why the cloud is it important for brands and businesses, and how OpenText is successfully rewriting the rules of business for successful digital transformation. According to Howatson, “Being able to connect the way [the business is] represented on social platforms and the way that brands are represented and shared on the Internet and through our connected society, through to back office operations, to manufacturing and internal business processes… It truly is the organizations who are able to integrate that experience and that flow of information who will outperform their competitors in every industry.” Bloomberg Business delivers business and markets news, data, analysis, and video nationwide featuring stories from Businessweek and Bloomberg News. As a global business network, Bloomberg has over 22 million visitors to its web video assets. Watch the video. Learn more about the Digital-First World by downloading the book: Digital: Disrupt or Die.

Read More

3 Questions: Adam Dennison Discusses Analytics for CIOs

Adam Dennison, senior vice president and publisher of CIO and IDG Enterprise’s events, boasts a 15-year career in technology publishing. CIO Perspectives, the company’s regional event series, brings CIOs and senior IT executives together with top technology vendors for a day of thought-provoking, action-oriented discussions. “At a high level, we cover three main topics: strategy, innovation and leadership,” Dennison explains. “More specifically, emerging technologies, digital transformation, and security are three major initiatives for CIOs today.”  We asked Dennison (@adamidg) how CIOs can leverage analytics in their work. OpenText: You’re moderating a panel at CIO Perspectives called “Straight Talk about SMAC” in which you’ll share key stats on trends and investments in social, mobile, analytics and cloud. Can you give us a preview of a stat you find compelling or timely? Dennison: Our research and discussions with CIOs show that currently 32 percent of enterprise IT investment is slated for edge technologies (like SMAC) vs. core technologies (like legacy systems and infrastructure). However, this will shift to 45 percent within the next one to three years. Additionally, 54 percent of enterprise CIOs plan to increase their spend with “newer vendors” within the next 12 months. OpenText: Late last year you published CIOs Must Market IT’s Value, arguing that internal marketing of IT can help businesspeople understand the value of IT departments. Meanwhile, marketing efforts are increasingly driven by data and metrics. How do the two relate? Dennison: My column on CIOs marketing IT to the business was more broad-based than just data and analytics. However, if asked to focus on the data aspect of it, I would say showing the business exactly what they are paying for is step one in the process. Explain to the business units through data, facts and transparency that they are getting true value from enterprise IT vs. a “do it yourself” solution. OpenText: A lot of industry research – and anecdotal evidence, including a recent CIO Quick Takes – shows that CIOs are eager to use analytics to improve business insights. What drives CIOs’ push for analytics? Dennison: What I read from the CIO Quick Takes was a laser focus on customers. Our CIO research shows us that 41 percent of CIOs’ current spend is on external customer experience, relationship and interaction. The only way to get closer to your customers and better serve them is to gather data on them and use that to your advantage. Our 2015 State of the CIO research also shows us that Data/Analytics is the number one tech priority for the next 12 months that will drive investment. Mobile and Cloud came in a close second and third respectively, but it’s all about data today. Adam Dennison photo courtesy of IDG Enterprise. Used with permission.

Read More

6 F-Type Examples in DevShare You Can Use Now

With the release of OpenText Information Hub, Free Edition (formerly BIRT iHub F-Type) we added corresponding content to the DevShare portion of the Developer Center. One section of that content, F-Type Examples, houses sample apps and working examples developed specifically for iHub, Free Edition. When it comes to working with a new product, or using aspects of a product I know well but have not used before, I find examples valuable for getting started. Whether I need ideas for what is possible or confirmation I am going down the right path, I find examples to be particularly useful. Currently, six different examples have been uploaded to the DevShare to demonstrate various ways that iHub, Free Edition can help you. These examples range from simple, well-designed reports to a full sample application. In this post, I will step you through these six examples and detail their various aspects and features. The Six Examples Call Center Example Dashboard – With the large number call centers around the world and companies increasingly focusing on custom experience, business leaders demand metrics about call center performance, and expect those metrics on quick-reading, real-time dashboards. This example demonstrates how the iHub, Free Edition can do exactly that by creating a dashboard to quickly and effectively analyze the metrics of a call center. It utilizes a well designed data object that uses CSV files as the data source, includes multiple data sets that have properly configured column properties (format, header, etc.), and uses a well designed data model with proper categories and hierarchies. Additionally, the use of multiple data sets within the data object is part of what makes this a well designed data model. When used against a relational database, the multiple data sets allow for optimal query trimming within the datadesign file and the generated data object will have better compression. The Dashboard itself has two tabs: Call Analysis (screenshot below) displays an interactive dashboard with drill-downs and selectors, and Calls By State displays a United States map; you click a state on the map to launch a sub-report for that state. InfographiX Examples – This example can create three different infographics. These samples can inspire you with ways to display your own data in highy visual formats. The three included samples are: Classic Models – This example uses the Classic Models sample database that comes with iHub, Free Edition and displays information about the customers in the database by geography, purchasing habits and more in graphic form. The result is shown below. Storage Statistics – This example uses static data and displays statistics for data in a cloud-based storage system – for example, data formats (audio, video, photos, etc.), traffic by data type, and downloads vs. uploads. Tornado – Uses example uses a data object for its data source. It displays statistics about tornadoes in the United States, such as the number of tornadoes per month, relative numbers of tornadoes by strength (on the Enhanced Fujita, or EF, scale) and fatalities by state. Chicago Crimes Example Dashboard – As a developer and a user, I am always interested in seeing location information integrated into applications. I find it particularly useful when a map is used to visualize location information as it allows you to quickly analyze important geographic information about a data set. This web application demonstrates how we can interact with location information. It uses the Custom Visualizations feature in iHub 3.1 to display a Google Map of the Windy City, and adds custom icons and marker clustering to show crime data. It also provides a data selector gadget so users can choose which types of crimes to display. This example, which can be deployed to iHub as an application, shows you how to display a Dashboard when opening a BIRT application. Simple and Well Designed Reports – This example includes two well designed reports that are simple and elegant. By “well designed,” we mean that elements of the data model have been configured to use libraries for styles, which unifies the whole report and makes its appearance consistent. Additionally, they conform to industry standards for layout and design. If you are new to report development, these are good examples to use as a starting point for developing quality reports with a uniform look and feel. If you are more experienced, these are still worth going over because you may find a feature you have not used before, and they will reinforce good practices on the features you already know. These reports demonstrate how to use a report library, and they highlight several features – such as alternate rows, hyperlinks, highlighting, aggregations and formatting – that help you to display complex data more efficiently and effectively. The reports also show how to use themes to standardize the look and feel of charts and tables, and demonstrate parameters with drop-down lists that use both static and scripted default values. A screenshot of one of the reports appears below. BIRT with the Power of jQuery – This includes two examples that use jQuery to add functionality to BIRT reports. Expand and Collapse – This example uses jQuery to automatically expand and collapse sections in a report. A plus (+) or minus (-) is added next to the report elements that, when clicked, will expand or collapse various sections for the report. For example, say you have organized your customers based on region and country. With this added interactivity, you can start with a compact table and then expand the table into the desired country and region in real time. This enables you to limit the displayed results without the need for filtering or re-rendering the report. Highlight on Mouse Hover – This example (shown below) uses jQuery to highlight rows and columns when the user’s mouse hovers over them, which helps users navigate through very wide or very tall tables. The jQuery used to achieve this behavior updates the CSS properties of the various elements, which gives you flexibility when modifying the styling of the report elements. The effects in this example are primarily achieved through modifying the background color, so any color you specify though standard CSS can be used for the highlight. City Taxi Sample App – This example, as the name implies, is a sample application for an imaginary urban transportation company. It demonstrates how BIRT can power compelling embedded analytic applications. Features demonstrated by this application include: Information graphics for displaying data in visually appealing formats Columnar reports with filtering capabilities Interactive reports that end users can modify from their browsers Geospatial visualization built into a report Dashboards designed for Big Data analysis Additionally, this application demonstrates how BIRT can seamlessly add analytic capabilities to existing web applications. All content is presented as part of the HTML web pages, providing a consistent overall user experience for data analysis. The app also demonstrates how to use the Javascript API to embed content with minimal coding. As you can see, this new DevShare section provides a unified area where iHub, Free Edition examples can be shared and distributed. The multiple examples already available range from simple, elegant reports for people who are just getting started, to full sample applications for developers who are ready to integrate BIRT into your existing applications. Thank you for reading this post on the Examples DevShare. We encourage you to download and work with the examples, then tell us what you like and what more you want to see.

Read More

Information Security in the Digital Age [Podcast]

This is the first of what we hope to be many podcasts in which we explore the technology and culture of Enterprise Information Management (EIM). We’re going to share stories about how OpenText is delivering world class technology and improving our Customer Experience on a daily basis. In this installment, we hope to give you a better understanding of the current cyber security climate, show you what we’re doing to keep your data secure and protect your privacy, and tell you how you can protect yourself online. Our discussion on information security has been recorded as a podcast! If you’d like to listen but don’t see the player above click here. If you don’t want to listen to the podcast, we’ve transcribed it for you below: … The unknown unknown… … If it was three in the morning and there was a bunch of guys standing down a poorly lit alley, would you walk down there by yourself? Probably not. Yet on the Internet, we do that continuously—we walk down that street—and then we’re shocked when negative things happen… … People have an expectation that once they put a lock on their door they’re secure. And that might be the case in their home. But electronically it’s not quite so simple… Are we safe online? Perhaps a better question is whether our information is safe online. 2014 was a banner year for information, data—what we now call cyber—security, and if analyst reports can be any indication, security professionals are on high alert in 2015. International governing bodies have also placed an urgency on better understanding cyber security risks and putting in place strategies to ensure stable telecommunications and safeguard information. There has also been growing concern around data privacy. Though security and privacy work hand-in- hand and it’s difficult to have data privacy without security, there is a difference between the two terms. Security involves the confidentiality, availability and integrity of data. It’s about only collecting information that’s required, then keeping that information safe and destroying it when it’s no longer needed. On the other hand, privacy is about the appropriate use of data. To help us through the topic of cyber security, we talked to Greg Murray, VP of Information Security and Chief Information Security Officer at OpenText. The OpenText security team is made up of specialists around the world who provide operational response, risk assessments and compliance. They also brief executive leadership regularly, and keep development teams abreast of pertinent security information. More importantly, Greg and his team work with our customers to ensure their unique security needs are covered end-to-end. “It starts early in the process,” says Greg. “It starts in the presales cycle where we try to understand the risks that [our customers] are trying to manage in their organization. We find out how they are applying security against that, and then that becomes contractual obligation that we make sure is clearly stated in our agreement with the customer. From there, it goes into our operations center—or risk center, depending on what we’re looking at—and we ensure that whatever our obligations, we’re on top of them and following the different verticals and industries.” Again, 2014 was a big year for cyber security in the news (I think we all remember the stories of not too long ago). But while news agencies focused on the scope and possible future threats, Greg learned something else: “I think if we look at media, one probably would not have argued until last year that media was a high threat area compared to something like aerospace defense. That has changed. Clearly that has changed. As a result, customers come back and say, ‘Hey, our environment has changed. What can you do to help us with that?’” “What a financial institution requires is very different than what a manufacturing provider requires or a pharmaceutical organization. Some of that, as a provider to these organizations and customers, we can carry for them on their behalf. In other cases they must carry it themselves. A lot of the discussions that we have with customers are in regards to ‘Where’s that line?’” “At the end of the day, there’s a collaboration. It’s not all on the customer, it’s not all on OpenText. We have to work together to be able to prove compliance and prove security across the environment.” Regardless of the size, industry or location of an organization, security needs to be a top priority. This concept isn’t a new one. As Greg told Adam Howatson, OpenText CMO in a recent Tech Talk interview, information security hasn’t evolved that much over the last 50 years (view the discussion on YouTube). Greg’s answer may surprise, but after some digging I learned that back in 1998, the Russian Federation brought the issue of information security to the UN’s attention by suggesting that telecommunications were beginning to be used for purposes “inconsistent with the objectives of maintaining international stability and security.” Since then, the UN has been trying to increase transparency, predictability and cooperation among the nations of the world in an effort to police the Internet and private networks. Additionally, if you have seen the Alan Turing biopic The Imitation Game, you know that people have been trying to encrypt and decipher messages since the 1940s and probably even earlier. Today, the lack of physical borders online has certainly complicated things, but the information security game remains the same, and cooperation among allies remains the key. “Are we all contributing together?” Greg asks. “If we’re all working together—just like Neighborhood Watch—we need that same neighborhood community watch on the internet. If you see stuff that doesn’t look right, you should probably report it.” The bad guys are organized and we need to be organized as well. The more we share information and the more we work together… Particularly at OpenText, we have a lot of customer outreach programs and security work where we work hand-in-hand with customer security teams. By doing that, we improve not only our security, but we improve security across the industry.” Recently I attended a talk given by Dr. Ann Cavoukian, former Ontario Privacy Commissioner and Executive Director at the Privacy and Big Data Institute at Ryerson University in Toronto. In it, she said that “privacy cannot be assured solely by compliance with regulatory frameworks; rather, privacy assurance must ideally become an organization’s default mode of operation.” She said that privacy—which again, involves the appropriate use of information—must be at the core of IT systems, accountable business practices and in the physical design and networked infrastructure. Privacy needs to be built into the very design of a business. And I think it’s evident from what Greg says about security, and the way OpenText designs its software with the users’ needs in mind, that our customers’ privacy and security is an essential part of what we offer. “We have a tremendous number of technical controls that are in place throughout all of our systems. For us, though, it starts on the drawing board. That’s when we start thinking about security.” “As soon as Product Management comes up with a new idea, we sit down with them to understand what they’re trying to achieve for the customer and how we’re going to secure it. So that by the time somebody’s uploading that document, it’s already gone through design, engineering, regression testing analysis, security penetration testing.” “One of the other things we do is called threat modelling. Typically we look at the different types of solutions—whether they’re file transfer or transactional, for example—and we look across the industry to see who has been breached and how. We then specifically include that in all of our security and regression testing.” You don’t need to look further than the OpenText Cloud Bill of Rights for proof in our dedication to information security and privacy. In it, we guarantee for our cloud customers the following: You own your content We will not lose your data We will not spy on your data We will not sell your data We will not withhold your data You locate your data where you want it Not everyone is up front with their data privacy policy, but with people becoming more aware of information security and privacy concerns, organizations are going to find themselves facing serious consequences if they do not make the appropriate changes to internal processes and policy. Data security doesn’t lie solely in the hands of cloud vendors or software developers, however. We asked Greg what users and IT administrators can do to protect themselves, and he said it comes down to three things: “One is change your passwords regularly. I know it sounds kind of foolish, but in this day and age if you can use two-factor or multi-factor authentication that does make a big difference.” “The second thing you can do is make sure your systems are patched. 95% of breaches happen because systems aren’t patched. When people ask ‘What’s the sexy side of security?’, it’s not patching. But it works. And it’s not that expensive—it’s typically included free from most vendors.” “The third thing is ‘think before you click.’ If you don’t know who it is or you don’t know what it is… Curiosity kills the cat and curiosity infects computers.” We hope you enjoyed our discussion on information privacy and cyber security. If you’d like to know more about the topics discussed today, visit opencanada.org, privacybydesign.com and of course Opentext.com. We also encourage you to learn more about security regulations and compliance by visiting the CCIRC and FS-ISAC websites.  

Read More

Cloud Quotation Management: Mobility Redefined

Consider this. It’s Friday afternoon. The sun is shining and everyone at the office is enjoying the chance to eat lunch outside. You’re already considering your plans for the weekend. Then the phone rings with a customer on the other end. He needs a quotation. Right away. You have two options. You can wolf down your lunch, return to your office, and ruin not just your own afternoon, but that of your co-workers and office staff too – trying to put everything together as fast as possible for the client in question. Maybe you are getting home at 8 p.m., if you’re lucky, with no energy to enjoy the rest of the day. Alternatively, you can take out your smartphone and have the quotation ready to go before your next bite, sending it directly to the customer or to your manager for approval. And you can have all of that done in a couple of minutes, with calculated prices and discounts, up-to-date information and your company’s corporate branding incorporated. Sounds impossible? It isn’t. In fact, that’s precisely what cloud quotation management is for. It’s true that cloud computing has had its fair share of growing pains and inconsistencies over the past few years. At first, scalability, document production, storage and printing from the cloud were, at best, available only by using unwieldy workarounds. And to make matters worse, there were all kinds of uncertainties regarding hosting and the countries in which sensitive data would be stored. But things have changed: many of those problems are now a thing of the past.   The impact of these changes couldn’t be clearer. Cloud-based CRM systems are becoming increasingly more prevalent, and the number of cloud products, apps and solutions designed to increase productivity is growing day by day. This, in turn, has resulted in the slow but sure migration of processes to the cloud as well. And quotation management is no exception, with benefits that include not only completely new possibilities for sales departments, but also an unparalleled boost in efficiency. Save Time by Using Cloud Quotation Management There’s nothing worse than having to waste time. Commuting, traveling, waiting at the airport or train station – few other things are as maddeningly unproductive. That’s why vendors such as Salesforce.com, Cobra and SugarCRM are releasing solutions that push the envelope in terms of how useful smartphone business apps can be. And their users couldn’t be happier. This trend may or may not have been kick-started by Salesforce CEO Marc Benioff, who had a vision of managing his company through his smartphone. And now software vendors and app programmers are pursuing that same vision too, coming up with new business apps every second. The result? Even the most experienced salesperson can now increase their productivity and save valuable time by using cloud-based tools – including cloud quotation management – wherever they are. Increase Reliability by Using Cloud Quotation Management If you work with Customer Relationship Management (CRM) or Enterprise Resource Planning (ERP) systems, you are probably already aware of the fact that appropriate process-specific actions are triggered automatically. However, you may not be aware that you can take advantage of this exact same behavior when using cloud quotation management systems. Just like a local or on-premise CRM system, you can rely on processes and actions that will run automatically as soon as they’re initiated. Cloud quotation management enables you to trigger these actions and processes from your smartphone or tablet. In fact, these processes can take care of most of the work in the background. They can be used, for example, with PowerDocs to prepare quotations, request approvals and send emails automatically. Reduce Risk by Using Cloud Quotation Management By mapping and integrating company-specific processes into your cloud quotation management system, you can minimize quotation errors before you even get started. For example, the editing process, the data retrieval process and the quotation writing process can all be centrally managed. In addition, the system can be set up in such a way that outside and inside salespeople will be able to access the content they need based on permissions, create quotations with only a few taps by relying on smart querying processes, and send their quotations directly to their customers or to their supervisors for approval. And that’s regardless of the number of pages, documents and attachments involved. To put it simply, correctly delineating and defining the processes for cloud quotation management in advance takes a load off sales departments’ shoulders, making their work much easier overall. Using the Right Cloud Quotation Management Tool What else do you need to know about cloud quotation management? First of all, it doesn’t necessarily require a cloud-based CRM. Cloud quotation management solutions can also be effectively combined with on-premises systems, meaning that, for instance, you can merge data from ERP, CRM and inventory control systems and retrieve, modify or edit it while you’re with a customer. In addition, every large CRM vendor has cloud apps available, though these are admittedly difficult when it comes to company-specific processes and mapping workflows in detail. Not surprisingly, this means that independent tools are often worth a look. In fact, these tools can come in very handy when switching between database systems or merging data from various systems, as they allow salespeople to continue working with the user interface they are familiar with, eliminating frustration and the need for time-consuming training in the process. PowerDocs is one of such tools. Be Ready for the Future Today, with Cloud Quotation Management The revolution that is cloud computing, apps for business applications, and mobile working solutions has only just begun and will keep growing more and more in the coming years. Not surprisingly, communications, as well as the speed at which business is conducted, are changing along with it. This is why it’s necessary to automate as many processes now as possible, in order to be able to meet the future needs of customers today. Cloud quotation management is not only the perfect solution, but is also an ideal way to optimize quotation management – all while making it easier and faster.

Read More

Digital Disruption: The Forces of Data Driven Smart Apps [Part 3]

Editor’s note: Shaku Atre (at right, above, at our Data Driven Summit last December) is the founder and managing partner of the Atre Group, Inc. of New York City, NY and Santa Cruz, California. (Read more about Atre here.) Atre has written a thorough and compelling treatise on the disruptive power of mobile apps, and supported her analysis and conclusions with templates and case studies.  We are privileged to present her analysis here in four parts. In Part 1, she made the case for mobile apps and described some of the forces behind digital disruption. In Part 2 she expanded on those forces. In this post, Atre shares two templates for conceptualizing smart apps, and on March 19 she presents case studies in financial services, telecommunications, car rental, and pharmaceutical industries.  —– Digital Disruption: The Forces of Data Driven Smart Apps [Part 3 of 4] Copyright by Atre Group, Inc. We will use the following two templates for planning our case studies for various industries to conceptually plan and design smart apps: Template 1: Representation of three major participants who create and use data as input to the app.   Figure 1 Publisher X of the app: From a particular industry. The apps, most likely, are stored with the servers in the cloud. Reponses of the customers, as well as those of potential customers, are collected in the cloud servers. Functions provided by the publisher’s app: The app will provide certain functions. Let us consider a commercial bank providing a function such as “Basic Consumer and Business Advantage checking with mobile banking.”Who are the publisher’s customers? E.g. Consumers and small business owners. These are the primary users of the app and primary beneficiaries. What benefits do these customers (the primary beneficiaries) receive? The consumers save substantial time by not having to travel to an ATM. They can deposit checks much quicker and can have access to the funds much faster. Potential customers (secondary beneficiaries) may become the publisher’s customers based on referrals of the customers of the publisher of the app. Referrals could be direct contact between the primary and secondary beneficiaries, or they could happen via web-based reviews or some other vehicle. Once the potential customers become customers they too receive the very same benefits as the primary beneficiaries. There are other types of beneficiaries: Credit scorekeepers of the customers, for example. Publisher of the app’s benefits: The publisher, in our case the bank, receives capital, which it can give out as loans to customers at higher interest rates than the depositing entities. Data is being driven from one of the three participants to other two participants. Decision making with analytics can take place by using the appropriate tools to use the apps by all participants. Template 2: Representation of Data Collection, Data Integration, Data Analysis and Discovery, Data Visualization, and the use of Mobile Technology as the messenger to supply messages to the participants of the Template 1. Figure 2   Generic and High Level View of Big Data and App Logic for the Smart and Novel Apps for your Customers, the Primary and Secondary Beneficiaries of your Industry, along with a multitude of Interfaces. Interface A: Customers are using the app and creating data, such as purchases, returns, money transactions, etc. This results in web logs using the cloud servers. Responses may be sent back via the cloud servers to the customers.Interface B: Potential customers are using the app and creating data, such as purchases, returns, money transactions, etc. This results in web logs using the cloud servers. Responses may be sent back by the cloud servers to the potential customers.Interface C: Some data created by the customers may be sent directly to Data Storage. Similar interface may exist for potential customers.Interface D: Customers’ invoices and archived internal data may be stored.  Potential customers’ invoices and archived internal data may be stored. Interface E: Data created by the customers and by the potential customers using the app and the cloud servers is transmitted to and from the customers’ data storage. Interface F: Archived data, created by customers and potential customers, may be stored with the cloud servers and might be retrieved when necessary. Interface G: Potential customers (secondary beneficiaries) create data with web logs, click stream, likes, dislikes, and sentiments Interface H: Customers’ and potential customers’ transactional data is stored in real time storage. Interface I:  Potential customers’ (secondary beneficiaries’) data with web logs, click stream, likes, dislikes and sentiments is transferred to Real Time Data Storage Interface J:  Customers’ (primary beneficiaries’) data with weblogs, click stream, likes, dislikes and sentiments is sent to the Real Time Data Storage Interface K: Cloud Severs provide marketing and industry related big data such as competitive data, data collected by sensors or other means, social media data (customers’ DNA) are streamed Interface L: Marketing and industry related big data such as competitive data, data collected by sensors or other means, social media data (customers’ DNA) are streamed to the Real Time Data storage Interface M: Real Time data is transferred to the Analytics Logic Box: Data is analyzed and discoveries are made of customers’ and of potential customers’ performance, likes, dislikes with analytics Interface N: Events are processed at mobile speed, Analytics Logic Box and Real Time “Make It Happen” Logic Box work together. Results are modified and travel between the Analytics Logic Box and Real Time “Make It Happen Box” in real time. Interface O & P: At the same time Analytics Logic Box and Real Time “Make it Happen Logic Box” work together with matching your Products/Services Logic Box for Matching of your products and/or of services with customers’ and potential customers’ likes and dislikes discoveries and analytics results are performed Interface Q: Real Time Make It Happen Logic Box provides visual display from Real Time Make It Happen Logic Box Interface R: Matching your Products/Services Logic Box Provides results of matches and/or mismatches to the Digital Responses Logic Box Interface S: Digital responses are created for communication with the smart phones of the customers and potential customers. If necessary, human staff contacts the customers and/or potential customers. Interface T & V:  Visualization Logic Box transfers to the customers and to the potential customers’ digital responses pictured in visual format. Interface U & W:  Digital responses are sent by the app to the customers and the potential customers. The customers and the potential customers respond back to the app. —– On March 19, this four-part blog series will wrap up with Shaku Atre’s case studies for smart apps in financial services, telecommunications, car rental and pharmaceutical industries. Subscribe (at left) to be notified when the posts are available.    

Read More

Infusing the Supply Chain with Analytics

The strategic use of supply chain information is a key driver of competitive advantage in the Digital-First World. Once a company’s B2B processes are automated and transactions are flowing, visibility into those transactions can fuel better strategic and tactical decision-making across the entire business network. Insights from analytics allow trading partners to speed their decision-making, rapidly respond to changing customer and market demands, and optimize their business processes. Our mission at OpenText is to enable our customers to prepare for and thrive in the digital future. Analytic capabilities will play a key role. As I’ve said in previous posts, analytic technologies represent the next frontier in extracting value from enterprise information. For this reason, we are infusing new analytic capabilities into all our core solutions. We recently added new analytic capabilities to the OpenText Trading Grid to help our customers easily access insights for improving their supply chains’ effectiveness. The OpenText Trading Grid is powered by the OpenText Cloud and is the world’s leading B2B integration network, processing more than 16 billion transactions per year, integrating 600,000 trading partners for more than 60,000 customers around the globe. The solution boasts easy-to-use dashboards which depict and summarize data trends and compare them to key performance indicators (KPIs) for the business. They allow companies to evaluate the performance of suppliers or the behavior of customers and use this information to improve processes and relationships. On a more granular level, ‘track and trace’ data highlights exception information about a specific order, shipment, or invoice. By taking prompt corrective action, companies can remedy a situation before process performance degrades or costs accumulate. There are many, many scenarios in which analytic insights bring incredible benefit to the supply chain. Armed with information about the physical location of products or shipments, companies are able to plan operations with greater efficiency, reduce the number of items lost in transit, and fulfill orders with greater accuracy. They can replenish products as shortages are detected. When it comes to process automation, information about the performance of equipment is used to track degradation, order replacement parts, and schedule service before failure occurs. Today, the digital supply chain is an information supply chain that coordinates the flow of goods, communications, and commerce internally and externally across an extended ecosystem of business partners. It seamlessly integrates data from supply chain processes and smart equipment, and tracks intelligent products, parts, and shipments tagged with sensors. Within a few short years it will expand to integrate data from the Internet of Things (IoT). Data will flow online from myriad devices, including wearable technologies, 3-D printers, and logistics drones. It is expected that we will see a thirty-fold increase in web-enabled physical devices by the year 2020. All of these devices producing volumes of data will create a network rich with information and insights. This is the future. A future in which analytics bring incredible value and competitive advantage to the supply chain. And we are only just beginning to envision it. To learn more about OpenText Trading Grid Analytics, read our press release or visit our website.

Read More

Apple Watch Validates the Power of Analytics

  The Apple Watch is not only the company’s foray into the smartwatch and wearable technology space, but it also validates the importance of digital disruption, cloud delivery and embedded analytics. Even before the company’s smartwatch made its formal debut this week, application developers representing companies that provide sports-related, entertainment, and productivity software were called by Apple to create innovative app designs that highlighted the device as a customized timepiece, instant communications device and health and fitness companion. The result: Apple executives announced 50 new apps that will all work immediately with its upcoming Apple Watch including Instagram, MLB.com At Bat, Nike+ Running, OpenTable, Shazam, Twitter, WeChat, Uber, Salesforce, American Airlines and Honeywell Lyric thermostat. What these applications have in common is that they all disrupt our notion of how content is delivered as well as how and where information is analyzed and provided. For example, Apple demonstrated with the help of supermodel Christy Turlington Burns how apps might access data such as weight, blood pressure, glucose levels and asthma inhaler use. Third-party devices and apps can measure the data through a cloud delivery system and then notify the user to take appropriate actions. Other data that can be measured and analyzed include the watch’s accelerometer, taptic engine, haptic feedback, microphone, gyroscope and GPS sensors in your iPhone to gain insight into the wearer’s gait, motor impairment, fitness, speech and even memory. Apple Watch and Digital Disruption In reviewing the potential of the Apple Watch, it is apparent that businesses will be able to capitalize on these digital disruptions in several areas. From a paperwork reduction initiative, Apple said its Watch can make it easier to recruit participants for large-scale research studies. Instead of sending out reams of survey packets, participants can complete tasks or submit surveys right from an app on their wrist, so researchers spend less time on paperwork and more time analyzing data. Using cloud-delivered analytics, researchers might then present an interactive informed consent process. Here are some other ways Apple’s Watch creates opportunities for businesses to take advantage of a cloud-delivered embedded analytics engine: Business Process Management (BPM): The most common process interaction is an approve/reject function. The Apple Watch is likely to raise the bar on how mobile devices handle BPM on the go. Status updates, alerts, reports and approval steps involved in a business process can be conducted on the watch. Enterprise Content Management (ECM): Collaboration will be the likely use case here. Commenting and following comments from co-workers, trending topics, volume of interactions, as well as simple sharing of documents and folders are tasks that may move to a watch. Customer Experience Management (CEM): All content that you see on a watch represents the brand identity of the application provider and defines the essence of the customer experience. Watches will become a digital experience channel that needs to be treated as part of a consistent omni-channel strategy, while delivering the best possible experience given the capabilities a limitations of the device. Information Exchange (IX): The Near field communication (NFC) sensor in Apple Watch will enable a new class of applications that can interact with the physical world. In a factory or warehouse, this may include actions such as retrieving information about a box of parts,  ordering new parts when supplies are low, retrieving status update on current shipments, or delivering supplier alerts. Another Demo to Watch While Apple put on an impressive show of its design strength, the company is by no means the first smartwatch maker to demonstrate applications that tap into Big Data and deliver analytics via the cloud. In November 2014, Actuate (now OpenText) combined the power of integrated Big Data access along multiple devices (including a smartwatch) with visualizations, open APIs and embedded analytics. Check out that demonstration in this video. Any thoughts on the Apple Watch, digital disruption or the future of analytics? Leave your comments below.

Read More

Step Aside Cloud, Mobile and Big Data, IoT has just Entered the Room

Mark Morley

This article provides a review of the ARC Advisory Group Forum in Orlando and expands on the ever increasing importance of analytics in relation to the Internet of Things The room I am referring to here is the office of the CIO, or should that be CTO or CDO (Chief Digital Officer), you see even as technology is evolving, the corporate role to manage digital transformation is evolving too. Since 2011, when Cloud, Mobile and Big Data technologies started to go mainstream, individual strategies to support each of these technologies have been evolving and some would argue that in some cases they remain separate strategies today. However the introduction of the Internet of Things (IoT) is changing the strategic agenda very quickly. For some reason IoT as a ‘collective & strategic’ term, has caught the interest of the enterprise and the consumer alike. IoT allows companies to effectively define one strategy that potentially embraces elements of cloud, mobile and Big Data. I would argue that in terms of IoT, cloud is nearly a commodity term that has evolved into offering connectivity any time, any place or anywhere. Mobile has evolved from simply porting enterprise applications to HTML5 to wearable technology such as Microsoft HoloLens, shown below. Finally Big Data which is broadening its appeal by focussing more on the analytics of information rather than just archiving huge volumes of data. In short, IoT has brought a stronger sense of purpose to cloud, mobile and Big Data. Two weeks ago I was fortunate to attend the ARC Advisory Group Forum in Orlando, a great conference if you have an interest in the Industrial Internet of Things and the direction this is taking. The terminology being used here is interesting as it is just another strand of the IoT, I will expand more on this naming convention a bit later in this post. There were over 700 attendees to the conference, and a lot of interest, as you would expect from industrial manufacturers such as GE, ABB, ThyssenKrupp & Schneider Electric. These companies weren’t just attending as delegates, they were actually showcasing their own IoT related technologies in the expo hall. In fact it was quite interesting to hear how many industrial companies were establishing state of the art software divisions for developing their own IoT applications. For me, the company that made the biggest impact at the conference was GE and their Intelligent Platforms division. GEIP focused heavily on industrial analytics and in particular how it could help companies improve the maintenance of equipment, either in the field or in a factory by using advanced analytics techniques to support predictive maintenance routines. So how does IoT support predictive maintenance scenarios then? It is really about applying IoT technologies such as sensors and analytics to industrial equipment and then being able to process the information coming from the sensors in real time to help identify trends in data and how it is then possible to predict when a component such as a water pump is likely to fail.  If you can predict when a component is likely to fail, you can replace a faulty component as part of a predictive maintenance routine and the piece of equipment is less likely to experience any unexpected downtime. In GE’s case they have many years of experience and knowledge of how their equipment performs in the field and so they can utilise this historical data as well to determine the potential timeline of component failure.  In fact GE went to great lengths to discuss the future of the ‘Brilliant Factory’. The IoT has brought a sense of intelligence or awareness to many pieces of industrial equipment and it was interesting learning from these companies about how they would leverage the IoT moving forwards. There were two common themes to the presentations and what the exhibitors were showcasing in the expo hall. Firstly cyber-security, over the past few months there has been no end of hacking related stories in the press and industrial companies are working very hard to ensure that connected equipment is not ‘hackable’.  The last thing you want is a rogue country hacking into your network, logging into a machine on the shopfloor and stealing tool path cutting information for your next great product that is likely to take the world by storm.  So device or equipment security is really a key focus area for industrial companies in 2015.  Interestingly it wasn’t just cyber-security of connected devices that was keeping CIOs awake at night, a new threat is emerging on the horizon.  What if a complete plant full of connected devices could be brought down by a simple Electro Magnetic Pulse (EMP) threat, this was another scenario discussed in one of the sessions at the conference. So encryption and shielding of data is a key focus area for many research establishments at the moment. The second key theme at the conference was analytics. As we know, Big Data has been around for a few years now but even though companies were good at storing TBs of data on mass storage devices they never really got the true value from the data by mining through it and looking for trends or pieces of information that could either transform the performance of a piece of equipment or improve the efficiency of a production process.  By itself, Big Data is virtually useless unless something is done which results in actionable intelligence and insight that delivers value to the organisation. Interesting quote from Oracle,93% of executives believe that organisations are losing revenue as a result of not being able to fully leverage the information they have. So deriving value from information coming from sensors attached to connected devices is going to become a key growth sector moving forwards. It is certainly an area that the CIO/CTO/CDO is extremely interested in as it can directly impact the bottom line and ultimately bring increased value to shareholders. I guess it is no surprise then that the world’s largest provider of Enterprise Information Management solutions, OpenText, should acquire Actuate, a leading provider of analytics based solutions. Last week the Information Exchange business unit of OpenText, which has a strong focus on B2B integration and supply chain, launched Trading Grid Analytics, a value add service to provide improved insights into transaction based information flowing across our cloud based Trading Grid infrastructure. With 16 billion transactions flowing across our business network each year there is a huge opportunity to mine this information and derive new value from these transactions, not just in the EDI related information that is being transmitted between companies on our network. Can you imagine the benefits that global governments could realise if they could predict a country’s GDP based on the volume of order and production related B2B transactions flowing across our network? Actuate is not integrated to Trading Grid just yet but it will eventually become a core piece of technology to analyse information flowing across not just Trading Grid but our other EIM solutions.  It is certainly an exciting time if you are a customer using our EIM solutions! Actuate has some great embedded analytics capabilities that will potentially help improve the overall operational efficiency of connected industrial equipment. In a previous blog I mentioned about B2B transactions being raised ‘on device’ , well with semi-conductor manufacturers such as Intel  spending millions of dollars developing low power chips to place on connected devices, it means that the device will become even more ‘intelligent’ and almost autonomous in nature.  I think we will see a lot more strategic partnerships announced between the semi-conductor manufacturers and industrial equipment manufacturers such as GE and ABB etc. Naturally, cloud, mobile and big data plays a big part in the overall success of an IoT related strategy. I certainly think we will see the emergence of more FOG based processing environments.  ‘FOG’ I hear you ask?, yes another term I heard at a Cisco IoT world forum two years ago.  Basically a connected device is able to perform some form of processing or analytics task in a FOG environment which is much closer to the connected device than a traditional cloud platform.  Think of FOG as being half way between the connected device and the cloud, ie a lot of pre-processing can take place on or near the connected device before the information is sent to a central cloud platform. So coming back to the conference, there was actually another area that was partially discussed, the area of IoT standards.  I guess it is to be expected that as this is a new technology area it will take time to develop new standards for how devices are connected to each other and standard ways for transporting, processing and securing the information flows. But there is another area of IoT related standards that is bugging me at the moment!, the many derivatives of the term IoT that are emerging.  IoT was certainly the first term defined by Kevin Ashton, closely followed by GE who introduced the Industrial Internet of Things, Cisco introducing the Internet of Everything and then you have the German manufacturers introducing Industry 4.0.  I appreciate that is has been the manufacturing industry that has driven a lot of IoT development so far but what about other industries such as retail, energy, healthcare  and other industry sub-sectors?  Admittedly IoT is a very generic term but already it is being more associated with consumer related technologies such as wearable devices and connected home devices such as NEST.  So in addition to defining standards for IoT cyber security, connectivity and data flows, how about introducing a standard naming convention that could support each and every industry? As there isn’t a suitable set of naming conventions, let me start the ball rolling by defining a common naming convention!  I think the following image nicely explains what I am thinking of here. In closing, I would argue, based on the presentations I saw at the ARC conference, that the industrial manufacturing sector is the most advanced in terms of IoT adoption. Can you imagine what sort of world we will live in when all the industries listed above embrace IoT, one word, exciting! Mark Morley currently leads industry marketing for the manufacturing sector at OpenText.  In this role Mark has a focus on automotive, high tech and the industrial sectors. Mark also defines the go-to-market strategy and thought leadership for applying B2B e-commerce and integration solutions within these sectors.

Read More

Data Driven Digest for February 27

Each Friday we share some favorite reporting on, and examples of, data driven visualizations and embedded analytics that came onto our radar in the past week. Use the “Subscribe” link at left and we’ll email you with new entries.     Om Data: Word clouds can be great,* but they’re not always enough;  sometimes you also need to show relationships between words. That’s what the folks at Information is Beautiful did with the graphic above (part of a larger infographic; click through to see it). It charts the benefits of meditation and mindfulness based on more than 75 clinical studies; organizes them into four broad groups: cognitive, physical, emotional and social benefits; charts the strength of evidence supporting the claim by word size (as in a word cloud); and organizes it all to show where benefits overlap. (For example, “increased self-control & will power” is considered both a cognitive benefit and an emotional one.) The result is an enlightening visualization. * You can now create word clouds in OpenText Analytics using the new custom visualizations capability in BIRT iHub 3.1. Learn about the new version here.   Wave Length: Politicians in Washington are once again arguing about immigration policy. If they want to inform their debate with data, they could start with the chart above. (Click through for the full interactive version.) Natalia Bronshtein has visualized nearly 200 years of U.S. immigration data from the federal Yearbook of Immigration Statistics. Hover your mouse anywhere on the graphic to see statistics on the many lands immigrants hail from, organized by country and decade. Bronshtein’s homepage has many more compelling interactive visualizations. Where Are You: Data Science Central, a social network for big data, business analytics and data science practitioners, has an ongoing project to understand where its members live and work. The organization correlated its own membership list with data from job websites in the U.S. to come up with the map above; Livan Alonso took the data a step further to produce the map below. Vincent Granville, Data Science Central’s founder, has asked Alonso to slice and dice the data further, so this is definitely a work in progress as well as a labor of love. Got a favorite or trending resource on embedded analytics and data visualization? Share it with the readers of the OpenText Analytics blog. Submit ideas to blogactuate@actuate.com or add a comment below. Subscribe (at left) and we’ll email you when new entries are posted. Recent Data Driven Digests: February 20: Lunar new year travels, mapping sounds, winning poker hands February 13: Shopping malls, subway germs, advice for choosing visualizations February 6: Anthony Davis, California farmland, Where’s Waldo?  

Read More

Forget the Oscars, Tata Motors Won a Bigger Award in Mumbai

Last week I had the pleasure of attending our Innovation Tour event in Mumbai, the first leg of a multi-city tour of the world to showcase our Enterprise Information Management solutions and how they help companies move to the digital first world! The event was very well attended and it was good to see keen interest being shown in our new offerings such as Actuate and Core and our other more mature EIM solutions. Enterprise World has traditionally been our key event of the year, but the Innovation Tour provides a way for OpenText to get closer to our customers around the world, Mumbai was no exception with keen interest shown in our expo hall. I have been to India before, two years ago in fact, to meet with an automotive industry association that looks after the ICT needs of the entire Indian automotive industry. Back then, the discussion was focused around B2B integration. However, last week’s event in  Mumbai showcased all solutions from the OpenText portfolio. One of the interesting solution areas being showcased by one of our customers was Business Process Management (BPM) and it is only fitting that one of our Indian based customers won an award for their deployment of BPM. Why fitting? Well, India has long been the global hub for business process outsourcing, so I guess you could say there is a natural interest in improving the management of business processes in India. OpenText has a strong presence in the Indian market. OpenText presented a number of awards during the event, and Tata Motors was the worthy winner of the award for the best deployment of BPM. Incidentally, Tata Motors also won the global Heroes Award at last year’s Enterprise World event for their deployment of our Cordys BPM Solution. So who are Tata Motors, I hear you ask? Well, they are the largest vehicle manufacturer in India with consolidated revenues of $38.9 billion. Tata Motors is part of a large group of companies which includes Tata Steel, Jaguar Land Rover in the UK, Tata Technologies and many other smaller companies that serve the domestic market in India. Tata Group is fast becoming a leading OpenText customer showcasing many different EIM solutions. For example, Jaguar Land Rover uses OpenText Managed Services to manage the B2B communications with over 1,200 suppliers following divestiture from Ford in 2009. Tata Steel in Europe also uses our Managed Services platform to help consolidate eleven separate EDI platforms and three web portals onto a single, common platform. So, simplification and consolidation of IT and B2B infrastructures is a common theme across Tata Group, and Tata Motors is no different with their implementation of OpenText BPM. Tata Motors has struggled over the years to exchange information electronically with over 750 vehicle dealers across India. Varying IT skills, multiple business processes, combined with having to use a notoriously difficult utilities and communications infrastructure across the country was really starting to impact Tata Motor’s business. In addition, their IT infrastructure had to support over 35,000 users and there were over 90 different types of business application in use across 1,200 departments of the company. So ensuring  that accurate, timely information could be exchanged across both internal and external users was proving to be a huge problem for Tata Motors. Step forward, OpenText BPM! Tata Motors decided to depoy our Cordys BPM solution as a SOA based backed platform to connect all their business applications and more importantly provide a common platform to help exchange information electronically across their extensive dealer network. Even though they had deployed Siebel CRM across their dealer network, Tata Motors faced a constant challenge of having to process a high volume of manual, paper based information, quite often this information would be inaccurate due to mis-keying of information. A simple mistake, but when scaled up across 750 dealers, it can have a serious impact on the bottom line and more importantly impact customer satisfaction levels with respect to new vehicle deliveries or spare parts related orders. Tata Motors had a number of goals for this particular project: Implement a Service Oriented Architecture – Primary objective was to setup a SOA environment for leveraging existing services and hence avoid re-inventing the wheel. They also wanted to use this platform to streamline the current integrations between multiple business systems. Process Automation / Business Process Management – They had a lot of manual, semi-automated of completely automated processes. Manual or semi-automated processes were inefficient and in some cases ineffective as well. Some of their automated processes were actually disconnected with actual business case scenarios. So the goal for implementing BPM was to bring these processes more nearer to ‘business design’, thus improving efficiency and process adherence. Uniform Web Services Framework – Tata Motors goal was to try and establish a single source of web services that could convert existing functionalities of underlying service sources into inter-operable web services. So, what were the primary reasons for Tata Motors choosing OpenText BPM? It was a SOA enabler, its business process automation capabilities, comprehensive product for application development, minimizes the application development time and improved cost effectiveness. Their BPM implementation covered two main areas: Enterprise Applications Integration – mainly deals with inward facing functionalities of employee and manufacturing related process applications. They had many applications but they had a common fault, they did not follow SOA principles. Web services had to be developed inside every application which was very inefficient from a time and resources point of view. In addition, if an application had to connect to SAP then it was an independent, unmanaged and insecure connection. Customer Relationship & Dealer Management Systems Integration –Tata Motors is the biggest player in the commercial vehicles sector in India and one of the biggest in terms of passenger car related sales, with over 750 dealers scattered across India. The dealerships are managed using Siebel CRM-DMS implementation but with many changes being rolled out across the system it needed a supporting platform to effectively manage this process. Cordys became the primary environment for developing CRM-DMS applications. So in summary, Cordys BPM has been integrated with SAP, Siebel CRM-DMS, Email/Exchange Server, Active Directory, Oracle Identity Manager, SMS Gateway and mobile applications across Android and iOS. The Cordys implementation also resulted in a number of business benefits including, improved process efficiency, stronger process adherence, built on a SOA based platform, significant cost and time savings. The project has already achieved its ROI ! Moving forwards OpenText BPM will act as a uniform, centrally managed and secure web services base for all applications used across Tata Motors landscape, irrespective of the technology in which it is developed. The platform will also provide an evolving architecture to mobilise existing applications and they plan to integrate to an in-house developed document management system. Finally, the go forward plan is to move their Cordys implementation to the cloud for improved management of their infrastructure. I have visited many car manufacturers over the years and one company head quartered in the Far East had over 300 dealers in Europe and each one had been allowed to implement their own CRM and DMS environments to manage their dealer business processes. Prior to the acquisition of GXS (my former company) by OpenText, I had to inform them that GXS didn’t have a suitable integration platform to help seamlessly connect all 300 dealers to a single platform. With OpenText BPM we can clearly achieve such an integration project now and Tata Motors is certainly a shining light in terms of what is achievable from an extended enterprise application integration point of view. Congratulations Tata Motors! For more information on OpenText BPM solutions, please CLICK HERE. Finally, I just want to say many thanks to my OpenText colleagues in India; it was a very successful event and a team effort to make it happen. For more information on our Innovation Tour schedule, please CLICK HERE

Read More

Could the Smart Trash Can Take Waste Out of the Supply Chain?

In my last post I introduced a vision for the Smart Trash that would automatically identify the items you are throwing away. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. How would the smart trash can work? There are a couple of different options. Version one (circa 2017) would probably rely on the use of RFID tags and readers. If manufacturers put RFID tags on each of the items you purchased then the smart trashcan would be able to identify them automatically. Version two (circa 2018) might add a camera to the lid. As items are being disposed of the camera would automatically recognize the item using “visual search” technology found in Google Goggles. Perhaps the smart trash can might have multiple cameras at different depths that could see through trash bag liners to identify items at rest. Version three (circa 2020) might be more advanced with capabilities to identify items based upon smell. Perhaps, the trash can would be fitted with sensors that can detect odors and identify items based upon their chemical composition. You might be wondering what data retailers and manufacturers use to forecast demand today and whether smart trashcans would provide an improvement. Today, the primary data used for forecasting demand is the information about what shoppers are buying at individual stores or what is called “Point-of-Sale” data. Every night retailers and manufacturers run reports to understand how many of each item was sold in each store. They then try to guesstimate how much inventory they have on hand and whether or not they are going to run out of stock in the coming days (weeks or months). If they are running low on inventory then will need to issue a replenishment order. Would trash can data provide better insights than Point of Sale data? This begs a good question. What provides better insights into future sales – what people are buying or what they are throwing away? Is monitoring Point-of-Sale data a better approach than monitoring waste? Let’s first think about items that are regularly purchased – batteries, diapers, detergent, shampoo, soda, milk, bread and salty snacks. I would argue that monitoring consumption (via trashcans) of these repeat purchases is a better indicator of near-term demand. If someone throws out a milk container they are very likely going to buy a new one in the next 24 hours. In many cases, the disposal of an item after it is consumed is the event that triggers the need to buy another one. But what about items which are not consistent, repeat purchases? Examples might include toys, electronics, clothing, shoes, etc. For these inconsistent purchases you might question the validity of the correlation between waste patterns and future purchases. Just because you throw something away doesn’t mean that you are going to purchase it again – immediately or ever. The value of using the trash data is clear for groceries and regular purchases. Will Nestle, Procter & Gamble and Tesco begin giving away free kitchen trash cans to consumers just to collect data and be optimally positioned for replenishment orders? Or even better what if your smart trash can was linked to your online grocery account? Items detected in your trash can (or recycling bin) could be automatically identified then transmitted to the garbage truck upon pickup at your house. A replenishment algorithm could review your list of “always in stock” items to determine if the item should be replaced immediately. If yes, then a home delivery provider might visit a few hours later to drop off new supplies on your doorstep. Amazon Fresh be extended to include Amazon Trash. Walmart might buy a waste management company. The Smart Trash Can could create a myriad of new opportunities in the supply chain.

Read More

4 Questions: Dieter Meuser Discusses Analytics in Manufacturing

Dieter Meuser and two colleagues founded iTAC Software AG in 1998 to commercialize a manufacturing executing system (MES) developed for Robert Bosch GmbH. The timing was ideal, as manufacturers sought ways to leverage the nascent Internet to automatically monitor and manage their plants and thereby improve quality and efficiency to bolster the bottom line. Today, iTAC (Internet Technologies & Consulting) is one of the leading MES companies serving discrete manufacturers, and has customers throughout Europe, Asia and the Americas. iTAC.MES.Suite is a cloud-based, Java EE-powered application that enables IP-based monitoring and management of every aspect of a manufacturing plant.  OpenText Analytics provides the business intelligence (BI) and analytics capabilities embedded in iTAC.MES.Suite. You can see the software in action in booth 3437 at the IPC APEX EXPO, held February 22-26 in San Diego, California. Learn more about iTAC’s participation in IPC APEX EXPO here, and learn more about the company at the end of this post. Meuser, iTAC’s CTO, has extensive experience working with manufacturers worldwide. He’s also an expert on the German government’s Industry 4.0 initiative to develop and support “Smart Factories” – that is, manufacturing plants that leverage embedded systems and the Internet of Things (IoT) to drive efficiency and improve quality.  We asked Meuser for his thoughts on these topics. OpenText: iTAC has more than two dozen enterprise customers with plants in 20 countries. What do those customers say are their pain points, particularly with regard to data? Meuser: The biggest single pain point is this: Companies have lots of data, but often they are unsure how to analyze it. Let me elaborate:  Many types of data are recorded to fulfill manufacturers’ tracking and tracing requirements. (Called “traceability standards,” these include VW 80131, VW 80160, MBN 10447, GS 95017 and others.) Data collected via sensor networks, such as plant temperature or humidity, are part of these standards. The objective of collecting this data is to continuously improve manufacturing processes through correlation analysis (or Big Data analysis), accomplished by running the data through intelligent algorithms. But because manufacturers frequently aren’t sure which criteria they should use to analyze the data, analysis often does not happen to the extent that manufacturers want.  As a result, data is collected and stored for possible later analysis. This can lead to a growing mountain of unanalyzed data and very little continuous improvement of processes. But it also illustrates why introducing data management right at the beginning of a tracking and tracing project is so important. Data management, supported by analytics, enables process optimization that otherwise would fall by the wayside. OpenText: How do manufacturers use and analyze data – sensor data in particular – to improve their processes? Meuser: Within manufacturing plants, the most common analysis is called Overall Equipment Effectiveness (OEE) based on integrated production data collection and machine data collection (PDC/MDC). This is done within the plant’s manufacturing execution system (MES). PDC/MDC can happen automatically if the plant’s systems are integrated, or manually via rich clients. The captured data can be evaluated in real time and analyzed via free selectable time intervals. Common analyses include comparing planned changeover and turnaround times with actual values; comparing actual production (including scrap) with forecasts; and examining of unexpected equipment breakdowns. Key Performance Indicators (KPIs) in these analyses feed into OEE, productivity and utilization. Reducing non-conformance costs is another important business case for data analysis in both IoT and Industry 4.0. The availability of structured and unstructured sensor data related to product failures (and costs associated with them) enables new opportunities to determine non-conformance. There is enormous potential in systematically analyzing causes of production failure. Failure cause catalogues (which many manufacturers have collected for decades), can be examined with the help of a modern data mining tool. Analyzing this data on the basis of quality, product and process data helps to reduce failure costs in a Smart Factory. OpenText: What is the role of analytics and data visualization in IoT and Industry 4.0? Meuser: A major objective of data analyses and visualizations in IoT and Industry 4.0 is automatic failure cause analysis. This is accomplished by measuring and testing product errors along with data about manufacturing machines, equipment and processes, then identifying inefficient processes in order to establish solutions. These solutions must be checked by process engineers who have years of experience. Humans and machines go hand in hand when we optimize product quality in an Industry 4.0 factory. OpenText: What are the benefits of a Smart Factory? Meuser: A Smart Factory consists of self-learning machines that can identify the causes of failure under specific conditions, determine appropriate measures to address a failure, and send messages to inform operators of problems. This is sometimes called a cyber-physical system (CPS). Combined with appropriate software models, it enables autonomous manufacturing machines (within certain limits) and supports the overall objective to optimize processes and avoid failures before they happen. The Smart Factory is enabled by modern data analysis techniques. It relies on data about products, processes, quality and environment (e.g. room temperature or humidity) as appropriate. The ability to interface an ERP system with production equipment creates continuous vertical integration that covers the entire value chain, from receiving to shipping. Be sure to visit iTAC at the IPC APEX EXPO, and read more about iTAC.MES.Suite in this case study.       More about iTAC iTAC Software AG is a leading provider of next-generation, platform-independent and cloud-based MES solutions for original equipment manufacturers (OEMs) and suppliers within the discrete manufacturing sector. The company has more than 20 years of experience in internet-based solutions for the discrete manufacturing sector, the Internet of Things (IoT) and the Industrial Internet To date, iTAC has amassed an enviable portfolio of over 70 global enterprise customers across five primary industries: automotive, electronics/EMS/telecommunications technology, metal fabrication, energy/utilities and medical devices. Customers including Audi, Bosch, Continental, Hella, Johnson Controls, Lear, Schneider Electric, Siemens and Volkswagen rely on the iTAC MES Suite to optimize their production processes. iTAC’s product portfolio represents the solutions for the Smart Factory of tomorrow. Its principal components are the iTAC.MES.Suite, the iTAC.Enterprise, Framework and iTAC.embedded.Systems, including its platform-independent iTAC.ARTES middleware and iTAC.Smart.Devices, the company’s new physical interface solutions.

Read More