B2B Integration

Wholesale Banking: The Drivers Behind Digital Channels

I use a tablet computer for many aspects of my daily life: paying bills, online shopping, social networking and entertainment. There is a slick and easy-to-use app for everything and I can access it instantly for a very low price. These applications are all interconnected, pre-packaged, and running on industrial electronic highways behind the scenes, creating the “instantaneous” experience of the digital life. Strangely, my digital life goes six years back in time, five days a week between 8am and 6pm. This “retro” feeling is experienced by millions of office workers, largely because our consumer world at home is already digital. . Expectations are shifting faster than reality, especially in the workplace. This article focuses on the impact of this digital “gap” in the world of wholesale banking. Enter Small and Medium Businesses (SMB) on my left, Corporates on my right. A Historical Difference Between the SMB and Corporate Market Until recently, the first group ─ the SMB market ─ had the tendency to use local currency bank accounts and domestic “low value” payments through historical clearing with a fairly small financial supply chain. Business Online Banking was the most appropriate channel to capture their payables and receivables, either through web-based file upload/download or manual form input. In other cases, a small or medium business would outsource this function to a payment and payroll Service Bureau. In a last example, a medium business would have a direct “legacy” electronic submission method with the Bank (BACSTEL-IP in the UK). The treasury team is often just one or a small number of individuals with other responsibilities in the business, such as accounting and office administration. The second group, Corporates, is more “industrial”, requiring direct host-to-host integration for both low- and high-value payments coming from a Treasury system ─ part of an organized and optimized Treasury and Financial Supply Chain function. Payroll files, collection instructions, and payment submissions are the result of highly industrial processes, requiring an industrial relationship with the Bank. The day-to-day process is fairly unattended; however the treasury team usually consists of top professionals versed in the world of working capital optimization, intra-day liquidity and cash flow management. SMBs Behaving Like Corporates and Vice-versa On one hand, Small and Medium Businesses are becoming accustomed to commercial “digital” packages that enable them to automate their small treasury and finance operations. Accounting and treasury desktop software ─ usually available as an online or mobile version as well ─ now enable SMBs to “transact” with their Bank, instead of painstakingly browsing an online banking website to upload or type a series of records. SMBs are basically becoming “host-to-host capable” through Internet-based communication protocols like sFTP and HTTPs. On the other hand, Corporates need to manage more and more exceptions in their industrial process, such as prompting a business signatory to execute an action, or “hijacking” and applying manual intervention against a transaction outside the industrial flow. For example, high value payments require multi-eye approval from business executives. Receivables reconciliation issues are flagged up to accountants immediately during an intra-day bank statement report. This leads corporates to require more and more flexibility from their technology to act outside the industrial corporate-to-bank flow of information ─ typically through a digital and user-friendly online banking or mobile-based ecosystem. A corporate treasurer or signatory will be much more open for an iPad-based executive approval prompt from the Bank (with contextual information), rather than go to the office, insert a physical security token on his desktop computer or on one of the treasury workstations. This is the beginning of the “omnichannel” age for wholesale banking. What’s Holding Back the Banks? SMB and corporate channels were designed for their original purpose: online and mobile banking for the former group, host-to-host for the latter. These are usually in silos separated from front- to back-office, with huge Program Management and IT lifecycle structures around them. Digital requirements and converging client markets mean only one thing ─ these banking platforms and the traditional approach are becoming obsolete. So are their mind-boggling costs for keeping the lights on, or applying simple changes. Digital Transformation ─ The Way Forward Digital channels have a positive side effect on the bank’s technology estate: it enables to keep products lean and simple, free of client-specific customization. This is something that banks call their “vanilla flavour”. Client customizations are built and maintained within the channel layer. Payment, trade finance, investment banking, and even consumer products remain leaner, easier to maintain throughout their lifecycle with fewer dependencies and more predictable P&L changes over time. I sincerely believe that banks will start competing with a growing number of non-bank financial suppliers (independent trade finance organizations and PSD2 service providers). The Digital Transformation is necessary now to compete and differentiate tomorrow.

Read More

Infusing the Supply Chain with Analytics

The strategic use of supply chain information is a key driver of competitive advantage in the Digital-First World. Once a company’s B2B processes are automated and transactions are flowing, visibility into those transactions can fuel better strategic and tactical decision-making across the entire business network. Insights from analytics allow trading partners to speed their decision-making, rapidly respond to changing customer and market demands, and optimize their business processes. Our mission at OpenText is to enable our customers to prepare for and thrive in the digital future. Analytic capabilities will play a key role. As I’ve said in previous posts, analytic technologies represent the next frontier in extracting value from enterprise information. For this reason, we are infusing new analytic capabilities into all our core solutions. We recently added new analytic capabilities to the OpenText Trading Grid to help our customers easily access insights for improving their supply chains’ effectiveness. The OpenText Trading Grid is powered by the OpenText Cloud and is the world’s leading B2B integration network, processing more than 16 billion transactions per year, integrating 600,000 trading partners for more than 60,000 customers around the globe. The solution boasts easy-to-use dashboards which depict and summarize data trends and compare them to key performance indicators (KPIs) for the business. They allow companies to evaluate the performance of suppliers or the behavior of customers and use this information to improve processes and relationships. On a more granular level, ‘track and trace’ data highlights exception information about a specific order, shipment, or invoice. By taking prompt corrective action, companies can remedy a situation before process performance degrades or costs accumulate. There are many, many scenarios in which analytic insights bring incredible benefit to the supply chain. Armed with information about the physical location of products or shipments, companies are able to plan operations with greater efficiency, reduce the number of items lost in transit, and fulfill orders with greater accuracy. They can replenish products as shortages are detected. When it comes to process automation, information about the performance of equipment is used to track degradation, order replacement parts, and schedule service before failure occurs. Today, the digital supply chain is an information supply chain that coordinates the flow of goods, communications, and commerce internally and externally across an extended ecosystem of business partners. It seamlessly integrates data from supply chain processes and smart equipment, and tracks intelligent products, parts, and shipments tagged with sensors. Within a few short years it will expand to integrate data from the Internet of Things (IoT). Data will flow online from myriad devices, including wearable technologies, 3-D printers, and logistics drones. It is expected that we will see a thirty-fold increase in web-enabled physical devices by the year 2020. All of these devices producing volumes of data will create a network rich with information and insights. This is the future. A future in which analytics bring incredible value and competitive advantage to the supply chain. And we are only just beginning to envision it. To learn more about OpenText Trading Grid Analytics, read our press release or visit our website.

Read More

Over 3M faxes a month can’t be wrong: Hybrid faxing is here!

At OpenText, we pride ourselves on our position as the #1 electronic fax provider for the enterprise. Between our best-selling on-premises solution (RightFax) and our outstanding fax service (Fax2Mail), OpenText is the undisputed leader in enterprise fax solutions. OpenText is ready to add another trophy to the case: Hybrid Faxing. What is hybrid faxing, you ask? Hybrid faxing combines an on-premises fax server deployed with cloud-based fax service to send and receive faxes. OpenText provides the most powerful hybrid fax solution for customers by combining RightFax with RightFax Connect, a solution that is blazing a path across the globe. With no capacity constraints, RightFax and RightFax Connect combine to provide instant, flexible capacity to handle any volume of faxing without congestion, busy signals or delays. And since RightFax Connect utilizes the OpenText Cloud to send and receive faxes, there is no need to implement, manage, troubleshoot or worry about your telephone connections to RightFax. Easy as pie. Hybrid faxing is here, RightFax with RightFax Connect is growing and OpenText is proud to display another market-leading trophy to our case.

Read More

Digital Disruption: The Forces of Data Driven Smart Apps [Part 1]

Editor’s note: Shaku Atre (at right, above, at our Data Driven Summit last December) is the founder and managing partner of the Atre Group, Inc. of New York City, NY and Santa Cruz, California. Atre Group is a business intelligence, Big Data and data warehousing corporation, and Atre herself is an internationally renowned expert who lectures worldwide on everything databases. She is author of six highly regarded books on database technology. Atre worked for IBM for 14 years in Germany and the United States, and was a partner at Price Waterhouse Coopers for a number of years. As an entrepreneur she founded a number of very successful IT companies in New York and California. Atre has written a thorough and compelling treatise on the disruptive power of mobile apps, and supported her analysis and conclusions with templates and case studies. We are pleased to present her powerful ideas in four blog installments this week and next.   —– Digital Disruption: The Forces of Data Driven Smart Apps [Part 1 of 4] Copyright by Atre Group, Inc. What are they? They connect us with friends, find us restaurants, allow us to deposit checks at home without having to visit ATMs, let us pay our bills online, provide the news, help us avoid traffic congestion, and even let us find with one click the side effects of a drug before purchasing it. Millions of people are now dependent on these apps. Not only they are dependent on them – they are addicted to them. These apps are the little sisters and brothers of our big, multifunction applications that IT developers created and are running back office jobs. These little apps are doing only a few focused functions to give smart phone users quick results. Hundreds of businesses are releasing their apps daily to reach the mobile customers with their contents digitally. Half of all app downloads are games and entertainment apps. But the close second are tens of billions of downloads for businesses. The businesses want to make their customers more productive by enabling them to buy more and are hoping to make them loyal. Businesses want to make their customers practically dependent, and secretly addicted to these ubiquitous apps. The industries that have been running the show for decades are affected drastically by this disruptive force de jour. Let us consider a big industry in major cities like New York, Boston, and Chicago just to name a few– taxis. “In major cities throughout the United States taxi medallion prices are tumbling as taxis face competition from car-service apps like Uber and Lyft.— in NYC the Medallion price fell by 17%, in Chicago by 17%, in Boston by 20% last year —” according to The New York Times, Nov 28, 2014 article by Josh Barro. And here we are not talking about the medallion price of only a few thousand dollars but close to a million dollars in its hay day. We need to see how organizations such as Uber and Lyft fare with their implementations in the next few months and years. These organizations are facing some problems such as “background checkups of the drivers” in a number of states of the US as well as in some European countries. Apps, Big Data, Embedded Analytics, and Smart Phones for Smart Users: Apps are running on the smart phones by digging into big data, transcending embedded analytics from transactional systems into the smart apps with dashboards for easier visualization. Today’s users are running the apps and apps are making the users run to quick and smart decision making. When conventional ways of doing business are thrown into turmoil by an agent, it is called disruption. Today’s agent of disruption is mobile technology and huge amounts of data that is collected. Our business culture is changing at a dizzying speed by mobile technology and big data. Our communication, our physical processes, our moving of goods across all industries is being challenged from the way it used to run in the past. What should we call it? We are seeing the face of Digital Disruption. Do you want to be the road kill of tomorrow, or even of today?  There is a lot of “road kill” organizations that were once leaders of their industries. They didn’t pay attention to the disruptions. Their vision of the future was fogged by the success of the past. Does anyone remember the goliaths, and once upon a time innovators, who were enjoying their inventiveness such as Ashton Tate, BlackBerry, Blockbuster, Borders, Borland, Compaq, Control Data, Data General, Honeywell, Kodak, Lucent, Lotus, Nokia, Novell, Polaroid, Tower Records – and many more. These giants of their industries suffered because they failed to come up with an appropriate response to disruptive technologies and got toppled by the nimble newcomers, masters of the disruption. If you don’t want your organization to suffer the fate of these once upon a time stellar performers, you have to master the disruptive forces. This cycle of disruption is going to repeat every few years whether we like it or not. The only difference will be that the time period of the cycle is compressed. There are enough Davids lurking in the wings today, which could topple your organization unless your vision becomes clear and you embrace the current digital disruption by changing your business processes, by using today’s digital technology and be an agent of change. Digital disruption and its four forces – Apps, The Users, Big Data, and Embedded Analytics: 1. Power of today’s Apps: The major three components of an app are: 1. User Interface – also called User Experience or front end – the user interface is going to make or break your app, 2. The backend with data – only a pretty face without backing it up with data is going to be the death trap. 3. Everything in between which is mainly the logic – if that doesn’t support the pretty face and the data then you are doomed too. How to start with the apps? Once the decision is made by the business to develop apps there are a few questions, among others that need to be answered such as: How to plan, design and develop an app? Should the app be developed in-house or an outside developer should be hired that specializes in app development or use “build your own app” by using software where coding skills are not required? Be cautious with this claim – coding possibly not required but logical thinking – yes – is definitely required. Which data to use? Where to get it from, where and how to store it, how to express the results visually? How can you register responses and how should those be acted upon? And how quickly? How to present the results digitally? Where and how to distribute the apps? Which mobile operating system platform or platforms to use iOS (Apple’s Xcode), Android (SDK), Windows? How to develop and deliver apps on multiple platforms? And finally, and very importantly, how to monetize the app? Business and App – Synergy for Success Any business exists because of its customers. In your own organization, your employees are walking around with a mobile phone— in many cases nowadays even two, in their pockets. And so are your customers. We are referring to the B2B2C (Business to Business to Consumers) interfaces here. Your customers could be businesses having customers. And these customers may be using desktop computers, laptops, sensors, and all types of mobile equipment to collect data about their consumers. And these consumers could be communicating with your customers which are businesses. If some aspects need to be improved in running of your business it would be beneficial to you if your customers, which are businesses would pass on the information to you. We are referring to C2B2B interfaces here. Mobile technology is ubiquitous because: It is affordable It is easily available It is easy to use And on top of all of this it makes everyone much more productive than not having it. Your customers are helped when you develop data driven Smart Apps for your business and possibly when these businesses do the same for their own consumers – once again with the B2B2C interfaces. And more importantly input from consumers is very important too – representing C2B2B interfaces. The apps are one of the biggest drivers of the use and creation of terabytes of data, referred to as “Big Data.” Big Data is being collected about the customers who use all of the technology at hand. But one can’t live by data alone. One needs to plan, design, and develop Smart Apps so that the customers can use the valuable collected data to become more successful. The success of your customers, by using the invaluable, humongous amounts of data collected, will be beneficial to many other parties as well. Your customers will be the primary beneficiaries, alongside others who will be secondary beneficiaries. Bear in mind that your success is very much dependent on your customers’ success. In statistical terms, your success is positively correlated with theirs. Three important criteria of any successful mobile app are: 1. Simplicity: The ability to eliminate steps or make a current process easier to perform. 2. Interaction: The ability to encourage greater user involvement that will increase productivity and loyalty. 3. Encourage your customer’s input: The ability to leverage user insight to improve the functionality of the app. How are you going to write the best Apps for your customers?     You have to play the role of your customer: When was the last time you stood in your customers’ shoes and saw what it is like doing business with your company and use your product or service? Today’s empowered consumers expect personalized, relevant interactions, and so for that reason and more, you must understand the customer’s journey and inspire your organization and others on how to considerably improve the customer experience. The smart apps one develops for customers must provide: Visualization: Easily understandable icons. Simplicity: The ability to eliminate steps or make a current process easier to Encourage your customer’s input with more interaction: Not only “push                              me” but also “pull you” Interface: The ability to encourage greater                              user involvement that will increase loyalty. User Insight: The ability to leverage user insight to improve the functionality                    of the app. Easy deployment of the ubiquitous mobile technology, “Cleverness” used in novel ways to implement data driven decision-making at                    mobile speed. It is predicted that mobile app development projects targeting smart phones and tablets outnumber PC projects by 5:1. We also have to consider the various interfaces that need to be covered in the user interface. The interfaces that need to be considered are not only B2B (Business To Business), B2C (Business To Consumer), B2B2C (Business To Business to Consumer) but also C2B (Consumer To Business) with interaction and insights, C2C (Consumer To Consumer) where one consumer informs other consumers via reviews, via texting, via phone calls and C2B2B (Consumer To Business To Business) with interaction and insights. Figure 1 2. The Users: The demography of the user base is changed. Number of users has increased enormously. Almost everyone above the age of 18 is using a smartphone in the Western world. There is a staggering growth of smart phone users in emerging markets too. If we look at the enormous number of smart phone users the percentage of the number of technology savvy people as compared to the total number of smart phone users, who know the system’s internals is going down. Technology savvy is a relative term. For me a technology savvy person can open hardware and fix it or can assemble a working unit of hardware by assembling various parts. As far as technology savviness is concerned with software– it is someone who can write software and develop applications. Percentage of these hardware and software technology people is going down vis-a-vis the total number of smart phone users. The percentage of the number of business savvy people is increasing. And the user base of people under the age of 25 is going up much faster. One of the big reasons is easy access to social media, to video games. Figure 2 The emerging markets have staggering growth of digital technology. Global smartphone shipments reached 295 Million in the 2nd Quarter of 2014. Figure 3 Emerging market countries have become a world force and has become a big provider of Big Data and users of smartphones. Please bear in mind that 80% of the World’s humanity lives in these emerging markets of Brazil, China, India, Korea, Mexico, and Russia. Figure 4 —– Shaku Atre’s next post will cover the other two forces of digital disruption: Big Data and Embedded Analytics. Look for it on Thursday, March 12. Subscribe (at left) to be informed when the post appears.

Read More

Musings on Prospects for Real-Time Payments in the US

As I sit in my office in Massachusetts after shoveling out from the 8th snowstorm in six weeks (but who is counting?), I have begun to wonder whether now is the time to simply declare that all banking going forward shall be digital. Just as our Commonwealth’s new Governor banned all street traffic except for emergency vehicles for several days in late January, perhaps the Federal Reserve Bank should declare a permanent ban on all financial transactions involving paper except for emergencies. No paper checks, no paper deposits, no paper loan documents, no signature cards, etc. OK, maybe this is a little bit extreme but being stuck at home for days at a time over a six week period can result in some out of the box thinking. As we enter the middle of the 2nd decade of the 21st century, what is clear is that in many ways, we are still living with 20th century infrastructure. Whether it is roads and bridges, public transportation, or archaic banking practices and systems, we need to stop taking a band-aid approach to our problems and recognize that world around us has changed. A perfect example of this is the slow but hopeful progress being made toward the development of a modern retail payment system in the United States. With the Federal Reserve System and at least part of the banking industry in the form of The Clearing House announcing support for a real-time electronic payment system, we have an opportunity to finally break the bonds of the past and join many other countries that have already made this investment. It’s time to stop talking and start acting. What is holding us back? Some banks are stuck on how to make the business case: they are in denial about the future of payments given the investment by non-bank providers and evolving consumer preferences. The cost of replacing batch-oriented systems that were developed decades ago and are still being used by almost all banks to process payments is a hurdle. Some industry participants are hung up on how to support the small percentage of the population that doesn’t have the access or interest in real-time payments, a percentage that shrinks every day. Trade associations and payment system operators have a perceived vested interest in the status quo. Merchants and large corporations are already overwhelmed with the multitude of new payment systems being introduced by banks and technology companies. Consumer and business adoption of technology is radically changing expectations about the fundamental role of banks in society. Payment processing is at the heart of the traditional role of banks: being a trusted intermediary of financial value. But payment processing in 2015 and in the future is also at the heart of the new role of banks: being a secure and convenient intermediary of financial value and data. If banks don’t hurry to embrace this new role, they will eventually be relegated to the role of the payments provider of last resort.

Read More

Step Aside Cloud, Mobile and Big Data, IoT has just Entered the Room

Mark Morley

This article provides a review of the ARC Advisory Group Forum in Orlando and expands on the ever increasing importance of analytics in relation to the Internet of Things The room I am referring to here is the office of the CIO, or should that be CTO or CDO (Chief Digital Officer), you see even as technology is evolving, the corporate role to manage digital transformation is evolving too. Since 2011, when Cloud, Mobile and Big Data technologies started to go mainstream, individual strategies to support each of these technologies have been evolving and some would argue that in some cases they remain separate strategies today. However the introduction of the Internet of Things (IoT) is changing the strategic agenda very quickly. For some reason IoT as a ‘collective & strategic’ term, has caught the interest of the enterprise and the consumer alike. IoT allows companies to effectively define one strategy that potentially embraces elements of cloud, mobile and Big Data. I would argue that in terms of IoT, cloud is nearly a commodity term that has evolved into offering connectivity any time, any place or anywhere. Mobile has evolved from simply porting enterprise applications to HTML5 to wearable technology such as Microsoft HoloLens, shown below. Finally Big Data which is broadening its appeal by focussing more on the analytics of information rather than just archiving huge volumes of data. In short, IoT has brought a stronger sense of purpose to cloud, mobile and Big Data. Two weeks ago I was fortunate to attend the ARC Advisory Group Forum in Orlando, a great conference if you have an interest in the Industrial Internet of Things and the direction this is taking. The terminology being used here is interesting as it is just another strand of the IoT, I will expand more on this naming convention a bit later in this post. There were over 700 attendees to the conference, and a lot of interest, as you would expect from industrial manufacturers such as GE, ABB, ThyssenKrupp & Schneider Electric. These companies weren’t just attending as delegates, they were actually showcasing their own IoT related technologies in the expo hall. In fact it was quite interesting to hear how many industrial companies were establishing state of the art software divisions for developing their own IoT applications. For me, the company that made the biggest impact at the conference was GE and their Intelligent Platforms division. GEIP focused heavily on industrial analytics and in particular how it could help companies improve the maintenance of equipment, either in the field or in a factory by using advanced analytics techniques to support predictive maintenance routines. So how does IoT support predictive maintenance scenarios then? It is really about applying IoT technologies such as sensors and analytics to industrial equipment and then being able to process the information coming from the sensors in real time to help identify trends in data and how it is then possible to predict when a component such as a water pump is likely to fail.  If you can predict when a component is likely to fail, you can replace a faulty component as part of a predictive maintenance routine and the piece of equipment is less likely to experience any unexpected downtime. In GE’s case they have many years of experience and knowledge of how their equipment performs in the field and so they can utilise this historical data as well to determine the potential timeline of component failure.  In fact GE went to great lengths to discuss the future of the ‘Brilliant Factory’. The IoT has brought a sense of intelligence or awareness to many pieces of industrial equipment and it was interesting learning from these companies about how they would leverage the IoT moving forwards. There were two common themes to the presentations and what the exhibitors were showcasing in the expo hall. Firstly cyber-security, over the past few months there has been no end of hacking related stories in the press and industrial companies are working very hard to ensure that connected equipment is not ‘hackable’.  The last thing you want is a rogue country hacking into your network, logging into a machine on the shopfloor and stealing tool path cutting information for your next great product that is likely to take the world by storm.  So device or equipment security is really a key focus area for industrial companies in 2015.  Interestingly it wasn’t just cyber-security of connected devices that was keeping CIOs awake at night, a new threat is emerging on the horizon.  What if a complete plant full of connected devices could be brought down by a simple Electro Magnetic Pulse (EMP) threat, this was another scenario discussed in one of the sessions at the conference. So encryption and shielding of data is a key focus area for many research establishments at the moment. The second key theme at the conference was analytics. As we know, Big Data has been around for a few years now but even though companies were good at storing TBs of data on mass storage devices they never really got the true value from the data by mining through it and looking for trends or pieces of information that could either transform the performance of a piece of equipment or improve the efficiency of a production process.  By itself, Big Data is virtually useless unless something is done which results in actionable intelligence and insight that delivers value to the organisation. Interesting quote from Oracle,93% of executives believe that organisations are losing revenue as a result of not being able to fully leverage the information they have. So deriving value from information coming from sensors attached to connected devices is going to become a key growth sector moving forwards. It is certainly an area that the CIO/CTO/CDO is extremely interested in as it can directly impact the bottom line and ultimately bring increased value to shareholders. I guess it is no surprise then that the world’s largest provider of Enterprise Information Management solutions, OpenText, should acquire Actuate, a leading provider of analytics based solutions. Last week the Information Exchange business unit of OpenText, which has a strong focus on B2B integration and supply chain, launched Trading Grid Analytics, a value add service to provide improved insights into transaction based information flowing across our cloud based Trading Grid infrastructure. With 16 billion transactions flowing across our business network each year there is a huge opportunity to mine this information and derive new value from these transactions, not just in the EDI related information that is being transmitted between companies on our network. Can you imagine the benefits that global governments could realise if they could predict a country’s GDP based on the volume of order and production related B2B transactions flowing across our network? Actuate is not integrated to Trading Grid just yet but it will eventually become a core piece of technology to analyse information flowing across not just Trading Grid but our other EIM solutions.  It is certainly an exciting time if you are a customer using our EIM solutions! Actuate has some great embedded analytics capabilities that will potentially help improve the overall operational efficiency of connected industrial equipment. In a previous blog I mentioned about B2B transactions being raised ‘on device’ , well with semi-conductor manufacturers such as Intel  spending millions of dollars developing low power chips to place on connected devices, it means that the device will become even more ‘intelligent’ and almost autonomous in nature.  I think we will see a lot more strategic partnerships announced between the semi-conductor manufacturers and industrial equipment manufacturers such as GE and ABB etc. Naturally, cloud, mobile and big data plays a big part in the overall success of an IoT related strategy. I certainly think we will see the emergence of more FOG based processing environments.  ‘FOG’ I hear you ask?, yes another term I heard at a Cisco IoT world forum two years ago.  Basically a connected device is able to perform some form of processing or analytics task in a FOG environment which is much closer to the connected device than a traditional cloud platform.  Think of FOG as being half way between the connected device and the cloud, ie a lot of pre-processing can take place on or near the connected device before the information is sent to a central cloud platform. So coming back to the conference, there was actually another area that was partially discussed, the area of IoT standards.  I guess it is to be expected that as this is a new technology area it will take time to develop new standards for how devices are connected to each other and standard ways for transporting, processing and securing the information flows. But there is another area of IoT related standards that is bugging me at the moment!, the many derivatives of the term IoT that are emerging.  IoT was certainly the first term defined by Kevin Ashton, closely followed by GE who introduced the Industrial Internet of Things, Cisco introducing the Internet of Everything and then you have the German manufacturers introducing Industry 4.0.  I appreciate that is has been the manufacturing industry that has driven a lot of IoT development so far but what about other industries such as retail, energy, healthcare  and other industry sub-sectors?  Admittedly IoT is a very generic term but already it is being more associated with consumer related technologies such as wearable devices and connected home devices such as NEST.  So in addition to defining standards for IoT cyber security, connectivity and data flows, how about introducing a standard naming convention that could support each and every industry? As there isn’t a suitable set of naming conventions, let me start the ball rolling by defining a common naming convention!  I think the following image nicely explains what I am thinking of here. In closing, I would argue, based on the presentations I saw at the ARC conference, that the industrial manufacturing sector is the most advanced in terms of IoT adoption. Can you imagine what sort of world we will live in when all the industries listed above embrace IoT, one word, exciting! Mark Morley currently leads industry marketing for the manufacturing sector at OpenText.  In this role Mark has a focus on automotive, high tech and the industrial sectors. Mark also defines the go-to-market strategy and thought leadership for applying B2B e-commerce and integration solutions within these sectors.

Read More

Forget the Oscars, Tata Motors Won a Bigger Award in Mumbai

Last week I had the pleasure of attending our Innovation Tour event in Mumbai, the first leg of a multi-city tour of the world to showcase our Enterprise Information Management solutions and how they help companies move to the digital first world! The event was very well attended and it was good to see keen interest being shown in our new offerings such as Actuate and Core and our other more mature EIM solutions. Enterprise World has traditionally been our key event of the year, but the Innovation Tour provides a way for OpenText to get closer to our customers around the world, Mumbai was no exception with keen interest shown in our expo hall. I have been to India before, two years ago in fact, to meet with an automotive industry association that looks after the ICT needs of the entire Indian automotive industry. Back then, the discussion was focused around B2B integration. However, last week’s event in  Mumbai showcased all solutions from the OpenText portfolio. One of the interesting solution areas being showcased by one of our customers was Business Process Management (BPM) and it is only fitting that one of our Indian based customers won an award for their deployment of BPM. Why fitting? Well, India has long been the global hub for business process outsourcing, so I guess you could say there is a natural interest in improving the management of business processes in India. OpenText has a strong presence in the Indian market. OpenText presented a number of awards during the event, and Tata Motors was the worthy winner of the award for the best deployment of BPM. Incidentally, Tata Motors also won the global Heroes Award at last year’s Enterprise World event for their deployment of our Cordys BPM Solution. So who are Tata Motors, I hear you ask? Well, they are the largest vehicle manufacturer in India with consolidated revenues of $38.9 billion. Tata Motors is part of a large group of companies which includes Tata Steel, Jaguar Land Rover in the UK, Tata Technologies and many other smaller companies that serve the domestic market in India. Tata Group is fast becoming a leading OpenText customer showcasing many different EIM solutions. For example, Jaguar Land Rover uses OpenText Managed Services to manage the B2B communications with over 1,200 suppliers following divestiture from Ford in 2009. Tata Steel in Europe also uses our Managed Services platform to help consolidate eleven separate EDI platforms and three web portals onto a single, common platform. So, simplification and consolidation of IT and B2B infrastructures is a common theme across Tata Group, and Tata Motors is no different with their implementation of OpenText BPM. Tata Motors has struggled over the years to exchange information electronically with over 750 vehicle dealers across India. Varying IT skills, multiple business processes, combined with having to use a notoriously difficult utilities and communications infrastructure across the country was really starting to impact Tata Motor’s business. In addition, their IT infrastructure had to support over 35,000 users and there were over 90 different types of business application in use across 1,200 departments of the company. So ensuring  that accurate, timely information could be exchanged across both internal and external users was proving to be a huge problem for Tata Motors. Step forward, OpenText BPM! Tata Motors decided to depoy our Cordys BPM solution as a SOA based backed platform to connect all their business applications and more importantly provide a common platform to help exchange information electronically across their extensive dealer network. Even though they had deployed Siebel CRM across their dealer network, Tata Motors faced a constant challenge of having to process a high volume of manual, paper based information, quite often this information would be inaccurate due to mis-keying of information. A simple mistake, but when scaled up across 750 dealers, it can have a serious impact on the bottom line and more importantly impact customer satisfaction levels with respect to new vehicle deliveries or spare parts related orders. Tata Motors had a number of goals for this particular project: Implement a Service Oriented Architecture – Primary objective was to setup a SOA environment for leveraging existing services and hence avoid re-inventing the wheel. They also wanted to use this platform to streamline the current integrations between multiple business systems. Process Automation / Business Process Management – They had a lot of manual, semi-automated of completely automated processes. Manual or semi-automated processes were inefficient and in some cases ineffective as well. Some of their automated processes were actually disconnected with actual business case scenarios. So the goal for implementing BPM was to bring these processes more nearer to ‘business design’, thus improving efficiency and process adherence. Uniform Web Services Framework – Tata Motors goal was to try and establish a single source of web services that could convert existing functionalities of underlying service sources into inter-operable web services. So, what were the primary reasons for Tata Motors choosing OpenText BPM? It was a SOA enabler, its business process automation capabilities, comprehensive product for application development, minimizes the application development time and improved cost effectiveness. Their BPM implementation covered two main areas: Enterprise Applications Integration – mainly deals with inward facing functionalities of employee and manufacturing related process applications. They had many applications but they had a common fault, they did not follow SOA principles. Web services had to be developed inside every application which was very inefficient from a time and resources point of view. In addition, if an application had to connect to SAP then it was an independent, unmanaged and insecure connection. Customer Relationship & Dealer Management Systems Integration –Tata Motors is the biggest player in the commercial vehicles sector in India and one of the biggest in terms of passenger car related sales, with over 750 dealers scattered across India. The dealerships are managed using Siebel CRM-DMS implementation but with many changes being rolled out across the system it needed a supporting platform to effectively manage this process. Cordys became the primary environment for developing CRM-DMS applications. So in summary, Cordys BPM has been integrated with SAP, Siebel CRM-DMS, Email/Exchange Server, Active Directory, Oracle Identity Manager, SMS Gateway and mobile applications across Android and iOS. The Cordys implementation also resulted in a number of business benefits including, improved process efficiency, stronger process adherence, built on a SOA based platform, significant cost and time savings. The project has already achieved its ROI ! Moving forwards OpenText BPM will act as a uniform, centrally managed and secure web services base for all applications used across Tata Motors landscape, irrespective of the technology in which it is developed. The platform will also provide an evolving architecture to mobilise existing applications and they plan to integrate to an in-house developed document management system. Finally, the go forward plan is to move their Cordys implementation to the cloud for improved management of their infrastructure. I have visited many car manufacturers over the years and one company head quartered in the Far East had over 300 dealers in Europe and each one had been allowed to implement their own CRM and DMS environments to manage their dealer business processes. Prior to the acquisition of GXS (my former company) by OpenText, I had to inform them that GXS didn’t have a suitable integration platform to help seamlessly connect all 300 dealers to a single platform. With OpenText BPM we can clearly achieve such an integration project now and Tata Motors is certainly a shining light in terms of what is achievable from an extended enterprise application integration point of view. Congratulations Tata Motors! For more information on OpenText BPM solutions, please CLICK HERE. Finally, I just want to say many thanks to my OpenText colleagues in India; it was a very successful event and a team effort to make it happen. For more information on our Innovation Tour schedule, please CLICK HERE

Read More

Could the Smart Trash Can Take Waste Out of the Supply Chain?

In my last post I introduced a vision for the Smart Trash that would automatically identify the items you are throwing away. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. How would the smart trash can work? There are a couple of different options. Version one (circa 2017) would probably rely on the use of RFID tags and readers. If manufacturers put RFID tags on each of the items you purchased then the smart trashcan would be able to identify them automatically. Version two (circa 2018) might add a camera to the lid. As items are being disposed of the camera would automatically recognize the item using “visual search” technology found in Google Goggles. Perhaps the smart trash can might have multiple cameras at different depths that could see through trash bag liners to identify items at rest. Version three (circa 2020) might be more advanced with capabilities to identify items based upon smell. Perhaps, the trash can would be fitted with sensors that can detect odors and identify items based upon their chemical composition. You might be wondering what data retailers and manufacturers use to forecast demand today and whether smart trashcans would provide an improvement. Today, the primary data used for forecasting demand is the information about what shoppers are buying at individual stores or what is called “Point-of-Sale” data. Every night retailers and manufacturers run reports to understand how many of each item was sold in each store. They then try to guesstimate how much inventory they have on hand and whether or not they are going to run out of stock in the coming days (weeks or months). If they are running low on inventory then will need to issue a replenishment order. Would trash can data provide better insights than Point of Sale data? This begs a good question. What provides better insights into future sales – what people are buying or what they are throwing away? Is monitoring Point-of-Sale data a better approach than monitoring waste? Let’s first think about items that are regularly purchased – batteries, diapers, detergent, shampoo, soda, milk, bread and salty snacks. I would argue that monitoring consumption (via trashcans) of these repeat purchases is a better indicator of near-term demand. If someone throws out a milk container they are very likely going to buy a new one in the next 24 hours. In many cases, the disposal of an item after it is consumed is the event that triggers the need to buy another one. But what about items which are not consistent, repeat purchases? Examples might include toys, electronics, clothing, shoes, etc. For these inconsistent purchases you might question the validity of the correlation between waste patterns and future purchases. Just because you throw something away doesn’t mean that you are going to purchase it again – immediately or ever. The value of using the trash data is clear for groceries and regular purchases. Will Nestle, Procter & Gamble and Tesco begin giving away free kitchen trash cans to consumers just to collect data and be optimally positioned for replenishment orders? Or even better what if your smart trash can was linked to your online grocery account? Items detected in your trash can (or recycling bin) could be automatically identified then transmitted to the garbage truck upon pickup at your house. A replenishment algorithm could review your list of “always in stock” items to determine if the item should be replaced immediately. If yes, then a home delivery provider might visit a few hours later to drop off new supplies on your doorstep. Amazon Fresh be extended to include Amazon Trash. Walmart might buy a waste management company. The Smart Trash Can could create a myriad of new opportunities in the supply chain.

Read More

4 Questions: Dieter Meuser Discusses Analytics in Manufacturing

Dieter Meuser and two colleagues founded iTAC Software AG in 1998 to commercialize a manufacturing executing system (MES) developed for Robert Bosch GmbH. The timing was ideal, as manufacturers sought ways to leverage the nascent Internet to automatically monitor and manage their plants and thereby improve quality and efficiency to bolster the bottom line. Today, iTAC (Internet Technologies & Consulting) is one of the leading MES companies serving discrete manufacturers, and has customers throughout Europe, Asia and the Americas. iTAC.MES.Suite is a cloud-based, Java EE-powered application that enables IP-based monitoring and management of every aspect of a manufacturing plant.  OpenText Analytics provides the business intelligence (BI) and analytics capabilities embedded in iTAC.MES.Suite. You can see the software in action in booth 3437 at the IPC APEX EXPO, held February 22-26 in San Diego, California. Learn more about iTAC’s participation in IPC APEX EXPO here, and learn more about the company at the end of this post. Meuser, iTAC’s CTO, has extensive experience working with manufacturers worldwide. He’s also an expert on the German government’s Industry 4.0 initiative to develop and support “Smart Factories” – that is, manufacturing plants that leverage embedded systems and the Internet of Things (IoT) to drive efficiency and improve quality.  We asked Meuser for his thoughts on these topics. OpenText: iTAC has more than two dozen enterprise customers with plants in 20 countries. What do those customers say are their pain points, particularly with regard to data? Meuser: The biggest single pain point is this: Companies have lots of data, but often they are unsure how to analyze it. Let me elaborate:  Many types of data are recorded to fulfill manufacturers’ tracking and tracing requirements. (Called “traceability standards,” these include VW 80131, VW 80160, MBN 10447, GS 95017 and others.) Data collected via sensor networks, such as plant temperature or humidity, are part of these standards. The objective of collecting this data is to continuously improve manufacturing processes through correlation analysis (or Big Data analysis), accomplished by running the data through intelligent algorithms. But because manufacturers frequently aren’t sure which criteria they should use to analyze the data, analysis often does not happen to the extent that manufacturers want.  As a result, data is collected and stored for possible later analysis. This can lead to a growing mountain of unanalyzed data and very little continuous improvement of processes. But it also illustrates why introducing data management right at the beginning of a tracking and tracing project is so important. Data management, supported by analytics, enables process optimization that otherwise would fall by the wayside. OpenText: How do manufacturers use and analyze data – sensor data in particular – to improve their processes? Meuser: Within manufacturing plants, the most common analysis is called Overall Equipment Effectiveness (OEE) based on integrated production data collection and machine data collection (PDC/MDC). This is done within the plant’s manufacturing execution system (MES). PDC/MDC can happen automatically if the plant’s systems are integrated, or manually via rich clients. The captured data can be evaluated in real time and analyzed via free selectable time intervals. Common analyses include comparing planned changeover and turnaround times with actual values; comparing actual production (including scrap) with forecasts; and examining of unexpected equipment breakdowns. Key Performance Indicators (KPIs) in these analyses feed into OEE, productivity and utilization. Reducing non-conformance costs is another important business case for data analysis in both IoT and Industry 4.0. The availability of structured and unstructured sensor data related to product failures (and costs associated with them) enables new opportunities to determine non-conformance. There is enormous potential in systematically analyzing causes of production failure. Failure cause catalogues (which many manufacturers have collected for decades), can be examined with the help of a modern data mining tool. Analyzing this data on the basis of quality, product and process data helps to reduce failure costs in a Smart Factory. OpenText: What is the role of analytics and data visualization in IoT and Industry 4.0? Meuser: A major objective of data analyses and visualizations in IoT and Industry 4.0 is automatic failure cause analysis. This is accomplished by measuring and testing product errors along with data about manufacturing machines, equipment and processes, then identifying inefficient processes in order to establish solutions. These solutions must be checked by process engineers who have years of experience. Humans and machines go hand in hand when we optimize product quality in an Industry 4.0 factory. OpenText: What are the benefits of a Smart Factory? Meuser: A Smart Factory consists of self-learning machines that can identify the causes of failure under specific conditions, determine appropriate measures to address a failure, and send messages to inform operators of problems. This is sometimes called a cyber-physical system (CPS). Combined with appropriate software models, it enables autonomous manufacturing machines (within certain limits) and supports the overall objective to optimize processes and avoid failures before they happen. The Smart Factory is enabled by modern data analysis techniques. It relies on data about products, processes, quality and environment (e.g. room temperature or humidity) as appropriate. The ability to interface an ERP system with production equipment creates continuous vertical integration that covers the entire value chain, from receiving to shipping. Be sure to visit iTAC at the IPC APEX EXPO, and read more about iTAC.MES.Suite in this case study.       More about iTAC iTAC Software AG is a leading provider of next-generation, platform-independent and cloud-based MES solutions for original equipment manufacturers (OEMs) and suppliers within the discrete manufacturing sector. The company has more than 20 years of experience in internet-based solutions for the discrete manufacturing sector, the Internet of Things (IoT) and the Industrial Internet To date, iTAC has amassed an enviable portfolio of over 70 global enterprise customers across five primary industries: automotive, electronics/EMS/telecommunications technology, metal fabrication, energy/utilities and medical devices. Customers including Audi, Bosch, Continental, Hella, Johnson Controls, Lear, Schneider Electric, Siemens and Volkswagen rely on the iTAC MES Suite to optimize their production processes. iTAC’s product portfolio represents the solutions for the Smart Factory of tomorrow. Its principal components are the iTAC.MES.Suite, the iTAC.Enterprise, Framework and iTAC.embedded.Systems, including its platform-independent iTAC.ARTES middleware and iTAC.Smart.Devices, the company’s new physical interface solutions.

Read More

Announcing BIRT iHub F-Type 3.1

Actuate (now OpenText) has released BIRT iHub F-Type 3.1, the latest version of our free report server. Like its commercial cousin BIRT iHub 3.1, the new release of BIRT iHub F-Type now features a REST API and support for third-party custom data visualizations, among other new features. For details about what’s new in the product, see the Technical Summary of New Features. These new capabilities will help you embed analytics and data visualizations in a broader range of applications – a big and growing need among enterprises. This post gives details about the new REST API and Custom Visualization support. I’ll also describe how we’ve tuned the data-out limitation in BIRT iHub F-Type to make dashboard creation more realistic. I’ll finish by taking you step-by-step through the process of moving your content from BIRT iHub F-Type 3.0 to 3.1. BIRT iHub REST API APIs have grown tremendously over the last decade in both quantity and usage, driven in part by enhanced integration standards. REST (Representational State Transfer) APIs are lightweight APIs that allow simple Create, Read, Update and Delete (CRUD) operations using JSON responses over HTTP.  The REST standard separates the data from the visualization, resulting in better code modularity and agility. The REST API in BIRT iHub F-Type 3.1 provides a variety of data access and report generation and management services. Specifically, with the REST API you can: Retrieve data from a BIRT document or BIRT Data Object for integration into applications Integrate BIRT data visualizations into applications by combining the REST API with the JavaScript API (JSAPI) Generate new data visualization documents based on user interaction in your application Convert data visualizations into Adobe PDF or Microsoft Excel formats Schedule generation of data visualizations for later retrieval Upload files to and download files from a BIRT iHub Encyclopedia Retrieve BIRT iHub Encyclopedia contents More information about the REST API can be found in the Technical Summary of New Features. Custom Visualization Support Data visualizations turn information into valuable insights. Previous versions of BIRT iHub provided charts, maps, gadgets and other visualizations, and the new Custom Visualization functionality in BIRT iHub 3.1 takes things a step further. Custom Visualizations enables developers to integrate third-party visualizations from D3, Google Charts, any version of Highcharts, amCharts and many others into BIRT content using external JavaScript libraries. Two examples of visualizations from external libraries, used within BIRT, are shown below. This is accomplished by marshaling BIRT Data Objects for consumption by third-party visualizations. Using BIRT Data Objects for data provisioning offers many advantages over connecting your data directly to the visualizations. Integrating third-party visualizations into your content requires minimal coding, boosting developer productivity. BIRT iHub capabilities, such as the ability to print and export to formats including PDF and Microsoft PowerPoint, are available to all third-party visualizations Adjustments to BIRT iHub F-Type data-out calculation One feature in the new edition is specific to BIRT iHub F-Type: BIRT Data Object generation no longer counts toward the product’s data output limits. (Learn more about BIRT iHub F-Type output here.) This change enables BIRT iHub F-Type users take better advantage of BIRT Dashboards, which rely on data objects. We made this change in direct response to BIRT iHub F-Type users, who told us loud and clear that creating BIRT Data Objects quickly used their data-out capacity. Because we want you to use your data capacity to create dashboards and applications – not just the BIRT Data Objects that they rely on – we made the change. Migrating Content to BIRT iHub F-Type 3.1 It is important to note that BIRT iHub F-Type 3.0 and 3.1 cannot coexist on the same server, and that you will not be able to use your old version of the software once the new version is installed. With those caveats in mind, follow these steps to migrate content from BIRT iHub F-Type 3.0 to the new version. I’d add – and you probably already know – that you shouldn’t use these instructions to migrate content for your commercial-grade BIRT iHub deployments. Because commercial BIRT iHub deployments are typically much larger and have complicated features like clustering, multi-tenancy, and large volumes of data in and out, please see documentation for Windows and Linux. 1. Get a new activation code. You can get a new activation code on your own by downloading BIRT iHub F-Type a second time. You must use the same email address that you used for your initial download. You will get this message: Choose the “click here” link and follow the instructions to get a new activation code. 2. Get your content in order: Once you have a new activation code, you need to get the content you want to migrate from your existing installation. You must do this before you uninstall the old version. You can migrate files and resources the hard way, by downloading the designs, documents and data objects one by one. But an easier way is to use BIRT Designer Pro to move the content. (If you don’t yet use BIRT Designer Pro, get started here). Follow these easy steps: Create a new Project in BIRT Designer Pro. Create a Server Profile in the Server Explorer tab (located under the layout in the Report editor), and enter information based on your installation. (You probably did this already if you used BIRT Designer Pro to design content for BIRT iHub F-Type 3.0.) The Server Profile looks like this: Explore your existing BIRT iHub F-Type installation, locate the content you want to migrate to your new installation, and drag and drop it into your project in BIRT Designer Pro. Make sure you keep the folder structure intact, as shown below. Files in the Resources directory should be in the root directory of the project, and subdirectories of the Resources folder should be located off the root. 3. Install BIRT iHub F-Type 3.1. You will find detailed instructions here. 4. Publish your content. If you installed BIRT iHub F-Type on a different machine, create a new Server Profile in BIRT Designer Pro. Now we want to publish the files on the new server. The easiest way is to right click on the BIRT Project folder in the Navigator and select Publish… Choose the Publish Files tab, and select the content you want publish. Make sure your files (report designs, report documents and dashboards, typically) are selected in the Publish Files window on top, and your resources (libraries, images, property files and data objects, typically) are selected in the Publish Resources window below. Click Publish to complete the migration. Your screen will look something like this: 5. Clear your cache. In some cases you might need to clear the browser cache for users who had connected to the older version of BIRT iHub F-Type in the past. This is necessary due to conflicts with different versions of JavaScript libraries. You’re now ready to use the capabilities in BIRT iHub F-Type 3.1. Thanks for reading and for using the software. To learn more, check out the many resources online to help you get started with BIRT iHub F-Type, including video tutorials and an active developer forum.

Read More

Embedded Analytics – Making the Right Decision

Think back to the last big purchase you made. Maybe you bought or rented a home, purchased a car, or chose a new computer or mobile provider. If you’re a smart shopper, you considered your decision from many angles: Does the product or service meet my needs simply and elegantly? Is the manufacturer solid and reliable? Will my choice serve me well into the future? We face similar questions when we decide on a third-party vendor to embed technology in the apps we build: Is the technology powerful enough? Is it easy to embed? Will the vendor be around in the future? Will the technology evolve and improve as my needs – and those of my customers – change over time? Elcom International faced such a decision almost a decade ago. Elcom’s software product, PECOS, digitizes the procurement process; Kevin Larnach, the company’s Executive Vice President of Operations, describes PECOS as “Amazon for business,” with extensive controls and business process integrations required by leading governments, utilities, businesses and healthcare providers. More than 120,000 suppliers provide products in PECOS through Elcom’s global supplier network, and PECOS is used by more than 200 organizations worldwide to manage more than $15 billion in total spending annually. Elcom decided on Actuate to provide the analytics for PECOS. Thanks to embedded analytics, PECOS  users avoid and reduce costs, get visibility into all aspects of the procurement process for oversight and audits, and reduce risks from rogue purchasing, off-contract purchasing, and slow, manual record-keeping. Larnach says embedded analytics has helped one PECOS user, the Scottish Government, accrue more than $1 billion in audited savings over the past seven years. Larnach told Elcom’s embedded analytics story in a recent free Actuate-sponsored webinar. He shared the virtual stage with Howard Dresner (@howarddresner), Chief Research Officer at Dresner Advisory Services and author of the Embedded Business Intelligence Market Study, published in late 2014. An on-demand replay of the webinar is now available. Dramatically Changing Requirements As Elcom added features and capabilities to PECOS over the last decade, and as its user base grew, the decision to embed Actuate’s analytics technology has been vindicated. “We’ve been able to work with [Actuate] as a sole vendor,” Lernach says. Actuate “satisfies all of our embedded BI requirements, which have changed dramatically over the years.” Larnach used this graphic to show how the capabilities in PECOS and Actuate’s analytics capabilities grew and evolved together. At the base of the pyramid we find basic transactional reporting. “Most of the embedded reporting and embedded analysis that we offered in early stages of our relationship [with Actuate] centered around transactional reporting,” Lernach says. This reporting isn’t limited to summary information; it includes line detail for each and every purchase. Accommodating user requests, Elcom built an extensive library of templates, forms and business documents with embedded analytics into PECOS. The ability to provide consistent, repeatable analysis led Elcom’s customers to want more; specifically, they wanted to perform periodic analysis of their procurement data. (That’s the second layer up on the pyramid.) The request made good sense; after all, PECOS tracks details of every transaction, and therefore creates an audit trail that begs for analysis. “Embedded BI provides analysis against those audit trails,” Lernach says, which both helps organizations uncover waste, fraud and abuse and also drives improved user behavior that locks in savings and efficient business processes. This ability to provide ongoing analysis has led to Elcom adding trend analysis and key performance indicators (KPIs) to PECOS – the third layer on the pyramid. Demand for those capabilities is growing among PECOS users, Lernach says. “We’re starting to do [dashboards and charting] as a standard feature of our software,” he explains, which leads to the tip of the pyramid: one-off analysis. “I see [one-off analysis] as a huge growth area for our organization, especially for customers who have been with us for many years,” Lernach says. Those customers have large volumes of transactional data to analyze – a full terabyte of data, in the case of the Scottish Government – and they want to eke out every bit of savings from that data that they can find. “When you take on a solution like ours, there are big savings areas up front because it’s a dramatic change from manual business processes to electronic ones,” Lernach explains. “But over the years, as you use [PECOS] and look for continuous improvement, it becomes more and more difficult” to find savings. But that’s exactly where one-off analysis capabilities now in PECOS help uncover “hidden gems,” Lernach says – such as a hospital system that saved hundreds of thousands of dollars by consolidating procurement from 11 types of latex gloves to three. “That could only be uncovered by the type of analysis that’s available through advanced BI tools – and some smart people, obviously,” Lernach says. Check out our free webinar to hear more about how Elcom uses embedded analytics in PECOS, and learn more about the powerful e-Procurement solution on the Elcom website. It’s the right decision. “Decide” image by Matt Wilson. P.S. If you want to embed analytics and are trying to decide whether you should leverage a third-party platform or create your own analytics from scratch, Actuate offers a free white paper, “11 Questions: Build or Buy Enterprise Software,” that helps you make the best choice. And if you need help deciding among embedded analytics providers, check out our infographic, “12 Questions for Providers of Embedded Analytics Tools.”

Read More

Why Location Intelligence is Critical to Your App

Anytime I’m standing in front of a directory map at the mall, my eyes initially navigate that little red dot that screams, “You are here!” As a consumer, it’s important to get your bearings before navigating through endless choices. Retailers also want to know where I am – if I fit their target demographic – because they want to attract me to their stores. The ability to track a customer’s location and pinpoint opportunities to upsell or cross-sell through analytics is shown to improve customer engagement and brand loyalty. So, can location-based technology combined with embedded analytics create a perfect storm for companies that want to increase customer engagement and improve revenue opportunities? Experts like John Hadl, Venture Partner at USVP and founder of mobile advertising agency Brand in Hand, say yes. “Location is the critical component that allows marketers to spend their mobile marketing dollars effectively,” notes Hadl. Location-based technology and data analytics, used together, are often called “location intelligence.” Location intelligence improves the customer experience by allowing companies to visualize, analyze and track partnerships, sales, customers and prospects, according to research from consulting firm Pitney Bowes. Having both a person’s location and their relevant information sets the stage for some innovative approaches to customer experience. Location Intelligence Meets Business Intelligence Research shows that user interest in specific location intelligence features grew almost across the board in the last year. According to the Dresner Advisory Services’ 2015 Location Intelligence Market Study, the most important location intelligence features for users are map-based visualization of information, drill-down navigation, and maps  embedded in dashboards. “From the supplier standpoint, we see vendors having a mixed view of the significance of Location Intelligence, with an increasing number this year saying it has critical importance,” writes Howard Dresner (pictured below). “Industry support is only somewhat aligned with user priorities for Location Intelligence features and geocoding, but GIS [Geographic Information Systems] integration and mobile feature support are well aligned.” The report, which is part of Dresner’s Wisdom of Crowds survey series, also notes that sales and marketing departments are most apt to believe location intelligence will impact their jobs. Other findings include: Compared to 2014, location intelligence use is driving farther down organizational ranks. Respondents’ highest priority is the ability to map locations of province/state, country, and postal codes. Governments, along with the retail and wholesale segment, are most interested in discrete geocoding features.  One challenge for organizations that hope to take advantage of location intelligence – aside from being precise – is the ability to map location data to the data set. Embedded analytics might be the solution to this obstacle. Daily Coffee, Chock Full of Data Let’s look at how location intelligence works: In San Francisco you can’t walk more than a block or two before you hit some type of specialty coffee business  (and yes, Starbucks does qualify). But all specialty coffee is not the same, and each neighborhood, because of its population, can provide different opportunities and potentially unique user experiences. As you can see from the infographic below, analysts from Pitney Bowes using CAMEO software determined that residents in San Francisco’s Sunset District represent a cosmopolitan suburbanite who typically engage best with exotic coffee selections and ample space to turn around a stroller. Hipsters living near San Francisco’s Financial District are more likely to attend art shows or poetry readings, so they would prefer a coffee experience tailored to a mid-afternoon espresso and a late-evening mocha. A mobile app can help these users find these ideal coffee experiences. Several coffee companies have them; they use embedded GPS information to help customers find their coffee – and often, to help the coffee find the customers as well. Blending location information with demographic data helps baristas provide customers with an improved experience. For more thoughts on how embedded analytics plays a major part in location intelligence, check out our series of discussions about the new business intelligence. And, of course, subscribe to this blog for all the latest thinking about embedded analytics. (Infographic courtesy of Pitney Bowes and Cameo)

Read More

OpenText Enhances Portfolio with Analytic Capabilities

    By Mark Barrenechea, President and Chief Executive Officer, OpenText Analytics are a hot technology today, and it is easy to see why. They have the power to transform facts into strategic insights that deliver intelligence “in the moment” for profound impact. Think “Moneyball” and the Oakland A’s in 2002, when Billy Bean hired a number-crunching statistician to examine their odds and changed the game of baseball forever. Across the board—from sports analysis to recommending friends to finding the best place to eat steak in town, analytics are replacing intelligence reports with algorithms that can predict behavior and make decisions. It can create that 1 percent advantage that creates the 100 percent difference between winning and losing. Analytics represent the next frontier in deriving value from information, which is why I’m pleased to announce that OpenText has recently acquired Actuate to enhance its portfolio of products. With powerful predictive analytics technology, Actuate complements our existing information management and B2B integration offerings by allowing organizations to analyze and visualize a broad range of structured, semi-structured, and unstructured data. In a recent study, 96 percent of organizations surveyed felt that analytics will become increasingly important to their organizations in the next three years. From a business perspective, analytics offer customers increased business process efficiencies, greater brand experience, and additional personalized insight for better and faster decisions. In a Digital-First World, organizations will tap into sophisticated analytics techniques to identify their best customers, accelerate product innovation, optimize supply chains, and identify the drivers of financial performance. Agile enterprises incorporate consumer and market data into decision making. People are empowered when they have easy access to agile, flexible, and responsive analytical tools and applications. Actuate enables developers to easily create business applications that leverage information about users, processes, and transactions generated by the various OpenText EIM suites. Customers will be able to view analytics for the entire EIM suite based on a common platform to reduce their total cost of ownership and get a comprehensive view for more elevated, strategic business insight. Actuate is the founder of the popular open source integrated development environment (IDE), BIRT, and develops the world-class deployment platform, BIRT iHub™. BIRT iHub™ significantly improves the productivity of developers working on customer-facing applications. More than 3.5 million BIRT developers and OEMs use Actuate to build scalable, secure solutions that deliver personalized analytics and insights to more than 200 million customers, partners and employees. Designed to be embeddable, developers can use the platform to enrich nearly any application. And, these analytics-enriched applications can be delivered on premises, in the cloud, or in any hybrid scenario. We are excited to welcome the Actuate team into the OpenText family as we continue to help drive innovation and offer the most complete EIM solution in the market. Read the press release on the acquisition here.

Read More

Is your EDI program strategic? If yes, find out which documents you need to implement. (part 2)

In my last blog on this topic (Is your EDI program strategic? If yes, find out which documents you need to implement. (Part 1) I introduced the Purchase Order Acknowledgment and the Purchase Order Change and discussed how you can derive benefits from both these documents regardless of the industry sector you are in.  In this blog, I focus on documents that are of specific benefit for anyone in, or working with, the retail sector. Here are a few you should definitely consider. Product and Price Catalog This is a key document that a supplier sends to its retailers.  It enables the supplier to provide product and price information for the retailer to use during the purchasing process.  This document, which is also known as a “sales catalog,” includes information about each product such as: Item identification number Detailed item description, including color, size, dimensions and other unique identifiers Ordering requirements, such as lead time and required quantities This is one that you would use most with third-party catalog providers, but it can also be used on a peer-to-peer basis with your main trading partners. The master product data information in this document is re-used in many other supply chain transactions, so it’s important that it contains accurate data so that errors can be reduced in purchase orders, ship notices and invoices, among other documents. This will enable significant quality improvements and benefits both retailer and supplier.  Ultimately, starting out with good data will speed product delivery to the retailer and eliminate discrepancies between purchase orders and invoices, and for suppliers, this should result in faster payment. Inventory and Product Activity Data Advice Retailers usually send this document to their suppliers to give them information about inventory levels, sales numbers and other related product activity information, such as which items are on back-order.  It should include: Item identification number On-hand inventory quantity by store Type of product movement, such as items sold, out-of-stock, received or on order Future demand calculations This versatile document can also be used in supplier-managed stock replenishment programs and sales forecasting.  Furthermore, for drop-ship orders, (those that are shipped directly to the consumer), it is probably the single most critical business document that can be exchanged. Most retailers’ e-commerce applications rely upon inventory feeds from their supplier here in order to determine whether products can be available for consumer purchases on websites. Delivery Confirmation This document is also extremely useful for direct-to-consumer delivery. Most products sent via a drop-ship process travel with small package carriers. The supplier obtains a tracking number from the carrier and provides it to the retailer via the Advance Ship Notice document.  Any automated communication typically ends at that point and so if order status is needed, the only way to get it is to contact the carrier. Instead, you could have complete end-to-end visibility of your order status if you ask the carrier to send a Delivery Confirmation document to confirm consumer receipt.  This helps the supplier to close the purchase order and to manage the payment and settlement cycle.   Click here if you are interested in learning more about EDI in the retail industry. And here are a couple of blogs about how EDI ASNs support retail-specific business processes: How EDI ASNs Enable Direct Store Delivery Direct Store Delivery (DSD) How EDI ASNs Enable Drop-Shipping  

Read More

Under the Hood – BIRT iHub F-Type: Understanding the Processes

While your customers don’t need to see the inner workings of your app, as a developer, you need to be the master of its parts and processes. It’s time to get under the hood. Hello BIRT community! My name is Jesse Freeman. Although I am not new to BIRT or Actuate, I am transitioning into a significantly more community-centric role. I have spent the last two years working as a Customer Support Engineer for Actuate, specializing in the server and designer products. I am excited to bring my product and support knowledge to the larger BIRT community. I come from a Java/JavaScript background and am a big fan of multi-platform, open source and open standard technologies. I am an advocate of Linux operating systems and have used or dabbled with the majority of the larger Linux distributions. In particular, I am a big fan of Arch Linux and CentOS. Over the next several weeks I will publish a series of blogs that will bring my support knowledge to the community. The series will include posts on understanding the BIRT iHub F-Type’s processes and configuration as well as troubleshooting. This series will provide technical insight for anybody who will be configuring and/or maintaining a BIRT iHub F-Type installation. BIRT iHub F-Type is a free BIRT server released by Actuate. It incorporates virtually all the functionality of commercially available BIRT iHub and is limited only by the capacity of output it can deliver on a daily basis, making it ideal for departmental and smaller scale applications. When BIRT iHub F-Type reaches its maximum output capacity, additional capacity is available as an in-app purchase. Understanding the Processes The first topic of my Under the Hood blog series is titled Understanding the Processes. When I first started in support, one of the first pieces of information I learned was the breakdown of all of the processes and their specific roles. This information was invaluable for the duration of my time providing support. Understanding the processes and their responsibilities provides insight into how the product works for configuration and integration purposes, and helps us understand where to look for more information if an issue arises. With that in mind, here is the list of the BIRT iHub F-Type processes and their responsibilities: ihubd – This is the daemon process responsible for the initial startup of BIRT iHub F-Type. The ihubd process starts the ihubc and ihubservletcontainer processes.  If issues occur during startup, this is one of the first processes to examine. ihubservletcontainer – As the name implies, this process is the front end servlet container for the BIRT iHub F-Type. This process is hosted out of an integrated Tomcat within BIRT iHub, which means anybody who is familiar with Tomcat should feel right at home when configuring or troubleshooting of the process. ihubc – This is the parent of all other processes started by BIRT iHub,  including the ihub, jsrvrihub and jfctsrvrihub processes.  The ihubc is the SOAP endpoint for BIRT iHub’s communication, the job dispatcher, and resource group manager, and also takes requests from front-end applications such as the integrated Information Console. ihub – The ihub process is responsible for communication with the metadata database as well as the Report Server Security Extension (RSSE) if one has been implemented. jsrvrihub – Within a single installation there may be multiple jsrvrihub processes running simultaneously. A typical out-of-the-box installation will have at least two. One of these two typical jsrvrihub processes is used for the viewing of dashboards and the other is used for execution and viewing of reports transiently. jfctsrvrihub – The jfcsrvrihub process is used for the execution of background jobs on BIRT iHub. This includes any report that is explicitly scheduled to run at a specific time (or immediately) and allows reports to be output to a directory within the ihub process rather than viewed immediately within the current browser session. Whether beginning an installation, working on an integration project, or troubleshooting an existing installation,  this information will assist you with knowing the process that needs to be examined. Thank you for reading.  Subscribe to this blog and you will be first to know when I publish my next Under the Hood – BIRT iHub F-Type series with a review of the Primary Configuration Files. Download BIRT iHub F-Type today so you can follow along. If you have any questions, post them in the comments below or in the BIRT iHub F-Type forum. -Jesse

Read More

Banking & PSD2: The Oscars Nominations for Biggest Disruptors are…

Similar to the Academy Awards official nominations, I’m sure you already have a good idea of the personalities that drew everyone’s attention in the Banking and Payments industry. Let’s open the envelope and nominate the most disruptive artefacts of the Payments Services Directive 2 (PSD2) from a Bank perspective. And the nominees are… Payments Initiation Services and Account Information Services From a commercial, legal, technical and operational perspective, Payment Initiation Services is introduced as a new obligation for Banks and traditional Payment Services Suppliers. The idea is to allow third-party companies (also regulated under the PSD2) to make payments on behalf of traditional bank clients. This is the result of the growing number of payment outfits in the marketplace, doing─until now─fairly unregulated business and operating within the scope of their own creativity and commercial objectives. PSD2 puts a framework around that, with the consequence of carving out the traditional business of Banks’ Payments and Cash Management, as we know it today. Account Information Services is the corollary of the first concept. It is the second nominee of our ceremony, as it also opens the market for competition and innovation. While Payment Initiation Services is largely “acting on behalf of the ultimate account owner”, Account Information Services is basically allowing third parties to act as aggregators across a number of banks, in terms of transaction visibility, reporting and all the traditional processes. The growing trend with corporates (wholesale banking) is the migration of treasury processes into the cloud with Payment Services Providers (PSPs) and historical treasury technology vendors, to optimize payments and cash management as an overlay to traditional Bank services. Account Information Services now puts a legal framework around this example. PSD2 is also largely designed to cover the same principles and benefits to the retail banking world. Low Hanging Fruits for Transaction Banking Clearly, the PSD2 will have a number of variations in transposed domestic laws, opening the gates to various rules, processes and technical standards. Across all of the impacted Bank functions, Transaction Services will pick up a lot more scope and business logic. I believe that minimizing and “protecting” back-office platforms from direct PSD2 impacts is the first point to consider. The aftereffect to this idea is the introduction of or fully leveraging Digital Banking. For those who still haven’t heard about Digital Banking, the idea is to separate the way Bank clients consume the services electronically from the way banking products, platforms and processes within the Bank are physically deployed. I wrote about Digital Banking in a previous blog. The low hanging fruits to deliver on the Payment Initiation Service as well as Account Information Services are: Digitize the Banking channels, enabling it to normalize client or third party data (payments, reporting, “act on behalf”, message types). This typically includes data normalization, file mapping, enrichment and transformation. Identify where pre-PSD2 processes in the middle and back-office systems can be maintained. The expansion of KYC rules, client-third party relations and reconciliation processes can help normalize all PSD2 flows from the technical foundations of the Bank. Operational client community and reference data management. Above and beyond KYC and sales data, it becomes an imperative to keep track of the entire detailed inventory of client integration reference data, settings, file types and envelopes, certificates, “act on behalf” mandates, etc. The Business As Usual (BAU teams) from Transaction Banking will need to keep track of multi-layered business and technical relationships. Ensure all access controls, identity management; auditing, reconciliations, and transaction management reflect the multi-layered model. Minimum Compliance vs. Commercial Strategy As of today, in early 2015 some European Banks have started to get ahead of the curve by spinning-off their own independent “third-party PI” brand, to compete within and maintain their share of the PSP market. This is very apparent in the Nordics and Germany. On the other end of the spectrum, the majority of banks are “hatching down”, bracing themselves with the minimum compliance approach. Minimum compliance is basically fixing the new gaps opened by PSD2, largely around security, KYC and electronic banking. Entering the “third-party PSP world” as a new or independent brand─a Bank joint venture, a spin-off, or a subsidiary─is the only way to keep or expand one’s market share. A few smart European Banks have chosen the most aggressive strategy, by executing a “land grab” from other Banks who chose minimum compliance. A Final Observation My personal take on PSD2 is that some Banks are wearing the scars of the financial and emotional investment into SEPA, still fresh in everyone’s minds. PSD2 looks like a further tightening of the bolts, when it actually introduces more disruption to the Banking business than SEPA did. When disruption comes, an organization can either do nothing or fully embrace it and ride the waves.

Read More

Will the Creation of ‘On Device’ or ‘On Thing’ Based B2B Transactions Ever Become a Reality?

Over the past five years CIOs around the world have been rolling out their cloud based B2B strategies. Whether deploying B2B on premise, on cloud or as a hybrid environment, companies have been able to deploy B2B infrastructures according to their budget, strategy and technical capabilities. Infrastructure-as-a-Service, Platform-as-a-Service and Software-as-a-Service initiatives have been deployed with great effect, and numerous other ‘as-a-Service’ definitions have evolved. So where next for B2B based infrastructures?, well with nearly every CIO formulating a strategy in support of the Internet of Things, how about an On Device or On Thing based B2B strategy? I have posted twenty or so blogs relating to cloud infrastructures since 2010 and over the past year I have spent some time looking at the Internet of Things and where this may go in relation to supply chains of the future.  In a couple of my IoT related blogs I provided some examples on how I thought IoT connected devices could connect into an enterprise infrastructure, (read about it here), and then initiate some form of closed loop ordering process as part of a replenishment or predictive maintenance scenario. I read an article on CIO.com last September where the author described something called the Internet of ‘Things as a Service’ or TaaS for short.  I didn’t realise it at the time of writing my own blogs but this is exactly what I was describing, namely a connected device will be able to analyse its own consumption trends or wear rates and then be able to place some form of order for replacement parts without any human intervention.  OK, sounds a bit far-fetched but I can guarantee this is where things, no pun intended, will be going in the future. Billions of dollars are being spent on developing onboard or embedded processing, sensing, storage and analytics based technologies for IoT based devices.  Many companies such as Intel are betting huge research budgets to develop next generation semi-conductor chips that can be embedded on ‘things’. In fact only last week, OpenText acquired a leading analytics company , and they have been looking at embedded analytics for IoT devices. I will take a look at embedded analytics in relation to B2B in a future blog entry as I believe it will transform how companies visualise, interact and manage B2B related information flowing across the extended enterprise. Two weeks ago I had an interesting discussion with ARC Advisory Group relating to device or ‘thing’ level creation of B2B transactions. ARC use the term Industrial Internet of Things (IIoT) to describe their take on this area as they are keen to differentiate themselves from more consumer focused IoT devices such as wearable technology and home automation equipment. As I have mentioned before there are many big players entering the IIoT space, for example GE (who originally coined the IIoT term), Cisco and Bosch to name but a few. Could we see a piece of equipment in the field, for example a generator or excavator, initiating a B2B transaction by itself to order a replacement part that is just about to fail? For the purposes of this blog I just wanted to introduce the idea of a device or ‘thing’ derived B2B transaction and you can read more in the ARC article that was written to support this.  

Read More

Expert Advice on Embedded BI with Howard Dresner [Webinar]

For once, your CEO and CIO agree on something: Your company needs to embed analytics into its applications. You’ve been tasked with researching which platform is best for you, and you probably have two items on your to-do list: Learn from an industry expert who thoroughly studies the many different embedded analytics platforms, and hear from a company that has successfully embedded analytics into its software. You can do both on January 22 by attending  Embedded BI Market Study with Howard Dresner, a free webinar sponsored by Actuate. Dresner, you probably know, is Chief Research Officer of Dresner Advisory Services, a respected technology analyst firm. Dresner (@howarddresner) coined the term “business intelligence” in 1989 and has studied the market drivers, technologies, and companies associated with BI and analytics ever since. It’s safe to say that nobody knows the sector better. In this webinar, Dresner will highlight the results of his recent Wisdom of Crowds report, the Embedded Business Intelligence Market Study, published in October 2014. Dresner’s study taps the expertise of some 2,500 organizations that use BI tools, focusing specifically on their efforts to embed analytics in other applications. In the webinar, Dresner will cover three main subjects: User intentions for – and perceptions of – embedded analytics, segmented by industry, types of users, architecture and vendor Architecture needs and priorities (such as web services, HTML/iFrame and Javascript API) for embedding, as identified by technologists who implement embedded analytics Ratings of 24 embedded BI vendors, based on both the architecture and features the individual vendors offer, and the reasons Actuate garnered the top ranking To add the user’s perspective, Dresner will then give the floor to Kevin Larnach, Executive Vice President of Operations at Elcom. Larnach will explain how Elcom embeds Actuate’s reporting solution in PECOS, its cloud-based e-procurement solution. Embedded analytics enables users of PECOS – a user base 120,000 strong, in more than 200 organizations, managing nearly $20 billion in total procurement spending annually – to access standard reports, slice and dice data for analysis, create custom reports and presentations of the data, and export transaction history to many different formats, all without IT expertise.  As this diagram shows, PECOS touches all aspects of the procurement process. PECOS users include the Scottish Government (including health services, universities and colleges, and government departments), several health services groups in Britain, the Northern Ireland Assembly, several school districts in the United States, the Tennessee Valley Authority (TVA), and many other organizations and companies. Elcom has identified over a billion dollars in audited savings that its customers have accrued thanks to embedded analytics – more than $500 million in the healthcare sector alone. Elcom’s application is truly an embedded analytics success story. The embedded analytics capability in PECOS, delivered with Actuate technology, is an important competitive differentiator for Elcom. Its competitors’ products either have limited fixed reporting, or don’t offer any standard reporting at all. Those competitors “are scrambling to adopt a flexible embedded approach such as the one enjoyed by PECOS users,” Elcom says. You’re sure to have questions for Dresner and Larnach, so the webinar will include a Q&A session. (An Actuate technical expert will also be on hand if you have specific questions about our embedded analytics capabilities.) The webinar will be accompanied by live Tweets using the hashtag #embeddedanalytics.  Register today.

Read More

Will We See 3D Printed Action Figures for Star Wars VII?

Forecasting the demand for merchandise and toys to support a new Star Wars film always requires a bit of magic and Jedi Arts.  How many of each different action figure, light saber and book should be produced?  It has been almost 10 years since the last film was released so there is no recent supply chain data to analyze.  And, of course, we want to avoid a 1977 style imbalance of the force (between supply and demand) when toys weren’t able to be mass produced in time for the holiday season.  What if we forget about the traditional mass production in China model.  What if, instead, toy manufacturers should use 3D printing to create action figures and other toys on demand?  Here is how it would work. Imagine you are seven years old. You convince your Mom or Dad to take you Walmart, Toys R Us or Target where you could visit a special Star Wars kiosk in the toy section. The kiosk would allow you to select action figures to buy – new characters from Episode VII and older ones from the first two trilogies. There would be options to purchase 100+ standard “pre-designed” action figures ($5.99). Alternatively, you could customize the design with hundreds of other permutations ($10.99) – different weapons (lightsaber, guns, sticks), headwear (masks, helmets) and clothing (capes, gowns). After adding your selections kiosks would direct you to visit the in-store factory. You and your parents could watch through a glass window as your own Chewbacca is printed layer-by-layer. After printing, a robotic arm could package the action figures into personalized boxes and put them onto a mini conveyor belt. A store clerk would then retrieve the figures and hand them to the customer. 3D Printers and scanners are already emerging in major retailers around the world. UK-based ASDA offers kiosks where customers can print a miniaturized version of themselves – 3D selfies. Customers stand in a body scanner similar to the one you see at the airport. It takes photos of you from different angles to create a three dimensional model. The model is then sent to an on-site 3D printer, which creates your mini-me a few minutes later. I don’t think it is unreasonable to expect that within twelve months that customized 3D printing stations could be introduced to major retailers to support the Star Wars movie launch. It would certainly make the buying experience much cooler. There would be many more SKUs and customization options available. It would be good for the environment too.  The 3D printing model should reduce the carbon footprint of a typical Stormtrooper. The mass production process overseas would be eliminated. Less pollution and waste would be generated. The transportation by ship and truck of the products from Shenzhen to retailers around the world would be eliminated. There would be less oil consumption and carbon emissions. There could be a perfect balance of the force(s) – of supply and demand. Only the exact quantity of toys desired would be produced. No more Jedi Arts to predict six months in advance how many R2-D2s to make.  There should be fewer out-of-stocks as well. Stores would simply need to keep sufficient printer capacity in the stores and have enough of the associated materials on hand. The customizable 3D printed action figures could be also be available online. The experience from the in-store kiosk could be replicated on a mobile app or a web browser. Imagine if you place an order on your iPhone and one hour later an Uber driver service pulls in your driveway with your Luke Skywalker. Even cooler would be a mobile app that could represent 3D holographic images of the characters. Imagine the ghost of Anakin, Obiwan and Yoda being inserted into pretend battles with real physical action figures. The hologram apps may not be available by next December, but I we should not be surprised if the technology is ready for the launch of Episode VIII.

Read More

Data Driven Summit – The Future of BIRT Analytics [Video]

What’s the best way to turn advanced analytics insight and intelligence into something that is useful in everyday apps and everyday situations? It’s a trick question from our perspective, because at Actuate we make tools that help organizations getting a handle on Big Data, advanced analytics and visualizations. It’s a highly competitive market because more and more organizations are required to make fast decisions that can have a big impact on their business – decisions that help reduce customer churn and anticipate risk, for example. So to assist data geeks and non-data geeks alike, Actuate is adding new capabilities to its BIRT Analytics platform (version 5.0) in early 2015. The improvements will allow business users to analyze billions of rows of data in seconds, and generate actionable and real-time results. BIRT Analytics runs as a server on Windows, Linux and Mac OS X. They can also access enriched data or intelligence from BIRT Analytics to embed into existing applications with a public API. This helps organizations discover hidden insights for forecasts, predictions or patterns for future success, while keeping abreast of the competition. During Data Driven Summit 2014 – Actuate’s annual series of customer events – Mark Gamble (@heygamble) and Pierre Tessier (@puckpuck) delivered a “must see” live demo session exploring some of the advanced features in BIRT Analytics 5.0. Actuate VP of Product Marketing & Innovation Allen Bonde (@abonde) was also on hand to give his perspective. Here’s that discussion at the Data Driven Summit in Santa Clara, Calif. We’ll be posting more of the Data Driven Summit 2014 video series here, including the other demonstrations, BIRT data visualization insights and panel discussions with industry insiders. Subscribe (at left) to be informed when new videos are posted.

Read More