B2B Integration

Musings on Prospects for Real-Time Payments in the US

As I sit in my office in Massachusetts after shoveling out from the 8th snowstorm in six weeks (but who is counting?), I have begun to wonder whether now is the time to simply declare that all banking going forward shall be digital. Just as our Commonwealth’s new Governor banned all street traffic except for emergency vehicles for several days in late January, perhaps the Federal Reserve Bank should declare a permanent ban on all financial transactions involving paper except for emergencies. No paper checks, no paper deposits, no paper loan documents, no signature cards, etc. OK, maybe this is a little bit extreme but being stuck at home for days at a time over a six week period can result in some out of the box thinking. As we enter the middle of the 2nd decade of the 21st century, what is clear is that in many ways, we are still living with 20th century infrastructure. Whether it is roads and bridges, public transportation, or archaic banking practices and systems, we need to stop taking a band-aid approach to our problems and recognize that world around us has changed. A perfect example of this is the slow but hopeful progress being made toward the development of a modern retail payment system in the United States. With the Federal Reserve System and at least part of the banking industry in the form of The Clearing House announcing support for a real-time electronic payment system, we have an opportunity to finally break the bonds of the past and join many other countries that have already made this investment. It’s time to stop talking and start acting. What is holding us back? Some banks are stuck on how to make the business case: they are in denial about the future of payments given the investment by non-bank providers and evolving consumer preferences. The cost of replacing batch-oriented systems that were developed decades ago and are still being used by almost all banks to process payments is a hurdle. Some industry participants are hung up on how to support the small percentage of the population that doesn’t have the access or interest in real-time payments, a percentage that shrinks every day. Trade associations and payment system operators have a perceived vested interest in the status quo. Merchants and large corporations are already overwhelmed with the multitude of new payment systems being introduced by banks and technology companies. Consumer and business adoption of technology is radically changing expectations about the fundamental role of banks in society. Payment processing is at the heart of the traditional role of banks: being a trusted intermediary of financial value. But payment processing in 2015 and in the future is also at the heart of the new role of banks: being a secure and convenient intermediary of financial value and data. If banks don’t hurry to embrace this new role, they will eventually be relegated to the role of the payments provider of last resort. The post Musings on Prospects for Real-Time Payments in the US appeared first on All About B2B.

Read More

Step Aside Cloud, Mobile and Big Data, IoT has just Entered the Room

Mark Morley

This article provides a review of the ARC Advisory Group Forum in Orlando and expands on the ever increasing importance of analytics in relation to the Internet of Things The room I am referring to here is the office of the CIO, or should that be CTO or CDO (Chief Digital Officer), you see even as technology is evolving, the corporate role to manage digital transformation is evolving too. Since 2011, when Cloud, Mobile and Big Data technologies started to go mainstream, individual strategies to support each of these technologies have been evolving and some would argue that in some cases they remain separate strategies today. However the introduction of the Internet of Things (IoT) is changing the strategic agenda very quickly. For some reason IoT as a ‘collective & strategic’ term, has caught the interest of the enterprise and the consumer alike. IoT allows companies to effectively define one strategy that potentially embraces elements of cloud, mobile and Big Data. I would argue that in terms of IoT, cloud is nearly a commodity term that has evolved into offering connectivity any time, any place or anywhere. Mobile has evolved from simply porting enterprise applications to HTML5 to wearable technology such as Microsoft HoloLens, shown below. Finally Big Data which is broadening its appeal by focussing more on the analytics of information rather than just archiving huge volumes of data. In short, IoT has brought a stronger sense of purpose to cloud, mobile and Big Data. Two weeks ago I was fortunate to attend the ARC Advisory Group Forum in Orlando, a great conference if you have an interest in the Industrial Internet of Things and the direction this is taking. The terminology being used here is interesting as it is just another strand of the IoT, I will expand more on this naming convention a bit later in this post. There were over 700 attendees to the conference, and a lot of interest, as you would expect from industrial manufacturers such as GE, ABB, ThyssenKrupp & Schneider Electric. These companies weren’t just attending as delegates, they were actually showcasing their own IoT related technologies in the expo hall. In fact it was quite interesting to hear how many industrial companies were establishing state of the art software divisions for developing their own IoT applications. For me, the company that made the biggest impact at the conference was GE and their Intelligent Platforms division. GEIP focused heavily on industrial analytics and in particular how it could help companies improve the maintenance of equipment, either in the field or in a factory by using advanced analytics techniques to support predictive maintenance routines. So how does IoT support predictive maintenance scenarios then? It is really about applying IoT technologies such as sensors and analytics to industrial equipment and then being able to process the information coming from the sensors in real time to help identify trends in data and how it is then possible to predict when a component such as a water pump is likely to fail.  If you can predict when a component is likely to fail, you can replace a faulty component as part of a predictive maintenance routine and the piece of equipment is less likely to experience any unexpected downtime. In GE’s case they have many years of experience and knowledge of how their equipment performs in the field and so they can utilise this historical data as well to determine the potential timeline of component failure.  In fact GE went to great lengths to discuss the future of the ‘Brilliant Factory’. The IoT has brought a sense of intelligence or awareness to many pieces of industrial equipment and it was interesting learning from these companies about how they would leverage the IoT moving forwards. There were two common themes to the presentations and what the exhibitors were showcasing in the expo hall. Firstly cyber-security, over the past few months there has been no end of hacking related stories in the press and industrial companies are working very hard to ensure that connected equipment is not ‘hackable’.  The last thing you want is a rogue country hacking into your network, logging into a machine on the shopfloor and stealing tool path cutting information for your next great product that is likely to take the world by storm.  So device or equipment security is really a key focus area for industrial companies in 2015.  Interestingly it wasn’t just cyber-security of connected devices that was keeping CIOs awake at night, a new threat is emerging on the horizon.  What if a complete plant full of connected devices could be brought down by a simple Electro Magnetic Pulse (EMP) threat, this was another scenario discussed in one of the sessions at the conference. So encryption and shielding of data is a key focus area for many research establishments at the moment. The second key theme at the conference was analytics. As we know, Big Data has been around for a few years now but even though companies were good at storing TBs of data on mass storage devices they never really got the true value from the data by mining through it and looking for trends or pieces of information that could either transform the performance of a piece of equipment or improve the efficiency of a production process.  By itself, Big Data is virtually useless unless something is done which results in actionable intelligence and insight that delivers value to the organisation. Interesting quote from Oracle,93% of executives believe that organisations are losing revenue as a result of not being able to fully leverage the information they have. So deriving value from information coming from sensors attached to connected devices is going to become a key growth sector moving forwards. It is certainly an area that the CIO/CTO/CDO is extremely interested in as it can directly impact the bottom line and ultimately bring increased value to shareholders. I guess it is no surprise then that the world’s largest provider of Enterprise Information Management solutions, OpenText, should acquire Actuate, a leading provider of analytics based solutions. Last week the Information Exchange business unit of OpenText, which has a strong focus on B2B integration and supply chain, launched Trading Grid Analytics, a value add service to provide improved insights into transaction based information flowing across our cloud based Trading Grid infrastructure. With 16 billion transactions flowing across our business network each year there is a huge opportunity to mine this information and derive new value from these transactions, not just in the EDI related information that is being transmitted between companies on our network. Can you imagine the benefits that global governments could realise if they could predict a country’s GDP based on the volume of order and production related B2B transactions flowing across our network? Actuate is not integrated to Trading Grid just yet but it will eventually become a core piece of technology to analyse information flowing across not just Trading Grid but our other EIM solutions.  It is certainly an exciting time if you are a customer using our EIM solutions! Actuate has some great embedded analytics capabilities that will potentially help improve the overall operational efficiency of connected industrial equipment. In a previous blog I mentioned about B2B transactions being raised ‘on device’ , well with semi-conductor manufacturers such as Intel  spending millions of dollars developing low power chips to place on connected devices, it means that the device will become even more ‘intelligent’ and almost autonomous in nature.  I think we will see a lot more strategic partnerships announced between the semi-conductor manufacturers and industrial equipment manufacturers such as GE and ABB etc. Naturally, cloud, mobile and big data plays a big part in the overall success of an IoT related strategy. I certainly think we will see the emergence of more FOG based processing environments.  ‘FOG’ I hear you ask?, yes another term I heard at a Cisco IoT world forum two years ago.  Basically a connected device is able to perform some form of processing or analytics task in a FOG environment which is much closer to the connected device than a traditional cloud platform.  Think of FOG as being half way between the connected device and the cloud, ie a lot of pre-processing can take place on or near the connected device before the information is sent to a central cloud platform. So coming back to the conference, there was actually another area that was partially discussed, the area of IoT standards.  I guess it is to be expected that as this is a new technology area it will take time to develop new standards for how devices are connected to each other and standard ways for transporting, processing and securing the information flows. But there is another area of IoT related standards that is bugging me at the moment!, the many derivatives of the term IoT that are emerging.  IoT was certainly the first term defined by Kevin Ashton, closely followed by GE who introduced the Industrial Internet of Things, Cisco introducing the Internet of Everything and then you have the German manufacturers introducing Industry 4.0.  I appreciate that is has been the manufacturing industry that has driven a lot of IoT development so far but what about other industries such as retail, energy, healthcare  and other industry sub-sectors?  Admittedly IoT is a very generic term but already it is being more associated with consumer related technologies such as wearable devices and connected home devices such as NEST.  So in addition to defining standards for IoT cyber security, connectivity and data flows, how about introducing a standard naming convention that could support each and every industry? As there isn’t a suitable set of naming conventions, let me start the ball rolling by defining a common naming convention!  I think the following image nicely explains what I am thinking of here. In closing, I would argue, based on the presentations I saw at the ARC conference, that the industrial manufacturing sector is the most advanced in terms of IoT adoption. Can you imagine what sort of world we will live in when all the industries listed above embrace IoT, one word, exciting! Mark Morley currently leads industry marketing for the manufacturing sector at OpenText.  In this role Mark has a focus on automotive, high tech and the industrial sectors. Mark also defines the go-to-market strategy and thought leadership for applying B2B e-commerce and integration solutions within these sectors.

Read More

Forget the Oscars, Tata Motors Won a Bigger Award in Mumbai

Last week I had the pleasure of attending our Innovation Tour event in Mumbai, the first leg of a multi-city tour of the world to showcase our Enterprise Information Management solutions and how they help companies move to the digital first world! The event was very well attended and it was good to see keen interest being shown in our new offerings such as Actuate and Core and our other more mature EIM solutions. Enterprise World has traditionally been our key event of the year, but the Innovation Tour provides a way for OpenText to get closer to our customers around the world, Mumbai was no exception with keen interest shown in our expo hall. I have been to India before, two years ago in fact, to meet with an automotive industry association that looks after the ICT needs of the entire Indian automotive industry. Back then, the discussion was focused around B2B integration. However, last week’s event in  Mumbai showcased all solutions from the OpenText portfolio. One of the interesting solution areas being showcased by one of our customers was Business Process Management (BPM) and it is only fitting that one of our Indian based customers won an award for their deployment of BPM. Why fitting? Well, India has long been the global hub for business process outsourcing, so I guess you could say there is a natural interest in improving the management of business processes in India. OpenText has a strong presence in the Indian market. OpenText presented a number of awards during the event, and Tata Motors was the worthy winner of the award for the best deployment of BPM. Incidentally, Tata Motors also won the global Heroes Award at last year’s Enterprise World event for their deployment of our Cordys BPM Solution. So who are Tata Motors, I hear you ask? Well, they are the largest vehicle manufacturer in India with consolidated revenues of $38.9 billion. Tata Motors is part of a large group of companies which includes Tata Steel, Jaguar Land Rover in the UK, Tata Technologies and many other smaller companies that serve the domestic market in India. Tata Group is fast becoming a leading OpenText customer showcasing many different EIM solutions. For example, Jaguar Land Rover uses OpenText Managed Services to manage the B2B communications with over 1,200 suppliers following divestiture from Ford in 2009. Tata Steel in Europe also uses our Managed Services platform to help consolidate eleven separate EDI platforms and three web portals onto a single, common platform. So, simplification and consolidation of IT and B2B infrastructures is a common theme across Tata Group, and Tata Motors is no different with their implementation of OpenText BPM. Tata Motors has struggled over the years to exchange information electronically with over 750 vehicle dealers across India. Varying IT skills, multiple business processes, combined with having to use a notoriously difficult utilities and communications infrastructure across the country was really starting to impact Tata Motor’s business. In addition, their IT infrastructure had to support over 35,000 users and there were over 90 different types of business application in use across 1,200 departments of the company. So ensuring  that accurate, timely information could be exchanged across both internal and external users was proving to be a huge problem for Tata Motors. Step forward, OpenText BPM! Tata Motors decided to depoy our Cordys BPM solution as a SOA based backed platform to connect all their business applications and more importantly provide a common platform to help exchange information electronically across their extensive dealer network. Even though they had deployed Siebel CRM across their dealer network, Tata Motors faced a constant challenge of having to process a high volume of manual, paper based information, quite often this information would be inaccurate due to mis-keying of information. A simple mistake, but when scaled up across 750 dealers, it can have a serious impact on the bottom line and more importantly impact customer satisfaction levels with respect to new vehicle deliveries or spare parts related orders. Tata Motors had a number of goals for this particular project: Implement a Service Oriented Architecture – Primary objective was to setup a SOA environment for leveraging existing services and hence avoid re-inventing the wheel. They also wanted to use this platform to streamline the current integrations between multiple business systems. Process Automation / Business Process Management – They had a lot of manual, semi-automated of completely automated processes. Manual or semi-automated processes were inefficient and in some cases ineffective as well. Some of their automated processes were actually disconnected with actual business case scenarios. So the goal for implementing BPM was to bring these processes more nearer to ‘business design’, thus improving efficiency and process adherence. Uniform Web Services Framework – Tata Motors goal was to try and establish a single source of web services that could convert existing functionalities of underlying service sources into inter-operable web services. So, what were the primary reasons for Tata Motors choosing OpenText BPM? It was a SOA enabler, its business process automation capabilities, comprehensive product for application development, minimizes the application development time and improved cost effectiveness. Their BPM implementation covered two main areas: Enterprise Applications Integration – mainly deals with inward facing functionalities of employee and manufacturing related process applications. They had many applications but they had a common fault, they did not follow SOA principles. Web services had to be developed inside every application which was very inefficient from a time and resources point of view. In addition, if an application had to connect to SAP then it was an independent, unmanaged and insecure connection. Customer Relationship & Dealer Management Systems Integration –Tata Motors is the biggest player in the commercial vehicles sector in India and one of the biggest in terms of passenger car related sales, with over 750 dealers scattered across India. The dealerships are managed using Siebel CRM-DMS implementation but with many changes being rolled out across the system it needed a supporting platform to effectively manage this process. Cordys became the primary environment for developing CRM-DMS applications. So in summary, Cordys BPM has been integrated with SAP, Siebel CRM-DMS, Email/Exchange Server, Active Directory, Oracle Identity Manager, SMS Gateway and mobile applications across Android and iOS. The Cordys implementation also resulted in a number of business benefits including, improved process efficiency, stronger process adherence, built on a SOA based platform, significant cost and time savings. The project has already achieved its ROI ! Moving forwards OpenText BPM will act as a uniform, centrally managed and secure web services base for all applications used across Tata Motors landscape, irrespective of the technology in which it is developed. The platform will also provide an evolving architecture to mobilise existing applications and they plan to integrate to an in-house developed document management system. Finally, the go forward plan is to move their Cordys implementation to the cloud for improved management of their infrastructure. I have visited many car manufacturers over the years and one company head quartered in the Far East had over 300 dealers in Europe and each one had been allowed to implement their own CRM and DMS environments to manage their dealer business processes. Prior to the acquisition of GXS (my former company) by OpenText, I had to inform them that GXS didn’t have a suitable integration platform to help seamlessly connect all 300 dealers to a single platform. With OpenText BPM we can clearly achieve such an integration project now and Tata Motors is certainly a shining light in terms of what is achievable from an extended enterprise application integration point of view. Congratulations Tata Motors! For more information on OpenText BPM solutions, please CLICK HERE. Finally, I just want to say many thanks to my OpenText colleagues in India; it was a very successful event and a team effort to make it happen. For more information on our Innovation Tour schedule, please CLICK HERE

Read More

Could the Smart Trash Can Take Waste Out of the Supply Chain?

In my last post I introduced a vision for the Smart Trash that would automatically identify the items you are throwing away. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. What would you do with the data collected? The waste management company may not have much use for the data, but manufacturers and retailers who are trying to predict what consumers are going to buy next would find it very valuable. How would the smart trash can work? There are a couple of different options. Version one (circa 2017) would probably rely on the use of RFID tags and readers. If manufacturers put RFID tags on each of the items you purchased then the smart trashcan would be able to identify them automatically. Version two (circa 2018) might add a camera to the lid. As items are being disposed of the camera would automatically recognize the item using “visual search” technology found in Google Goggles. Perhaps the smart trash can might have multiple cameras at different depths that could see through trash bag liners to identify items at rest. Version three (circa 2020) might be more advanced with capabilities to identify items based upon smell. Perhaps, the trash can would be fitted with sensors that can detect odors and identify items based upon their chemical composition. You might be wondering what data retailers and manufacturers use to forecast demand today and whether smart trashcans would provide an improvement. Today, the primary data used for forecasting demand is the information about what shoppers are buying at individual stores or what is called “Point-of-Sale” data. Every night retailers and manufacturers run reports to understand how many of each item was sold in each store. They then try to guesstimate how much inventory they have on hand and whether or not they are going to run out of stock in the coming days (weeks or months). If they are running low on inventory then will need to issue a replenishment order. Would trash can data provide better insights than Point of Sale data? This begs a good question. What provides better insights into future sales – what people are buying or what they are throwing away? Is monitoring Point-of-Sale data a better approach than monitoring waste? Let’s first think about items that are regularly purchased – batteries, diapers, detergent, shampoo, soda, milk, bread and salty snacks. I would argue that monitoring consumption (via trashcans) of these repeat purchases is a better indicator of near-term demand. If someone throws out a milk container they are very likely going to buy a new one in the next 24 hours. In many cases, the disposal of an item after it is consumed is the event that triggers the need to buy another one. But what about items which are not consistent, repeat purchases? Examples might include toys, electronics, clothing, shoes, etc. For these inconsistent purchases you might question the validity of the correlation between waste patterns and future purchases. Just because you throw something away doesn’t mean that you are going to purchase it again – immediately or ever. The value of using the trash data is clear for groceries and regular purchases. Will Nestle, Procter & Gamble and Tesco begin giving away free kitchen trash cans to consumers just to collect data and be optimally positioned for replenishment orders? Or even better what if your smart trash can was linked to your online grocery account? Items detected in your trash can (or recycling bin) could be automatically identified then transmitted to the garbage truck upon pickup at your house. A replenishment algorithm could review your list of “always in stock” items to determine if the item should be replaced immediately. If yes, then a home delivery provider might visit a few hours later to drop off new supplies on your doorstep. Amazon Fresh be extended to include Amazon Trash. Walmart might buy a waste management company. The Smart Trash Can could create a myriad of new opportunities in the supply chain.

Read More

4 Questions: Dieter Meuser Discusses Analytics in Manufacturing

Dieter Meuser and two colleagues founded iTAC Software AG in 1998 to commercialize a manufacturing executing system (MES) developed for Robert Bosch GmbH. The timing was ideal, as manufacturers sought ways to leverage the nascent Internet to automatically monitor and manage their plants and thereby improve quality and efficiency to bolster the bottom line. Today, iTAC (Internet Technologies & Consulting) is one of the leading MES companies serving discrete manufacturers, and has customers throughout Europe, Asia and the Americas. iTAC.MES.Suite is a cloud-based, Java EE-powered application that enables IP-based monitoring and management of every aspect of a manufacturing plant.  OpenText Analytics provides the business intelligence (BI) and analytics capabilities embedded in iTAC.MES.Suite. You can see the software in action in booth 3437 at the IPC APEX EXPO, held February 22-26 in San Diego, California. Learn more about iTAC’s participation in IPC APEX EXPO here, and learn more about the company at the end of this post. Meuser, iTAC’s CTO, has extensive experience working with manufacturers worldwide. He’s also an expert on the German government’s Industry 4.0 initiative to develop and support “Smart Factories” – that is, manufacturing plants that leverage embedded systems and the Internet of Things (IoT) to drive efficiency and improve quality.  We asked Meuser for his thoughts on these topics. OpenText: iTAC has more than two dozen enterprise customers with plants in 20 countries. What do those customers say are their pain points, particularly with regard to data? Meuser: The biggest single pain point is this: Companies have lots of data, but often they are unsure how to analyze it. Let me elaborate:  Many types of data are recorded to fulfill manufacturers’ tracking and tracing requirements. (Called “traceability standards,” these include VW 80131, VW 80160, MBN 10447, GS 95017 and others.) Data collected via sensor networks, such as plant temperature or humidity, are part of these standards. The objective of collecting this data is to continuously improve manufacturing processes through correlation analysis (or Big Data analysis), accomplished by running the data through intelligent algorithms. But because manufacturers frequently aren’t sure which criteria they should use to analyze the data, analysis often does not happen to the extent that manufacturers want.  As a result, data is collected and stored for possible later analysis. This can lead to a growing mountain of unanalyzed data and very little continuous improvement of processes. But it also illustrates why introducing data management right at the beginning of a tracking and tracing project is so important. Data management, supported by analytics, enables process optimization that otherwise would fall by the wayside. OpenText: How do manufacturers use and analyze data – sensor data in particular – to improve their processes? Meuser: Within manufacturing plants, the most common analysis is called Overall Equipment Effectiveness (OEE) based on integrated production data collection and machine data collection (PDC/MDC). This is done within the plant’s manufacturing execution system (MES). PDC/MDC can happen automatically if the plant’s systems are integrated, or manually via rich clients. The captured data can be evaluated in real time and analyzed via free selectable time intervals. Common analyses include comparing planned changeover and turnaround times with actual values; comparing actual production (including scrap) with forecasts; and examining of unexpected equipment breakdowns. Key Performance Indicators (KPIs) in these analyses feed into OEE, productivity and utilization. Reducing non-conformance costs is another important business case for data analysis in both IoT and Industry 4.0. The availability of structured and unstructured sensor data related to product failures (and costs associated with them) enables new opportunities to determine non-conformance. There is enormous potential in systematically analyzing causes of production failure. Failure cause catalogues (which many manufacturers have collected for decades), can be examined with the help of a modern data mining tool. Analyzing this data on the basis of quality, product and process data helps to reduce failure costs in a Smart Factory. OpenText: What is the role of analytics and data visualization in IoT and Industry 4.0? Meuser: A major objective of data analyses and visualizations in IoT and Industry 4.0 is automatic failure cause analysis. This is accomplished by measuring and testing product errors along with data about manufacturing machines, equipment and processes, then identifying inefficient processes in order to establish solutions. These solutions must be checked by process engineers who have years of experience. Humans and machines go hand in hand when we optimize product quality in an Industry 4.0 factory. OpenText: What are the benefits of a Smart Factory? Meuser: A Smart Factory consists of self-learning machines that can identify the causes of failure under specific conditions, determine appropriate measures to address a failure, and send messages to inform operators of problems. This is sometimes called a cyber-physical system (CPS). Combined with appropriate software models, it enables autonomous manufacturing machines (within certain limits) and supports the overall objective to optimize processes and avoid failures before they happen. The Smart Factory is enabled by modern data analysis techniques. It relies on data about products, processes, quality and environment (e.g. room temperature or humidity) as appropriate. The ability to interface an ERP system with production equipment creates continuous vertical integration that covers the entire value chain, from receiving to shipping. Be sure to visit iTAC at the IPC APEX EXPO, and read more about iTAC.MES.Suite in this case study.       More about iTAC iTAC Software AG is a leading provider of next-generation, platform-independent and cloud-based MES solutions for original equipment manufacturers (OEMs) and suppliers within the discrete manufacturing sector. The company has more than 20 years of experience in internet-based solutions for the discrete manufacturing sector, the Internet of Things (IoT) and the Industrial Internet To date, iTAC has amassed an enviable portfolio of over 70 global enterprise customers across five primary industries: automotive, electronics/EMS/telecommunications technology, metal fabrication, energy/utilities and medical devices. Customers including Audi, Bosch, Continental, Hella, Johnson Controls, Lear, Schneider Electric, Siemens and Volkswagen rely on the iTAC MES Suite to optimize their production processes. iTAC’s product portfolio represents the solutions for the Smart Factory of tomorrow. Its principal components are the iTAC.MES.Suite, the iTAC.Enterprise, Framework and iTAC.embedded.Systems, including its platform-independent iTAC.ARTES middleware and iTAC.Smart.Devices, the company’s new physical interface solutions.

Read More

Announcing BIRT iHub F-Type 3.1

Actuate (now OpenText) has released BIRT iHub F-Type 3.1, the latest version of our free report server. Like its commercial cousin BIRT iHub 3.1, the new release of BIRT iHub F-Type now features a REST API and support for third-party custom data visualizations, among other new features. For details about what’s new in the product, see the Technical Summary of New Features. These new capabilities will help you embed analytics and data visualizations in a broader range of applications – a big and growing need among enterprises. This post gives details about the new REST API and Custom Visualization support. I’ll also describe how we’ve tuned the data-out limitation in BIRT iHub F-Type to make dashboard creation more realistic. I’ll finish by taking you step-by-step through the process of moving your content from BIRT iHub F-Type 3.0 to 3.1. BIRT iHub REST API APIs have grown tremendously over the last decade in both quantity and usage, driven in part by enhanced integration standards. REST (Representational State Transfer) APIs are lightweight APIs that allow simple Create, Read, Update and Delete (CRUD) operations using JSON responses over HTTP.  The REST standard separates the data from the visualization, resulting in better code modularity and agility. The REST API in BIRT iHub F-Type 3.1 provides a variety of data access and report generation and management services. Specifically, with the REST API you can: Retrieve data from a BIRT document or BIRT Data Object for integration into applications Integrate BIRT data visualizations into applications by combining the REST API with the JavaScript API (JSAPI) Generate new data visualization documents based on user interaction in your application Convert data visualizations into Adobe PDF or Microsoft Excel formats Schedule generation of data visualizations for later retrieval Upload files to and download files from a BIRT iHub Encyclopedia Retrieve BIRT iHub Encyclopedia contents More information about the REST API can be found in the Technical Summary of New Features. Custom Visualization Support Data visualizations turn information into valuable insights. Previous versions of BIRT iHub provided charts, maps, gadgets and other visualizations, and the new Custom Visualization functionality in BIRT iHub 3.1 takes things a step further. Custom Visualizations enables developers to integrate third-party visualizations from D3, Google Charts, any version of Highcharts, amCharts and many others into BIRT content using external JavaScript libraries. Two examples of visualizations from external libraries, used within BIRT, are shown below. This is accomplished by marshaling BIRT Data Objects for consumption by third-party visualizations. Using BIRT Data Objects for data provisioning offers many advantages over connecting your data directly to the visualizations. Integrating third-party visualizations into your content requires minimal coding, boosting developer productivity. BIRT iHub capabilities, such as the ability to print and export to formats including PDF and Microsoft PowerPoint, are available to all third-party visualizations Adjustments to BIRT iHub F-Type data-out calculation One feature in the new edition is specific to BIRT iHub F-Type: BIRT Data Object generation no longer counts toward the product’s data output limits. (Learn more about BIRT iHub F-Type output here.) This change enables BIRT iHub F-Type users take better advantage of BIRT Dashboards, which rely on data objects. We made this change in direct response to BIRT iHub F-Type users, who told us loud and clear that creating BIRT Data Objects quickly used their data-out capacity. Because we want you to use your data capacity to create dashboards and applications – not just the BIRT Data Objects that they rely on – we made the change. Migrating Content to BIRT iHub F-Type 3.1 It is important to note that BIRT iHub F-Type 3.0 and 3.1 cannot coexist on the same server, and that you will not be able to use your old version of the software once the new version is installed. With those caveats in mind, follow these steps to migrate content from BIRT iHub F-Type 3.0 to the new version. I’d add – and you probably already know – that you shouldn’t use these instructions to migrate content for your commercial-grade BIRT iHub deployments. Because commercial BIRT iHub deployments are typically much larger and have complicated features like clustering, multi-tenancy, and large volumes of data in and out, please see documentation for Windows and Linux. 1. Get a new activation code. You can get a new activation code on your own by downloading BIRT iHub F-Type a second time. You must use the same email address that you used for your initial download. You will get this message: Choose the “click here” link and follow the instructions to get a new activation code. 2. Get your content in order: Once you have a new activation code, you need to get the content you want to migrate from your existing installation. You must do this before you uninstall the old version. You can migrate files and resources the hard way, by downloading the designs, documents and data objects one by one. But an easier way is to use BIRT Designer Pro to move the content. (If you don’t yet use BIRT Designer Pro, get started here). Follow these easy steps: Create a new Project in BIRT Designer Pro. Create a Server Profile in the Server Explorer tab (located under the layout in the Report editor), and enter information based on your installation. (You probably did this already if you used BIRT Designer Pro to design content for BIRT iHub F-Type 3.0.) The Server Profile looks like this: Explore your existing BIRT iHub F-Type installation, locate the content you want to migrate to your new installation, and drag and drop it into your project in BIRT Designer Pro. Make sure you keep the folder structure intact, as shown below. Files in the Resources directory should be in the root directory of the project, and subdirectories of the Resources folder should be located off the root. 3. Install BIRT iHub F-Type 3.1. You will find detailed instructions here. 4. Publish your content. If you installed BIRT iHub F-Type on a different machine, create a new Server Profile in BIRT Designer Pro. Now we want to publish the files on the new server. The easiest way is to right click on the BIRT Project folder in the Navigator and select Publish… Choose the Publish Files tab, and select the content you want publish. Make sure your files (report designs, report documents and dashboards, typically) are selected in the Publish Files window on top, and your resources (libraries, images, property files and data objects, typically) are selected in the Publish Resources window below. Click Publish to complete the migration. Your screen will look something like this: 5. Clear your cache. In some cases you might need to clear the browser cache for users who had connected to the older version of BIRT iHub F-Type in the past. This is necessary due to conflicts with different versions of JavaScript libraries. You’re now ready to use the capabilities in BIRT iHub F-Type 3.1. Thanks for reading and for using the software. To learn more, check out the many resources online to help you get started with BIRT iHub F-Type, including video tutorials and an active developer forum.

Read More

Embedded Analytics – Making the Right Decision

Think back to the last big purchase you made. Maybe you bought or rented a home, purchased a car, or chose a new computer or mobile provider. If you’re a smart shopper, you considered your decision from many angles: Does the product or service meet my needs simply and elegantly? Is the manufacturer solid and reliable? Will my choice serve me well into the future? We face similar questions when we decide on a third-party vendor to embed technology in the apps we build: Is the technology powerful enough? Is it easy to embed? Will the vendor be around in the future? Will the technology evolve and improve as my needs – and those of my customers – change over time? Elcom International faced such a decision almost a decade ago. Elcom’s software product, PECOS, digitizes the procurement process; Kevin Larnach, the company’s Executive Vice President of Operations, describes PECOS as “Amazon for business,” with extensive controls and business process integrations required by leading governments, utilities, businesses and healthcare providers. More than 120,000 suppliers provide products in PECOS through Elcom’s global supplier network, and PECOS is used by more than 200 organizations worldwide to manage more than $15 billion in total spending annually. Elcom decided on Actuate to provide the analytics for PECOS. Thanks to embedded analytics, PECOS  users avoid and reduce costs, get visibility into all aspects of the procurement process for oversight and audits, and reduce risks from rogue purchasing, off-contract purchasing, and slow, manual record-keeping. Larnach says embedded analytics has helped one PECOS user, the Scottish Government, accrue more than $1 billion in audited savings over the past seven years. Larnach told Elcom’s embedded analytics story in a recent free Actuate-sponsored webinar. He shared the virtual stage with Howard Dresner (@howarddresner), Chief Research Officer at Dresner Advisory Services and author of the Embedded Business Intelligence Market Study, published in late 2014. An on-demand replay of the webinar is now available. Dramatically Changing Requirements As Elcom added features and capabilities to PECOS over the last decade, and as its user base grew, the decision to embed Actuate’s analytics technology has been vindicated. “We’ve been able to work with [Actuate] as a sole vendor,” Lernach says. Actuate “satisfies all of our embedded BI requirements, which have changed dramatically over the years.” Larnach used this graphic to show how the capabilities in PECOS and Actuate’s analytics capabilities grew and evolved together. At the base of the pyramid we find basic transactional reporting. “Most of the embedded reporting and embedded analysis that we offered in early stages of our relationship [with Actuate] centered around transactional reporting,” Lernach says. This reporting isn’t limited to summary information; it includes line detail for each and every purchase. Accommodating user requests, Elcom built an extensive library of templates, forms and business documents with embedded analytics into PECOS. The ability to provide consistent, repeatable analysis led Elcom’s customers to want more; specifically, they wanted to perform periodic analysis of their procurement data. (That’s the second layer up on the pyramid.) The request made good sense; after all, PECOS tracks details of every transaction, and therefore creates an audit trail that begs for analysis. “Embedded BI provides analysis against those audit trails,” Lernach says, which both helps organizations uncover waste, fraud and abuse and also drives improved user behavior that locks in savings and efficient business processes. This ability to provide ongoing analysis has led to Elcom adding trend analysis and key performance indicators (KPIs) to PECOS – the third layer on the pyramid. Demand for those capabilities is growing among PECOS users, Lernach says. “We’re starting to do [dashboards and charting] as a standard feature of our software,” he explains, which leads to the tip of the pyramid: one-off analysis. “I see [one-off analysis] as a huge growth area for our organization, especially for customers who have been with us for many years,” Lernach says. Those customers have large volumes of transactional data to analyze – a full terabyte of data, in the case of the Scottish Government – and they want to eke out every bit of savings from that data that they can find. “When you take on a solution like ours, there are big savings areas up front because it’s a dramatic change from manual business processes to electronic ones,” Lernach explains. “But over the years, as you use [PECOS] and look for continuous improvement, it becomes more and more difficult” to find savings. But that’s exactly where one-off analysis capabilities now in PECOS help uncover “hidden gems,” Lernach says – such as a hospital system that saved hundreds of thousands of dollars by consolidating procurement from 11 types of latex gloves to three. “That could only be uncovered by the type of analysis that’s available through advanced BI tools – and some smart people, obviously,” Lernach says. Check out our free webinar to hear more about how Elcom uses embedded analytics in PECOS, and learn more about the powerful e-Procurement solution on the Elcom website. It’s the right decision. “Decide” image by Matt Wilson. P.S. If you want to embed analytics and are trying to decide whether you should leverage a third-party platform or create your own analytics from scratch, Actuate offers a free white paper, “11 Questions: Build or Buy Enterprise Software,” that helps you make the best choice. And if you need help deciding among embedded analytics providers, check out our infographic, “12 Questions for Providers of Embedded Analytics Tools.”

Read More

Why Location Intelligence is Critical to Your App

Anytime I’m standing in front of a directory map at the mall, my eyes initially navigate that little red dot that screams, “You are here!” As a consumer, it’s important to get your bearings before navigating through endless choices. Retailers also want to know where I am – if I fit their target demographic – because they want to attract me to their stores. The ability to track a customer’s location and pinpoint opportunities to upsell or cross-sell through analytics is shown to improve customer engagement and brand loyalty. So, can location-based technology combined with embedded analytics create a perfect storm for companies that want to increase customer engagement and improve revenue opportunities? Experts like John Hadl, Venture Partner at USVP and founder of mobile advertising agency Brand in Hand, say yes. “Location is the critical component that allows marketers to spend their mobile marketing dollars effectively,” notes Hadl. Location-based technology and data analytics, used together, are often called “location intelligence.” Location intelligence improves the customer experience by allowing companies to visualize, analyze and track partnerships, sales, customers and prospects, according to research from consulting firm Pitney Bowes. Having both a person’s location and their relevant information sets the stage for some innovative approaches to customer experience. Location Intelligence Meets Business Intelligence Research shows that user interest in specific location intelligence features grew almost across the board in the last year. According to the Dresner Advisory Services’ 2015 Location Intelligence Market Study, the most important location intelligence features for users are map-based visualization of information, drill-down navigation, and maps  embedded in dashboards. “From the supplier standpoint, we see vendors having a mixed view of the significance of Location Intelligence, with an increasing number this year saying it has critical importance,” writes Howard Dresner (pictured below). “Industry support is only somewhat aligned with user priorities for Location Intelligence features and geocoding, but GIS [Geographic Information Systems] integration and mobile feature support are well aligned.” The report, which is part of Dresner’s Wisdom of Crowds survey series, also notes that sales and marketing departments are most apt to believe location intelligence will impact their jobs. Other findings include: Compared to 2014, location intelligence use is driving farther down organizational ranks. Respondents’ highest priority is the ability to map locations of province/state, country, and postal codes. Governments, along with the retail and wholesale segment, are most interested in discrete geocoding features.  One challenge for organizations that hope to take advantage of location intelligence – aside from being precise – is the ability to map location data to the data set. Embedded analytics might be the solution to this obstacle. Daily Coffee, Chock Full of Data Let’s look at how location intelligence works: In San Francisco you can’t walk more than a block or two before you hit some type of specialty coffee business  (and yes, Starbucks does qualify). But all specialty coffee is not the same, and each neighborhood, because of its population, can provide different opportunities and potentially unique user experiences. As you can see from the infographic below, analysts from Pitney Bowes using CAMEO software determined that residents in San Francisco’s Sunset District represent a cosmopolitan suburbanite who typically engage best with exotic coffee selections and ample space to turn around a stroller. Hipsters living near San Francisco’s Financial District are more likely to attend art shows or poetry readings, so they would prefer a coffee experience tailored to a mid-afternoon espresso and a late-evening mocha. A mobile app can help these users find these ideal coffee experiences. Several coffee companies have them; they use embedded GPS information to help customers find their coffee – and often, to help the coffee find the customers as well. Blending location information with demographic data helps baristas provide customers with an improved experience. For more thoughts on how embedded analytics plays a major part in location intelligence, check out our series of discussions about the new business intelligence. And, of course, subscribe to this blog for all the latest thinking about embedded analytics. (Infographic courtesy of Pitney Bowes and Cameo)

Read More

OpenText Enhances Portfolio with Analytic Capabilities

    By Mark Barrenechea, President and Chief Executive Officer, OpenText Analytics are a hot technology today, and it is easy to see why. They have the power to transform facts into strategic insights that deliver intelligence “in the moment” for profound impact. Think “Moneyball” and the Oakland A’s in 2002, when Billy Bean hired a number-crunching statistician to examine their odds and changed the game of baseball forever. Across the board—from sports analysis to recommending friends to finding the best place to eat steak in town, analytics are replacing intelligence reports with algorithms that can predict behavior and make decisions. It can create that 1 percent advantage that creates the 100 percent difference between winning and losing. Analytics represent the next frontier in deriving value from information, which is why I’m pleased to announce that OpenText has recently acquired Actuate to enhance its portfolio of products. With powerful predictive analytics technology, Actuate complements our existing information management and B2B integration offerings by allowing organizations to analyze and visualize a broad range of structured, semi-structured, and unstructured data. In a recent study, 96 percent of organizations surveyed felt that analytics will become increasingly important to their organizations in the next three years. From a business perspective, analytics offer customers increased business process efficiencies, greater brand experience, and additional personalized insight for better and faster decisions. In a Digital-First World, organizations will tap into sophisticated analytics techniques to identify their best customers, accelerate product innovation, optimize supply chains, and identify the drivers of financial performance. Agile enterprises incorporate consumer and market data into decision making. People are empowered when they have easy access to agile, flexible, and responsive analytical tools and applications. Actuate enables developers to easily create business applications that leverage information about users, processes, and transactions generated by the various OpenText EIM suites. Customers will be able to view analytics for the entire EIM suite based on a common platform to reduce their total cost of ownership and get a comprehensive view for more elevated, strategic business insight. Actuate is the founder of the popular open source integrated development environment (IDE), BIRT, and develops the world-class deployment platform, BIRT iHub™. BIRT iHub™ significantly improves the productivity of developers working on customer-facing applications. More than 3.5 million BIRT developers and OEMs use Actuate to build scalable, secure solutions that deliver personalized analytics and insights to more than 200 million customers, partners and employees. Designed to be embeddable, developers can use the platform to enrich nearly any application. And, these analytics-enriched applications can be delivered on premises, in the cloud, or in any hybrid scenario. We are excited to welcome the Actuate team into the OpenText family as we continue to help drive innovation and offer the most complete EIM solution in the market. Read the press release on the acquisition here.

Read More

Is your EDI program strategic? If yes, find out which documents you need to implement. (part 2)

In my last blog on this topic (Is your EDI program strategic? If yes, find out which documents you need to implement. (Part 1) I introduced the Purchase Order Acknowledgment and the Purchase Order Change and discussed how you can derive benefits from both these documents regardless of the industry sector you are in.  In this blog, I focus on documents that are of specific benefit for anyone in, or working with, the retail sector. Here are a few you should definitely consider. Product and Price Catalog This is a key document that a supplier sends to its retailers.  It enables the supplier to provide product and price information for the retailer to use during the purchasing process.  This document, which is also known as a “sales catalog,” includes information about each product such as: Item identification number Detailed item description, including color, size, dimensions and other unique identifiers Ordering requirements, such as lead time and required quantities This is one that you would use most with third-party catalog providers, but it can also be used on a peer-to-peer basis with your main trading partners. The master product data information in this document is re-used in many other supply chain transactions, so it’s important that it contains accurate data so that errors can be reduced in purchase orders, ship notices and invoices, among other documents. This will enable significant quality improvements and benefits both retailer and supplier.  Ultimately, starting out with good data will speed product delivery to the retailer and eliminate discrepancies between purchase orders and invoices, and for suppliers, this should result in faster payment. Inventory and Product Activity Data Advice Retailers usually send this document to their suppliers to give them information about inventory levels, sales numbers and other related product activity information, such as which items are on back-order.  It should include: Item identification number On-hand inventory quantity by store Type of product movement, such as items sold, out-of-stock, received or on order Future demand calculations This versatile document can also be used in supplier-managed stock replenishment programs and sales forecasting.  Furthermore, for drop-ship orders, (those that are shipped directly to the consumer), it is probably the single most critical business document that can be exchanged. Most retailers’ e-commerce applications rely upon inventory feeds from their supplier here in order to determine whether products can be available for consumer purchases on websites. Delivery Confirmation This document is also extremely useful for direct-to-consumer delivery. Most products sent via a drop-ship process travel with small package carriers. The supplier obtains a tracking number from the carrier and provides it to the retailer via the Advance Ship Notice document.  Any automated communication typically ends at that point and so if order status is needed, the only way to get it is to contact the carrier. Instead, you could have complete end-to-end visibility of your order status if you ask the carrier to send a Delivery Confirmation document to confirm consumer receipt.  This helps the supplier to close the purchase order and to manage the payment and settlement cycle.   Click here if you are interested in learning more about EDI in the retail industry. And here are a couple of blogs about how EDI ASNs support retail-specific business processes: How EDI ASNs Enable Direct Store Delivery Direct Store Delivery (DSD) How EDI ASNs Enable Drop-Shipping   The post Is your EDI program strategic? If yes, find out which documents you need to implement. (Part 2) appeared first on All About B2B.

Read More

Under the Hood – BIRT iHub F-Type: Understanding the Processes

While your customers don’t need to see the inner workings of your app, as a developer, you need to be the master of its parts and processes. It’s time to get under the hood. Hello BIRT community! My name is Jesse Freeman. Although I am not new to BIRT or Actuate, I am transitioning into a significantly more community-centric role. I have spent the last two years working as a Customer Support Engineer for Actuate, specializing in the server and designer products. I am excited to bring my product and support knowledge to the larger BIRT community. I come from a Java/JavaScript background and am a big fan of multi-platform, open source and open standard technologies. I am an advocate of Linux operating systems and have used or dabbled with the majority of the larger Linux distributions. In particular, I am a big fan of Arch Linux and CentOS. Over the next several weeks I will publish a series of blogs that will bring my support knowledge to the community. The series will include posts on understanding the BIRT iHub F-Type’s processes and configuration as well as troubleshooting. This series will provide technical insight for anybody who will be configuring and/or maintaining a BIRT iHub F-Type installation. BIRT iHub F-Type is a free BIRT server released by Actuate. It incorporates virtually all the functionality of commercially available BIRT iHub and is limited only by the capacity of output it can deliver on a daily basis, making it ideal for departmental and smaller scale applications. When BIRT iHub F-Type reaches its maximum output capacity, additional capacity is available as an in-app purchase. Understanding the Processes The first topic of my Under the Hood blog series is titled Understanding the Processes. When I first started in support, one of the first pieces of information I learned was the breakdown of all of the processes and their specific roles. This information was invaluable for the duration of my time providing support. Understanding the processes and their responsibilities provides insight into how the product works for configuration and integration purposes, and helps us understand where to look for more information if an issue arises. With that in mind, here is the list of the BIRT iHub F-Type processes and their responsibilities: ihubd – This is the daemon process responsible for the initial startup of BIRT iHub F-Type. The ihubd process starts the ihubc and ihubservletcontainer processes.  If issues occur during startup, this is one of the first processes to examine. ihubservletcontainer – As the name implies, this process is the front end servlet container for the BIRT iHub F-Type. This process is hosted out of an integrated Tomcat within BIRT iHub, which means anybody who is familiar with Tomcat should feel right at home when configuring or troubleshooting of the process. ihubc – This is the parent of all other processes started by BIRT iHub,  including the ihub, jsrvrihub and jfctsrvrihub processes.  The ihubc is the SOAP endpoint for BIRT iHub’s communication, the job dispatcher, and resource group manager, and also takes requests from front-end applications such as the integrated Information Console. ihub – The ihub process is responsible for communication with the metadata database as well as the Report Server Security Extension (RSSE) if one has been implemented. jsrvrihub – Within a single installation there may be multiple jsrvrihub processes running simultaneously. A typical out-of-the-box installation will have at least two. One of these two typical jsrvrihub processes is used for the viewing of dashboards and the other is used for execution and viewing of reports transiently. jfctsrvrihub – The jfcsrvrihub process is used for the execution of background jobs on BIRT iHub. This includes any report that is explicitly scheduled to run at a specific time (or immediately) and allows reports to be output to a directory within the ihub process rather than viewed immediately within the current browser session. Whether beginning an installation, working on an integration project, or troubleshooting an existing installation,  this information will assist you with knowing the process that needs to be examined. Thank you for reading.  Subscribe to this blog and you will be first to know when I publish my next Under the Hood – BIRT iHub F-Type series with a review of the Primary Configuration Files. Download BIRT iHub F-Type today so you can follow along. If you have any questions, post them in the comments below or in the BIRT iHub F-Type forum. -Jesse

Read More

Banking & PSD2: The Oscars Nominations for Biggest Disruptors are…

Similar to the Academy Awards official nominations, I’m sure you already have a good idea of the personalities that drew everyone’s attention in the Banking and Payments industry. Let’s open the envelope and nominate the most disruptive artefacts of the Payments Services Directive 2 (PSD2) from a Bank perspective. And the nominees are… Payments Initiation Services and Account Information Services From a commercial, legal, technical and operational perspective, Payment Initiation Services is introduced as a new obligation for Banks and traditional Payment Services Suppliers. The idea is to allow third-party companies (also regulated under the PSD2) to make payments on behalf of traditional bank clients. This is the result of the growing number of payment outfits in the marketplace, doing─until now─fairly unregulated business and operating within the scope of their own creativity and commercial objectives. PSD2 puts a framework around that, with the consequence of carving out the traditional business of Banks’ Payments and Cash Management, as we know it today. Account Information Services is the corollary of the first concept. It is the second nominee of our ceremony, as it also opens the market for competition and innovation. While Payment Initiation Services is largely “acting on behalf of the ultimate account owner”, Account Information Services is basically allowing third parties to act as aggregators across a number of banks, in terms of transaction visibility, reporting and all the traditional processes. The growing trend with corporates (wholesale banking) is the migration of treasury processes into the cloud with Payment Services Providers (PSPs) and historical treasury technology vendors, to optimize payments and cash management as an overlay to traditional Bank services. Account Information Services now puts a legal framework around this example. PSD2 is also largely designed to cover the same principles and benefits to the retail banking world. Low Hanging Fruits for Transaction Banking Clearly, the PSD2 will have a number of variations in transposed domestic laws, opening the gates to various rules, processes and technical standards. Across all of the impacted Bank functions, Transaction Services will pick up a lot more scope and business logic. I believe that minimizing and “protecting” back-office platforms from direct PSD2 impacts is the first point to consider. The aftereffect to this idea is the introduction of or fully leveraging Digital Banking. For those who still haven’t heard about Digital Banking, the idea is to separate the way Bank clients consume the services electronically from the way banking products, platforms and processes within the Bank are physically deployed. I wrote about Digital Banking in a previous blog. The low hanging fruits to deliver on the Payment Initiation Service as well as Account Information Services are: Digitize the Banking channels, enabling it to normalize client or third party data (payments, reporting, “act on behalf”, message types). This typically includes data normalization, file mapping, enrichment and transformation. Identify where pre-PSD2 processes in the middle and back-office systems can be maintained. The expansion of KYC rules, client-third party relations and reconciliation processes can help normalize all PSD2 flows from the technical foundations of the Bank. Operational client community and reference data management. Above and beyond KYC and sales data, it becomes an imperative to keep track of the entire detailed inventory of client integration reference data, settings, file types and envelopes, certificates, “act on behalf” mandates, etc. The Business As Usual (BAU teams) from Transaction Banking will need to keep track of multi-layered business and technical relationships. Ensure all access controls, identity management; auditing, reconciliations, and transaction management reflect the multi-layered model. Minimum Compliance vs. Commercial Strategy As of today, in early 2015 some European Banks have started to get ahead of the curve by spinning-off their own independent “third-party PI” brand, to compete within and maintain their share of the PSP market. This is very apparent in the Nordics and Germany. On the other end of the spectrum, the majority of banks are “hatching down”, bracing themselves with the minimum compliance approach. Minimum compliance is basically fixing the new gaps opened by PSD2, largely around security, KYC and electronic banking. Entering the “third-party PSP world” as a new or independent brand─a Bank joint venture, a spin-off, or a subsidiary─is the only way to keep or expand one’s market share. A few smart European Banks have chosen the most aggressive strategy, by executing a “land grab” from other Banks who chose minimum compliance. A Final Observation My personal take on PSD2 is that some Banks are wearing the scars of the financial and emotional investment into SEPA, still fresh in everyone’s minds. PSD2 looks like a further tightening of the bolts, when it actually introduces more disruption to the Banking business than SEPA did. When disruption comes, an organization can either do nothing or fully embrace it and ride the waves.

Read More

Will the Creation of ‘On Device’ or ‘On Thing’ Based B2B Transactions Ever Become a Reality?

Over the past five years CIOs around the world have been rolling out their cloud based B2B strategies. Whether deploying B2B on premise, on cloud or as a hybrid environment, companies have been able to deploy B2B infrastructures according to their budget, strategy and technical capabilities. Infrastructure-as-a-Service, Platform-as-a-Service and Software-as-a-Service initiatives have been deployed with great effect, and numerous other ‘as-a-Service’ definitions have evolved. So where next for B2B based infrastructures?, well with nearly every CIO formulating a strategy in support of the Internet of Things, how about an On Device or On Thing based B2B strategy? I have posted twenty or so blogs relating to cloud infrastructures since 2010 and over the past year I have spent some time looking at the Internet of Things and where this may go in relation to supply chains of the future.  In a couple of my IoT related blogs I provided some examples on how I thought IoT connected devices could connect into an enterprise infrastructure, (read about it here), and then initiate some form of closed loop ordering process as part of a replenishment or predictive maintenance scenario. I read an article on CIO.com last September where the author described something called the Internet of ‘Things as a Service’ or TaaS for short.  I didn’t realise it at the time of writing my own blogs but this is exactly what I was describing, namely a connected device will be able to analyse its own consumption trends or wear rates and then be able to place some form of order for replacement parts without any human intervention.  OK, sounds a bit far-fetched but I can guarantee this is where things, no pun intended, will be going in the future. Billions of dollars are being spent on developing onboard or embedded processing, sensing, storage and analytics based technologies for IoT based devices.  Many companies such as Intel are betting huge research budgets to develop next generation semi-conductor chips that can be embedded on ‘things’. In fact only last week, OpenText acquired a leading analytics company , and they have been looking at embedded analytics for IoT devices. I will take a look at embedded analytics in relation to B2B in a future blog entry as I believe it will transform how companies visualise, interact and manage B2B related information flowing across the extended enterprise. Two weeks ago I had an interesting discussion with ARC Advisory Group relating to device or ‘thing’ level creation of B2B transactions. ARC use the term Industrial Internet of Things (IIoT) to describe their take on this area as they are keen to differentiate themselves from more consumer focused IoT devices such as wearable technology and home automation equipment. As I have mentioned before there are many big players entering the IIoT space, for example GE (who originally coined the IIoT term), Cisco and Bosch to name but a few. Could we see a piece of equipment in the field, for example a generator or excavator, initiating a B2B transaction by itself to order a replacement part that is just about to fail? For the purposes of this blog I just wanted to introduce the idea of a device or ‘thing’ derived B2B transaction and you can read more in the ARC article that was written to support this.  

Read More

Expert Advice on Embedded BI with Howard Dresner [Webinar]

For once, your CEO and CIO agree on something: Your company needs to embed analytics into its applications. You’ve been tasked with researching which platform is best for you, and you probably have two items on your to-do list: Learn from an industry expert who thoroughly studies the many different embedded analytics platforms, and hear from a company that has successfully embedded analytics into its software. You can do both on January 22 by attending  Embedded BI Market Study with Howard Dresner, a free webinar sponsored by Actuate. Dresner, you probably know, is Chief Research Officer of Dresner Advisory Services, a respected technology analyst firm. Dresner (@howarddresner) coined the term “business intelligence” in 1989 and has studied the market drivers, technologies, and companies associated with BI and analytics ever since. It’s safe to say that nobody knows the sector better. In this webinar, Dresner will highlight the results of his recent Wisdom of Crowds report, the Embedded Business Intelligence Market Study, published in October 2014. Dresner’s study taps the expertise of some 2,500 organizations that use BI tools, focusing specifically on their efforts to embed analytics in other applications. In the webinar, Dresner will cover three main subjects: User intentions for – and perceptions of – embedded analytics, segmented by industry, types of users, architecture and vendor Architecture needs and priorities (such as web services, HTML/iFrame and Javascript API) for embedding, as identified by technologists who implement embedded analytics Ratings of 24 embedded BI vendors, based on both the architecture and features the individual vendors offer, and the reasons Actuate garnered the top ranking To add the user’s perspective, Dresner will then give the floor to Kevin Larnach, Executive Vice President of Operations at Elcom. Larnach will explain how Elcom embeds Actuate’s reporting solution in PECOS, its cloud-based e-procurement solution. Embedded analytics enables users of PECOS – a user base 120,000 strong, in more than 200 organizations, managing nearly $20 billion in total procurement spending annually – to access standard reports, slice and dice data for analysis, create custom reports and presentations of the data, and export transaction history to many different formats, all without IT expertise.  As this diagram shows, PECOS touches all aspects of the procurement process. PECOS users include the Scottish Government (including health services, universities and colleges, and government departments), several health services groups in Britain, the Northern Ireland Assembly, several school districts in the United States, the Tennessee Valley Authority (TVA), and many other organizations and companies. Elcom has identified over a billion dollars in audited savings that its customers have accrued thanks to embedded analytics – more than $500 million in the healthcare sector alone. Elcom’s application is truly an embedded analytics success story. The embedded analytics capability in PECOS, delivered with Actuate technology, is an important competitive differentiator for Elcom. Its competitors’ products either have limited fixed reporting, or don’t offer any standard reporting at all. Those competitors “are scrambling to adopt a flexible embedded approach such as the one enjoyed by PECOS users,” Elcom says. You’re sure to have questions for Dresner and Larnach, so the webinar will include a Q&A session. (An Actuate technical expert will also be on hand if you have specific questions about our embedded analytics capabilities.) The webinar will be accompanied by live Tweets using the hashtag #embeddedanalytics.  Register today.

Read More

Will We See 3D Printed Action Figures for Star Wars VII?

Forecasting the demand for merchandise and toys to support a new Star Wars film always requires a bit of magic and Jedi Arts.  How many of each different action figure, light saber and book should be produced?  It has been almost 10 years since the last film was released so there is no recent supply chain data to analyze.  And, of course, we want to avoid a 1977 style imbalance of the force (between supply and demand) when toys weren’t able to be mass produced in time for the holiday season.  What if we forget about the traditional mass production in China model.  What if, instead, toy manufacturers should use 3D printing to create action figures and other toys on demand?  Here is how it would work. Imagine you are seven years old. You convince your Mom or Dad to take you Walmart, Toys R Us or Target where you could visit a special Star Wars kiosk in the toy section. The kiosk would allow you to select action figures to buy – new characters from Episode VII and older ones from the first two trilogies. There would be options to purchase 100+ standard “pre-designed” action figures ($5.99). Alternatively, you could customize the design with hundreds of other permutations ($10.99) – different weapons (lightsaber, guns, sticks), headwear (masks, helmets) and clothing (capes, gowns). After adding your selections kiosks would direct you to visit the in-store factory. You and your parents could watch through a glass window as your own Chewbacca is printed layer-by-layer. After printing, a robotic arm could package the action figures into personalized boxes and put them onto a mini conveyor belt. A store clerk would then retrieve the figures and hand them to the customer. 3D Printers and scanners are already emerging in major retailers around the world. UK-based ASDA offers kiosks where customers can print a miniaturized version of themselves – 3D selfies. Customers stand in a body scanner similar to the one you see at the airport. It takes photos of you from different angles to create a three dimensional model. The model is then sent to an on-site 3D printer, which creates your mini-me a few minutes later. I don’t think it is unreasonable to expect that within twelve months that customized 3D printing stations could be introduced to major retailers to support the Star Wars movie launch. It would certainly make the buying experience much cooler. There would be many more SKUs and customization options available. It would be good for the environment too.  The 3D printing model should reduce the carbon footprint of a typical Stormtrooper. The mass production process overseas would be eliminated. Less pollution and waste would be generated. The transportation by ship and truck of the products from Shenzhen to retailers around the world would be eliminated. There would be less oil consumption and carbon emissions. There could be a perfect balance of the force(s) – of supply and demand. Only the exact quantity of toys desired would be produced. No more Jedi Arts to predict six months in advance how many R2-D2s to make.  There should be fewer out-of-stocks as well. Stores would simply need to keep sufficient printer capacity in the stores and have enough of the associated materials on hand. The customizable 3D printed action figures could be also be available online. The experience from the in-store kiosk could be replicated on a mobile app or a web browser. Imagine if you place an order on your iPhone and one hour later an Uber driver service pulls in your driveway with your Luke Skywalker. Even cooler would be a mobile app that could represent 3D holographic images of the characters. Imagine the ghost of Anakin, Obiwan and Yoda being inserted into pretend battles with real physical action figures. The hologram apps may not be available by next December, but I we should not be surprised if the technology is ready for the launch of Episode VIII. The post Will We See 3D Printed Action Figures for Star Wars VII? appeared first on All About B2B.

Read More

Data Driven Summit – The Future of BIRT Analytics [Video]

What’s the best way to turn advanced analytics insight and intelligence into something that is useful in everyday apps and everyday situations? It’s a trick question from our perspective, because at Actuate we make tools that help organizations getting a handle on Big Data, advanced analytics and visualizations. It’s a highly competitive market because more and more organizations are required to make fast decisions that can have a big impact on their business – decisions that help reduce customer churn and anticipate risk, for example. So to assist data geeks and non-data geeks alike, Actuate is adding new capabilities to its BIRT Analytics platform (version 5.0) in early 2015. The improvements will allow business users to analyze billions of rows of data in seconds, and generate actionable and real-time results. BIRT Analytics runs as a server on Windows, Linux and Mac OS X. They can also access enriched data or intelligence from BIRT Analytics to embed into existing applications with a public API. This helps organizations discover hidden insights for forecasts, predictions or patterns for future success, while keeping abreast of the competition. During Data Driven Summit 2014 – Actuate’s annual series of customer events – Mark Gamble (@heygamble) and Pierre Tessier (@puckpuck) delivered a “must see” live demo session exploring some of the advanced features in BIRT Analytics 5.0. Actuate VP of Product Marketing & Innovation Allen Bonde (@abonde) was also on hand to give his perspective. Here’s that discussion at the Data Driven Summit in Santa Clara, Calif. We’ll be posting more of the Data Driven Summit 2014 video series here, including the other demonstrations, BIRT data visualization insights and panel discussions with industry insiders. Subscribe (at left) to be informed when new videos are posted.

Read More

Join Santa Claus on his Journey to the Digital First World!

When OpenText acquired GXS in January 2014, little did the company know that they would also be acquiring a customer widely regarded as having one of the most secretive businesses in the world. Over the years, many companies have decided to outsource the management of their B2B environment and in 2008, GXS signed a Managed Services contract with its most high profile customer, Santa Claus Enterprises in the North Pole. Over the years I have kept in close contact with this particular customer as they have been a shining example of how to deploy the full portfolio of B2B solutions from OpenText. Each year, just before Santa’s busiest period, I have provided a summary of the enhancements to their B2B environment. The evolution of Santa’s B2B environment is documented via the blogs below, feel free to take a look through as they will also provide some interesting insights into what it takes to deliver millions of Christmas presents on just one night of the year. 2013 – Santa deploys the Internet of Things across his North Pole Operations 2012 – Santa begins to evaluate the information flowing across SantaNet and implements a Big Data strategy 2011 – OpenText Active Community gets rolled out across Santa’s trading partner community to improve day to day collaboration across his Present Delivery Network and he also gets nominated for B2B Heroes award 2010 – Santa evaluates how cloud computing and mobile devices could improve North Pole operations 2009 – Santa completes deployment of OpenText Managed Services and begins to embrace social media tools 2008 – OpenText Managed Services chosen to support Santa’s new B2B hub, OpenText Intelligent Web Forms deployed to create SantaNet Santa’s little helpers, namely his army of elves, were asked by Santa to review the portfolio of Enterprise Information Management (EIM) solutions from OpenText to see where further benefits could be made by automating manual business processes and digitising the remainder of his business operations. Many companies are embarking on a digital journey to improve the way in which different departments manage and get access to their corporate information. In fact ‘Digital Transformation’ projects are high on the agenda of many CIOs around the world at the moment and OpenText is in a unique position to provide a one stop shop to transform companies into a digital business. In August I received an email from Sonja Lundström, Santa’s trusted advisor and executive assistant, inviting me to go up to the North Pole to provide a digital business briefing for Santa and his executive board. Santa’s board members comprise of senior executives from some of the world’s leading toy manufacturers including Mattel, Hasbro and Lego. As with previous trips up to the North Pole, I was asked to check in at the Elf Air desk at a secret terminal at Schipol Airport just outside Amsterdam. This year I had the privilege of travelling on one of Santa’s new Airbus A380’s, a converted passenger plane that allows Santa, when required, to expedite the shipment of thousands of parcels to any one of his Present Distribution Hubs located in strategic locations around the world. The plane I travelled on, call sign ELF020, was one of a fleet of ten aircraft that Santa had chartered for the 2014 holiday season. 16 hours after leaving the UK I was checking into the North Pole Ice Hotel, a stone’s throw from the entrance to Santa’s primary toy manufacturing and distribution facility. I decided to get an early night as I knew the following day would be quite busy! The next day I walked across to Santa’s factory and I was whisked up to the executive briefing centre where I was introduced to Santa’s board members. Five minutes later and the main man himself walked through the frosted glass doors to the board room. Following introductions, Santa’s Chief Elf Information Officer provided an update on their current IT and B2B related projects. I have documented many of these projects quite extensively in the earlier articles which I listed at the beginning of this blog. Needless to say I was very impressed by the ROI that Santa had obtained by deploying OpenText Managed Services. Santa’s core B2B platform, the Present Delivery Network (shown above), processes billions of transactions each year and over the last five years, Santa had seen a 40% growth in new present orders through SantaNet, a web form based toy ordering environment that our company setup in 2008. The growth in new orders had come from the so called omni-channel effect with children placing toy orders through PCs, mobiles and tablet based devices. In addition to deploying a world leading B2B platform, Santa’s team rolled out their ‘Internet of Santa’s Things’ infrastructure, a high profile initiative to provide improved visibility across Santa’s Present Delivery Network. The Internet of Things has become one of the most talked about disruptive digital technologies of 2014, and Santa had no concerns about deploying his IoST environment and he certainly proved to be a digital trail blazer in this particular area. In addition, Santa had embraced a number of other disruptive technologies during 2014. Last year I discussed how Santa’s elves were using Google Glass in their warehouses to improve their toy pick rates. In addition to Glass, Santa had tested some other high profile disruptive technologies. A few years ago Santa invited Steve Jobs to his factory and following lengthy discussions Santa Claus Enterprises became a leading member of Apple’s beta test program. As soon as the early iWatch wearable devices were revealed to the world’s media in 2014, Apple despatched a shipment of iWatches for every elf in the factory. These came pre-loaded with a number of festive mobile apps to help improve the day to day efficiency of Santa’s team of elves. 3D printing was rolled out across Santa’s production department, not just for manufacturing proof of concept toy designs but to build scale models of new sleigh designs that would then be refined in Santa’s onsite wind tunnel. Sleigh research budgets have increased significantly over the years and 3D printing was helping to develop the most aerodynamically refined sleigh in the world. The final area of digital disruption that Santa embraced in 2014 was advanced robotics. Santa had heard that Foxconn, a leading contract manufacturer to Apple, was deploying up to a million ‘Foxbots’ across their manufacturing operations. Santa decided that he wanted to deploy ‘Elfbots’ to bring similar efficiencies to his own production operations. Santa is now working with Andy Rubin, head of Google’s newly formed robotics division, to define a development plan for his network of 2,000 Elfbots. Santa has done a great job of ensuring that he can seamlessly connect with the little children around the world. So in many ways Santa’s operations were already significantly digitally enabled but now that GXS had been acquired by OpenText there was scope for the deployment of further digital information tools. After all, many of the new disruptive technologies such as connected IoST devices were producing high volumes of unstructured data that would need to be archived, analysed and acted upon as required. After the CEIO had provided his updates it was time for me to take to the floor. I provided Santa and the board with a high level introduction to OpenText and they were very impressed with the joint customer base and the opportunities available to embrace new Enterprise Information Management solutions. Even though Santa had consolidated many back end business systems, such as his Elf Resources Platform (ERP), there were still many different information silos located within the various departments of his operations. Just finding the right information at the right time proved to be a challenge on occasions. To gain further efficiencies across Santa’s operations it would be important to ensure that all departments could feed off of a centralised digital information hub. This hub would be accessible any time, any place or anywhere, useful considering the global nature and complexity of Santa’s operations. OpenText solutions are divided across five key ‘pillars’, shown by way of the chart below, Santa’s B2B solutions are under the Information Exchange pillar. Before I had even explained each of the five solution pillars, Santa could immediately see that there was a significant opportunity to increase the footprint of OpenText solutions across his business. Santa said that he would like OpenText to become his trusted guide during his journey into the digital first world. But first he wanted me to highlight how OpenText could manage different types of information from the key stages of a toy’s lifecycle. I created the chart below to help illustrate some of the key process stages across Santa’s manufacturing operations. I have also overlaid, where appropriate the five key solution pillars as they apply to each stage of the lifecycle of a toy (which in reality could represent any manufactured product). Now I could go into detail around how OpenText can help manage information across each of these twelve process steps, but for the purposes of this article, let me just expand on five of these. Toy Design & Engineering – At this phase of a toy’s lifecycle, any information associated with the design of a toy will need to be centrally managed and archived in an Enterprise Content Management (ECM) solution. Typical files managed at this stage include 3D CADCAM models, 3D printer files, 2D drawings, production related information and high quality rendered images and 3D animations. A Digital Asset Management solution from OpenText would allow Santa’s marketing elves and outside PR agencies to review and download high quality rendered images and videos for use in promotional materials. Information Exchange (IX), solutions such as Managed File Transfer, allows Santa’s design elves to send large file size design information anywhere across the external enterprise, including contract manufacturers. Procurement / Supplier Onboarding – This is part of the toy’s lifecycle that GXS, now Information Exchange, has been supporting over the past few years, from on-boarding suppliers and ensuring they can exchange B2B transactions electronically to providing back end integration to Santa’s ERP platform. In addition, it is important for a procurement team to work collaboratively with their suppliers and all proposal, contract and contact information will need to be centrally managed. The procurement elves may need to undertake some form of Governance, Risk and Compliance (GRC) assessments across their trading partner community. The area of GRC is becoming an increasingly important area for many companies and new regulations such as conflict minerals compliance needs to be adhered to and managed in an effective way. Just as an aside, Santa takes Corporate Social Responsibility really seriously, so much so that he would like to setup an Elf Information Management System (EIMS) to help with the day to day management of his elves and ensure the quality of their welfare whilst working in the toy factory. Plant Maintenance and Asset Management – Santa has an army of elves conducting proactive maintenance on shop floor related manufacturing and assembly equipment. Given the tight production schedule that Santa has each year, his elves ideally need quick access to maintenance and machine test procedures, 2D maintenance drawings and equipment test and compliance certificates. Even ensuring that Santa’s elves adhere to the latest Elf and Safety procedures has become a challenge over the years. The elves already have access to ruggedized tablet devices for use on the shop floor. Using Appworks, OpenText’s mobile app development platform, Santa’s elves would be able to get remote access to any information archived in the central content management system. In addition, the elves need to follow a standard process for maintaining each piece of equipment and OpenText’s Business Process Management (BPM) solution would be able to more effectively manage all the process steps involved with maintaining Santa’s production equipment. Can you imagine what would happen on the 24th December each year if the toy production lines are halted due to a malfunctioning assembly robot? Online Customer Experience – The SantaNet portal had worked well over the years and allowed the little children of the world to login to a portal and submit their present wish lists! At this stage of the toy’s lifecycle, various web related assets will need to be created and managed, eg product brochures, toy promotion videos and animations will need to be accessed by different elves across the extended enterprise and outside video production agencies. OpenText Customer Experience Management (CEM) solutions are ideal for this purpose. Given the connected nature of today’s children, Santa would be able to setup a best in class ‘Young Person Experience Management’ offering that would leverage OpenText’s Web Experience Management offering. In addition, all other internal websites used by his elves could be upgraded with the latest portal technologies offered by OpenText. Recalls and Warranty Repair – The final stage of a toy’s lifecycle relates to the potential recall or repair of toys. Unfortunately not every toy delivered via the chimney makes it safely down to the fireplace and breakages can occur. Santa established a toy repair and recall centre ten years ago however many of the processes used to recover broken toys from the world’s children are quite lengthy and prone to delays due to the amount of manual paperwork that needs to be processed. In addition to repairs, sometimes toys have to be recalled, perhaps due to poor quality workmanship by Santa’s elves. Whether repairing broken toys or recalling faulty toys, Santa’s elves could significantly improve operational efficiencies by deploying OpenText’s Business Process Management (BPM) solution. BPM will ensure that every toy that needs to be repaired or recalled follows a strict series of process steps. This ensures that a consistent and repeatable repair/recall process can be established and this helps to improve Child Satisfaction Levels, a key metric used by Santa to keep the world’s children happy with their toys. In addition to providing an overview of these five solution areas, I explained to Santa that OpenText was looking at how the different pillar solutions could be integrated together. I also showed a new fast moving video which helps to describe the OpenText Cloud. To wrap up my presentation to Santa and the board I also discussed new development areas and highlighted a recent announcement concerning OpenText’s intention to acquire the business intelligence company, Actuate. Last year when I visited Santa Claus Enterprises HQ, I was shown the latest beta version of SantaPad, a Big Data analytics engine for processing toy consumption trends across the little boys and girls of the world. Actuate could potentially provide the business intelligence platform to significantly improve the big data analytics capabilities across Santa’s operations. Santa was so excited by this news that he requested a briefing of Actuate’s capabilities, as and when it was convenient for OpenText to do so. We had just gone over our two hour presentation slot with Santa and I decided to summarise how OpenText helps businesses move to a 100% digital business. Firstly OpenText can help to Simplify Santa’s back end platforms to manage enterprise wide business information, irrespective of which application the information was originally created in. Secondly, OpenText can help to Transform information from literally any format to another and ensure that digital information can be exchanged both internally across the elf community and externally across third party contract manufacturers and logistics providers. Thirdly, OpenText can help to Accelerate the adoption of digital technologies, which would allow faster business decisions to be made. Santa’s operations would ultimately become more responsive to changing consumer demand and increased competition from new emerging toy markets. This brought our meeting to a close and I had a number of actions to follow up on with my colleagues back at OpenText! In closing, Santa wished OpenText and our global customers Season’s Greetings and Happy New Year and he said he was looking forward to working closely with OpenText during 2015 and beyond. So it just leaves me to say season’s greetings and best of luck for 2015!  

Read More

Fax to the rescue – again!

Most of us can’t imagine a business day without email. It’s the lifeline of most communication both inside and outside an organization. But what happens when you can’t use your email? That’s what Sony employees and executives are faced with right now, in light of the recent hack to their network and email system. It seems that Sony employees have taken to picking up the phone, handwritten notes and, you guess it – fax, as a way to communicate with one another. Old school, you say? Old reliable, we say. Among those alternatives, fax is the only form of communication that is reliable and completely secure. Since a fax is a secure point-to-point communication between two parties, it can’t be intercepted or hacked. A phone call can be overheard, a handwritten note stolen. An electronic fax can go from one party to the other, safely and securely. The next time you need to communicate something that you don’t want to see in a headline, we suggest you FAX it. See the story here: http://uk.businessinsider.com/sony-execs-use-fax-machines-after-hack-wiped-out-email-2014-12

Read More

Demand Forecasting for Star Wars VII Merchandise – Use the Force

One year from today (December 18, 2015) the seventh episode of the Star Wars saga will be released into movie theaters around the world. The movie will only last two hours, but kids will relive the movie for years afterwards with the Star Wars action figures and other new toys that will accompany the film. Forecasting demand for toys and merchandise associated with a major movie release can be quite challenging. When the original Star Wars action figures were released in late 1977 there was a huge supply shortage the following Christmas. A small toy company named Kenner had licensed the rights to produce toys for the original Star Wars film.  With numerous production delays and budget overruns, few associated with the film expected Star Wars to be a success before its release.  Consequently, Kenner had not bothered to manufacture any merchandise in time for the release.  Needless to say, the film was a smashing success breaking box office sales records and creating a loyal fan based of millions seemingly overnight. As a result, Kenner had to develop a merchandising strategy to capitalize on the film’s widespread popularity.   But the development of plastic toys required over a year.  The action figures needed to be designed, sculpted and tested.  Expensive and time consuming steel molds were needed to support the manufacturing activities.   Starting in mid-summer, Kenner would not be able to get the products to market in time for the all-important Christmas holiday season.  Although Kenner held the license rights to the biggest movie in history, it could not capitalize on the opportunity because it get its merchandise to market fast enough. Toy manufacturers won’t make the same mistake with Episode VII. With filming already completed in Iceland and Abu Dhabi earlier this year, I suspect design has already started on action figures, replica ships and other merchandise in preparation for launch day. Manufacturing on the toys will need to start four to six months in advance (July-September 2015) to allow adequate time for product to be shipped from China via ocean freight to stores around the world. Forecasting demand for toys will require the usual guesswork. Which action figures will be the most popular? Will it be the older versions of Princess Leia, Luke Skywalker and Han Solo or will it be new additions Poe Dameron and Kylo Ren? Only the force could be used to accurately make these predictions. Episode VII’s release seven days before the Christmas holiday will complicate matters even further. Millions of kids will view the movie in the first few days then make last minute additions to their holiday wish lists. Santa and his elves will have to work quickly to respond to changing demand patterns up until Christmas Eve. I think the best way to launch the toys would be not to mass manufacture them in China, but instead to produce them on-demand in retail stores with 3D printers.

Read More

Accessible Communications Deadlines Looming for Public & Private Sectors in Ontario

Do you communicate with your customers electronically? An important deadline is looming for businesses that operate in Ontario. If your organization uses PDF documents in its Customer Communications strategy, then you should be concerned about the deadline for providing accessible communication supports in those PDF documents. The Accessibility for Ontarians with Disabilities Act (AODA) was established in 2005 to help fight discrimination against people with disabilities in Ontario. Since then, in 2010, the Ontario Government enacted the Integrated Accessibility Standards Regulation (IASR) under the AODA. The IASR has deadlines for organizations to provide accessible communication supports which are prescribed based on type and size of organization. The Ontario Government and Legislative Assembly are already on the hook to meet this requirement as of January 1st, 2014. The Public and Private sectors are expected to comply according to the following schedule: Large Public Sector Organizations as of January 1st, 2015; Small Public Sector Organizations and Large Private Organizations as of January 1st, 2016; and Small Private Organizations as of January 1st, 2017. A large organization is one which employs 50 or more people. Programs such as paperless billing, which may involve electronic delivery of PDF-based statements, invoices and bills, are almost ubiquitous in industry today, due to their cost savings advantages and various corporate green initiatives. People with disabilities such as blindness, partial vision loss and cognitive disabilities interfering with reading ability, should be as much able to take advantage of these socially-responsible programs as anybody else. Not having communication supports built into these documents creates barriers to their participation and violates their rights under the law. Accessible Communication Supports for PDFs are provided by adhering to the PDF for Universal Accessibility (PDF/UA) standard which requires including a tag structure and other metadata within the file. PDF/UA compliant documents provide information to assistive technologies such as screen readers with regards to what is the meaningful content, and in what order it is to be read. It includes such information as language specification, identification of document hierarchy and alternate text for images used as content. Without these, screen readers see a PDF document as essentially empty. Many organizations take a manual approach to providing accessible formats, by responding to customer requests and sending documents to service providers at tremendous cost (from $5 to $35 per page) to be converted on-demand. However this is an exclusionary approach, requiring people with disabilities to inform these organizations of their disability, which to them is private information. In addition, delays in having documents made accessible by hand can disadvantage the consumer, especially when the information requested is time-sensitive in nature. An automated transformation approach, as provided by Actuate’s Document Accessibility Solution, can solve this problem by providing consumers with virtually instant access to accessible versions of their statements, allowing them the same timely access to information enjoyed by their sighted compatriots. Because of its ease of integration with an organization’s existing Customer Communications Management (CCM) systems, it can be provided inclusively eliminating the need for consumers to divulge private information. This can all be done at a small fraction of the per-page cost as compared with the labor-intensive manual remediation approach. This not only improves an organization’s public image, but also reduces expenses improving the bottom line. Organizations facing these deadlines under the IASR should already be thinking about how they will comply. To find out more about Actuate’s Document Accessibility Solution, simply send your request for information to ccminfo@actuate.com, and we would be delighted to start that discussion with you.

Read More