EDI

10 More Features a Cloud-Based B2B Integration Platform Needs to Offer

cloud B2B integration

In my last post I shared 10 features that a cloud-based B2B integration platform should offer beyond simple message translation and protocol mediation – services such as compression, encryption and data enrichment. But there are many more features that the leading platforms have already started to offer: 1) Digital Signatures In Plain English – Put your electronic John Hancock on it to confirm that you actually sent it Example – You recently shipped 5000 brake pads from your plant in Stuttgart to your customer outside Paris. Now it is time to send an invoice. The German tax authorities require all electronic invoices be transmitted with a digital signature. If there is an audit, government authorities can be sure that fields on the invoice were not manipulated to evade Value Added Taxes. The cloud-based integration service should be able to apply the appropriate digital signatures to comply with German law. 2) Message/File Split In Plain English – Take one message and split it into two Example – You need to make 100 payments to various suppliers for goods and services rendered.  Some payments are to be made by check. Others by wire transfer. The remaining payments will be made by Automated Clearinghouse. You upload a file with the names of each supplier; the payment amount; payment method; and payment date to your bank. But before the files are transmitted to the bank, a cloud-based integration service splits the list of payments into three separate files – one for checks; one for wires; ones for ACH. The three files are then routed on to the corresponding payment processing systems in the bank. 3) Message/File Merge In Plain English – Take two (or more) messages and merge them into one Example – Your online retail customer requires that you sent an inventory update once an hour for the high-volume consumer electronics product they sell on their website. But the inventory data is housed in four different applications behind your firewall. Instead of sending four different files to the retailer you prefer to consolidate the information into a single report. The cloud-based integration service should be able to collect the files from the four different applications then merge them into a single report. 4) Long Term Archiving In Plain English – Keep a copy of the message on file for several years You exchange invoices electronically in Europe with your customers and suppliers. The tax regulations in several EU member states require copies of invoices to be kept on file for up to 10 years. A cloud-based integration service could store the invoices in the original format for a period of years to comply with the regulation. 5) Alerting In Plain English –Tell me when something is wrong with my message. Example – The map translating a $1M purchase order from your largest customer into your SAP system just failed. You have a 2 hour service level commitment to acknowledge orders from this account. Your sales organization needs to phone the customer to confirm the order, but how do they even know it exists? A cloud-based integration service could notify individual sales representatives when errors or exceptions occur for big customers. 6) Quarantine In Plain English – Put incoming data in a holding cell until it gets cleaned up Example – Your supplier sends you an invoice for $500,000 for widgets it delivered last week. But the invoice does not contain the general ledger code of the buying organization or a reference to your purchase order number. If the invoice goes into your ERP system then it will be your accounting team’s problem to track down the information. Unless there was a way to force your supplier to enter the missing fields before it is allowed through your firewall.  Cloud-based integration services should be able to quarantine messages with bad data in them. 7) Search and Replace In Plain English – Replace “This” with “That” Example – You recently changed the name of your primary product. Your marketing team insists that the brand be represented properly on any outbound communications including EDI/XML. Just like in your word processor you need to replace the words “Big Widget” with “Ultra Widget.” Your cloud-based integration service should be able to perform a global search and replace within inventory reports, ship notices and other electronic messages. 8) Duplicate Checking In Plain English – Don’t send the same document twice Example – Your supplier is having problems with their B2B integration environment. It is unstable and has been restarted three times in the past week. Each time the supplier restarts its B2B environment it attempts to resume processing all the transactions in its queue. But the queue gets corrupted causing the same messages to be sent more than once. As a result, you receive two duplicate invoices for $2M each. Will your accounting team catch this before you make a double payment? Your cloud-based integration service should be able to perform duplicate matching on transactions. 9)  Carbon Copy In Plain English – Send a copy to a third party Example – Just as with email you can carbon copy interested parties on EDI/XML documents in addition to the intended recipient. You have outsourced fulfillment and inventory management of certain products to a third party logistics provider (3PL). Whenever you receive purchase orders for those items, you want the 3PL to receive a copy as well so they can process the order. 10) Out-of-Sequence Processing In Plain English – Make sure the documents are received in chronological order At 9:56AM a status message is received from a trucking company stating that your shipment is on the road 100 miles away and will arrive this afternoon. At 10:30AM you receive a shipment notification from the manufacturer that the same order has just shipped from the original factory 500 miles away. Have the laws of physics been violated? No, the trucking company sends its EDI messages more frequently than the manufacturer. Companies often send EDI/XML documents in batches once per hour (or a few times a day). Cloud-based integration services can re-sequence documents into their chronological order so that there is no confusion about timelines.

Read More

Santa Deploys the ‘Internet of Things’ Across his North Pole Operations

Over the past five years I have been providing updates on one of our more secretive customers based out of a large factory in the North Pole. Nearly all of our B2B solutions have been implemented across the supply chain and distribution network of the big man himself, Santa Claus. I have listed five years worth of project updates below, simply click on the web links to learn how cloud B2B integration helps Santa with his operations. 2008 – GXS Managed Services chosen to support Santa’s new B2B hub, GXS Intelligent Web Forms deployed to create SantaNet 2009 – Santa completes deployment of GXS Managed Services and begins to embrace social media tools 2010 – Santa evaluates how cloud computing and mobile devices could improve North Pole operations 2011 – GXS Active Community (formerly RollStream) gets rolled out across Santa’s trading partner community to improve day to day collaboration across his Present Delivery Network and he also gets nominated for B2B Hero award by GXS 2012 – Santa begins to evaluate the information flowing across SantaNet and implements a Big Data strategy Five years on from the initial discussions with Santa’s IT team I have just returned from a three day trip to the North Pole. Getting an audience with Santa has always been difficult, especially this time of year, and it was whilst I was returning from a business trip to Amsterdam that I received an email from one of his many assistants. “Please can you come up to Santa’s factory in the North Pole as he would like to update you on how we have expanded our Present Delivery Network Hub during 2013. A seat has been reserved for you on ELF001 which will be leaving from Schipol Airport at 21:00hrs” I cancelled my flight home and boarded ELF001, a 747 ‘DreamLifter’ up to the North Pole International Airport. I showed a picture of one of these aircraft in last year’s update and it always amazes me how much you can fit into one of these aircraft, Santa leases several of these aircraft each year to help with the distribution of over 520million presents to his global network of Present Distribution Hubs. When I arrived at Santa HQ I was whisked through the fast track security channel, (security is normally tighter here than any TSA check point found at North American airports) I was taken straight to the ‘Project Dasher’ war room where the global deployment of GXS Managed Services was initially masterminded from. In the corner of the room was Santa, sitting by the fire reading a copy of our new EDI Basics book (download a copy HERE). One thing you quickly learn about Santa, probably due to the nature of his job, is that he knows most of the IT and technology trends that have made news in recent years. Santa always looks for ways to continuously improve his operations and during 2013 he has been implementing a new project relating to the ‘Internet of Things’. The Internet of Things basically relies on machine to machine connectivity via the internet to exchange real time information from one device to another. ‘The Internet of Santa’s Things’ (IoST) has now been deployed across his entire operation. Santa already had good visibility across his operations but the connected nature of the Internet of Things has meant that he has been able to take enterprise wide visibility to an entirely new level. Santa already had his trading partners connected to his Present Delivery Network Hub and last year he spent a lot of time implementing a Big Data strategy to analyse information flowing across this platform. Santa quickly realised that if he could somehow connect his digital and physical supply chains together then he would obtain even greater operational efficiencies. However there was one major stumbling block to deploying IoST, he had to connect every machine or piece of equipment to the internet. Santa decided to extend this still further by connecting every employee and reindeer to IoST as well, but more on this later. Santa had to sign partnership agreements with numerous network and industrial automation providers as well as one of the world’s largest mobile network companies to allow all aspects of his operation to connect with IoST. Every piece of equipment that needed to be connected to IoST had to have a WiFi card connected to the machine or equipment’s main control board. The extended real time connectivity that IoST now provides allows Santa to obtain some interesting insights into his operation. Here are just a few examples of how IoST is beginning to help Santa’s operation. Every piece of warehouse and logistics equipment within Santa’s global network of Present Distribution Hubs is now connected to the Internet. Increased connectivity across his distribution network has helped to remove traditional blind spots where inventory levels had previously been difficult to monitor. Every stock movement via ‘internet connected’ machines such as fork lift trucks and pallet movers can now be accounted for and this provides a more accurate view of inventory levels. Given the tight schedule that Santa has over the Christmas period, every second saved through improved visibility helps to improve the overall service to his customers, the children of the world. Closed loop processes were implemented to allow the automatic ordering of toy parts. As soon as the level of parts inventory fell below a certain level, sensors in the storage bins sent a message to the order management system and electronic orders for new parts would be sent directly to the supplier with no elf intervention at all. This closed loop ordering process has helped to significantly reduce buffer stocks of toy parts which are now ordered on demand, as they are required. Santa’s army of warehouse associates, elves, work in warehouses ten times the size of Amazon’s largest warehouse in the world. To improve elf productivity it is important for presents to be located quickly in their respective storage locations within the warehouse. In order to maximise efficiency, each elf has been issued with Google Glass which helps to locate specific presents in the warehouse and provide instant access to product information. The extensive network of sensors and other connected devices transmit a constant stream of information across the warehouse and factory locations. Google Glass provides one of many visual ‘entry points’ into information flowing across Santa’s Present Distribution Network. The North Pole Union of Elves (NPUE) has been keeping a close eye on the working environment of Santa’s elves in recent years. The elves work tirelessly during the Christmas period to fulfil ‘orders’ from the little children of the world and to protect the health of the elves, they have been issued with a slightly modified Jawbone device which they can wear on their wrists during the working day. The Jawbone device helps to monitor the work, health and sleep patterns of each elf. As each elf clocks out at the end of each day, information from their Jawbone device is uploaded to a central database and the health of each elf is analysed overnight to ensure they are working within the union’s guidelines. So as well as machines being connected to IoST, every one of the 10,567 elves are connected to IoST as well. Santa started to implement a Big Data strategy in late 2012 but now with every machine and elf connected to IoST, Santa has a huge amount of data available to him which helps to make improved and better informed management decisions. Santa heard from one of the leading industry analysts that Big Data Analytics was going to become a more important area in coming years and this is why Santa has now decided to sponsor a degree at a local Elf University which will help elves analyse these rich data sets and essentially become world leading information scientists. The scale of Santa’s operation is huge and so are his energy bills to run his numerous factories and distribution facilities. Santa’s business is at the heart of an area often depicted in news reports relating to global warming and for this reason Santa has an added interest in preserving the environment. The Internet of Things presented Santa with an opportunity to monitor energy levels across his facilities and take corrective action to improve the energy efficiency of these operations. Over the past year Santa has built up a global network of technology partners, one example was Google discussed earlier and another is NEST, a producer of leading edge thermostats and smoke detection systems. These devices are installed in every major work area of Santa’s operation and each device is connected to IoST. They allow Santa to remotely monitor and control the temperature to ensure his elves have an optimum environment to work in. Santa also worked with local water supply and electricity companies to find ways of monitoring energy usage. Power for his factories and warehouses can be supplied from either water based power generation equipment located under the Arctic ice flows or the giant solar panels and wind turbines that have been installed across the North Pole. Santa takes great pride in the operational efficiency of his elves on the production lines as they are assembling toys. To improve his levels of factory automation, Santa has also worked with leading industrial automation companies to install automatic assembly and packaging machines. Needless to say these pieces of production equipment have also been connected to IoST. In fact since connecting to IoST the production line has experienced virtually zero downtime. This is due to the ‘predictive maintenance’ processes that have been put in place. In the old days Santa’s maintenance elves would carry out preventative maintenance on production equipment but with the introduction of IoST, Santa decided to implement a predictive maintenance process that would harness the information being transmitted from each machine and allow them to make decisions on whether parts needed replacing. This process has been so successful that even Santa’s contract manufacturers have connected their production equipment to IoST so that they can leverage the benefits of predictive maintenance. Some people think that Santa’s reindeer have some form of nuclear power source as they are able to go around the world in a matter of hours. In fact following the elves adoption of Jawbone devices, Santa thought it was only fair to ensure the well-being of the other key members of his extended staff, his herd of reindeer. Santa had read in a recent edition of Wired magazine that some scientists had experimented with health monitoring chips embedded under the skin to monitor key functions of the body. Santa worked with one of the world’s leading universities conducting research into improving animal health, to see if a chip could be embedded under the skin of a reindeer. This would allow the reindeer’s health to be monitored remotely 24/7. The chip monitors nearly a dozen key body functions and this information is transmitted back to Santa HQ via IoST every minute so that if necessary Santa can replace a reindeer as and when required and thus ensure that the herd of reindeers are working to their optimal performance. Santa’s sleigh contains nearly as many sensors as a Formula One race car. The 150 sensors placed at strategic points on the sleigh monitor everything from speed, sleigh distortion, temperature, weight and on board inventory levels. The information from ‘Sleigh Force One’ is transmitted to Santa HQ via IoST. The burst of information, nearly 2Gb worth of data, is transmitted every five minutes and is archived on a bank of storage devices within Santa’s newly upgraded data centre. Remember those barges mysteriously floating off the California coast recently? Google said that they were going to become facilities for show casing new products. In fact one of the barges is actually a new data centre that was recently towed up to the Arctic Circle. This data centre is at the hub of Santa’s IoST infrastructure. A team of elves located in the Mission Control facility at Santa’s HQ are constantly monitoring the sleigh to make sure it is perfectly balanced and optimised during its journey through some of the harshest weather conditions around the world. Santa’s sleigh also contains a temperature controlled locker and sensors placed on presents stored in this locker ensure that the temperature is maintained to a specific level. The one other area where IoST has been applied is with the presents themselves. Whilst on the sleigh, sensors monitor the condition and temperature of the presents. However when they are dropped down a chimney the sensors switch mode and start transmitting information about its condition after being delivered. As presents are sent out from Santa HQ, two identifying labels are automatically applied to each present. One label has an embedded RFID device which is directly linked up to IoST and the second label is essentially a QR code. This QR code is applied for the purposes of the children’s parents. The QR code once scanned, takes you to a website which not only shows what the present is, but includes downloadable instructions etc, more importantly it describes how the packaging should be recycled or disposed of. This initiative alone has helped boost Santa’s green credentials and as the information is transmitted across IoST it has helped Santa achieve REACH and RoHS compliance to ensure materials are disposed of safely. Following the implementation of IoST, Santa also upgraded SantaPAD, a mobile app that was developed last year to help him keep track of his operations whilst he was delivering presents around the world. SantaPAD v2.1 now provides details of every device connected to IoST, through a machine to machine equivalent of Facebook, and this means that Santa can effectively monitor the ‘pulse’ of his operations from anywhere in the world by simply using his iPad app. So it has been quite a year for Santa, each time I visit he manages to extend the functionality of his IT infrastructure in a different way. Santa’s Internet of Things strategy has been his most ambitious project to date and it seems to have stretched into every area of his operations. I think it certainly provides a great case study in terms of how other companies could deploy the Internet of Things across their own production and logistics operations and I am sure Santa will be open to showing companies around his new and fully connected operation. 2014 will see many companies start to embrace the Internet of Things and Santa’s big bang approach to rolling out IoST across his operations has worked out well but not every company will want to take this approach. Providing an update on Santa’s B2B platform traditionally means that this is one of my last blogs of the year and so with this in mind I just wanted to offer season’s greetings and best wishes for 2014, See you next year !  

Read More

Top Ten Trends that will Impact Automotive Supply Chains in 2014

I recently published my thoughts on some of the key high tech related industry trends for 2014, so now I thought I would follow up this blog with my predictions on what could happen across the automotive industry in 2014. The industry is going through an exciting period of change with significant global expansion, introduction of new technologies and a global desire to introduce greener vehicles. So let me now outline some of the key supply chain and B2B related trends that are likely to impact the global automotive industry in 2014: Increased adoption of global vehicle platforms will simplify and consolidate supply chains – companies such as VW Group have proven that if implemented correctly, global car platforms can bring significant benefits to an automotive manufacturer. Despite the initial high investment, consolidated suppliers/parts/sub systems, simplified production systems and logistics flows all contribute to justify the investment. With the trend for global expansion growing, especially by Far Eastern automotive companies at the moment, I would expect more automotive manufacturers to start rolling out global car platforms or vehicle architectures during 2014. ‘Internet of Automotive Things’ becomes more deeply embedded within both vehicle and production environments – 2013 saw the ‘Internet of Things’ go mainstream. 2014 will see all participants in the automotive supply chain working to get their ‘machines’ connected to the internet. Production equipment, logistics networks, and aftermarket service infrastructures will become connected to a common enterprise platform to allow information flows to be analysed and acted upon. Every car manufacturer will begin to offer a ‘connected car’ within their respective range of vehicles. Automotive OEMs follow Tesla and BMWs lead by developing dedicated electric vehicle brands – exponential growth in sales of Tesla and BMW i-Series electric vehicles in 2014 will see many other vehicle manufacturers introduce dedicated platforms and sub brands for their electric vehicles. So far many car manufacturers have decided to enter the electric vehicle market by ‘electrifying’ existing vehicle platforms. From a packaging point of view many of these vehicles are not suitable for housing large battery packs or electric motors. To be successful in 2014, vehicle manufacturers will have to follow Tesla and BMW’s lead by developing dedicated, lightweight and ‘connected’ vehicle platforms China accelerates global expansion plans with acquisition of key suppliers and struggling western OEMs – China has so far failed to set the world alight with some of their own car brands. Lack of quality, limited brand awareness and having to compete against strong western brands has all contributed towards China’s limited global expansion of its domestic automotive industry. Increasing wealth in China will see a continued stream of western companies being acquired by Chinese manufacturers, the acquisition of Volvo Cars by Geely has shown how successful this can be. What if Chinese domestic OEMs could sign agreements in 2014 to use under-utilised production facilities in Europe and North America? This would serve to increase production levels globally, China would get a foothold in other markets and the whole supply base becomes rejuvenated. Adoption of Cloud B2B platforms accelerates due to continued consolidation of global ERP and legacy B2B environments – The continued globalisation of the automotive industry in 2014 will see stronger efforts to upgrade old legacy B2B environments. Continued expansion into the ‘2nd wave’ of emerging markets in 2014 will require an extension of IT infrastructures into North Africa, Vietnam and Thailand. Limited IT skills in these countries will see cloud based solution being deployed to allow all suppliers to be connected to a centralised B2B hub. The introduction of ‘connected plants’ to support strategies relating to the Internet of Things will see increased levels of consolidation amongst ERP instances to provide a single view of ERP information across multiple automotive plants. Automotive OEMs form alliance to lobby regional governments to invest in electric charging infrastructures – range anxiety is the number one barrier to electric vehicle adoption and the automotive industry is going to need the help of regional governments if they are to overcome this barrier. Cities such as Amsterdam have successfully implemented charging networks and even manufacturers such as Tesla have decided to fund the development of their own charging infrastructure to help drive electric vehicle adoption in the market. However if automotive companies are going to meet stringent government set emissions targets by 2020 then the government should be investing in regional charging infrastructure investment policies to provide an incentive for consumers to make the switch to electric vehicles. 3D printing technology matures and moves from conceptual design applications to limited use in production environments – this technology has been around for more than twenty years but in 2013 it was introduced to the general consumer. Automotive companies have been using 3D printing technologies for rapid prototyping at the concept design stage of a vehicle’s development for many years. Increased awareness of this technology will now see it begin to be deployed in certain production and aftermarket service situations where parts can be manufactured at a production or service centre location. Production of castings and housings will be one of the initial beneficiaries of this particular technology in 2014. More countries adopt global B2B communication and message standards to support international operations – increased globalisation of production has complicated logistics flows and supplier on-boarding initiatives. We are already seeing ERP and B2B platforms being consolidated to support these global operations. In 2014 we will see an increased interest in adopting global standards such as OFTP2 for communications and the soon to be introduced global message set being developed by the German automotive industry. In 2014 I would expect to see more regions follow Germany’s lead in using global standards. I would also expect regional industry associations such as AIAG in North America and JAMA in Japan to take a close look at the EDIFACT based global message set (which is being developed by manufacturers such as VW Group, BMW, Hella and Bosch) to see how they can be applied in their own countries. Strategic partnerships announced between high tech and automotive OEMs – Over the past few years we have seen a number of strategic partnerships being announced between for example Panasonic and Toyota, Ford and Microsoft. In 2014 I would expect to see a new generation of partnerships emerging thanks to the increased interest from consumers to connect their electronic devices to in-car entertainment systems. To date we have seen traditional consumer electronics vendors form partnerships with the automotive industry, moving forwards I would expect to see Google, Apple and other consumer centric high tech brands develop stronger relationships with the automotive industry. Will downloadable apps become common place in 2014?, will wearable devices interact with vehicles?, will Google’s Android and Apple’s IOS platform form the basis of future in car software platforms? Europe and other regions follow North America in rolling out regulations to minimise use of conflict minerals – North America is one of the first countries to try and significantly reduce the amount of conflict minerals flowing across supply chains. New regulations being introduced in 2014 by the Securities and Exchange Commission (SEC) in North America will require companies to demonstrate that they are not using conflict minerals as part of their supply chain operations. In 2014 I would expect Europe, Japan and other key industrialised regions to begin evaluating the implementation of their own conflict minerals reporting laws. AIAG in North America has already been working extensively with the automotive industry in North America, I would expect them to work closely with other industry associations such as Odette in Europe and JAMA in Japan to share key learnings and best practices. This will help to develop a unified approach to the removal of conflict minerals from global automotive supply chains during 2014.

Read More

Finally, Some Hard Facts About EDI – (1) EDI Still #1 By Far

Although EDI has been around since the 1980’s, there have been new technologies many thought would replace it. These are technologies such as portals and specialized multi-party trading exchanges that held out the promise to be lower cost, provide for faster partner onboarding, and enable greater participation rates by trading partners (suppliers, customers, logistics providers, financial institutions) than traditional EDI. But, according to the new study “EDI: Workhorse of the Value Chain” published by industry analyst and founder of Supply Chain Insights, Lora Cecere, “nothing could be further from the truth. It [EDI] is the workhorse of the extended supply chain.” Here are two of the interesting statistics from the report that back up Lora Cecere’s statement: While 70% of all orders are automated by EDI/XML, Portals, and/or Exchanges, EDI is the method of choice.   This chart shows: For orders received from customers (Sales Orders), 55% are received via EDI (either totally integrated or requiring minimal manual intervention) while only 10% are received via a portal and 7% are received via exchanges. The rest are sent via other, including manual, methods. The results are similar for companies that use EDI for sending orders (Purchase Orders) to their suppliers. 62% send them via EDI, while only 7% are sent via a portal and 1% via trading exchanges.   This is expected because EDI provides a universally accepted cross-industry standard format that enjoys significant adoption. When a business offers a portal option to its trading partners, some manual intervention in the process is usually required, causing cycle-time delays and increasing the likelihood of errors. Multi-party exchanges (e.g. Elemica, GHX) are usually industry-specific platforms that target specific markets and require significant investments in specialized technologies and processes that have not yet enjoyed the level of adoption of EDI. Lora Cecere notes: “After a decade of hype, the use of multi-party trading exchanges for purchase order fulfillment is in its infancy in procure-to-pay processes.” 2. EDI users are quite satisfied with how it improves supply chain performance. Survey participants were asked to rate how effectively the implementation of various EDI documents improved supply chain performance. They used a scale of 1-7, in which a rating of 1 indicated “not at all effective,” while a rating of 7 indicated “very effective.” As you can see in the chart below: All the EDI documents significantly improved supply chain efficiency for more than half the respondents – they indicated a rating of between 5 and 7. The documents that appear to have the greatest impact on improving the supply chain were orders and advance ship notices, receiving a rating between 5 and 7 by 84% and 80% of respondents, respectively. So, EDI is the most commonly used B2B e-commerce technology and its users are quite satisfied with its performance. In fact, for many companies, EDI has become the lifeblood of their business, making them more efficient, driving down costs, and increasing customer satisfaction. It is the means by which they can differentiate themselves from their competition. It provides them visibility into their ordering and delivery processes that enable business success. No wonder EDI is indeed alive and well. In future blogs, I will discuss additional statistics from Supply Chain Insights’ report. If you would like to read the report, get your copy here.

Read More

What was Driving B2B in 2013? – Top Ten Most Popular Blog Posts

As we reach the end of 2013 I thought it would be a good time to review the top ten most popular blogs from Driving B2B for the year. The Top Ten list below was derived using results from Google Analytics and for the purposes of this blog looks at which posts, from all my blog entries over the years, were most visited during 2013. 1. Why is India’s Automotive Industry Growing so Quickly? – Even though I posted this blog in March 2012, this has been the most reviewed blog on Driving B2B for 2013. I think it helps to highlight that there is continued interest in setting up automotive operations in India. Growing consumer wealth, an interest in premium level cars and low cost labour are all contributing towards continued inward investment in the country from the world’s automotive industry. Interest in Jaguar Land Rover, recently discussed in this blog, who are owned by India’s TATA Motors has also helped to fuel the growth of the automotive industry in India. 2. How will Cloud Computing Benefit the Manufacturing Industry? – This blog was originally posted in February 2011 and each year proves to be one of the most popular blogs read on Driving B2B. Cloud computing has transformed the way in which business can operate. The manufacturing sector is truly global in nature and this blog described how the industry could benefit from deploying cloud based infrastructures to manage the direct materials supply to their production operations. Today’s manufacturers need to quickly enter new markets to remain competitive and cloud B2B infrastructures provide the scalability and flexibility to achieve this. 3. Top Ten Trends That Will Impact High Tech Supply Chains in 2014 – This blog was posted in August 2013 and has been the most popular in terms of those that have been posted on Driving B2B during 2013. The high tech industry is currently going through an exciting period with the wide spread adoption of tablet devices, wearable devices and the ‘Internet of Things’. The adoption of these three tech trends alone has transformed the high tech industry over the past 12 months and is likely to continue driving consumer interest in 2014. This blog describes the top ten tech trends that are likely to impact high tech supply chains in 2014. 4. How the ‘Internet of Things’ will Impact B2B and Global Supply Chains – This blog was only posted in October this year, however this post has been the most popular as it has received a lot of reviews in just a two month time frame. It is not surprising as the Internet of Things has been one of the most popular tech trends for 2013. This year companies have been learning about IoT but I think in 2014 we will see more companies actually start to deploy more machine to machine connectivity environments. I will continue to keep a close eye on this emerging sector as we go through 2014 as it will help to drive convergence between the physical and digital supply chains. 5. How the Brazilian Government Plans to Stimulate Growth Across Their Automotive Industry – This blog post was originally posted in July 2012 and has continued to prove popular amongst visitors to Driving B2B. As with India, Brazil is also seeing exponential growth in its automotive industry. Imported automotive parts or vehicles are subject to high taxes and as a way of boosting their domestic automotive industry the Brazilian government introduced the INOVAR directive in 2012. This has proved very successful as many of the global car manufacturers have now invested billions of dollars in setting new plants in the country. 6. Build to Order or Build to Stock? – This is one of my oldest blogs to appear in the Top Ten list for 2013, however it helps to demonstrate that there is a continued interest in the automotive industry to reduce inventory levels by adopting build to order production processes. Changing consumer demand, combined with exponential growth in premium car sales in the emerging markets has led many vehicle manufacturers to implement build to order production. This in turn led to increased adoption of Just-In-Time production techniques which helped to considerably reduce the volume of parts flowing across automotive supply chains. 7. Arriba, Arriba, The Automotive Industry Speeds Up its Investments in Mexico – This blog post was originally posted in 2012 and continues the theme shown in this Top Ten list regarding continued interest in the emerging markets. Over the last two years Mexico has emerged as one of the most important automotive manufacturing hubs in the world. Many Far Eastern and European vehicle manufacturers have established a presence in the country as it provides the ideal stepping stone to export vehicles into the lucrative North American market. 8. Why Eastern Europe Could Benefit from the ‘Perfect Storm’ Currently Brewing in the High Tech Industry – This blog was originally posted in October 2012 and discusses how Eastern Europe has become a magnet for inward investment from the high tech industry. Restructuring across the industry and a continued interest from Far Eastern companies to enter the Western European market has led to significant inward investment in the region. Close proximity to Germany for example and access to a highly skilled, low cost workforce has made this one of the fastest high tech investment regions in the world. 9. How Cloud B2B Integration Enables Michelin’s International Operations – This blog was posted in September this year and promoted our recent joint webinar with Michelin. This has become one of the most popular manufacturing related webinars that we have produced this year. Michelin is one of the world’s leading manufacturers and distributors of tyres and this blog discusses how the company uses Cloud based B2B integration to connect with their extensive trading partner community around the world. 10. German Automotive Industry to Move From VDA to Global EDIFACT Messages – This blog was posted in June this year and discussed the German automotive industry’s move towards using Global EDIFACT messages rather than VDA messages. German automotive companies such as VW have globalised their operations in recent years and they found the VDA message set to be too restrictive in terms of managing global logistics flows. The introduction of the Global Message set aims to standardise the way in which the German automotive industry exchanges messages with its global trading partner community.

Read More

EDI will Support Japan’s Implementation of the WCO SAFE Framework for Imported Goods

In June 2005 the World Customs Organisation (WCO) council adopted the SAFE Framework. This is a globally agreed set of standards to secure and facilitate global trade that acts as a deterrent to international terrorism, secures revenue collections and promotes trade facilitation worldwide. The facilitation of trade has been highlighted by many industry analysts as one of the key benefits as the framework provides a much higher level of visibility into activities across a global supply chain. Many countries have introduced their own variant of the SAFE framework, in the European Union, the European Summary Declaration (ENS) was introduced by the 27 member countries for all modes of transport. Mexico, Australia and Brazil have their own rules in place to support the framework and China is currently working on their own version of the rule. North America was one of the first to implement the ruling, partly to help minimise terror related activities and in early 2010 it extended the rule by introducing the ‘10+2’ compliance initiative. Under the 10+2 compliance ruling, before merchandise arrives in the United States, importers, customs brokers or freight forwarders must submit certain information, 10 data elements, namely the first eight no later than 24 hours before the cargo is loaded on its U.S bound vessel and then submit the last two no later than 24 hours prior to the ship’s arrival at a U.S port. Logistics Carriers must also submit two pieces of information, hence the name 10+2. In March 2014, Japan will become the latest country to adopt the SAFE Framework with the introduction of their Advanced Filing Requirement (AFR). The AFR goes into effect on 1st March 2014 and the penalty phase will apply ten days later on 10th March 2014. The Japan AFR requires all logistics companies to submit electronic shipping details 24 hours prior to the vessel’s departure for all Bill of Lading covering any goods that are intended to land in Japan. Japan will be getting tough on companies that do not submit information electronically with a potential $5,000 fine or a year in jail. A more serious option could be the Japanese Customs Agency refusing to issue a permit to discharge the cargo. To ensure compliance many companies have been collaborating with Nippon Automated Cargo and Port Consolidated System (NACCS), the private sector IT division of Japan Customs. NACCS has been working tirelessly over the past 12 months to try and educate the market on what is required to be sent to Japan Customs ahead of the shipment leaving its port of origin. NACCS requires nearly thirty pieces of information to be submitted and the exact data fields required can be found HERE The introduction of the WCO SAFE Framework has helped global customs and border control agencies to significantly improve the way in which they process inbound shipments and remove significant paperwork from their processes. It has also allowed Customs to modernise their IT infrastructures to support global carriers in a more efficient way. Improved visibility of information relating to inbound shipments has not only helped to minimise terror related activities but has also contributed towards a reduction in the amount of counterfeit goods or parts entering today’s global supply chains. As with other countries, Japan has decided that information to support AFR should be sent electronically via EDI, which is the global ‘language’ of today’s B2B activities across trading partner communities. Businesses today face an increasing amount of regulatory compliance and EDI adoption is helping companies adhere to these new regulations and ensure that the wheels of global trade continue turning.

Read More

Commsbursting

Every night retailers around the world batch up their Point of Sale (POS) data and transmit it to their suppliers shortly after the stores close. Suppliers can gain a great deal of insights from inspecting retail sales data. Suppliers can understand how many of each of their products were sold and in what stores. If they are lucky the retailer may provide insights about what other items that were in the shopper’s basket and purchased at the same time. And if they are really lucky, the retailer may provide demographics about the specific shopper who made the purchase, which suppliers can use to understand the profile of their target customer. Transmitting POS data is critical to demand sensing in the consumer products supply chain, but it often comes at the expense of a migraine headache for the IT department. The root cause of these headaches is that each retailer sends data differently. Some retailers will send each supplier the data for their specific SKUs. Others send over their entire day’s results across all their suppliers. Some retailers send five fields for each purchase transaction while others might send as many as 100 fields. Some days only a subset of the POS may be transmitted if the stores are unable to report in a timely manner. And during busy seasons like the Post-Thanksgiving holiday buying season there may be a spike in sales for particular items, which means much larger files. As a result the size of the POS files to be received can vary considerably from day-to-day. Why are large POS files a problem for IT? Well, they can be a big problem if the IT infrastructure for which they are destined wasn’t expecting them. First, there may not be enough disk space to accommodate the files. Suppose you have a 100GB hard drive. Your NOC has set an alert to notify them when the disk has less than 10% capacity remaining. At 9pm the disk has 12GB of space remaining. But the retailers send over unusually large files starting at 10pm, whose combined size is in excess of 14GB. Suddenly you are out of space and cannot accept the data. Large files also tend to choke firewalls and local area network capacity. They can also monopolize the resources of the Managed File Transfer or B2B Integration Software that processes the files. If your B2B/MFT software becomes too overloaded you may get what is effectively a Denial of Service (DOS) condition for other transactions. So a customer who wants to send you a $1 Million purchase order via EDI is unable to do so. Your HR team who needs to submit their weekly payroll run to the local bank is unable to do so. POS files are not the only types of large files. These days’ companies are exchanging many different types of unstructured data – images, videos, audio files, telemetry, logs and entire databases. And you never know when you will get one of these files, because sender rarely gives any type of warning . In today’s era of nearly-ubiquitous broadband and virtually-free storage costs, why should they be bothered with worrying about the size of files? By now you can probably guess where this is leading. Couldn’t cloud computing help with this problem? The large file transfer problem should be easily solved with the principles of elasticity that cloud providers introduce to storage and processing power. Enter Commsbursting, a new technique being used by cloud-based integration providers such as GXS, to automatically provision additional file transfer capacity in situations where large files threaten to deny service. For example, if processing or storage space hits a certain threshold (e.g. 70%) then the cloud provider could auto-provision additional resources to accommodate the spike in traffic. Commsbursting is not only useful in situations with large files. It can be used to respond to any type of spike in demand. For example, most of the payment clearinghouses around the world operate only during business hours. For example, a payment processing window might be from 9am to 4pm. As a result, near the end of the payment processing window as 4pm approaches there is a surge in demand for payment requests as accounting groups rush to get transactions recorded with today’s date. The pre-cutoff volumes are relatively unpredictable. But banks do not want to be in the situation of telling a client that they could not process an important wire transfer submitted at 3:55pm because their IT infrastructure could not accommodate the load. Using a commsbursting model to handle the spike in payment transaction volume is an elegant and economical solution for a financial institution. So you may be wondering – how do you get commsbursting for your B2B integration environment? Technically, there is no reason why you could not build the capability into a private cloud in your own data center. But the ROI of having additional capacity available to burst up may not be very compelling. With a public cloud, however, the economics are far more favorable. A provider can recover the costs from a wide base of customers each of whom benefits at only a fraction of the investment that would be required in-house.

Read More

EDI Benefits – Hard facts now available!

EDI has helped simplify and improve commerce between trading partners for years and its benefits continue as it improves more business processes such as electronic procurement, automated receiving, electronic invoicing, and electronic payments. EDI can reduce the cost of personnel and office space, improve data quality, speed up business cycles, improve efficiency, and provide strategic business benefits. Sometimes, in order to make a decision to embark on a new EDI program or expand their current EDI projects, companies seek quantitative data to educate their executive teams and drive a more holistic understanding of the EDI processes and benefits for the greater organization. They want hard facts that answer questions such as: How many days earlier can orders be shipped when EDI is used for the ordering process? What are the actual cost savings that companies realize when they use EDI Advance Ship Notices (ASNs)? What are the actual cost savings that companies realize when they use barcode labels or RFID tags? What are the top challenges that businesses face when EDI is not used as part of the ordering process? How do supplier companies really benefit when they comply with their customers’ requests to exchange business documents via EDI? Want the answers to these questions and more? Watch this 30-minute webinar during which industry analyst and founder of Supply Chain Insights, Lora Cecere, discussed the key findings and takeaways from her new study titled, EDI: Workhorse of the Value Chain; A Closer Look at B2B Connectivity Benchmarks in the Extended Supply Chain.

Read More

Reduce Risk with Secure Information Exchange

Every business is an information business and this is why digitization is a key corporate priority. Every enterprise exchanges massive amounts of information on a daily basis, both inside and outside of the firewall. Security has become a top priority for the CIO with the volume, variety, and velocity of the information exchange that occurs. Security risks are intensified by globalization, a reliance on email for business collaboration, BYOD and mobile access, growing regulatory pressures, bad actors, and cloud computing. With more than 80 percent of enterprise data residing outside of structured ERP systems, much of this information is in transit outside the business firewall. In fact, there have been more than 600 million information breaches in the U.S. alone since 2005 (privacyrights.org) and one-third of these breaches were due to unintended disclosure, insider fraud, or the use of mobile devices. Poor information exchange systems and practices lead to security threats at many levels: to employees; projects; competitive advantage; national security; reputation; brand, and in some cases, to the business itself. In view of these challenges, IT departments have a mandate to transform reactive, unsecured, and uncontrolled information exchanges into safer, more flexible, managed, and compliant processes. CIOs need easy, integrated, and trusted solutions to support all of their business information exchanges— from vendor invoicing, payroll submissions, transfer of healthcare records, to securing corporate IP. OpenText has responded to these customer needs with our Information Exchange Suite which is an integrated set of cloud-based messaging services that comprises Secure Email, Secure MFT (Managed File Transfer), Fax, EDI (Electronic Data Interchange) and Notifications. Recent innovations to the newly released Information Exchange Suite include cloud-based secure messaging, large-file transfer acceleration, data leak prevention, real-time audit trail, and integration with desktop, mobile and other systems. Each service is designed to support the business requirement for exchanging information with anyone, anywhere, and in any format, while instilling confidence that the exchange of information is accessible, efficient, and trusted. Enterprises are empowered with the infrastructure for secure and reliable exchange of information to help improve operational performance, reduce risk, and enable enterprise agility. Imagine the productivity gains and competitive edge your organization would realize if the exchange of information could be made faster, easier and more secure! I’m pleased to share these advancements with our customers, and more information is available here. Next month, I look forward to providing an update on GXS.

Read More

How Does the Automotive Industry Plan to Embrace the New Dodd-Frank Conflict Minerals Law?

In an earlier blog entry I discussed how a new ruling being introduced in North America is likely to impact manufacturing supply chains around the world. The ruling will essentially make companies more accountable for where they source certain materials that go into their products. The Dodd-Frank Law relating to conflict minerals usage has been introduced to try and kerb the funding of rebel groups in the Democratic Republic of Congo and its immediate neighbouring countries. There are four key minerals, Tin, Tungsten, Tantalum and Gold, (collectively known as the 3TG minerals) that will be impacted by this ruling. For a more general introduction to 3TG minerals and how the Dodd-Frank law will impact their sourcing, please see my previous blog on this subject area, Click Here. The manufacturing industry is probably going to be the hardest hit by this new ruling as every manufacturer must be able to demonstrate what minerals are in their products, and more importantly where they were sourced from. The high tech and automotive industries are going to be severely affected by this ruling and in my previous blog I discussed what steps the high tech industry was taking to address this issue. This blog will briefly review how this new ruling will impact the automotive industry. The image below shows just a small selection of parts within a vehicle that are likely to use 3TG minerals in some shape or form. Tantalum is probably the more widely used mineral and is used in many different areas of a vehicle. Given a car may contain over three thousand individual components you can just imagine the huge task facing the OEMs to try and identify not just which components are using 3TG minerals, but the likely quantity as well. It has been estimated that just adhering to this new law will cost the automotive industry between $3 billion to $4 billion. With the reporting period underway already in North America and reports due to the Securities and Exchange Commission (SEC) by 31stMay 2014, there is a growing sense of urgency amongst automotive companies to introduce compliance procedures for their supply chains. In April 2011, six manufacturers, Chrysler, Ford, General Motors, Honda, Nissan and Toyota issued a joint letter to their respective suppliers informing them of the new reporting requirements and requesting their cooperation in terms of identifying which parts may contain 3TG minerals. The six companies outlined three basic steps: Determine which parts/assemblies incorporate one or more of the identified conflict minerals or their derivatives Assess the supply chains associated with those parts/assemblies Engage with suppliers to identify the smelters used in a supply chain to process the conflict minerals or validate the origin of the conflict minerals as recycled/scrap In August 2012, the SEC finalised the rule for complying with conflict minerals provision of the Dodd-Frank Law. These rules, especially given the global nature of the automotive industry, have major implications for nearly every automotive company, not just in North America. Even companies headquartered outside of the US and those which do not report to the SEC, may be subjected to conflict minerals requests from customers who do report to the SEC or are in the supply chain of these companies or their tier suppliers. One of the key ways that companies make sure they achieve compliance is by ensuring that the internal organisation is aligned and the purchasing department will typically take the lead on this initiative. After all, the purchasing department is responsible for the day to day communications with the supply chain and they manage all communications and interactions. When you consider that there are over 190,000 automotive manufacturers and suppliers in the industry that are potentially involved, the route to compliance could potentially be a long and complex one. It will require significant resources to ensure compliance and make sure that filings are submitted to the SEC on time. The Automotive Industry Action Group (AIAG) has been working tirelessly to educate the industry over the last couple of years, on what they need to do to ensure compliance. They have taken the Electronic Industry Citizenship Coalition (EICC) conflict minerals reporting template and created their own web based version of the reporting tool to help companies survey their trading partner communities. A recent study by the analyst firm PwC, referenced in the chart below, highlighted some of the key challenges faced by the automotive industry with ensuring compliance across their supply chains. The biggest challenge highlighted by respondents to the survey said that getting accurate and complete information from relevant suppliers was going to be their biggest challenge. In fact 31% of respondents agreed that this would be the greatest challenge in terms of the conflict minerals compliance process. Being able to identify which companies across the automotive supply chain may be unknowingly supplying parts containing 3TG minerals is one challenge but then being able to reach out to them efficiently is another challenge altogether. One of the key challenges faced by many companies today is that contact information is potentially held within many different back end business systems and ensuring this data is up to date is an ongoing challenge. GXS Active Community is an enterprise wide collaboration platform that could potentially help to address this particular issue. By providing a web based collaboration platform that allows suppliers to update their own contact information as and when required helps to ensure that you can reach out to your supplier community in a more efficient manner. At the end of the day if a supplier wishes to do business with a prospective customer then it is in their own interest to at least make sure they are contactable. Active Community not only allows companies to keep up to date information about each and every supplier, it also provides a platform to send out regular communication to a trading partner community. Ensuring that suppliers adhere to various compliance procedures is becoming a key part of the trading partner management process today and Active Community can provide various tools to allow regular assessments to be sent out for completion by a supply base. For example the EICC reporting template could easily be replicated within Active Community, an example of which is shown below, and a supplier community would be able to use this platform to ensure that they meet the compliance requirements of the conflict minerals reporting law. As discussed earlier, it is not just companies in North America that will need to submit evidence to the SEC that their supply chains do not contain conflict minerals. The European Union completed a public consultation in June 2013 and a recommendation on necessary steps to be taken by European member countries will be announced by the end of 2013. The European Commissioner for Trade recently gave a speech on responsible sourcing of conflict minerals and his speech can be read here It is widely expected that the European Union will embrace the OECD framework highlighted in my earlier blog on this subject and it is expected that other regions will also embrace the framework moving forwards. Given that three Japanese OEMs were involved with the initial communication to their supply base in North America it is expected that Japan’s government will be taking a closer look at this initiative in the near future as well. The automotive industry is fortunate to have a proactive group of industry bodies, namely AIAG in North America, Odette in Europe and JAMA in Japan that are/will be willing to work closely with the regional suppliers to ensure that they meet the various compliance initiatives. Odette and JAMA will hopefully be able to utilise much of the initial work undertaken by AIAG to support their own conflict minerals reporting initiatives. I recently recorded a webinar relating to the area of conflict minerals and how GXS Active Community could be used as a potential reporting platform. GXS will make this webinar available shortly, so please feel free to register to learn more about the new conflict minerals law and how GXS can help ensure compliance across your supply chain.

Read More

Security with Digital Certificates: Should you generate your own or use a Certificate Authority?

In the world of B2B, the recommended approach for ensuring the security of the documents you exchange with business partners – such as your suppliers, customers, logistics providers, financial institutions – via the Internet is the same encryption approach used by many communications protocols such as AS2 and SFTP. These communications protocols use a system of public and private keys – one set for the sending company and one set for the receiving company – while leveraging digital certificates to enable the easy exchange and management of the key pairs. (See How Digital Certificates Help Ensure the Security of EDI Data.) One of the decisions you’ll need to make when using this approach is how you will generate the digital certificates your company uses. You have two options for generating the digital certificate: (1) You can generate your own, using special software, or (2) you can use one of the Certificate Authorities (CAs), such as Verisign and Entrust, to generate and manage them on your behalf. If the digital certificate is generated by a CA, it is usually valid for one or two years. If you generate it yourself, you can make it valid for a longer period. When certificates expire, they need to be renewed or replaced and you must provide the new certificate to your trading partners in advance of expiration to ensure that the critical business documents you exchange, such as purchase orders and invoices, can continue to flow without interruption. For an annual fee, a certificate authority (CA) will issue digital certificates, and can also provide additional services, such as: If a certificate is compromised – for example, the private key has been lost or stolen – the CA can “revoke” it before it expires. These revoked certificates are put on a revocation list that is automatically checked by your software to verify the certificate prior to its use. The CA ensures that the certificate holder is who they claim to be by verifying their credentials. This adds an additional level of assurance of the trustworthiness of any business partners with whom you are exchanging documents. Prompted by the expiration date within your partner’s certificate, the CA will verify the identity of your trading partner on a regular basis, increasing the security of the system still further. The alternative to using a CA is to get everyone in your community to “self-generate” certificates, allowing them to set their own expiration dates. The benefits of this approach include: It’s free, as many B2B software applications include a certificate self-generation capability. You may have less administration headaches because everyone can set longer certificate expiration dates, say 5 or 10 years. Then, instead of having to update your system with everyone’s new certificate every one or two years, as would be necessary for CA-issued certificates, you only need to do it every 5-10 years. However, having longer expiration dates reduces the overall security of the system, since no organization is “policing” the system and confirming that a certificate does belong to the person it appears to come from. If your trading partners set the rules, you may need to support both models, with some partners asking you to use a certificate from a CA, while others will accept self-generated certificates. Whichever route you choose, you must be careful not to lose access to your private key (by forgetting your own password, for instance), since neither a CA nor a system that self- generates certificates can retrieve it. In these circumstances, you would need to generate a new certificate and distribute it to all of your trading partners, and you or your partners may need to re-send some documents if they were sent using the old key. To learn more about the best options for B2B Communications, watch this webinar: How to Determine the Best Communications Protocol for B2B Integration

Read More

Driving Innovation & Growth

The growth of unstructured information inside the enterprise is staggering. In fact, experts estimate that over 80 percent of data in organizations is unstructured and is growing at a rate of over 36 percent year-over-year(1). Managing this information across different formats, devices, and applications is a challenge for organizations that’s not going away. There is profound value in this unstructured information. In fact, your company’s future depends on it. How is the enterprise dealing with all of these new data types? Well, according to Forrester, they’re not. In a survey conducted in May of 2013, only 13 percent of the respondents had a formal information management strategy in place (2), which is why we’ve spent the last year focusing our efforts on our biggest synchronized software release to date. Announced at Enterprise World, this finely choreographed release features software advancements across our Enterprise Information Management (EIM) suite designed to help organizations manage huge amounts of data and unlock the untapped value of their information to create competitive advantage. Our latest release features over 300 integration points and stronger synchronization across five suites of software: Content Suite, Process Suite, Experience Suite, Information Exchange Suite, and Discovery Suite. The OpenText Vision: A Holistic View of EIM Let me touch on some of the highlights, suite by suite: Content Suite: will reduce costs through security features, information governance, and content lifecycle management. Innovations include an easier to use interface, APIs, reports, report writer, and Archive in the Cloud. Process Suite: automate processes to improve performance with new Smart Process Apps, including Case Management, Case Intelligence, and flexible deployment on premise or in the Cloud. Experience Suite: create the best possible consistent experiencewith every interaction through enhancements like omni-channel publishing and adaptive media, web and social analytics, ecommerce connectors, and our new HTML5 user experience. Information Exchange Suite: build trust and reliability, and reduce risk with the secure exchange of information from any user on any device to any destination. Innovations include advanced messaging services layer for fax, notification and EDI services, real time audit trails, and data loss prevention capabilities. Discovery Suite: empower people to find, understand, and leverage enterprise information for greater insight and better decision making with solutions for auto-classification, content migration, content analytics, semantic search, eDiscovery – and a CIO dashboard. With our latest release, we’re also introducing AppWorks, common RESTful services, and an EIM developer platform that accelerates the speed of development and introduces opportunities for innovation to our customers and partners. With AppWorks, developers can begin to write code using our suites within hours as opposed to weeks, using standard languages such as Java, JavaScript, and HTML5. The EIM suites outlined above are integrated through AppWorks to leverage the value of combined suites into a comprehensive EIM platform. Support for Sophisticated Information Flows AppWorks builds our holistic EIM strategy by supporting complete and integrated information flows to maximize the value of information across the enterprise. EIM is the next generation of enterprise software. Our latest release delivers a strong technology foundation for our customers to build on and establishes EIM as the mission-critical solution to drive insight, innovation, and growth. (1) Ray Paquet, “Technology Trends You Can’t Afford to Ignore”, Gartner Inc., http://www.gartner.com/it/content/1503500/1503515/january_19_tech_trends_you_cant_afford_to_ignore_rpaquet.pdf (accessed 10 Nov. 2012). (2) Alan Weintrub, “The Enterprise Information Management Barbell Strengthens Your Information Value.” ©2013, Forrester Research, Inc: July 15, 2003

Read More

The Rise of the Machine Connected Supply Chain

Two weeks ago I was fortunate to get an invite from Cisco to attend their ‘Internet of Things’ World Forum in Barcelona. As I have discussed before, the Internet of Things is going to herald the introduction of the fourth industrial revolution and Cisco, SAP, Oracle, GE and many other tech and industrial companies want to be part of this new and exciting sector. Some of the subjects being discussed at the forum, especially in healthcare, would not have looked out of place in the 2003 Terminator 3 film, ‘The Rise of the Machines’, hence the title for this particular blog entry! I personally found the event to be one of the best conferences that I have attended in recent years and there were a lot of conceptual ideas discussed which gave plenty of food for thought. One such idea, which for some reason I have not been able to forget since the conference, is the thought that any type of machine could potentially have its own avatar or Facebook style profile page. To put another way, in your personal life you may have an online profile such as Facebook or LinkedIn to keep in touch with both friends and work colleagues. What about if every machine connected to the internet had its own avatar or online profile as well? I thought this was quite an interesting concept, but what would it look like and how would it work? I like challenges such as this so I had to try and come up with a concept, using Facebook as a basic template as you will see below. The other reason that I am interested in this is from a B2B and supply chain point of view. Here at OpenText we offer a collaboration or rather community management platform called Active Community which allows companies to maintain a centralised database of contacts for all trading partners across a supply chain. If machines were somehow able to interact directly with the supply chain, either up or downstream then it could potentially help introduce a new set of operational benefits and efficiencies. The Internet of Things offers the potential to connect the physical and digital supply chains in a way that has not been possible before. But what if individual machines could somehow interact with users or vice versa, could a Facebook type of ‘asset management’ environment provide a neat way of managing the machine to machine communications across a supply chain? Even though the concept of the machine based avatar was discussed at the Forum, there was relatively little information offered as to how this could work. I am sure there are some conceptual ideas out there already, but I thought I would offer my own vision of how this could work. One of the discussion panels at the forum included a speaker from Caterpillar who said that every machine that leaves their factory includes a WiFi module so that it can be connected to the internet and information can be analysed and exchanged remotely when it is out in the field. We are just entering the fourth industrial revolution and the focus so far has been on the subject of just getting machines connected to the internet. I have really only scraped the surface of this particular subject area and every aspect of the operation of today’s supply chains is likely to be impacted in some way by the Internet of Things, from logistics networks, warehouse and distribution centres through to improved monitoring of inventory levels in factories and retail stores. I think we are heading for exciting times!

Read More

Integration-as-a-Product (IaaP) – The Forthcoming Wikipedia Entry

Integration-as-a-Product (IaaP) was a late twentieth century model in which corporations utilized their in-house IT organizations to operate integration technology behind the firewall. The model worked as follows. Corporations would license integration software vendors from technology vendors. The software would be installed on servers running in a corporate data center. Vendors typically received license fees of hundreds of thousands of dollars up front as well as 20% on-going maintenance fees. Payment to the vendor was irrevocable regardless of whether the business outcome sought by the corporation was achieved or not. Corporations struggled to successfully implement integration as a product for three decades. After a period of 20 years of experimentation the industry had only achieved 30-50% adoption of electronic commerce technologies. It was followed by a shift in thinking towards different approach – See “Integration-as-a-Service” or “Cloud Based Integration.” As with the dinosaurs there are many theories as to why Integration-as-a-Product failed. Some of the most popular root causes are believed to be: Proliferation of XML Standards and IP Communications options resulting in overwhelming complexity for IT organizations. Inability of Point-to-Point Connections to scale beyond 20% of a community. Ineffectiveness of IT organizations to convince Small Businesses to participate due to the costs, resources and training required to purchase and implement software. Lack of accountability amongst technology vendors to ensure success. Payment to the vendor was irrespective of business outcome. Failure of IT organizations to keep pace with trading partner growth caused by rapid rise in outsourcing of functions such as manufacturing, logistics, distribution and aftermarket service. Widespread acceptance of easier-to-use and lower-cost cloud and SaaS models. See “Epic Fail” for other examples of similar failed business models.

Read More

Securities – Collateral for Dummies, Damage Free – Part 3

In my third and final blog in this series, I am covering the basics of aggregated collateral reporting and segregated client collateral reporting. Aggregated collateral reporting Collateral is almost always diversified and fragmented across many locations, making it difficult to optimise its use across a legal entity or portfolio. Banks, Brokers, Asset Managers and Corporates are starting to look into or build collateral reporting services that aggregate all of the collateral posted as a preliminary step to optimising its use globally. The key business stakeholders of this process are usually in the Collateral Management, Margining and Treasury functions, which receive the service from Product Managers, Operation Managers, Technology Managers, Treasurers, COO and CAO’s. Banks and Brokers are the stakeholders who look to create and offer this service, while Asset Managers and Corporates are the ones who subscribe to it. In order to successfully create an aggregated reporting service, organisations need to integrate, capture and normalise the relevant daily and intra-day feeds coming from their venues. This is usually delivered in many different formats and times of the day. GXS often becomes the delivery channel for these feeds; data normalisation, translation capabilities and processing logic are key to creating a “standard” format that can be loaded as a Straight-Through Process (STP) into the reporting service. Segregated client collateral As discussed in part 1 of this blog series, Client Collateral must now be segregated from a Bank or Brokers’ own assets. In some cases the collateral must be deposited at a Clearing House in a segregated account, in the clients’ name. Regulations require that this must be reported to regulators periodically. Legally Separated, Operationally Commingled (LSOC) are the rules adopted by the CFTC; it is the basis for the complete legal segregation model, which determines how margin for cleared swaps will be held for the benefit of customers of a Futures Commission Merchant (FCM). Futures clearing, collateral management and liquidity services all have a business stake in the process, with Operations Managers, Product Managers and IT running and delivering the service. As a result of the regulations and industry best practices, there is a larger volume of segregated accounts, increased operational and management effort and several additional orders of magnitude for issuing reports on balances to regulators and clients. Add to this the need to manage diverse file formats and underlying systems, and there is an opportunity to have a consolidated, systematic and automated approach to segregated client collateral reporting. GXS often assists the FCM in creating the connectivity, formats and feeds needed to comply with these regulations, to enable their client’s reporting on LSOC accounts. The goal of this three-part blog was to help put some of the industry buzzwords and themes in context. I hope this was useful and I look forward to receiving feedback on how your organisation is dealing with these challenges.

Read More

How Digital Certificates Help Ensure the Security of EDI Data

When you exchange EDI documents via the Internet, the security of your data is of vital importance. It is critical that only the intended recipient can read the sensitive data being transmitted, such as purchase orders, invoices, or remittance advices. While encryption technologies have long been used to achieve the level of security needed for this sensitive data, their usage can be hampered by the difficulty of exchanging the “keys” upon which they depend. Digital certificates resolve the key exchange and management issues. There are two basic kinds of cryptography. The first is called “symmetric key encryption,” which involves the use of an encryption/decryption key, often called a “shared secret.” The key can be a code of any length, for example, 768 bytes or more. The longer and more random the key is, the greater the security achieved. To use this approach for B2B, that long key would need to be exchanged with all companies with which a business would be exchanging documents. There are several issues with the symmetric key approach. First, how do you exchange the key in a secure fashion? Just as you shouldn’t exchange passwords or credit card numbers via email, email is not a good vehicle for the shared secret – it’s not secure! Also, if you exchange documents with multiple partners, you probably want each partner to have a different key. That way, if one partner inadvertently gets documents intended for another partner, he cannot decrypt it because his key works only on documents intended for him. Managing all these keys can become a logistics nightmare. A better approach is to use “asymmetric encryption,” which uses a set of two keys – a “public key” that is used to encrypt and a “private key” that is used to decrypt – combined with a “digital certificate,” which makes the key exchange and management process very easy. The public key is called “public” because everyone who sends you documents can use the same key. There are no security worries about the public key falling into unauthorized hands, since this key cannot be used to decrypt or read your messages. It can only be used to encrypt messages being sent to you. You have a second key, called a “private” key, which is not shared with anyone else. This key is accessed by your communications software and is used to decrypt documents sent to you by partners that have encrypted documents using your public key. The digital certificate is actually an electronic “container” for the public key and other important information such as organization name, email address, and server identification. The certificate is formatted in a standard way, thus enabling software to immediately read the certificate and “know” where to find the specific pieces of data needed. The certificate can be exchanged via email because the information in it is all public, so there’s no security concern. In addition, the digital certificate enables you to keep track of which public key belongs to which company – this is extremely helpful when you need to manage hundreds or even thousands of keys for all the business partners to whom you are sending documents, each with its own public key. You can obtain a digital certificate for your company from an authorized certificate authority – such as VeriSign or Thawte – that acts as a trusted third party who vouches for the validity of the keys. Or, you can use special software to create your own digital certificate. In the B2B world, the asymmetric encryption approach combined with the digital certificate is the better approach. The public and private keys help ensure that (1) the data is encrypted during transmission over the Internet and (2) only the intended recipient is capable of decrypting the data. The digital certificate makes the process easy and manageable. To learn more about the best options for B2B Communications, watch this webinar: How to Determine the Best Communications Protocol for B2B Integration.

Read More

We Need Hypervisors for Translators

In my last post, I talked about how most map development tools and translation engines on the market were either “too soft” or “too hard” to meet the variety of B2B integration needs a company has. This creates a dilemma for companies trying to standardize on just one mapping environment and one translation tool. A new cloud-based service, called a translation hypervisor will allow you to run not just one, but multiple translators to support your B2B integration needs. The principle is similar to the hypervisors you are familiar with that allow you to run multiple instances of an operating system on the same server. Recall the example from my last post Let’s say you are a company that has a lot of maps. About half of these maps are relatively simple. They require straightforward mapping of data fields from your ERP application into an industry standard such as OAGi XML. But the other half the maps are complex. Perhaps, you have a demanding group of customers that each want you to integrate directly to their business applications. These customers want business logic included in the maps along with calls to databases and APIs to enrich or validate data. Which type of mapping and translation tool would you choose for this scenario? Historically, companies would have tried to standardize on a single translator across the enterprise to reduce costs. In the case above, however, this forces companies to standardize on complex mapping and translation tools to support the most challenging requirements. Developing simple maps becomes a lot more complicated and time-consuming than necessary. It’s like forcing developers to use Java when they could have used Microsoft Excel. The translation hypervisor changes all that. The customer in the example above could simply subscribe to two different translation services. Simpler maps could run on a lower-tier service with no “bells and whistles.” More complex maps could run on a high-end service designed for performance and rich functionality. Requests for translation from external business applications would be routed to the hypervisor. The hypervisor looks up which map is most appropriate for its inbound request then routes it to the appropriate translator. How does the customer benefit? Rather than having to take a one size fits all model, the translation hypervisor allows companies to obtain the optimal economics for each specific type of map. Simpler maps can be run on the lower cost cloud-based translation service while the complex maps run on a higher tier, higher priced service. These types of economics would be difficult for most IT organizations to obtain in-house. Sure, there is no technical reason why a large IT organization could not build a multi-translator service in their data centers. However, the ROI for building such a capability would not be justifiable for most companies because their map volumes are relatively low. But when you can aggregate a group of customers’ maps in the cloud the ROI from running multiple translators becomes much more compelling. Most cloud providers service a wide range of customers (some big and some small) with a wide range of mapping needs (some easy and some hard). Using a hypervisor approach allows the flexibility to offer customers maps and cloud-based translation services at a range of price points and implementation time frames.

Read More

How GXS Addresses the B2B Challenges Faced by Today’s Automotive Industry

GXS has been supporting companies across the automotive industry for more than forty years. Today, GXS works with many of the world’s leading automotive OEMs and tier 1 suppliers, helping to address B2B challenges such as supporting international expansion and ERP integration projects. I have been at GXS for just over seven years now and 2013 represents a major turning point in the industry. The restructuring faced by many companies after the recent economic recession is now starting to pay off and the industry is going through a renaissance. But what are the industry trends around the world?, what are the B2B related challenges faced by today’s automotive industry? and how does GXS help to address these challenges? Over the past few months I have been experimenting with a presentation tool called Prezi, with a bit of patience you can produce fairly eye catching presentations that makes Powerpoint based presentations look very old fashioned indeed! I thought I would share my latest Prezi presentation, the video is only 11 minutes long but I hope it gives you a good idea of the challenges faced by today’s automotive industry and how B2B integration solutions can help to address these challenges. If you would like to access the Prezi presentation used to create this video, then please click here. You will also find another presentation in my Prezi profile that provides a high level introduction to Cloud B2B integration.

Read More

Standardizing on a Single Mapping and Translation Tool for B2B Integration

There are a wide variety of mapping tools available on the market today. Some are designed to be easy-to-use by IT professionals without much formal training in map development. These might come with wizards or other assisted mapping tools to guide you through the process. Or they might come with libraries of pre-built maps that just require slight customization to use. At the other extreme are mapping tools which are designed to support very complex scenarios, but require more experienced IT professionals to use. Runtime translation engines, that execute the maps as requested, also come in different shapes and sizes. Some translators perform better with certain types of standards (e.g. EDI and XML) than others. Some translators are designed for very high volumes of complex maps that require lots of processing overhead. These might support the creation of complex business logic or web services calls to enrich data. At the other extreme are translators designed for lower processing volumes and simpler maps. But, most companies have a very diverse mix of maps that don’t fit into the “sweet spot” of one particular mapping and translation tool. Let’s say you are a company that develops a lot of maps. About half of these maps are relatively simple. They require straightforward mapping of data fields from your ERP application into an industry standard such as EDI or OAGi XML. But the other half the maps are complex. Perhaps, you have a demanding group of customer that each want you to integrate directly to their business applications. These customers want business logic included in the maps along with calls to databases and APIs to enrich or validate data. Which mapping and translation tool would you choose for this scenario? Neither is really a good choice. The simple mapping and low performance translators are too soft. The complex mapping and high performance translators are too hard. The Goldilocks Problem with Translators – Some are too soft. Some are too hard. (Image Source: http://kindergartencrayons.blogspot.com) Historically, most companies have selected a single mapping and translation tool to support their B2B integration programs. Conventional wisdom suggests that you can lower costs and improve productivity by standardizing on one vendor’s technology for mapping and translation. Using multiple different mapping tools and translators from multiple vendors is viewed to be highly inefficient. So what do you do? A new cloud-based service called a translation hypervisor is emerging that offers a cost-effective answer to this dilemma.

Read More

Securities – Collateral for Dummies, Damage Free

In this blog, I am introducing another industry buzzword: Straight-Through Processing. To illustrate what this means, I am looking at the use case of the typical collateral services offered by a Bank. Let’s examine at high level what the challenges are from a people, process and technology perspective, and what can be done in the wider context of the new daily reporting regulations. In order to achieve this, I’m using real-life examples shared by Curt Brill, seasoned professional from various global securities and funds services functions. Collateral Services offered by Banks Collateral Services are a growth area for Banks, however the issue is that there is little automation in place to support this, especially for daily collateral calculations and reporting. Clients traditionally send inbound instructions via, fax, email, or spreadsheet and the bank matches broker and asset manager/corporate instructions manually before acting. To achieve this process daily and for many thousands of accounts, banks need to receive instructions in a standard manner. Automation is key and only possible with a consistent, consolidated set of IT processes and business instruction to enable the removal of manual intervention. This is known as STP, Straight-Through processing. Unfortunately there isn’t currently a unique, simple and highly adopted industry standard or best practice. Things can vary tremendously between regions, because of legacy systems or IT compromises made by each stakeholder and counterparty in the value chain. There is an increasing amount of convergence, with only a handful of standards and best practices globally, however because the devil is in the details, each Bank needs to bridge that final gap with their own choice of operating model, IT infrastructure and integration and messaging technology. Straight-Through Processing isn’t only the ultimate goal for Banks, the mirror processes happening with Brokers and Clearing Houses also require automation. There is at least a few areas where STP should apply as much as possible: Collateral Management, Broker Dealer Services, Liquidity Services, Collateral Segregation, CCP Clearing and Collateralisation. GXS operates services that augment the level of STP to enable our clients to send instructions in non-standard formats through our platform; we can translate and enrich with reference data to create a standard format banks can process. This improves client servicing and onboarding for these bank services. From an Operations and Post-Trade perspective, this allows keeping an audit trail of client input formats to bank standards, for banks to resolve client inquiries. The Bank can further automate the process flow by leveraging operational tools to reject or process instructions that do not match or have errors. From a client and counterparty on-boarding perspective, Banks can deliver bespoke portal applications that clients can use to instruct their banks in a standard manner. Alongside portal applications, GXS also provides connectivity and messaging services (SFTP, MQ, etc) for clients to send in instructions. From a reporting perspective, the flow of inbound and outbound data allows Banks to centrally store, process and extract detailed information to calculate accurate collateral positions and issue daily reports.  

Read More