EDI

Finally, Some Hard Facts About EDI – (1) EDI Still #1 By Far

Although EDI has been around since the 1980’s, there have been new technologies many thought would replace it. These are technologies such as portals and specialized multi-party trading exchanges that held out the promise to be lower cost, provide for faster partner onboarding, and enable greater participation rates by trading partners (suppliers, customers, logistics providers, financial institutions) than traditional EDI. But, according to the new study “EDI: Workhorse of the Value Chain” published by industry analyst and founder of Supply Chain Insights, Lora Cecere, “nothing could be further from the truth. It [EDI] is the workhorse of the extended supply chain.” Here are two of the interesting statistics from the report that back up Lora Cecere’s statement: While 70% of all orders are automated by EDI/XML, Portals, and/or Exchanges, EDI is the method of choice.   This chart shows: For orders received from customers (Sales Orders), 55% are received via EDI (either totally integrated or requiring minimal manual intervention) while only 10% are received via a portal and 7% are received via exchanges. The rest are sent via other, including manual, methods. The results are similar for companies that use EDI for sending orders (Purchase Orders) to their suppliers. 62% send them via EDI, while only 7% are sent via a portal and 1% via trading exchanges.   This is expected because EDI provides a universally accepted cross-industry standard format that enjoys significant adoption. When a business offers a portal option to its trading partners, some manual intervention in the process is usually required, causing cycle-time delays and increasing the likelihood of errors. Multi-party exchanges (e.g. Elemica, GHX) are usually industry-specific platforms that target specific markets and require significant investments in specialized technologies and processes that have not yet enjoyed the level of adoption of EDI. Lora Cecere notes: “After a decade of hype, the use of multi-party trading exchanges for purchase order fulfillment is in its infancy in procure-to-pay processes.” 2. EDI users are quite satisfied with how it improves supply chain performance. Survey participants were asked to rate how effectively the implementation of various EDI documents improved supply chain performance. They used a scale of 1-7, in which a rating of 1 indicated “not at all effective,” while a rating of 7 indicated “very effective.” As you can see in the chart below: All the EDI documents significantly improved supply chain efficiency for more than half the respondents – they indicated a rating of between 5 and 7. The documents that appear to have the greatest impact on improving the supply chain were orders and advance ship notices, receiving a rating between 5 and 7 by 84% and 80% of respondents, respectively. So, EDI is the most commonly used B2B e-commerce technology and its users are quite satisfied with its performance. In fact, for many companies, EDI has become the lifeblood of their business, making them more efficient, driving down costs, and increasing customer satisfaction. It is the means by which they can differentiate themselves from their competition. It provides them visibility into their ordering and delivery processes that enable business success. No wonder EDI is indeed alive and well. In future blogs, I will discuss additional statistics from Supply Chain Insights’ report. If you would like to read the report, get your copy here.

Read More

What was Driving B2B in 2013? – Top Ten Most Popular Blog Posts

As we reach the end of 2013 I thought it would be a good time to review the top ten most popular blogs from Driving B2B for the year. The Top Ten list below was derived using results from Google Analytics and for the purposes of this blog looks at which posts, from all my blog entries over the years, were most visited during 2013. 1. Why is India’s Automotive Industry Growing so Quickly? – Even though I posted this blog in March 2012, this has been the most reviewed blog on Driving B2B for 2013. I think it helps to highlight that there is continued interest in setting up automotive operations in India. Growing consumer wealth, an interest in premium level cars and low cost labour are all contributing towards continued inward investment in the country from the world’s automotive industry. Interest in Jaguar Land Rover, recently discussed in this blog, who are owned by India’s TATA Motors has also helped to fuel the growth of the automotive industry in India. 2. How will Cloud Computing Benefit the Manufacturing Industry? – This blog was originally posted in February 2011 and each year proves to be one of the most popular blogs read on Driving B2B. Cloud computing has transformed the way in which business can operate. The manufacturing sector is truly global in nature and this blog described how the industry could benefit from deploying cloud based infrastructures to manage the direct materials supply to their production operations. Today’s manufacturers need to quickly enter new markets to remain competitive and cloud B2B infrastructures provide the scalability and flexibility to achieve this. 3. Top Ten Trends That Will Impact High Tech Supply Chains in 2014 – This blog was posted in August 2013 and has been the most popular in terms of those that have been posted on Driving B2B during 2013. The high tech industry is currently going through an exciting period with the wide spread adoption of tablet devices, wearable devices and the ‘Internet of Things’. The adoption of these three tech trends alone has transformed the high tech industry over the past 12 months and is likely to continue driving consumer interest in 2014. This blog describes the top ten tech trends that are likely to impact high tech supply chains in 2014. 4. How the ‘Internet of Things’ will Impact B2B and Global Supply Chains – This blog was only posted in October this year, however this post has been the most popular as it has received a lot of reviews in just a two month time frame. It is not surprising as the Internet of Things has been one of the most popular tech trends for 2013. This year companies have been learning about IoT but I think in 2014 we will see more companies actually start to deploy more machine to machine connectivity environments. I will continue to keep a close eye on this emerging sector as we go through 2014 as it will help to drive convergence between the physical and digital supply chains. 5. How the Brazilian Government Plans to Stimulate Growth Across Their Automotive Industry – This blog post was originally posted in July 2012 and has continued to prove popular amongst visitors to Driving B2B. As with India, Brazil is also seeing exponential growth in its automotive industry. Imported automotive parts or vehicles are subject to high taxes and as a way of boosting their domestic automotive industry the Brazilian government introduced the INOVAR directive in 2012. This has proved very successful as many of the global car manufacturers have now invested billions of dollars in setting new plants in the country. 6. Build to Order or Build to Stock? – This is one of my oldest blogs to appear in the Top Ten list for 2013, however it helps to demonstrate that there is a continued interest in the automotive industry to reduce inventory levels by adopting build to order production processes. Changing consumer demand, combined with exponential growth in premium car sales in the emerging markets has led many vehicle manufacturers to implement build to order production. This in turn led to increased adoption of Just-In-Time production techniques which helped to considerably reduce the volume of parts flowing across automotive supply chains. 7. Arriba, Arriba, The Automotive Industry Speeds Up its Investments in Mexico – This blog post was originally posted in 2012 and continues the theme shown in this Top Ten list regarding continued interest in the emerging markets. Over the last two years Mexico has emerged as one of the most important automotive manufacturing hubs in the world. Many Far Eastern and European vehicle manufacturers have established a presence in the country as it provides the ideal stepping stone to export vehicles into the lucrative North American market. 8. Why Eastern Europe Could Benefit from the ‘Perfect Storm’ Currently Brewing in the High Tech Industry – This blog was originally posted in October 2012 and discusses how Eastern Europe has become a magnet for inward investment from the high tech industry. Restructuring across the industry and a continued interest from Far Eastern companies to enter the Western European market has led to significant inward investment in the region. Close proximity to Germany for example and access to a highly skilled, low cost workforce has made this one of the fastest high tech investment regions in the world. 9. How Cloud B2B Integration Enables Michelin’s International Operations – This blog was posted in September this year and promoted our recent joint webinar with Michelin. This has become one of the most popular manufacturing related webinars that we have produced this year. Michelin is one of the world’s leading manufacturers and distributors of tyres and this blog discusses how the company uses Cloud based B2B integration to connect with their extensive trading partner community around the world. 10. German Automotive Industry to Move From VDA to Global EDIFACT Messages – This blog was posted in June this year and discussed the German automotive industry’s move towards using Global EDIFACT messages rather than VDA messages. German automotive companies such as VW have globalised their operations in recent years and they found the VDA message set to be too restrictive in terms of managing global logistics flows. The introduction of the Global Message set aims to standardise the way in which the German automotive industry exchanges messages with its global trading partner community.

Read More

EDI will Support Japan’s Implementation of the WCO SAFE Framework for Imported Goods

In June 2005 the World Customs Organisation (WCO) council adopted the SAFE Framework. This is a globally agreed set of standards to secure and facilitate global trade that acts as a deterrent to international terrorism, secures revenue collections and promotes trade facilitation worldwide. The facilitation of trade has been highlighted by many industry analysts as one of the key benefits as the framework provides a much higher level of visibility into activities across a global supply chain. Many countries have introduced their own variant of the SAFE framework, in the European Union, the European Summary Declaration (ENS) was introduced by the 27 member countries for all modes of transport. Mexico, Australia and Brazil have their own rules in place to support the framework and China is currently working on their own version of the rule. North America was one of the first to implement the ruling, partly to help minimise terror related activities and in early 2010 it extended the rule by introducing the ‘10+2’ compliance initiative. Under the 10+2 compliance ruling, before merchandise arrives in the United States, importers, customs brokers or freight forwarders must submit certain information, 10 data elements, namely the first eight no later than 24 hours before the cargo is loaded on its U.S bound vessel and then submit the last two no later than 24 hours prior to the ship’s arrival at a U.S port. Logistics Carriers must also submit two pieces of information, hence the name 10+2. In March 2014, Japan will become the latest country to adopt the SAFE Framework with the introduction of their Advanced Filing Requirement (AFR). The AFR goes into effect on 1st March 2014 and the penalty phase will apply ten days later on 10th March 2014. The Japan AFR requires all logistics companies to submit electronic shipping details 24 hours prior to the vessel’s departure for all Bill of Lading covering any goods that are intended to land in Japan. Japan will be getting tough on companies that do not submit information electronically with a potential $5,000 fine or a year in jail. A more serious option could be the Japanese Customs Agency refusing to issue a permit to discharge the cargo. To ensure compliance many companies have been collaborating with Nippon Automated Cargo and Port Consolidated System (NACCS), the private sector IT division of Japan Customs. NACCS has been working tirelessly over the past 12 months to try and educate the market on what is required to be sent to Japan Customs ahead of the shipment leaving its port of origin. NACCS requires nearly thirty pieces of information to be submitted and the exact data fields required can be found HERE The introduction of the WCO SAFE Framework has helped global customs and border control agencies to significantly improve the way in which they process inbound shipments and remove significant paperwork from their processes. It has also allowed Customs to modernise their IT infrastructures to support global carriers in a more efficient way. Improved visibility of information relating to inbound shipments has not only helped to minimise terror related activities but has also contributed towards a reduction in the amount of counterfeit goods or parts entering today’s global supply chains. As with other countries, Japan has decided that information to support AFR should be sent electronically via EDI, which is the global ‘language’ of today’s B2B activities across trading partner communities. Businesses today face an increasing amount of regulatory compliance and EDI adoption is helping companies adhere to these new regulations and ensure that the wheels of global trade continue turning.

Read More

Commsbursting

Every night retailers around the world batch up their Point of Sale (POS) data and transmit it to their suppliers shortly after the stores close. Suppliers can gain a great deal of insights from inspecting retail sales data. Suppliers can understand how many of each of their products were sold and in what stores. If they are lucky the retailer may provide insights about what other items that were in the shopper’s basket and purchased at the same time. And if they are really lucky, the retailer may provide demographics about the specific shopper who made the purchase, which suppliers can use to understand the profile of their target customer. Transmitting POS data is critical to demand sensing in the consumer products supply chain, but it often comes at the expense of a migraine headache for the IT department. The root cause of these headaches is that each retailer sends data differently. Some retailers will send each supplier the data for their specific SKUs. Others send over their entire day’s results across all their suppliers. Some retailers send five fields for each purchase transaction while others might send as many as 100 fields. Some days only a subset of the POS may be transmitted if the stores are unable to report in a timely manner. And during busy seasons like the Post-Thanksgiving holiday buying season there may be a spike in sales for particular items, which means much larger files. As a result the size of the POS files to be received can vary considerably from day-to-day. Why are large POS files a problem for IT? Well, they can be a big problem if the IT infrastructure for which they are destined wasn’t expecting them. First, there may not be enough disk space to accommodate the files. Suppose you have a 100GB hard drive. Your NOC has set an alert to notify them when the disk has less than 10% capacity remaining. At 9pm the disk has 12GB of space remaining. But the retailers send over unusually large files starting at 10pm, whose combined size is in excess of 14GB. Suddenly you are out of space and cannot accept the data. Large files also tend to choke firewalls and local area network capacity. They can also monopolize the resources of the Managed File Transfer or B2B Integration Software that processes the files. If your B2B/MFT software becomes too overloaded you may get what is effectively a Denial of Service (DOS) condition for other transactions. So a customer who wants to send you a $1 Million purchase order via EDI is unable to do so. Your HR team who needs to submit their weekly payroll run to the local bank is unable to do so. POS files are not the only types of large files. These days’ companies are exchanging many different types of unstructured data – images, videos, audio files, telemetry, logs and entire databases. And you never know when you will get one of these files, because sender rarely gives any type of warning . In today’s era of nearly-ubiquitous broadband and virtually-free storage costs, why should they be bothered with worrying about the size of files? By now you can probably guess where this is leading. Couldn’t cloud computing help with this problem? The large file transfer problem should be easily solved with the principles of elasticity that cloud providers introduce to storage and processing power. Enter Commsbursting, a new technique being used by cloud-based integration providers such as GXS, to automatically provision additional file transfer capacity in situations where large files threaten to deny service. For example, if processing or storage space hits a certain threshold (e.g. 70%) then the cloud provider could auto-provision additional resources to accommodate the spike in traffic. Commsbursting is not only useful in situations with large files. It can be used to respond to any type of spike in demand. For example, most of the payment clearinghouses around the world operate only during business hours. For example, a payment processing window might be from 9am to 4pm. As a result, near the end of the payment processing window as 4pm approaches there is a surge in demand for payment requests as accounting groups rush to get transactions recorded with today’s date. The pre-cutoff volumes are relatively unpredictable. But banks do not want to be in the situation of telling a client that they could not process an important wire transfer submitted at 3:55pm because their IT infrastructure could not accommodate the load. Using a commsbursting model to handle the spike in payment transaction volume is an elegant and economical solution for a financial institution. So you may be wondering – how do you get commsbursting for your B2B integration environment? Technically, there is no reason why you could not build the capability into a private cloud in your own data center. But the ROI of having additional capacity available to burst up may not be very compelling. With a public cloud, however, the economics are far more favorable. A provider can recover the costs from a wide base of customers each of whom benefits at only a fraction of the investment that would be required in-house.

Read More

EDI Benefits – Hard facts now available!

EDI has helped simplify and improve commerce between trading partners for years and its benefits continue as it improves more business processes such as electronic procurement, automated receiving, electronic invoicing, and electronic payments. EDI can reduce the cost of personnel and office space, improve data quality, speed up business cycles, improve efficiency, and provide strategic business benefits. Sometimes, in order to make a decision to embark on a new EDI program or expand their current EDI projects, companies seek quantitative data to educate their executive teams and drive a more holistic understanding of the EDI processes and benefits for the greater organization. They want hard facts that answer questions such as: How many days earlier can orders be shipped when EDI is used for the ordering process? What are the actual cost savings that companies realize when they use EDI Advance Ship Notices (ASNs)? What are the actual cost savings that companies realize when they use barcode labels or RFID tags? What are the top challenges that businesses face when EDI is not used as part of the ordering process? How do supplier companies really benefit when they comply with their customers’ requests to exchange business documents via EDI? Want the answers to these questions and more? Watch this 30-minute webinar during which industry analyst and founder of Supply Chain Insights, Lora Cecere, discussed the key findings and takeaways from her new study titled, EDI: Workhorse of the Value Chain; A Closer Look at B2B Connectivity Benchmarks in the Extended Supply Chain.

Read More

Reduce Risk with Secure Information Exchange

Every business is an information business and this is why digitization is a key corporate priority. Every enterprise exchanges massive amounts of information on a daily basis, both inside and outside of the firewall. Security has become a top priority for the CIO with the volume, variety, and velocity of the information exchange that occurs. Security risks are intensified by globalization, a reliance on email for business collaboration, BYOD and mobile access, growing regulatory pressures, bad actors, and cloud computing. With more than 80 percent of enterprise data residing outside of structured ERP systems, much of this information is in transit outside the business firewall. In fact, there have been more than 600 million information breaches in the U.S. alone since 2005 (privacyrights.org) and one-third of these breaches were due to unintended disclosure, insider fraud, or the use of mobile devices. Poor information exchange systems and practices lead to security threats at many levels: to employees; projects; competitive advantage; national security; reputation; brand, and in some cases, to the business itself. In view of these challenges, IT departments have a mandate to transform reactive, unsecured, and uncontrolled information exchanges into safer, more flexible, managed, and compliant processes. CIOs need easy, integrated, and trusted solutions to support all of their business information exchanges— from vendor invoicing, payroll submissions, transfer of healthcare records, to securing corporate IP. OpenText has responded to these customer needs with our Information Exchange Suite which is an integrated set of cloud-based messaging services that comprises Secure Email, Secure MFT (Managed File Transfer), Fax, EDI (Electronic Data Interchange) and Notifications. Recent innovations to the newly released Information Exchange Suite include cloud-based secure messaging, large-file transfer acceleration, data leak prevention, real-time audit trail, and integration with desktop, mobile and other systems. Each service is designed to support the business requirement for exchanging information with anyone, anywhere, and in any format, while instilling confidence that the exchange of information is accessible, efficient, and trusted. Enterprises are empowered with the infrastructure for secure and reliable exchange of information to help improve operational performance, reduce risk, and enable enterprise agility. Imagine the productivity gains and competitive edge your organization would realize if the exchange of information could be made faster, easier and more secure! I’m pleased to share these advancements with our customers, and more information is available here. Next month, I look forward to providing an update on GXS.

Read More

How Does the Automotive Industry Plan to Embrace the New Dodd-Frank Conflict Minerals Law?

In an earlier blog entry I discussed how a new ruling being introduced in North America is likely to impact manufacturing supply chains around the world. The ruling will essentially make companies more accountable for where they source certain materials that go into their products. The Dodd-Frank Law relating to conflict minerals usage has been introduced to try and kerb the funding of rebel groups in the Democratic Republic of Congo and its immediate neighbouring countries. There are four key minerals, Tin, Tungsten, Tantalum and Gold, (collectively known as the 3TG minerals) that will be impacted by this ruling. For a more general introduction to 3TG minerals and how the Dodd-Frank law will impact their sourcing, please see my previous blog on this subject area, Click Here. The manufacturing industry is probably going to be the hardest hit by this new ruling as every manufacturer must be able to demonstrate what minerals are in their products, and more importantly where they were sourced from. The high tech and automotive industries are going to be severely affected by this ruling and in my previous blog I discussed what steps the high tech industry was taking to address this issue. This blog will briefly review how this new ruling will impact the automotive industry. The image below shows just a small selection of parts within a vehicle that are likely to use 3TG minerals in some shape or form. Tantalum is probably the more widely used mineral and is used in many different areas of a vehicle. Given a car may contain over three thousand individual components you can just imagine the huge task facing the OEMs to try and identify not just which components are using 3TG minerals, but the likely quantity as well. It has been estimated that just adhering to this new law will cost the automotive industry between $3 billion to $4 billion. With the reporting period underway already in North America and reports due to the Securities and Exchange Commission (SEC) by 31stMay 2014, there is a growing sense of urgency amongst automotive companies to introduce compliance procedures for their supply chains. In April 2011, six manufacturers, Chrysler, Ford, General Motors, Honda, Nissan and Toyota issued a joint letter to their respective suppliers informing them of the new reporting requirements and requesting their cooperation in terms of identifying which parts may contain 3TG minerals. The six companies outlined three basic steps: Determine which parts/assemblies incorporate one or more of the identified conflict minerals or their derivatives Assess the supply chains associated with those parts/assemblies Engage with suppliers to identify the smelters used in a supply chain to process the conflict minerals or validate the origin of the conflict minerals as recycled/scrap In August 2012, the SEC finalised the rule for complying with conflict minerals provision of the Dodd-Frank Law. These rules, especially given the global nature of the automotive industry, have major implications for nearly every automotive company, not just in North America. Even companies headquartered outside of the US and those which do not report to the SEC, may be subjected to conflict minerals requests from customers who do report to the SEC or are in the supply chain of these companies or their tier suppliers. One of the key ways that companies make sure they achieve compliance is by ensuring that the internal organisation is aligned and the purchasing department will typically take the lead on this initiative. After all, the purchasing department is responsible for the day to day communications with the supply chain and they manage all communications and interactions. When you consider that there are over 190,000 automotive manufacturers and suppliers in the industry that are potentially involved, the route to compliance could potentially be a long and complex one. It will require significant resources to ensure compliance and make sure that filings are submitted to the SEC on time. The Automotive Industry Action Group (AIAG) has been working tirelessly to educate the industry over the last couple of years, on what they need to do to ensure compliance. They have taken the Electronic Industry Citizenship Coalition (EICC) conflict minerals reporting template and created their own web based version of the reporting tool to help companies survey their trading partner communities. A recent study by the analyst firm PwC, referenced in the chart below, highlighted some of the key challenges faced by the automotive industry with ensuring compliance across their supply chains. The biggest challenge highlighted by respondents to the survey said that getting accurate and complete information from relevant suppliers was going to be their biggest challenge. In fact 31% of respondents agreed that this would be the greatest challenge in terms of the conflict minerals compliance process. Being able to identify which companies across the automotive supply chain may be unknowingly supplying parts containing 3TG minerals is one challenge but then being able to reach out to them efficiently is another challenge altogether. One of the key challenges faced by many companies today is that contact information is potentially held within many different back end business systems and ensuring this data is up to date is an ongoing challenge. GXS Active Community is an enterprise wide collaboration platform that could potentially help to address this particular issue. By providing a web based collaboration platform that allows suppliers to update their own contact information as and when required helps to ensure that you can reach out to your supplier community in a more efficient manner. At the end of the day if a supplier wishes to do business with a prospective customer then it is in their own interest to at least make sure they are contactable. Active Community not only allows companies to keep up to date information about each and every supplier, it also provides a platform to send out regular communication to a trading partner community. Ensuring that suppliers adhere to various compliance procedures is becoming a key part of the trading partner management process today and Active Community can provide various tools to allow regular assessments to be sent out for completion by a supply base. For example the EICC reporting template could easily be replicated within Active Community, an example of which is shown below, and a supplier community would be able to use this platform to ensure that they meet the compliance requirements of the conflict minerals reporting law. As discussed earlier, it is not just companies in North America that will need to submit evidence to the SEC that their supply chains do not contain conflict minerals. The European Union completed a public consultation in June 2013 and a recommendation on necessary steps to be taken by European member countries will be announced by the end of 2013. The European Commissioner for Trade recently gave a speech on responsible sourcing of conflict minerals and his speech can be read here It is widely expected that the European Union will embrace the OECD framework highlighted in my earlier blog on this subject and it is expected that other regions will also embrace the framework moving forwards. Given that three Japanese OEMs were involved with the initial communication to their supply base in North America it is expected that Japan’s government will be taking a closer look at this initiative in the near future as well. The automotive industry is fortunate to have a proactive group of industry bodies, namely AIAG in North America, Odette in Europe and JAMA in Japan that are/will be willing to work closely with the regional suppliers to ensure that they meet the various compliance initiatives. Odette and JAMA will hopefully be able to utilise much of the initial work undertaken by AIAG to support their own conflict minerals reporting initiatives. I recently recorded a webinar relating to the area of conflict minerals and how GXS Active Community could be used as a potential reporting platform. GXS will make this webinar available shortly, so please feel free to register to learn more about the new conflict minerals law and how GXS can help ensure compliance across your supply chain.

Read More

Security with Digital Certificates: Should you generate your own or use a Certificate Authority?

In the world of B2B, the recommended approach for ensuring the security of the documents you exchange with business partners – such as your suppliers, customers, logistics providers, financial institutions – via the Internet is the same encryption approach used by many communications protocols such as AS2 and SFTP. These communications protocols use a system of public and private keys – one set for the sending company and one set for the receiving company – while leveraging digital certificates to enable the easy exchange and management of the key pairs. (See How Digital Certificates Help Ensure the Security of EDI Data.) One of the decisions you’ll need to make when using this approach is how you will generate the digital certificates your company uses. You have two options for generating the digital certificate: (1) You can generate your own, using special software, or (2) you can use one of the Certificate Authorities (CAs), such as Verisign and Entrust, to generate and manage them on your behalf. If the digital certificate is generated by a CA, it is usually valid for one or two years. If you generate it yourself, you can make it valid for a longer period. When certificates expire, they need to be renewed or replaced and you must provide the new certificate to your trading partners in advance of expiration to ensure that the critical business documents you exchange, such as purchase orders and invoices, can continue to flow without interruption. For an annual fee, a certificate authority (CA) will issue digital certificates, and can also provide additional services, such as: If a certificate is compromised – for example, the private key has been lost or stolen – the CA can “revoke” it before it expires. These revoked certificates are put on a revocation list that is automatically checked by your software to verify the certificate prior to its use. The CA ensures that the certificate holder is who they claim to be by verifying their credentials. This adds an additional level of assurance of the trustworthiness of any business partners with whom you are exchanging documents. Prompted by the expiration date within your partner’s certificate, the CA will verify the identity of your trading partner on a regular basis, increasing the security of the system still further. The alternative to using a CA is to get everyone in your community to “self-generate” certificates, allowing them to set their own expiration dates. The benefits of this approach include: It’s free, as many B2B software applications include a certificate self-generation capability. You may have less administration headaches because everyone can set longer certificate expiration dates, say 5 or 10 years. Then, instead of having to update your system with everyone’s new certificate every one or two years, as would be necessary for CA-issued certificates, you only need to do it every 5-10 years. However, having longer expiration dates reduces the overall security of the system, since no organization is “policing” the system and confirming that a certificate does belong to the person it appears to come from. If your trading partners set the rules, you may need to support both models, with some partners asking you to use a certificate from a CA, while others will accept self-generated certificates. Whichever route you choose, you must be careful not to lose access to your private key (by forgetting your own password, for instance), since neither a CA nor a system that self- generates certificates can retrieve it. In these circumstances, you would need to generate a new certificate and distribute it to all of your trading partners, and you or your partners may need to re-send some documents if they were sent using the old key. To learn more about the best options for B2B Communications, watch this webinar: How to Determine the Best Communications Protocol for B2B Integration

Read More

Driving Innovation & Growth

The growth of unstructured information inside the enterprise is staggering. In fact, experts estimate that over 80 percent of data in organizations is unstructured and is growing at a rate of over 36 percent year-over-year(1). Managing this information across different formats, devices, and applications is a challenge for organizations that’s not going away. There is profound value in this unstructured information. In fact, your company’s future depends on it. How is the enterprise dealing with all of these new data types? Well, according to Forrester, they’re not. In a survey conducted in May of 2013, only 13 percent of the respondents had a formal information management strategy in place (2), which is why we’ve spent the last year focusing our efforts on our biggest synchronized software release to date. Announced at Enterprise World, this finely choreographed release features software advancements across our Enterprise Information Management (EIM) suite designed to help organizations manage huge amounts of data and unlock the untapped value of their information to create competitive advantage. Our latest release features over 300 integration points and stronger synchronization across five suites of software: Content Suite, Process Suite, Experience Suite, Information Exchange Suite, and Discovery Suite. The OpenText Vision: A Holistic View of EIM Let me touch on some of the highlights, suite by suite: Content Suite: will reduce costs through security features, information governance, and content lifecycle management. Innovations include an easier to use interface, APIs, reports, report writer, and Archive in the Cloud. Process Suite: automate processes to improve performance with new Smart Process Apps, including Case Management, Case Intelligence, and flexible deployment on premise or in the Cloud. Experience Suite: create the best possible consistent experiencewith every interaction through enhancements like omni-channel publishing and adaptive media, web and social analytics, ecommerce connectors, and our new HTML5 user experience. Information Exchange Suite: build trust and reliability, and reduce risk with the secure exchange of information from any user on any device to any destination. Innovations include advanced messaging services layer for fax, notification and EDI services, real time audit trails, and data loss prevention capabilities. Discovery Suite: empower people to find, understand, and leverage enterprise information for greater insight and better decision making with solutions for auto-classification, content migration, content analytics, semantic search, eDiscovery – and a CIO dashboard. With our latest release, we’re also introducing AppWorks, common RESTful services, and an EIM developer platform that accelerates the speed of development and introduces opportunities for innovation to our customers and partners. With AppWorks, developers can begin to write code using our suites within hours as opposed to weeks, using standard languages such as Java, JavaScript, and HTML5. The EIM suites outlined above are integrated through AppWorks to leverage the value of combined suites into a comprehensive EIM platform. Support for Sophisticated Information Flows AppWorks builds our holistic EIM strategy by supporting complete and integrated information flows to maximize the value of information across the enterprise. EIM is the next generation of enterprise software. Our latest release delivers a strong technology foundation for our customers to build on and establishes EIM as the mission-critical solution to drive insight, innovation, and growth. (1) Ray Paquet, “Technology Trends You Can’t Afford to Ignore”, Gartner Inc., http://www.gartner.com/it/content/1503500/1503515/january_19_tech_trends_you_cant_afford_to_ignore_rpaquet.pdf (accessed 10 Nov. 2012). (2) Alan Weintrub, “The Enterprise Information Management Barbell Strengthens Your Information Value.” ©2013, Forrester Research, Inc: July 15, 2003

Read More

The Rise of the Machine Connected Supply Chain

Two weeks ago I was fortunate to get an invite from Cisco to attend their ‘Internet of Things’ World Forum in Barcelona. As I have discussed before, the Internet of Things is going to herald the introduction of the fourth industrial revolution and Cisco, SAP, Oracle, GE and many other tech and industrial companies want to be part of this new and exciting sector. Some of the subjects being discussed at the forum, especially in healthcare, would not have looked out of place in the 2003 Terminator 3 film, ‘The Rise of the Machines’, hence the title for this particular blog entry! I personally found the event to be one of the best conferences that I have attended in recent years and there were a lot of conceptual ideas discussed which gave plenty of food for thought. One such idea, which for some reason I have not been able to forget since the conference, is the thought that any type of machine could potentially have its own avatar or Facebook style profile page. To put another way, in your personal life you may have an online profile such as Facebook or LinkedIn to keep in touch with both friends and work colleagues. What about if every machine connected to the internet had its own avatar or online profile as well? I thought this was quite an interesting concept, but what would it look like and how would it work? I like challenges such as this so I had to try and come up with a concept, using Facebook as a basic template as you will see below. The other reason that I am interested in this is from a B2B and supply chain point of view. Here at OpenText we offer a collaboration or rather community management platform called Active Community which allows companies to maintain a centralised database of contacts for all trading partners across a supply chain. If machines were somehow able to interact directly with the supply chain, either up or downstream then it could potentially help introduce a new set of operational benefits and efficiencies. The Internet of Things offers the potential to connect the physical and digital supply chains in a way that has not been possible before. But what if individual machines could somehow interact with users or vice versa, could a Facebook type of ‘asset management’ environment provide a neat way of managing the machine to machine communications across a supply chain? Even though the concept of the machine based avatar was discussed at the Forum, there was relatively little information offered as to how this could work. I am sure there are some conceptual ideas out there already, but I thought I would offer my own vision of how this could work. One of the discussion panels at the forum included a speaker from Caterpillar who said that every machine that leaves their factory includes a WiFi module so that it can be connected to the internet and information can be analysed and exchanged remotely when it is out in the field. We are just entering the fourth industrial revolution and the focus so far has been on the subject of just getting machines connected to the internet. I have really only scraped the surface of this particular subject area and every aspect of the operation of today’s supply chains is likely to be impacted in some way by the Internet of Things, from logistics networks, warehouse and distribution centres through to improved monitoring of inventory levels in factories and retail stores. I think we are heading for exciting times!

Read More

Integration-as-a-Product (IaaP) – The Forthcoming Wikipedia Entry

Integration-as-a-Product (IaaP) was a late twentieth century model in which corporations utilized their in-house IT organizations to operate integration technology behind the firewall. The model worked as follows. Corporations would license integration software vendors from technology vendors. The software would be installed on servers running in a corporate data center. Vendors typically received license fees of hundreds of thousands of dollars up front as well as 20% on-going maintenance fees. Payment to the vendor was irrevocable regardless of whether the business outcome sought by the corporation was achieved or not. Corporations struggled to successfully implement integration as a product for three decades. After a period of 20 years of experimentation the industry had only achieved 30-50% adoption of electronic commerce technologies. It was followed by a shift in thinking towards different approach – See “Integration-as-a-Service” or “Cloud Based Integration.” As with the dinosaurs there are many theories as to why Integration-as-a-Product failed. Some of the most popular root causes are believed to be: Proliferation of XML Standards and IP Communications options resulting in overwhelming complexity for IT organizations. Inability of Point-to-Point Connections to scale beyond 20% of a community. Ineffectiveness of IT organizations to convince Small Businesses to participate due to the costs, resources and training required to purchase and implement software. Lack of accountability amongst technology vendors to ensure success. Payment to the vendor was irrespective of business outcome. Failure of IT organizations to keep pace with trading partner growth caused by rapid rise in outsourcing of functions such as manufacturing, logistics, distribution and aftermarket service. Widespread acceptance of easier-to-use and lower-cost cloud and SaaS models. See “Epic Fail” for other examples of similar failed business models.

Read More

Securities – Collateral for Dummies, Damage Free – Part 3

In my third and final blog in this series, I am covering the basics of aggregated collateral reporting and segregated client collateral reporting. Aggregated collateral reporting Collateral is almost always diversified and fragmented across many locations, making it difficult to optimise its use across a legal entity or portfolio. Banks, Brokers, Asset Managers and Corporates are starting to look into or build collateral reporting services that aggregate all of the collateral posted as a preliminary step to optimising its use globally. The key business stakeholders of this process are usually in the Collateral Management, Margining and Treasury functions, which receive the service from Product Managers, Operation Managers, Technology Managers, Treasurers, COO and CAO’s. Banks and Brokers are the stakeholders who look to create and offer this service, while Asset Managers and Corporates are the ones who subscribe to it. In order to successfully create an aggregated reporting service, organisations need to integrate, capture and normalise the relevant daily and intra-day feeds coming from their venues. This is usually delivered in many different formats and times of the day. GXS often becomes the delivery channel for these feeds; data normalisation, translation capabilities and processing logic are key to creating a “standard” format that can be loaded as a Straight-Through Process (STP) into the reporting service. Segregated client collateral As discussed in part 1 of this blog series, Client Collateral must now be segregated from a Bank or Brokers’ own assets. In some cases the collateral must be deposited at a Clearing House in a segregated account, in the clients’ name. Regulations require that this must be reported to regulators periodically. Legally Separated, Operationally Commingled (LSOC) are the rules adopted by the CFTC; it is the basis for the complete legal segregation model, which determines how margin for cleared swaps will be held for the benefit of customers of a Futures Commission Merchant (FCM). Futures clearing, collateral management and liquidity services all have a business stake in the process, with Operations Managers, Product Managers and IT running and delivering the service. As a result of the regulations and industry best practices, there is a larger volume of segregated accounts, increased operational and management effort and several additional orders of magnitude for issuing reports on balances to regulators and clients. Add to this the need to manage diverse file formats and underlying systems, and there is an opportunity to have a consolidated, systematic and automated approach to segregated client collateral reporting. GXS often assists the FCM in creating the connectivity, formats and feeds needed to comply with these regulations, to enable their client’s reporting on LSOC accounts. The goal of this three-part blog was to help put some of the industry buzzwords and themes in context. I hope this was useful and I look forward to receiving feedback on how your organisation is dealing with these challenges.

Read More

How Digital Certificates Help Ensure the Security of EDI Data

When you exchange EDI documents via the Internet, the security of your data is of vital importance. It is critical that only the intended recipient can read the sensitive data being transmitted, such as purchase orders, invoices, or remittance advices. While encryption technologies have long been used to achieve the level of security needed for this sensitive data, their usage can be hampered by the difficulty of exchanging the “keys” upon which they depend. Digital certificates resolve the key exchange and management issues. There are two basic kinds of cryptography. The first is called “symmetric key encryption,” which involves the use of an encryption/decryption key, often called a “shared secret.” The key can be a code of any length, for example, 768 bytes or more. The longer and more random the key is, the greater the security achieved. To use this approach for B2B, that long key would need to be exchanged with all companies with which a business would be exchanging documents. There are several issues with the symmetric key approach. First, how do you exchange the key in a secure fashion? Just as you shouldn’t exchange passwords or credit card numbers via email, email is not a good vehicle for the shared secret – it’s not secure! Also, if you exchange documents with multiple partners, you probably want each partner to have a different key. That way, if one partner inadvertently gets documents intended for another partner, he cannot decrypt it because his key works only on documents intended for him. Managing all these keys can become a logistics nightmare. A better approach is to use “asymmetric encryption,” which uses a set of two keys – a “public key” that is used to encrypt and a “private key” that is used to decrypt – combined with a “digital certificate,” which makes the key exchange and management process very easy. The public key is called “public” because everyone who sends you documents can use the same key. There are no security worries about the public key falling into unauthorized hands, since this key cannot be used to decrypt or read your messages. It can only be used to encrypt messages being sent to you. You have a second key, called a “private” key, which is not shared with anyone else. This key is accessed by your communications software and is used to decrypt documents sent to you by partners that have encrypted documents using your public key. The digital certificate is actually an electronic “container” for the public key and other important information such as organization name, email address, and server identification. The certificate is formatted in a standard way, thus enabling software to immediately read the certificate and “know” where to find the specific pieces of data needed. The certificate can be exchanged via email because the information in it is all public, so there’s no security concern. In addition, the digital certificate enables you to keep track of which public key belongs to which company – this is extremely helpful when you need to manage hundreds or even thousands of keys for all the business partners to whom you are sending documents, each with its own public key. You can obtain a digital certificate for your company from an authorized certificate authority – such as VeriSign or Thawte – that acts as a trusted third party who vouches for the validity of the keys. Or, you can use special software to create your own digital certificate. In the B2B world, the asymmetric encryption approach combined with the digital certificate is the better approach. The public and private keys help ensure that (1) the data is encrypted during transmission over the Internet and (2) only the intended recipient is capable of decrypting the data. The digital certificate makes the process easy and manageable. To learn more about the best options for B2B Communications, watch this webinar: How to Determine the Best Communications Protocol for B2B Integration.

Read More

We Need Hypervisors for Translators

In my last post, I talked about how most map development tools and translation engines on the market were either “too soft” or “too hard” to meet the variety of B2B integration needs a company has. This creates a dilemma for companies trying to standardize on just one mapping environment and one translation tool. A new cloud-based service, called a translation hypervisor will allow you to run not just one, but multiple translators to support your B2B integration needs. The principle is similar to the hypervisors you are familiar with that allow you to run multiple instances of an operating system on the same server. Recall the example from my last post Let’s say you are a company that has a lot of maps. About half of these maps are relatively simple. They require straightforward mapping of data fields from your ERP application into an industry standard such as OAGi XML. But the other half the maps are complex. Perhaps, you have a demanding group of customers that each want you to integrate directly to their business applications. These customers want business logic included in the maps along with calls to databases and APIs to enrich or validate data. Which type of mapping and translation tool would you choose for this scenario? Historically, companies would have tried to standardize on a single translator across the enterprise to reduce costs. In the case above, however, this forces companies to standardize on complex mapping and translation tools to support the most challenging requirements. Developing simple maps becomes a lot more complicated and time-consuming than necessary. It’s like forcing developers to use Java when they could have used Microsoft Excel. The translation hypervisor changes all that. The customer in the example above could simply subscribe to two different translation services. Simpler maps could run on a lower-tier service with no “bells and whistles.” More complex maps could run on a high-end service designed for performance and rich functionality. Requests for translation from external business applications would be routed to the hypervisor. The hypervisor looks up which map is most appropriate for its inbound request then routes it to the appropriate translator. How does the customer benefit? Rather than having to take a one size fits all model, the translation hypervisor allows companies to obtain the optimal economics for each specific type of map. Simpler maps can be run on the lower cost cloud-based translation service while the complex maps run on a higher tier, higher priced service. These types of economics would be difficult for most IT organizations to obtain in-house. Sure, there is no technical reason why a large IT organization could not build a multi-translator service in their data centers. However, the ROI for building such a capability would not be justifiable for most companies because their map volumes are relatively low. But when you can aggregate a group of customers’ maps in the cloud the ROI from running multiple translators becomes much more compelling. Most cloud providers service a wide range of customers (some big and some small) with a wide range of mapping needs (some easy and some hard). Using a hypervisor approach allows the flexibility to offer customers maps and cloud-based translation services at a range of price points and implementation time frames.

Read More

How GXS Addresses the B2B Challenges Faced by Today’s Automotive Industry

GXS has been supporting companies across the automotive industry for more than forty years. Today, GXS works with many of the world’s leading automotive OEMs and tier 1 suppliers, helping to address B2B challenges such as supporting international expansion and ERP integration projects. I have been at GXS for just over seven years now and 2013 represents a major turning point in the industry. The restructuring faced by many companies after the recent economic recession is now starting to pay off and the industry is going through a renaissance. But what are the industry trends around the world?, what are the B2B related challenges faced by today’s automotive industry? and how does GXS help to address these challenges? Over the past few months I have been experimenting with a presentation tool called Prezi, with a bit of patience you can produce fairly eye catching presentations that makes Powerpoint based presentations look very old fashioned indeed! I thought I would share my latest Prezi presentation, the video is only 11 minutes long but I hope it gives you a good idea of the challenges faced by today’s automotive industry and how B2B integration solutions can help to address these challenges. If you would like to access the Prezi presentation used to create this video, then please click here. You will also find another presentation in my Prezi profile that provides a high level introduction to Cloud B2B integration.

Read More

Standardizing on a Single Mapping and Translation Tool for B2B Integration

There are a wide variety of mapping tools available on the market today. Some are designed to be easy-to-use by IT professionals without much formal training in map development. These might come with wizards or other assisted mapping tools to guide you through the process. Or they might come with libraries of pre-built maps that just require slight customization to use. At the other extreme are mapping tools which are designed to support very complex scenarios, but require more experienced IT professionals to use. Runtime translation engines, that execute the maps as requested, also come in different shapes and sizes. Some translators perform better with certain types of standards (e.g. EDI and XML) than others. Some translators are designed for very high volumes of complex maps that require lots of processing overhead. These might support the creation of complex business logic or web services calls to enrich data. At the other extreme are translators designed for lower processing volumes and simpler maps. But, most companies have a very diverse mix of maps that don’t fit into the “sweet spot” of one particular mapping and translation tool. Let’s say you are a company that develops a lot of maps. About half of these maps are relatively simple. They require straightforward mapping of data fields from your ERP application into an industry standard such as EDI or OAGi XML. But the other half the maps are complex. Perhaps, you have a demanding group of customer that each want you to integrate directly to their business applications. These customers want business logic included in the maps along with calls to databases and APIs to enrich or validate data. Which mapping and translation tool would you choose for this scenario? Neither is really a good choice. The simple mapping and low performance translators are too soft. The complex mapping and high performance translators are too hard. The Goldilocks Problem with Translators – Some are too soft. Some are too hard. (Image Source: http://kindergartencrayons.blogspot.com) Historically, most companies have selected a single mapping and translation tool to support their B2B integration programs. Conventional wisdom suggests that you can lower costs and improve productivity by standardizing on one vendor’s technology for mapping and translation. Using multiple different mapping tools and translators from multiple vendors is viewed to be highly inefficient. So what do you do? A new cloud-based service called a translation hypervisor is emerging that offers a cost-effective answer to this dilemma.

Read More

Securities – Collateral for Dummies, Damage Free

In this blog, I am introducing another industry buzzword: Straight-Through Processing. To illustrate what this means, I am looking at the use case of the typical collateral services offered by a Bank. Let’s examine at high level what the challenges are from a people, process and technology perspective, and what can be done in the wider context of the new daily reporting regulations. In order to achieve this, I’m using real-life examples shared by Curt Brill, seasoned professional from various global securities and funds services functions. Collateral Services offered by Banks Collateral Services are a growth area for Banks, however the issue is that there is little automation in place to support this, especially for daily collateral calculations and reporting. Clients traditionally send inbound instructions via, fax, email, or spreadsheet and the bank matches broker and asset manager/corporate instructions manually before acting. To achieve this process daily and for many thousands of accounts, banks need to receive instructions in a standard manner. Automation is key and only possible with a consistent, consolidated set of IT processes and business instruction to enable the removal of manual intervention. This is known as STP, Straight-Through processing. Unfortunately there isn’t currently a unique, simple and highly adopted industry standard or best practice. Things can vary tremendously between regions, because of legacy systems or IT compromises made by each stakeholder and counterparty in the value chain. There is an increasing amount of convergence, with only a handful of standards and best practices globally, however because the devil is in the details, each Bank needs to bridge that final gap with their own choice of operating model, IT infrastructure and integration and messaging technology. Straight-Through Processing isn’t only the ultimate goal for Banks, the mirror processes happening with Brokers and Clearing Houses also require automation. There is at least a few areas where STP should apply as much as possible: Collateral Management, Broker Dealer Services, Liquidity Services, Collateral Segregation, CCP Clearing and Collateralisation. GXS operates services that augment the level of STP to enable our clients to send instructions in non-standard formats through our platform; we can translate and enrich with reference data to create a standard format banks can process. This improves client servicing and onboarding for these bank services. From an Operations and Post-Trade perspective, this allows keeping an audit trail of client input formats to bank standards, for banks to resolve client inquiries. The Bank can further automate the process flow by leveraging operational tools to reject or process instructions that do not match or have errors. From a client and counterparty on-boarding perspective, Banks can deliver bespoke portal applications that clients can use to instruct their banks in a standard manner. Alongside portal applications, GXS also provides connectivity and messaging services (SFTP, MQ, etc) for clients to send in instructions. From a reporting perspective, the flow of inbound and outbound data allows Banks to centrally store, process and extract detailed information to calculate accurate collateral positions and issue daily reports.  

Read More

What’s the difference between the 4 types of EDI acknowledgments?

When you send a document electronically to your business partner it is critical that you know for sure whether or not it was received. Furthermore, if the document you are sending is important, such as a purchase order for a critical item, you must be confident that your supplier not only received the order, but is committed to fulfilling it. In the world of electronic data interchange (EDI), there are 4 types of acknowledgments that can help to answer the question: “Did you receive my document?” 1) Basic communications-level status message – This is a basic status message that is provided by all communications protocols from the most basic to the most sophisticated – e.g., FTP, SFTP, AS2. The computer that receives the transmission notifies the sending computer of receipt of a certain amount of data – e.g., “We received the 256 bytes of data you sent.” The data can be of any type, such as a text file, EDI data, or spreadsheets. This is a status message that is exchanged at the communications protocol level. 2) Message Disposition Notification (MDN) – The MDN is a special notification that is a key component of the AS2 communications standards. Because AS2 places EDI documents in an additional envelope to enable secure transmission over the Internet, you need to know that the EDI message was successfully extracted from that envelope, decrypted so that it can be processed by the EDI translator of the recipient, and the electronic signature was validated. Your AS2 communications software will generally manage both the communication status and envelope extraction status. Having this message is a critical first step to indicate that the document arrived successfully. But, as the sender of EDI data, you also need to know the answer to: “Was your system able to open the document and read it?” Enter the FA… 3) Functional Acknowledgment (FA) document – The FA is a status document that was specifically defined for the exchange of EDI documents. It is an electronic “receipt” from the EDI translator of the receiving computer to the EDI translator of the sender’s computer to indicate that the document was both received and read successfully. Specifically, the receiver’s translator was able to open the EDI envelope and confirm that the contents within were structurally and syntactically valid according to the EDI standard being used. The FA – which is often referred to as a 997 in the ANSI standard or a CONTRL message in the EDIFACT standard – is different from the communications-level status message in two ways: 1) The FA is exchanged at the translator level, while the communications status message is exchanged at the communications protocol level. 2) The FA confirms that the document was readable, which is not addressed at all by the communications status message. The FA certainly improves a sender’s level of confidence in document receipt, but, it still does not indicate that the receiver is acting upon the business-level contents. For example, if the document is a purchase order, the FA does not acknowledge that the order will be fulfilled. For that we need the next acknowledgment level – the business-level acknowledgment… 4) Business-level acknowledgment – The business-level acknowledgment goes far beyond the traditional FA. It confirms the content of the document received and also that the receiver is taking appropriate action. For example, upon a supplier’s receipt of an EDI Purchase Order, the supplier responds with a Purchase Order Acknowledgment, which can tell a buyer, down to the line item level, whether the order is accepted, including quantities and shipping windows. If the supplier is unable to meet the Purchase Order requirements, the acknowledgment can include the specific information about what quantities they can fulfill, and whether they need to split shipments across multiple dates. Other examples of business-level acknowledgment documents include the Purchase Order Change Acknowledgment and Application Advice documents. The way you monitor business status will depend on your own business processes and the software you are using internally to manage those processes. Ensuring that documents don’t “get lost in the system” typically requires tracking the progress of the document in the four ways described above. The first three apply to any standard that automates the exchange of documents. The fourth one applies only to transmissions exchanged using AS2 or AS3. To learn more about how B2B Communications, including how acknowledgments are handled, watch this webinar, Which Communications Protocol is Best for B2B Integration?

Read More

How the ‘Internet of Things’ will Impact B2B and Global Supply Chains

Over the past few months CIOs and executives around the world have been trying to embrace a new set of IT related buzzwords, namely The Internet of Things (IoT), The Internet of Everything and the Industrial Internet. All three terms are essentially used to describe machine to machine (M2M) connectivity across the internet. The IoT relies on any machine or device being connected, via fixed wire or wireless communications links to the internet and then being able to transmit information in one form or another. There are countless research articles that have been published on the internet describing these three terms and I do not want to spend too much time discussing these in detail, instead I will discuss how they relate to B2B and the Supply Chain and how they are going to change the way in which companies work with each other in the future. The IoT has provided a much needed boost to the high tech and manufacturing sectors, but the technology being deployed is usable across virtually any industry sector and there lies the business opportunity. The most widely published figure estimating the market size for IoT was produced by Cisco who believes the market will be worth $14.4 trillion by 2020. Cisco has taken the lead in terms of developing thought leadership in this area, their recent Internet of Everything study provided some interested insights including the benefits chart shown below. IDC estimates an IoT market size of $9 trillion by 2020. Either of these estimates are very big numbers which is why IBM, GE, Infineon, Qualcomm and many other companies are investing significant amounts of money on IoT based technologies and services. IDC suggests there are three enablers driving the IoT, namely: On-going development of smart cities, cars and houses Enhanced connectivity infrastructure An increasingly connected culture where everyone wants to be connected to the internet at home, at work or in the car IDC goes on to predict that there will 212 billion ‘things’ connected to the internet by 2020. It is important to stress that the IoT is in its infancy but wired connected devices have been in use for many years. The idea of the IoT initially became popular through the Auto-ID Center, a non-profit collaboration of private companies and academic institutions that pioneered the development of a web-like infrastructure for tracking shipments around the world through the use of RFID tags carrying electronic product codes. The IoT relies on web-enabling virtually any type of product or piece of equipment so that data about the object can be captured and communicated. Once captured, the information would be transferred from the remote device and then processed via some form of middleware to an integration platform. Ideally from a business point of view, all connected devices would be connected to the same integration platform to allow them to work seamlessly with back office business environments such as supply chain management and ERP platforms. McKinsey & Company recently said in a report that the manufacturing sector is likely to see the most benefits from the IoT and they went on to predict that we are about to enter the fourth industrial revolution, or Industry 4.0. The industrial internet will see the world of manufacturing become more and more networked until everything is interlinked with everything else. GE sees so much potential in the industrial internet that they have setup a software division, called GE Software, based in California to look at how they can exploit this across the various products and services that they offer. McKinsey also believes that the IoT will lead to an exponential growth in data flowing across the extended enterprise and companies will have to acquire personnel with the necessary data analysis experience to be able to process this information. These people will have to design robust algorithms for processing IoT related information and then translate what happens in the physical world into a format that can be handled in the digital world. This requires mathematical, domain, market and context know-how. In the connected world, the physical world should be integrated with business processes, including delivering data, sending events and processing rules. The area of Big Data may just be starting to gain acceptance across different industry sectors, but the IoT will see interest in Big Data Analytics grow exponentially over the coming years. With a piece of integration software or middleware acting as the interface between the physical and digital supply chains, how can companies leverage this connection and more importantly how could it help to streamline supply chain processes across the extended enterprise? There are a number of ways in which the IoT could add value to supply chain strategies, not just in manufacturing, but in other sectors such as retail as well. We are at the very early stages of understanding how the IoT will impact the enterprise, but from a supply chain management point of view here are three initial areas where the IoT could impact global supply chains: Pervasive Visibility – relates to the way in which shipments are tracked at every stage of their journey from their point of manufacture to their point of delivery. The IoT not only provides ‘information everywhere’ but will offer ‘visibility everywhere’ as well. RFID is one such technology that was introduced to provide improved visibility of shipments, but has sometimes struggled to offer full end to end visibility across a supply chain due to the fact that the infrastructure to track RFID tags has not existed on a truly widespread, end to end basis. As more pieces of equipment, infrastructure and vehicles are connected to the internet, it means that traditional ‘black spots’ or visibility gaps across a supply chain where shipment visibility is limited will begin to disappear. Connected devices or infrastructure will help to plug these visibility gaps and allow shipments to be tracked end to end across a supply chain. The IoT will also allow companies to have two way communications with their shipments at each stage of its journey across the supply chain. For example a piece of equipment could be remotely contacted and instructed to go into an ‘installation mode’ before it arrives at the site where it will be delivered. Proactive Replenishment – efficient inventory management has always been a challenge across the retail industry, especially when one considers the various channels that consumers can purchase goods today. Whether buying online, or through traditional brick and mortar stores, managing inventory levels and being able to replenish stocks efficiently is a constant challenge. It is not just the retail outlets, keeping vending machines stocked up is another area that could be considerably improved. Many vending machines are typically in remote locations and the only way a company gets to monitor stock levels is to physically visit the vending machine and replenish the stock as required. What if the vending machine could be connected to the IoT and when the vending machine detects low stock for a particular item it is able to automatically place an order for new stock which can then be delivered before it runs out. What if this was expanded to normal retail outlets and low stock signals from a sensor fitted to shelving in the stores triggers the ordering of new stock. In France many supermarkets have RFID based price tags to allow pricing information to be updated centrally, extending this capability to check for stock levels will change the way in which retail outlets manage their inventory levels. Predictive Maintenance – In an earlier blog I highlighted a case for how the IoT could support the replacement of parts in serviceable products such as industrial equipment and office equipment for example. If a piece of equipment is able to self-diagnose a potential problem and then place an order for a replacement part, then it can be fitted before the part fails. I used an example of a car engine detecting reduced flow rates across a water pump. A seal on the water pump could be leaking, causing inefficient operation of the cooling system. Before the water pump completely fails, the car’s ECU sends information via the internet to a local service centre about a potential problem and at the same time places an order for a new seal to be delivered directly to the car owner’s normal location where they get their vehicle serviced. The service centre then automatically checks the service schedule and emails the owner of the car to notify them of an impending issue with their car. This scenario could be applied to an aircraft, a piece of construction equipment or even a fax machine in an office, any serviceable equipment will be connected to the IoT to help detect potential problems and get them resolved ASAP. Direct integration with a B2B platform allows all ordering and shipment related documents to be created and tracked automatically so that service centres know exactly when the replacement parts will be delivered. Key to all three of these areas is the ability to integrate the physical and digital supply chains. Companies will need access to a cloud based integration platform that can integrate to a wide variety of connected devices, equipment and services. An M2M API or middleware that sits between the piece of equipment and the supply chain management environment will be key to providing the link between physical and digital supply chains. Therefore common standards will have to be developed to achieve this seamlessly and IBM has started the ball rolling by proposing their MQTT standard as the basis of how machines will communicate with each other across the internet. But this is only one part of the equation, as document/file standards and ways to process the information being transmitted between these devices must also be developed. The key challenge to widespread adoption of the IoT relates to achieving seamless interoperability and common standards that need to be developed to allow machines to be able to communicate at a technical level and across different borders and cultures as well. In North America an alliance of ten companies including Cisco and GE are working to lobby the US government on the importance of developing open standards that will encourage broad adoption of the IoT. The alliance is aiming to address the following IoT related issues: Co-engineering cyber and physical systems Identifying cyber-security issues and solutions Addressing concerns about interoperability Identifying ways to maintain robust wireless connections Setting standards for real-time data collection and analytics It is not just in North America where IoT related standards are being discussed. The European Research Cluster on the Internet of Things has undertaken some interesting research over the past couple of years, one of the more interesting reports tried to define all the areas that had to be addressed to develop an IoT related platform. In China the government sees the IoT as being able to offer a key competitive advantage in the global economy. So as not to miss out on the next big internet revolution they have instructed numerous government departments to come up with policies for how the IoT can be deployed across China. Pervasive, Proactive and Preventative, three words that begin to define the benefits of the IoT, especially from a supply chain perspective. The IoT will allow the seamless exchange of information in real time between a shipment, its surroundings and a common, cloud based, integration platform that is used to connect all trading partners across the extended enterprise. I have not had time to discuss other areas such as Smart Grids and how for example the IoT will impact the Electric Vehicle industry, I will expand on these in future blogs. I am fortunate to be attending the Internet of Things World Forum in Barcelona in two weeks’ time and I will provide an update on the latest IoT related trends when I get back from this conference. In the meantime have you given any thought as to how your business could benefit or embrace the IoT?

Read More

The End of Mapping (in B2B Integration)

Maps are the one of the most painful and costly aspects of a B2B integration program. But maps are also the most important part of your B2B program. Maps are the key conduit through which information flows from your enterprise applications (SAP, Oracle, Infor) into the systems of your business partners (customers, suppliers, banks). That means that every time you make changes to the data structures in your enterprise applications you also need to make modifications to the associated maps. Every time you win a new account, you need to create a new map that converts the information the customer sends into the format of your enterprise application systems. And maps cannot easily be ported from one vendor’s translation tool to another vendor’s. If you want to change translators then you often need to rewrite your maps completely. Wouldn’t it be great if we could find a way to stop having to use maps? Shouldn’t we be able to develop an algorithm that can automatically map and convert the data from one file format into the format of another? This type of artificial intelligence technology is certainly within reach (if not already present). Consider that: IBM’s Watson can interpret natural language queries and respond with answers faster than even the world’s smartest Jeopardy players. Siri can interpret a wide variety of human speech patterns and respond within seconds with answers to most common questions. Shazam can listen to a 30 seconds of music playing on a radio and match it the artist and title of the song. Google Goggles can scan an item via a cell phone camera and identify it with a high degree of accuracy. So why can’t B2B translators consume structured files and automatically identify what the data fields in the source file map to the respective fields in a target? Fields like bank account numbers, street addresses, part numbers and unit prices follow a relatively consistent pattern that should be easily recognizable. Optical Character Recogition (OCR) software is able to do this today. OCR applications can scan and match fields on document such as invoices and paper checks with 90% or greater accuracy. It is time to replace the expensive and time consuming process of mapping with an algorithm. But how would you build such an algorithm? The algorithm should require minimal inputs. For example, the algorithm might be a RESTful API that requires only the name of the receiver, type of transaction and the input file format. For example, you should only need to tell the algorithm that you want to send an invoice to Walmart from the attached EDI file. Or that you want to send a wire transfer instruction to HSBC via the attached SAP Idoc. The algorithm would maintain a library of known blueprints for translating documents that it builds up over time. For example, the algorithm might have a set of blueprints for converting wire transfer instructions into the respective HSBC format. And it might have a set of blueprint for converting invoices into Walmart’s preferred format. Of course, there will be new translation scenarios for which no blueprints exist. In these scenarios the algorithm will need to auto-magically identify the input fields and correlate them to the associated output fields. This could be done by reading the tags next to each data field. For example, “Invoice #” next to a field is a dead giveaway. The key to success is building such an algorithm will be having enough sample data to test and optimize it until it is 99.999% accurate. We can take some lessons from companies such as Facebook and Google here. How did Google develop its online “translation” capabilities? They identified hundreds of books which had been reprinted in different languages. Google used these known associations between words and phrases from the books in different languages to build their language translation algorithm. How did Google develop its speech recognition capabilities? Google offered free “411” information lookups to gather a wide sample set of human speech patterns that it could use to build its voice recognition software. Cloud-based B2B integration providers could take a similar approach. They already have massive amounts of transactions that are being processed. These same transactions could be replayed in a test environment to refine and optimize an algorithm. It could take several years to optimize an algorithm, but whoever can crack the code on this first will be able to save their customers thousands of hours of mapping activities every year.

Read More