Digital Transformation

B2B Integration – There’s No App for That?

“The App Store” is on pace to hit ten billion downloads any day now.  My four-year old decided to drop my iPhone face-down onto the ceramic tile in my kitchen last week.  Needless to say this shattered any plans I had for buying more apps.  Despite Apple’s ownership of “The App Store” name, it does not hold a monopoly on the idea.  In fact, App Stores are popping up everywhere.  Of course, the best known app stores have been launched by mobile device and tablet manufacturers such as RIM, Google, Nokia and Microsoft.  But the idea of App Stores is rapidly expanding beyond tablets and phones into new areas.  Earlier this month Apple launched a new App Store for Mac Users.  GPS Navigation Device maker TomTom announced plans for an App Store in 2010.  And Cloud Computing vendors such as Microsoft, Salesforce.com and Intuit have App Stores for add-ons to their enterprise applications or infrastructure offerings.  Even my dry cleaner told me that she is planning to launch an app store later this year.

Read More

Finding Context in a River of Data

In my last blog entry, I discussed the river of data that flows between trading partners.  This river of data has the potential to deliver far beyond it’s role in business process execution, but there are challenges.  The first challenge is establishing a context. The splash of water to the left could be the drop that breaks the dam, the drink that saves someone from thirst, the coolant that prevents the engine from overheating, or any one of a dozen other potentials — the point is that you cannot tell from the picture, because it lacks any sort of context (it’s a splash from a larger body of water — wouldn’t want to leave you in suspense…). When a purchase order, ship notice, payment, or change in inventory status comes flowing across the wire, it too needs a context to be truly useful — but what do I mean exactly by context?  It’s one of those concepts that is easiest to understand through examples. Types of Context in B2B Process: what business process is this transaction a part of?  One of the most common processes GXS helps customers with is the “order process”, but there are actually many intersecting processes going on with any real interaction.  For instance, orders (hopefully) result in shipments and (sadly, for the buyer) invoices.  The shipment and invoicing processes intersect the order process — so a given transaction (say, an Advance Ship Notice) may participate in many processes simultaneously.  The ability to see a document in the context of a process is very valuable.  Order processes are typically not too bad in this regard, as the originating order number is often a part of the downstream transaction flow.  The trick to process context is associating a given transaction to a completely defined process so that you know how it is supposed to be handled. Customer/Supplier (Partner): what business relationship is this transaction a part of?  Hopefully knowing your partner is not an issue — but partner context is not about identity, it is about the relationship.  If a supplier is going to be late on an order, is it part of a pattern?  Am I about to violate a service level for a customer that I am in the midst of renegotiating a big contract with?  This is theoretically the realm of the CRM/SRM/ERP infrastructures, but very few of them operate well in real time, as events are flowing in off the wire.  The ability to put a B2B transaction in the right partner context quickly can have a big impact. Product: what product line is addressed by this transaction?  Is it a new product, a retired product, a product that doesn’t exist (i.e. a data error).  Product context is also the entry point for trade promotions management in some industries.  This has been an interesting area of late, as customers have sought to tie in B2B execution systems to master data management, precisely for ensuring a correct product context. Financial: how big is this transaction?  What effect does it have on the financial metrics of both organizations?  Traditionally the realm of ERP systems, the ability to look into the river of data and see the money flowing can be very powerful.  Sometimes establishing this context is as simple as integrating effectively to the ledgers within the traditional ERP systems.  Other times, smaller companies may look for help from a SaaS (software as a service) offering to deliver financially oriented reports on what is flowing (because their internal systems lag reality a bit). Logistical: what does this transaction tell me about the logistics involved?  Specifically, is this shipment/payment/etc going to be where I need it (either physically or financially) when I need it?  As an example, if a big promotion is planned for a given product, and several containers of it are held up in customs, that could be a big issue.  If a truck of critical components for a factory is delayed, is that an annoyance or will it grind production to a halt.  Part of the logistics context is understanding how things move, and what ultimate impact a disruption could have.  If you know that an item is being shipped by boat, rail and truck — and is delayed in a rail yard, can the truck make up the time?  When is the product needed?  To establish a logistical context, you need to know the planned logistics (often available in shipping transactions), and the historical performance of the modes and carriers involved. Market/competitive: has there been a change in the order pattern?  If a few customers (or even one) change their order pattern, it could be a sign of a competitor move.  If several customers (or suppliers) suddenly change behavior, it could be a sign of a shift in the market.  Change is really only something you can observe within the context of a given relationship.  The reason B2B in context is a critical area to observe is that averages and aggregates can sometimes be misleading.  If the market is growing, but a big customer is shrinking, you may not see the customer issue in the totals. There are many kinds of context I have not even discussed (geographical, regulatory, legal), but you get the idea.  The challenge is to decide which contexts make sense, and learn to see the river of data flowing between you and your partners through the lens of those contexts.

Read More

The Cloud may have the Computing, but the River has the data….

Organizations around the world are busy extolling, exploring or arguing with the notion of Cloud Computing, but there is another metaphor that may grow in importance as Cloud Computing matures — the River of data that flows between organizations and their partners which can be tapped for power and profit. Every day, companies exchange millions of electronic documents with each other that have some very special characteristics: – the data matches actual transactions occurring in the “real world” (orders, shipments, payments) – the data is structured in a way that provides meaning, it can be processed and turned into information – the transaction data has relationships to other transactions in the River that can be correlated Unfortunately, for most organizations, the potential to tap into this powerful source of information remains just that, potential.  The challenges inherent in examining, correlating, and acting upon the massive flows of data generated by even modest enterprises have typically been overcome only rarely, and often not at a sufficient scale to truly exploit the opportunity.  In this post I’d like to look at some of the challenges that make this difficult, and I will explore the opportunities in future posts. Some of the Challenges…. Context: even today, most B2B transactions utilize traditional EDI standards like ANSI X12 or EDIFACT.  Despite the standardization of documents, it is frequently tricky to understand the context in which a given transaction is operating (e.g. for a given ASN — ship notice — what order process is it part of?).  Without context, it is a free floating piece of data, requiring some associations to turn it into information.  If a critical shipment is one day away from delivery to a store about to run a promotion, that is information — if Shipment #101 has been processed, and I don’t know what process it is part of, that’s just data Timeliness: while immediacy is a great benefit of the data flowing between trading partners, age is its enemy.  With today’s more efficient supply chains, data starts to go stale very rapidly.  The challenge is to connect data flowing between partners to other information and to business processes before it is too late to act upon the information.  The data flow is not unlike electricity generated by a dam, which must be sent over wires to be consumed in real time.  Data warehouses can use the last three years of data to help forecast, but a logistics system has to act upon what is happening the chain today to deliver ROI Exceptions: until “the perfect order” is actually achieved, some of the most critical issues between partners are those that generate exceptions, either technically (cannot process an order or logistics document) or on the business side (wrong product, amount, price).  It is truly scary at the latency involved in resolving these exceptions, with most of them basically “failing out” to a manual process.  Despite the fact that for the “success case” there may be 100% automation, the most common “automated” exception handling is an email alert (how full is your inbox?) Integration: the “father of all challenges” when it comes to automation (and its role in resolving the previous three challenges), is the ability to integrate the flow of data into the many systems that are capable of rapidly providing context, and identifying and handling exceptions.  This challenge has always been daunting, but recent developments in technology and the ongoing march of technology have started to improve the integration possibilities Scale: the sheer volume of data, while being its greatest strength, may also be its biggest challenge.  Since millions of transactions flow between partners on a given day, to understand what is happening, all of those transactions have to be interpreted, put into context and analyzed for meaning.  RIGHT NOW!  If the approach is to methodical, you succumb to the timeliness challenge, but if the approach is too loose, you may generate so many “false exceptions” that you actually degrade business performance, rather than improving it These challenges are by no means the only — or even necessarily the most difficult — obstacles to successfully navigating the River, but I want to start focusing on the opportunities in my next post.  In the meantime, please share additional challenges to tapping into the flow without drowning.

Read More

E-Prescriptions – The Fastest Growing B2B Networks?

In a post earlier this month I explored the question of who has the largest B2B integration network in the world.  The competition is primarily between the top vendors in the financial services, health care and supply chain segments, each of which have been using B2B technologies for over 20 years now.  But another equally interesting question is who has the fastest growing B2B network?  One of the candidates must surely be in the US Electronic Health Records (EHR) market.  The topic of Electronic Health Records has been receiving a great deal of publicity in the past year since the American Recovery and Reinvestment Act (ARRA) was passed.  ARRA stipulates that the Federal Government will provide financial incentives for health care providers which can demonstrate “meaningful use” of EHRs.  EHRs can include a variety of information about a specific patient including demographics, medications, immunizations, allergies, laboratory data, radiology reports and past medical history.   In my last post I described one of the fastest growing segments of EHR, the sharing of medication data between providers, pharmacies, and payers via e-prescription networks.  In this post I will outline who are these e-prescription networks and how fast are they growing?

Read More

Who has the Largest B2B Integration Network?

Since IBM’s acquisition of Sterling Commerce in May there has been a lot of discussion about the consolidation of B2B networks (and what it means for the industry).  One of the topics discussed amongst B2B integration professionals is – Who has the largest B2B network?  Numerous analysts and thought leaders declared GXS to be the largest following the successful acquisition of Inovis in June.  However, I am not convinced that GXS really is the biggest.  Amongst the traditional supply chain oriented EDI networks, GXS earns top ranking for network-based revenues, annual transaction volumes and number of companies connected.  However, if you were to consider other types of similar B2B networks in adjacent industries such as financial services or health care, then the #1 position is not as clear.

Read More

EBICS – The Standard for Corporate-to-Bank Communication?

2010 is an exciting time in the world of B2B integration standards for the European banking sector.  In this case, I am not referring to the continued rollout of SEPA in the EuroZone, but rather the extended reach of EBICS.  EBICS is a highly secure file transfer protocol being used in the French and German banking communities for exchange of cash management related transactions.  The name EBICS stands for Electronic Banking Internet Communications Standard (EBICS).  EBICs is the successor to an earlier standard named BCS that was used in the German banking sector from the mid-1980s until the end of 2007.  BCS refers to the Banking Communication Standard which was developed by the German Credit Committee, which is known as Zentraler Kreditausschuss Association (ZKA) in German (longer, but no doubt easier to say than Eyjafjallajokull).

Read More

Does Your Supply Chain Have An Early Warning System?

New Predictive Intelligence-Based Financial Industry Risk Ranking Offers A Great Model for Global Supply Chains The dust, or ash cloud if you prefer has all but settled from the recent Icelandic volcanic eruption. An eruption that grounded flights yet gave flight to heightened awareness of the need to intelligently predict and respond to business disruptions. In days following the event, I was heartened to see humorous (and futile) attempts to master the pronunciation of Eyjafjallajokull quickly make way to discussions of supply chain risk and resiliency.

Read More

B2B E-Commerce from 2010 to 2020 – Predictions for the Next Ten Years

Each of the past few years, GXS has published a list of predictions for the coming year.  With the start of a new decade, we have taken a different approach this year.  Instead of issuing predictions about just 2010, I asked a group of eight GXS Subject Matter Experts (SMEs) to offer their opinions about how changing market conditions and new technologies will impact B2B e-Commerce over the next 10 years.  We developed 10 different predictions on a wide range of topics including cloud computing, SaaS, mobility, SOA, agile development, open source, social media, sustainability, emerging markets and demand driven supply chains.

Read More

Could Community Source be a Better Approach for Supply Chain Applications?

In my last GXS post I described the relatively small number of B2B integration vendors which have embraced open source.  I this post, I will discuss a type of open sourced called "community source" which I think has high applicability for the supply chain and B2B integration.  In community source a group of companies (versus individual users) unite to develop software that solves a common business problem.   Typically the application being developed either is not available from a commercial software vendor or is available, but only via a cost-prohibitive licensing model.   There are several different models of community source, but the most popular is called a “gated community,” in which only selected member organizations contribute to the development.  The gated community concept differs from traditional open source to which the general public can contribute.   The focus of community source is not to build commercial applications for resale, but rather to create useful software that the developer community can leverage for business benefit.

Read More

Integration Smackdown: Documents versus API in B2B

I have been focused of late (a rare luxury for me) on exploring the integration technologies to the emerging cloud computing providers — especially the SaaS guys.  Last week I attended Dreamforce 2009, which was an absolutely fantastic experience.  Salesforce.com has posted a bunch of the material on the site, and it is worthwhile. While at Dreamforce, I attended training on integration with Salesforce.com, both to and from.  I enjoyed the class, and found the relatively rich set of alternatives (remember that this is a very young company that has quadrupled its workforce in the last few years) to be a good toolbox for integration.  One of the more intriguing aspects is how the company's engineers balance control of service levels (i.e. their ability to prevent you from hurting their performance when you integrate) with access.  One of their key technologies is a set of "governors" to control the resources available. While I really enjoyed the exposure to the API-level integration, I cannot help but compare all of this to how we traditionally do integration between business partners, as well as to SAP, which is via standard documents.  Apparently I'm not alone, as Benoit Lheureux of Gartner wrote a thoughtful blog post on how cloud integration will affect traditional integration.  From my point of view, API level integration (and yes, this includes web services), is usually more powerful and functional than integration driven through documents (disclaimer — yes, I realize at its heart that web services is basically the exchange of XML documents over HTTP, but for this discussion, it feels like an API…), but, document driven exchange is easier, more interoperable, and often enjoys much higher adoption.  And a major part of the "document advantage" is the ability to separate the handling of the document from "transport" (communications), which often allows use of familiar technologies to perform the integration. For my purposes, API level integration is the calling of specific low level services, with a set of arguments, often — though not always — using web services.  Although web services is "standard", the service calls are typically specific to a given SaaS provider (i.e. proprietary).  Additionally, the service calls may change based on the configurations done for a given instance setup for a customer (i.e. custom proprietary). Document level integration, in contrast, uses the basic pattern of sending in a document in a defined format, often — again, not always — in XML.  This document may vary from customer to customer, but most of it is the same, and customers can often submit the document in a variety of ways (web services, ftp, etc).  SAP is a pretty popular ERP system in the current world, and it supports a wide variety of integration technologies.  In our B2B managed services worlds, IDocs over ALE is by far the most common integration technology — despite the fact that there are often very good reasons to use a lower level approach (directly invoking RFCs after sending data using FTP, for instance).  Why?  Among many, many other reasons, customers and integrators like solutions that work predictably across multiple environments.  IDocs, like X12, EDIFACT, OAG, etc, are defined entities that do not change based on adding fields or custom logic to an application.  But possibly a bigger reason is the ability to use existing technology and processes to perform the work.  SAP IDocs can be manipulated using standard translation technology, and then sent via ALE or other communications methods.  The ALE protocol can be implemented in a comms gateway that you already own. Modern communications gateways have extensive support for web services today, but that is really a kind of toolbox for building custom adapters to custom services, with each one a "one-off".  This problem can be intensified if the service you are connecting to changes over time as additional data fields are added to objects (a product object is especially susceptible to change).  API level integration is usually much more brittle for this reason, and it is one of the characteristics that led many enterprises to switch to message oriented middleware, and attempt to impose canonical standards ("canonical standards" is a fancy way of saying a common format — usually in XML — for frequently used documents, like customer; canonical standards are often adopted from outside, especially the OAGi standards).  Integrating to a single system via API is often fun, maintaining dozens or hundreds of these is not. A common pattern is for emerging technology categories to start with low level APIs, and gradually build up to more abstract interfaces that are easier to use and less brittle.  Already, in the case of Salesforce, they are offering a bulk load service, and they are also offering a "meta-data" API that allows tooling to simplify integration.  Over time, I fully expect that most major SaaS providers will provide integration methods that feel a lot like trading documents, and that B2B teams will take the lead in using them. In the long battle between document and API integration, document style integration will dominate, though API and service level integration will play critical roles…   

Read More

Record Cash Holdings by Large Corporations makes B2Bank Integration More Important than Ever

 The Wall Street Journal published an excellent article on November 3rd about how companies are holding more cash on average than any time in the past 40 years.  “In the second quarter, the 500 largest nonfinancial US firms by total assets, held about $994 billion in cash and short term investments, or 9.8% of their assets.”  With nearly a $1Trillion dollars in cash holdings, the opportunity for the Transaction Banking divisions at major financial institutions has never been greater.  However, some institutions have an advantage over others in the areas of sales, client delivery and product features due to their superior B2Bank capabilities.

Read More

We Need a Napster-like Service for B2B File Transfers

In my last two posts I described how B2B integration infrastructures at major corporations are choking on the increasing number of large files transmitted crossing their firewalls.  Most of the existing approaches to large file transfer are expensive, proprietary and complex.  It is interesting to note that corporations have yet to achieve a simple, open, universal file sharing model comparable to what consumers enjoyed with the original Napster in 1999.  Yet, business users expect to be able to share large files at work the same way they do in their everyday consumer lives.  Recent technology advances compound the problem.  Broadband connectivity has become nearly ubiquitous in most developed countries.  For example, Verizon FiOS offers an Internet package that allows home users to buy Internet connectivity at speeds up to 50Mbps.  Furthermore storage costs continue to decline rapidly.  Grocery stores now sell 16GB USB memory sticks in the check out aisles for $20.  Despite these advances in networking and storage technologies, the corporate world continues to struggling with large file transfer. 

Read More

Finding the Opportunity in Crisis

Among the primary definitions of the word "crisis" found in Merriam-Webster's dictionary, I think the one that best defines the past 18 months in the banking sector is "an unstable or crucial time or state of affairs in which a decisive change is impending".  As I look around, I definitely see signs of decisive change across the industry.  And as the best crisis management teams will tell you–what people really remember is not the triggering event that begins a crisis but the long-term response to it. Already, most US businesses are giving the Federal Reserve high marks for its response to last year's meltdown. But what about the perception of corporates–your clients–to the banking industries' response?

Read More

B2B Integration could help improve tracking of Pandemics such as H1N1 Swine Flu

I was watching the movie I am Legend on HBO Sunday evening.  I’m not sure if there is any correlation between HBO’s decision to broadcast of the film in May and the outbreak of the H1N1 Swine Flu.  However, it did start me thinking about pandemics and what could be done to better contain these outbreaks before they turn all of Manhattan into nocturnal, cannibalistic zombies.  The widespread outbreaks of H1N1 in Mexico and the US have made this subject top of mind for everyone from politicians to economists.  Of course, pandemics are yet another area in which B2B interoperability and integration technologies could play a significant role. The Center for Information Technology Leadership published a comprehensive report on how B2B interoperability in the US health care community could not only reduce costs but improve the quality of care.   Much of the data cited in this post is sourced from the 2004 report entitled The Value of Healthcare Information Exchange and Interoperability.   See my January post on how the Obama administration could save $75B annually from B2B interoperability in health care for more background information. Tracking Pandemics at the State, Local and Federal Level State laws require providers and laboratories to report cases of certain diseases to local and state public health departments.  Nationally “notifiable” diseases are forwarded by the state agencies onto the Centers for Disease Control and Prevention (CDC).  Connections between the states and the CDC are electronic and highly automated.  However, the first mile between the providers and the local and state agencies is highly manual.   Providers typically submit data via phone, fax, hard copy forms or very basic B2B communications methods such as a web portal.  For larger provider groups operating in multiple regions, notifications to state health agencies become even more cumbersome.  The 50 US states maintain more than 100 different systems to collect data each with its own communications mode. The most closely monitored “notifiable” diseases are frequently under-reported in the US.  Various studies conducted between 1970 and 1999 showed that only 79% of all STD, tuberculosis and AIDS cases were reported to public health agencies.  Reporting rates for other diseases was much lower at 49%.  There are several reasons for the reporting challenges.  But certainly one of the key issues is the ease with which the information can be transmitted to health authorities.  There is no question that the primitive communications methods used to collect provider data is a critical barrier to success.  However, even more problematic is the dependency upon overworked and understaffed provider personnel to take the time to consistently file the reports. Electronic Health Records – Public Health Benefits A better methodology for reporting on “notifiable” diseases would be to eliminate the need for human initiation altogether.  The process could be completely automated by connecting health care provider’s Health Information Systems and Practice Management Systems which contain the patient data to Public Health and Safety tracking systems.  However, connecting the tens of thousands of medical practices to the hundreds of different public health systems could prove quite an ambitious integration project.  A less complex and costly alternative would leverage the concept of Electronic Health Records (EHR).  The EHR would significantly simplify tracking of public health epidemics without the need for bespoke integration between various state agencies and each different medical provider. The EHR provides a comprehensive set of information about each patient including demographics, medications, immunizations, allergies, physician notes, laboratory data, radiology reports and past medical history.  EHR information could be stored in a series of centralized repository deployed around the country.  Each repository could contain the full medical records or just pointers to the locations of the records.   Triggers could be set up to automatically identify trends in data sets that might not be otherwise noticed, helping to provide an early warning system for potential disease outbreaks.  In the event of a pandemic or bioterrorist event, public health officials could easily access de-identified EHR data such as physician’s notes, patient demographics and medical history.  Without the dependency upon manual data entry, the latency of information flow could be reduced and the quality of information collected could be improved.  Administrative costs would be reduced considerably.  Average cost to send a report manually is $14 as compared to only $0.03 electronically.  CITL estimated that the use of electronic data flow from providers and laboratories to public health agencies would reduce administrative costs by $195M annually.  CITL did not quantify the potential economic savings from early identification of pandemics and bioterrorist events, but there is no question that these could be in the billions of dollars. B2B Interoperability and EHR Of course, a key technology enabler for EHR is interoperability between the various health care providers and the corresponding state, local and federal agencies.  Medical data is transmitted between providers, payers and public agencies using a variety of B2B standards including DICOM, HL7, NCPDP, and HIPAA-compliant EDI transactions.  EHRs could aggregate the available data related to prescriptions, claims, lab reports and radiology images into an electronic record.  Additional services could be layered onto the B2B integration framework such as data quality could be used to ensure the completeness of records and business activity monitoring to identify behavioral trends. Another concept evangelized in the CITL report is the idea of a National Electronic Disease Surveillance System (NEDSS).  The NEDSS would collect data from a number of relevant sources outside of the health care system which could be useful for monitoring   Examples might include 911 call analysis; veterinary clinic activity; OTC pharmacy sales; school absenteeism; health web-site traffic and retail sales of facial tissue, Orange Juice.   Such practices have been deployed by the US Department of Defense and the Utah Department of Health during the Salt Lake City Olympics in 2002.  Such an effort would require integrating additional state and local agencies, educational institutions and retail chains electronically using B2B.  

Read More

The Importance of Integrating B2B and ERP Platforms in the Automotive Industry

Integration, this is an important word on the minds of many CIOs around the world, and yet many companies underestimate the importance of it, especially when trying to implement and work with countless back office systems. Let me try and compare a company trying to use a mixture of different back office systems to a musical orchestra. Each musician in an orchestra has their part to play in the musical performance, if one musical instrument is out of tune or a beat is missed for some reason then the audience will notice it straight away. It is no different for a company trying to integrate back office systems, the reason for doing this is to ensure a smooth flow of information around the organization, making sure that the relevant people in the various departments get the right information at the right time. Without backend integration, companies will struggle to orchestrate the flow of information across their extended enterprise. Whilst on the subject of orchestras, this was one of my favourite automotive adverts from last year, which was performed on actual car parts. Enterprise Resource Planning (ERP) systems have been used for many years to manage materials and enterprise resources within a company. Now unless a company makes everything in house, which is very rare these days, companies will have to manage a range of different suppliers and interact with customers on a daily basis. So there is clearly a potential barrier to information flow here, an ERP system is managing the flow of information within a company, but by itself it cannot manage a global network of trading partners as well. There has to be some level of integration so that the barrier can be removed and information can flow freely across the extended enterprise. Imagine being able to utilize ERP related information across your trading partner community, imagine being able to smoothly receive information from trading partners without any re-keying of information or having to interrogate multiple business systems. Integration, CIOs should not under-estimate the power that this word has and outsourcing the management of the B2B to ERP integration process to a trusted partner such as GXS should be given serious consideration. SAP is the most widely used ERP platform in the automotive industry today, and many companies rely on SAP software to manage their global manufacturing operations. Many European automotive companies, particularly those in Germany, have globalised their manufacturing operations in the past few years. They have established operations in ‘old’ emerging markets such as China, Brazil and India and the ‘new’ emerging markets of Thailand and Vietnam. As the automotive companies stretched their operations around the world it became more important to provide integration to multiple SAP instances and to also provide a means to manage their ever expanding community of trading partners. Ideally, the car companies need to get a single, consolidated view of their resources across their extended enterprise. Providing an integrated B2B and ERP platform allows the automotive companies to improve how they manage inventory levels, trading partner relationships and of course reduce costs across their business. GXS has worked with many companies over the years to integrate their B2B and SAP ERP platforms, in fact we are currently working with one global automotive tier 1 supplier to integrate 80 instances of SAP that are scattered across their global network of manufacturing plants. As you would expect, we have been able to establish a valuable knowledge base of information associated with integrating our Trading Grid® B2B platform with many SAP installations. However we can offer companies much more than this, let me briefly explain. GXS has over 40 years experience of working with companies in the automotive sector, in fact over 70% of the world’s Top 100 automotive suppliers are connected to our Trading Grid infrastructure. GXS can provide seamless SAP integration to a global B2B platform where nearly all the major suppliers already have pre-configured connections to our service We offer a mapping centre of excellence which provides a centralized location for managing over 30,000 maps. We can also provide support for a number of different SAP IDOC documents, for example SHPMNT03, INVOIC01, DELVRY01, ORDERS01. As you would expect GXS are fully certified by SAP for Netweaver® integrations We offer support for the broadest set of communication standards in the industry, whether it is mediating between legacy communication protocols or the latest internet communication protocols or indeed FTP over SAP ALE, either way we can make sure that you can connect to your trading partners irrespective of their technical capability Providing global 24/7, multi-lingual, support services ensures that your trading partners can be on-boarded and supported anywhere in the world. GXS recently acquired Interchange in Brazil which now allows us to work more closely with this important automotive manufacturing region You would have peace of mind knowing that your SAP implementation would be integrated to one of the most modern and highly available B2B infrastructures in the world. This means that your global instances of SAP would be connected to a future proof B2B infrastructure that has the flexibility to grow as your company grows Now I have only scratched the surface here and in future blog entries I will cover each of the above in more detail and explain how they help contribute to the smooth integration of many ERP related back office environments around the world.

Read More

Should the FCC ban Async and Bisync?

legacy B2B protocols

Today is February 17th, which has become the equivalent of Y2K in the television broadcast industry. Today was also the original deadline set by the Federal Communications Commission (FCC) to discontinue analog broadcast television signals throughout the US. Congress recently passed the Digital TV Delay law which has extended the deadline to June 12th to allow consumers more time to complete the transition. While June 12th has become the new “drop dead” date, Congress specified that broadcasters are no longer obligated to continue analog transmissions during this extension period. However, 1100 of the nation’s 1800 TV networks will continue their legacy transmissions to maximize advertising distribution. The FCC’s ability to phase out older technology in favor of more modernized, cost effective protocols begs a question in my mind. If the FCC can apply such a policy to TV broadcasts, should we consider enacting similar legislation for outdated B2B communications protocols such as async, bisync and X.25? Y2K for the Television of the Industry The official motivation for the US switching to digital TV is to free up wireless broadcast spectrum which is in high demand by other user groups. Rather than using the spectrum for analog TV broadcasts, the frequencies could be allocated to municipal fire, police and emergency rescue departments to support public safety efforts. Of course, the upgrade to newer technology has numerous benefits for consumers and broadcasters as well such as improved picture quality and a wider variety of programming options. After the cutover, older televisions, which do not have a digital receiver and are not connected to a cable or satellite service provider, will not be able to receive the new transmissions. Consumers with older TVs will need to follow one of three courses of action to watch TV: Purchase a digital converter box, subsidized by the US government Upgrade to a newer model television Subscribe to a cable or satellite service The February deadline was extended to June due to a much higher than anticipated population of non-digital TV users. The US government originally estimated that only 15% of households received analog signals TV via free antennas. However, actual utilization of the analog broadcasts appears to be closer to 35% due to the fact that many homes have extra TVs not connected to cable or satellite TV networks. Legacy B2B Communications Protocols I think most government officials were surprised to learn of such a high population of legacy TV technology still in use in 2009. I suspect a similar level of disbelief would be experienced if a study of the use of legacy communications in the B2B integration sector were conducted. One would assume with all of the e-commerce technology businesses have today such as AS2, FTP and Web Forms that legacy technology such as async, bisync and X.25 have become virtually extinct. Unfortunately, this is not the case. For those of you not familiar with these legacy communications protocols, which is by no means something to be embarrassed about, here is a brief introduction. Async – is a communications protocol in which a start and stop signal are used to encapsulate individual characters and words. Async was originally developed for use with teletypwriters in the early 20th century.  Asynchronous signalling was very popular for dial-up modem access to time sharing computers and bulletin board systems during the 1970s, 80s and 90s.  Here is a link to the wikipedia page on async Bisync – is an acronym for the “Binary Synchronous Communication” protocol intorduced by IBM in the 1960s.  The bisync protocol was the most popular file transfer protocol during the 1970s and 1980s. EDI applications were a primary user of bisync as were ATM machines, cash registers and radar systems X.25 – is a Wide Area Network (WAN) protocolused with leased lines, ISDN connections and traditional telephone lines.  X.25 was especially popular in the 1980s within the financial services industry to connect ATMs to banking networks. Here is a link to the wikipedia page on X.25 I have never seen an official report on the use of legacy communications in B2B integration. However, I am confident there are over 10,000 companies are still using legacy dial-up connections such as async, bisync and X.25 throughout the US. What are the implications of the business world’s continued dependency on these ancient networking standards?  Can commercial entities effectively phase out these technologies on their own or will government regulation be required?

Read More

Cloud Computing may Change Outsourcing/BPO

cloud outsourcing

Recent events (economic, technological, business) may be driving the convergence of two traditionally separate sources of value in the managed services or outsourcing world, specialized knowledge, and scale. Although some organizations have traditionally applied both specialized knowledge and scale, it was far more common for smaller service providers to compete on specialized knowledge — with higher infrastructure costs, and larger firms to compete mostly on scale, with less specialized knowledge and focus than their smaller rivals. But recent research and trends, show that the advent of cloud computing, and large commodity cloud service providers, may allow specialized managed services providers to focus and still provide compelling scale in the infrastructure. Referred to as “ADAMs” by Gartner (Alternative Delivery Models), these providers leverage not just Cloud architectures, but also Software as a Service (SaaS) platforms. And there is no reason why this has to be only a two-tier model! I recently posted a link via Twitter, Facebook, etc regarding a beta-service called CloudMQ. This service, not commercialised yet, may be an indicator. It aims to offer standards based (JMS, or Java Messaging Service), business grade messaging (guaranteed delivery, etc) “in the cloud” (offered to customers via the internet). So far, this sounds normal enough, until you realize the entire service is hosted on Amazon’s EC2 and S3 services, which are themselves infrastructure services!  And we are only at the start of this. Commercial services like Amazon change the economics, and when the economics change, the models quickly follow…

Read More