Digital Transformation

EBICS – The Standard for Corporate-to-Bank Communication?

2010 is an exciting time in the world of B2B integration standards for the European banking sector.  In this case, I am not referring to the continued rollout of SEPA in the EuroZone, but rather the extended reach of EBICS.  EBICS is a highly secure file transfer protocol being used in the French and German banking communities for exchange of cash management related transactions. The name EBICS stands for Electronic Banking Internet Communications Standard (EBICS).  EBICs is the successor to an earlier standard named BCS that was used in the German banking sector from the mid-1980s until the end of 2007.  BCS refers to the Banking Communication Standard which was developed by the German Credit Committee, which is known as Zentraler Kreditausschuss Association (ZKA) in German (longer, but no doubt easier to say than Eyjafjallajokull). BCS was originally designed to offer a standardized approach for corporate-to-bank communication.  In the early days of electronic banking many of the technology vendors were developing their own proprietary protocols in lieu of a widely adopted standard.  BCS was based upon the ISO 8571 standard commonly known as FTAM (File Transfer Access and Management). German banks were mandated to offer BCS as a communication adoption starting in the 1990s.  The mandate along with the publicly available distribution of the specifications led to widespread adoption of BCS across German industry.  Challenges began to emerge shortly after 2000 with the popularity of the Internet as a network for business transactions.  As with all transaction types, the industry began to seek alternative IP-based protocols for electronic banking.  The ZKA responded by evolving the BCS standard into a more fully-functional, Internet based standard called EBICS. EBICS is a transport protocol that offers secure transmission of files independent of the payload.  As a result, EBICS can be utilized to exchange a variety of banking information such as account statements; securities holdings; debit and credit payment orders.  EBICS supports the traditional SWIFT FIN/MT message classes such as the MT 103 and MT 940. Furthermore, EBICS can be easily extended to support the new ISO 20022 XML (SWIFT MX) messages.  The EBICS protocol is based upon HTTP, but with network encryption using TLS.  Due to the nature of the financial transactions EBICS supports, high levels of security are incorporated.  EBICS also offers checkpoint/restart capability which enables the file transmission process to gracefully resume in the event of interruption (rather than having to start over). EBICS has been offered in Germany since January 2008.  Adoption has been much quicker than other B2B messaging standards. The success is due in part to the fact that the traditional alternative, the legacy BCS standard, will no longer be supported at the end of this year.  All of the major German banks – Deutsche, Commerzbank, Dresdner (Allianz) and HypoVereinsbank support the protocol along with the smaller, regional institutions. The big news in 2010, however, is not just the decommissioning of BCS, but rather the adoption of EBICS within the French market.   A new version of EBICS (2.4) has been developed to meet the needs of the local French payments and banking sector.  Much like in Germany, EBICS will be replacing a legacy communications protocol called ETEBAC (Echange TElematique BAnque Clients) in France. ETEBAC is scheduled for end-of-life in 2011, which is driving a sense of urgency for French corporate and financial institutions to migrate to new connectivity options.  EBICS is not the only option for French corporate. Treasurers can choose to connect to their financial institutions via SWIFTNet or alternatively via a web-based portal provided by the bank itself. ETEBAC, the French communications protocol being replaced, has a rich history much like BCS.  The original versions of ETEBAC (v1 and v2) were unidirectional in nature.  ETEBAC1 enabled corporate clients to upload a file (e.g. payment instruction) to a bank using a secure transmission over the phone network.  Conversely, ETEBAC2 offered the ability to download a file (e.g. account statement) from a bank.   The most popular version of ETEBAC was version 3, which supported bidirectional exchange of files via the TRANSPAC network.  A fourth and fifth version of the ETEBAC standards were developed but never reached the popularity of ETEBAC3. The rollout of EBICS in France and Germany offers an interesting case study in the evolution of B2B communications standards.  In my next post I will highlight what we can learn from the new EBICS protocol.

Read More

Does Your Supply Chain Have An Early Warning System?

New Predictive Intelligence-Based Financial Industry Risk Ranking Offers A Great Model for Global Supply Chains The dust, or ash cloud if you prefer has all but settled from the recent Icelandic volcanic eruption. An eruption that grounded flights yet gave flight to heightened awareness of the need to intelligently predict and respond to business disruptions. In days following the event, I was heartened to see humorous (and futile) attempts to master the pronunciation of Eyjafjallajokull quickly make way to discussions of supply chain risk and resiliency. Systemic Risk Ratings For Financial Institutions With ash clouds serving as the perfect backdrop for risk mitigation initiatives, I found it was very interesting to see New York University SternSchool of Business launch last week – the NYU Stern Systemic Risk rankings – a weekly rating and ordering by level of risk that the largest U.S. financial institutions bring to the financial system. The rankings are updated daily and utilize numerous elements of market data, from 1990 to the present, and provide an early warning that will help identify threats to the overall health of the financial system. The Systemic Risk Contribution Index (SRISK%) ranks firms by the percentage of total system risk each is expected to contribute in a future crisis. The Systemic Risk Rankings take into consideration factors such as company size, exposure to loss of market capitalization, and leverage. While new rating will serve as a dashboard to monitor and react to the volatility of corporate health, I can’t help but wonder if a similar early warning system couldn’t have prevented the recent financial industry meltdown, credit default swaps notwithstanding. Past Performance IS Indicative Of Future Results Unlike how mutual fund prospectus are obligated to warn us, it is vital to look into the past performance a company’s trading partners along with measures that analyze prevailing internal and external events to measure risk. An early warning system, similar to the NYU Stern risk ranking is imperative for corporations to predict and prevail in the light of potentially devastating supply chain disruptions. Risk and mitigation strategies are arguably subjective and often highly dependent upon the structure and dynamics of each supply chain. Yet, accurate, effective and timely KPIs, benchmarks and scorecards that track the operational performance of vital supply chain partners such as vendors, contract manufacturers, carriers and agents are absolutely imperative to managing risk. I will elaborate on the various facets of supply chain risk that can be discerned by analyzing operational data in subsequent posts. Where There Is Risk, There Is… Insurance! You know supply chain risk is no longer a theoretical or esoteric concept when you have a structured insurance product to help you hedge against it. Zurich Financial Services group has launched two new products – Supply Chain Risk Assessment and Supply Chain Insurance – to help businesses protect their supply chain in this modern, global marketplace. Corporations would be well served to consider such offerings alongside their own internal initiatives that are required to monitor and manage risk factors before they snowball into expensive disruptions. Operational dashboards, facilitated by a nimble way of analyzing operational data exchanged between trading partners and correlating it to seemingly exogenous market events deliver predictive intelligence. This is the vital early warning system that every supply chain needs.

Read More

Could Community Source be a Better Approach for Supply Chain Applications?

In my last GXS post I described the relatively small number of B2B integration vendors which have embraced open source.  I this post, I will discuss a type of open sourced called "community source" which I think has high applicability for the supply chain and B2B integration.  In community source a group of companies (versus individual users) unite to develop software that solves a common business problem.   Typically the application being developed either is not available from a commercial software vendor or is available, but only via a cost-prohibitive licensing model.   There are several different models of community source, but the most popular is called a “gated community,” in which only selected member organizations contribute to the development.  The gated community concept differs from traditional open source to which the general public can contribute.   The focus of community source is not to build commercial applications for resale, but rather to create useful software that the developer community can leverage for business benefit.

Read More

Integration Smackdown: Documents versus API in B2B

I have been focused of late (a rare luxury for me) on exploring the integration technologies to the emerging cloud computing providers — especially the SaaS guys.  Last week I attended Dreamforce 2009, which was an absolutely fantastic experience. Salesforce.com has posted a bunch of the material on the site, and it is worthwhile. While at Dreamforce, I attended training on integration with Salesforce.com, both to and from.  I enjoyed the class, and found the relatively rich set of alternatives (remember that this is a very young company that has quadrupled its workforce in the last few years) to be a good toolbox for integration.  One of the more intriguing aspects is how the company’s engineers balance control of service levels (i.e. their ability to prevent you from hurting their performance when you integrate) with access.  One of their key technologies is a set of “governors” to control the resources available. While I really enjoyed the exposure to the API-level integration, I cannot help but compare all of this to how we traditionally do integration between business partners, as well as to SAP, which is via standard documents.  Apparently I’m not alone, as Benoit Lheureux of Gartner wrote a thoughtful blog post on how cloud integration will affect traditional integration. From my point of view, API level integration (and yes, this includes web services), is usually more powerful and functional than integration driven through documents (disclaimer — yes, I realize at its heart that web services is basically the exchange of XML documents over HTTP, but for this discussion, it feels like an API…), but, document driven exchange is easier, more interoperable, and often enjoys much higher adoption.  And a major part of the “document advantage” is the ability to separate the handling of the document from “transport” (communications), which often allows use of familiar technologies to perform the integration. For my purposes, API level integration is the calling of specific low level services, with a set of arguments, often — though not always — using web services.  Although web services is “standard”, the service calls are typically specific to a given SaaS provider (i.e. proprietary).  Additionally, the service calls may change based on the configurations done for a given instance setup for a customer (i.e. custom proprietary). Document level integration, in contrast, uses the basic pattern of sending in a document in a defined format, often — again, not always — in XML.  This document may vary from customer to customer, but most of it is the same, and customers can often submit the document in a variety of ways (web services, ftp, etc). SAP is a pretty popular ERP system in the current world, and it supports a wide variety of integration technologies.  In our B2B managed services worlds, IDocs over ALE is by far the most common integration technology — despite the fact that there are often very good reasons to use a lower level approach (directly invoking RFCs after sending data using FTP, for instance). Why?  Among many, many other reasons, customers and integrators like solutions that work predictably across multiple environments.  IDocs, like X12, EDIFACT, OAG, etc, are defined entities that do not change based on adding fields or custom logic to an application.  But possibly a bigger reason is the ability to use existing technology and processes to perform the work.  SAP IDocs can be manipulated using standard translation technology, and then sent via ALE or other communications methods.  The ALE protocol can be implemented in a comms gateway that you already own. Modern communications gateways have extensive support for web services today, but that is really a kind of toolbox for building custom adapters to custom services, with each one a “one-off”.  This problem can be intensified if the service you are connecting to changes over time as additional data fields are added to objects (a product object is especially susceptible to change). API level integration is usually much more brittle for this reason, and it is one of the characteristics that led many enterprises to switch to message oriented middleware, and attempt to impose canonical standards (“canonical standards” is a fancy way of saying a common format — usually in XML — for frequently used documents, like customer; canonical standards are often adopted from outside, especially the OAGi standards).  Integrating to a single system via API is often fun, maintaining dozens or hundreds of these is not. A common pattern is for emerging technology categories to start with low level APIs, and gradually build up to more abstract interfaces that are easier to use and less brittle.  Already, in the case of Salesforce, they are offering a bulk load service, and they are also offering a “meta-data” API that allows tooling to simplify integration.  Over time, I fully expect that most major SaaS providers will provide integration methods that feel a lot like trading documents, and that B2B teams will take the lead in using them. In the long battle between document and API integration, document style integration will dominate, though API and service level integration will play critical roles…

Read More

Record Cash Holdings by Large Corporations makes B2Bank Integration More Important than Ever

The Wall Street Journal published an excellent article on November 3rd about how companies are holding more cash on average than any time in the past 40 years.  “In the second quarter, the 500 largest nonfinancial US firms by total assets, held about $994 billion in cash and short term investments, or 9.8% of their assets.”  With nearly a $1Trillion dollars in cash holdings, the opportunity for the Transaction Banking divisions at major financial institutions has never been greater.  However, some institutions have an advantage over others in the areas of sales, client delivery and product features due to their superior B2Bank capabilities. Why are Companies Hoarding Cash? The WSJ article provided some fascinating insights as to the motivations behind large corporations such as PepsiCo, Alcoa and Texas Instruments desire to hold massive amounts of cash on their balance sheet: “A hangover from the financial crisis a year ago, when companies couldn’t raise money or had to pay much higher rates than usual” stated Citigroup’s head of Financial Strategy. Texas Instruments “Decided…to amass cash so they could seize opportunities to buy cheap manufacturing capacity, technology and other assets.” “The cash providers operating and strategic flexibility…We’re very happy to have it sit in our bank account and earn a modest interest rate” said Google whose cash represents 58% of total assets.  The Importance of Information Reporting With record amounts of cash assets on the balance sheet, the need to maintain visibility to the locations of funds is extremely important.  Corporate treasurers need to know what the cash balances in their various bank accounts are and what the positions in their various securities holdings are.  Treasurers need the account balances to determine whether they will have a net surplus or deficit of cash after all the payables and receivables for the day have cleared.  Ideally, corporations do not want to leave any cash in non-interest bearing demand deposit accounts.  Instead, treasurers prefer to put the money to use in overnight sweeps or other short term investment instruments designed to generate some returns from the funds. Financial institutions must be prepared to provide electronic copies of intra-day and end-of-day statements to their corporate clients.  Ideally, corporate customers would like rich information reporting capabilities from their banks that detail individual transactions and exception scenarios.  The information should be available in multiple different formats.  Banks should offer web-based reporting services, downloads into Microsoft Excel spreadsheets and the option to integrate directly into the corporation’s Treasury Workstation or ERP system. Multi-national corporations such as the ones outlined above with the high cash balances, typically maintain relationships with several different financial institutions around the world.  One of the key challenges experienced by multi-national corporations is the varying levels of reporting capabilities from their cash management banks.  The level of detail, timing and formats in which statements are available can vary by geography and institution.  Several of the global money center banks have established multi-bank reporting applications to simplify the process for corporate treasurers. Multi-bank reporting services such as Citigroup’s Treasury Vision provide visibility into cash flows both within their own accounts as well as their competitors.  To provide these services integration is required with correspondent banks in foreign countries and often with direct competitors.  Financial institutions can leverage the SWIFT Network to gather bank statements or connect directly to other banks using secure FTP over the Internet.  Banks which can leverage integration to provide more detailed and timely cash reporting to their customers will have a competitive advantage in the marketplace. Faster Access to Deposits and Transaction Fees Not only can integration provide financial institutions with a competitive advantage in sales scenarios, but B2B can accelerate time to revenues as well.  Corporations put their cash management business out to bid every few years via RFP.  Winning the cash management business of a large multi-national is highly prized by banks, because the routine payment and collection services generate significant recurring fee revenue.  Furthermore, a corporation’s cash management bank is typically the primary lender and primary holder of deposits, which in today’s market can be significant. A key challenge that many financial institutions face after winning new business is the ability to on-board the new clients to their cash management services.  The complexity of technical integration between the bank’s product applications and the customer’s ERP systems is often a key factor in delaying the process.  I recall a conversation I had with the President of Innovative Banking Solutions, earlier this year. He stated that in many cases it takes less time to build a house than it does to establish connectivity between a corporation and its bank. Financial institutions with poor B2Bank integration skills can take up to six months to on-board new clients, leaving significant money on the table.  These financial institutions are not only losing out on six months of transaction fees, but they are also missing significant opportunities to leverage massive cash deposits for lending purposes.  By contrast, those financial institutions which excel at customer integration and rapid on-boarding will benefit from faster access to large deposits and transaction fees.

Read More

We Need a Napster-like Service for B2B File Transfers

In my last two posts I described how B2B integration infrastructures at major corporations are choking on the increasing number of large files transmitted crossing their firewalls.  Most of the existing approaches to large file transfer are expensive, proprietary and complex. It is interesting to note that corporations have yet to achieve a simple, open, universal file sharing model comparable to what consumers enjoyed with the original Napster in 1999.  Yet, business users expect to be able to share large files at work the same way they do in their everyday consumer lives.  Recent technology advances compound the problem. Broadband connectivity has become nearly ubiquitous in most developed countries.  For example, Verizon FiOS offers an Internet package that allows home users to buy Internet connectivity at speeds up to 50Mbps.  Furthermore storage costs continue to decline rapidly.  Grocery stores now sell 16GB USB memory sticks in the check out aisles for $20.  Despite these advances in networking and storage technologies, the corporate world continues to struggling with large file transfer. A Napster-like File Transfer Service for Business Users So how do we solve the large file transfer problem in B2B?  I think we need a Napster-like service, but for business users.  Unfortunately, for many people, the original Napster service reminds them of college students illegally sharing MP3s with one another in an effort to outwit Hollywood’s recording industry.  From my perspective, Napster was the key catalyst for the explosion of digital media content that has revolutionized the home entertainment experience.  I also credit Napster as a key pioneer in the large file transfer segment.  Napster provided end-users with the ability to exchange high volumes of MP3 files across the Internet.  Napster changed the World Although each MP3 was only a few megabytes in size, the capability was revolutionary at the time it was introduced.  If only we could create a file transfer service that was as easy-to-use, as widely-deployed and as reasonably priced as Napster, but for corporate use.  Key features would be: Ease of use – Users familiar with Internet browsing would be able to upload or download. There would be no need to write file transfer scripts or otherwise engage IT resources. No vendor lock-in – No proprietary communications protocols would be used to perform the file transfers.  Open standards would be used.  Furthermore no desktop software clients would be necessary. Low cost – The service should be considerably cheaper than the cost of licensing MFT software packages.  Users should have the option of paying per file transmission or subscribe to unlimited usage for a monthly recurring fee. The introduction of such a commercial service would create a viral effect that promotes widespread adoption quickly throughout the business community.  There are a few legitimate concerns that business users would have with sharing their critical IP and sensitive documents over a Napster-like service.  Therefore, I would propose a few key additional services for business users: Store-and-Forward – Files would only be housed on the hosted servers temporarily.  Once the transmission to the receiving entity is completed only metadata such as file size, sender, receiver, date/time would be retained for tracking purposes. Data Security – Users could encrypt sensitive file transmissions to protect sensitive corporate IP.  The service should offer higher levels of security such as 128 bit SSL encryption in a premium package. The good news is that such a B2B file sharing service is not far-fetched.  With the recent explosion in cloud computing numerous highly scalable, secure, reliable file transfer services have been introduced by start-ups. Hosted file sharing services such as Gigasize, Leapfile, and YouSendIt are available today with relatively large transmission capabilities and very affordable prices.  We have also introduced a first generation file sharing service here, but perhaps, most interesting is a cloud-computing file transfer service that I will detail in a forthcoming post.

Read More

Finding the Opportunity in Crisis

Among the primary definitions of the word “crisis” found in Merriam-Webster’s dictionary, I think the one that best defines the past 18 months in the banking sector is “an unstable or crucial time or state of affairs in which a decisive change is impending“.  As I look around, I definitely see signs of decisive change across the industry.  And as the best crisis management teams will tell you–what people really remember is not the triggering event that begins a crisis but the long-term response to it. Already, most US businesses are giving the Federal Reserve high marks for its response to last year’s meltdown. But what about the perception of corporates–your clients–to the banking industries’ response? All good crises’ responses begin with a plan. In many ways, the banking industry as a whole was caught off guard by the scope and depth of the meltdown and was caught without a plan.  In fact, throughout much of the early response to the meltdown, the basic steps of crisis management were either forgotten or completely overlooked.  But hindsight is 20/20 and the question for most banks should not be what did you miss but rather “what can you do to turn this situation around and create an opportunity out of the crisis?” I’ve written quite extensively about the need for visibility in the financial supply chain. A number of the problems that created the environment for the financial meltdown had to do with visibility or rather the lack of end-to-end visibility. But an issue of equal if not even greater importance also exists and it is one that is deeply ingrained in the industry. The issue is a lack of flexibility–not only in its IT systems but also in it’s business processes. Flexibility has a number of definitions but for the purposes of this discussion, I am speaking specifically about market flexibility as it relates to the ability of  banking systems to adapt to market demands.  Banking systems are notoriously slow to change.  While there are many good reasons for the prevalence of legacy systems in the banking space, the primary reasons that banks haven’t changed or updated their systems are the inherent risks and costs of replacement. A 2008 pre-meltdown survey found that most banks will still be using the same legacy-based technologies for at least another 5 years. Now 5 years may not seem like that long but in technology 5 years represents a lifetime. In this short span of time, government requirements and standards have changed, while initiatives like SEPA and customer demands for banking as a service (BaaS) type offerings have accelerated. Yet, most banking systems  remain fundamentally unchanged. They are still largely reliant on IT systems that are ten, twenty or more years out-of-date.  Despite the age, limited functionality and high costs of maintaining legacy systems, most banks see themselves caught in “lose/lose” situation when it comes to addressing these problems. We all know that “rip and replace” is not an option and even with the promise of service-oriented architecture or SOA-based solutions, the simple truth is that the world will not stop while the bank’s technology plays catch up.  Banks must be able to respond today, even while the debate about legacy systems vs. modern systems continues.  Which means there must be an interim strategy that delivers and enables the bank to address market changes in a flexible but still repeatable way.  This is the promise and the reality of integration–to allow banks to leverage the inherent value still to be found in their legacy systems while also adding much needed functionality to address market changes fluidly and most importantly, quickly. Integration services providers, like us have helped and continue to help banks navigate the path between the past and the future.  At the crossroads of legacy systems vs modern ones is the data, so while avoiding a sales pitch like the plague, it is critical note that  the key area where integration vendors excel is in “bringing data together”.  This expertise along with experience in process, work-flow and systems integration help make walking the tightrope between the old and the new easier for banks and bankers. So while it’s just a guess on my part,  I suspect that at least some of the banks that are still spending big on IT right now; are doing so by working with integration services providers to identify, mine and capitalize on the opportunities to be found in this crisis.

Read More

B2B Integration could help improve tracking of Pandemics such as H1N1 Swine Flu

I was watching the movie I am Legend on HBO Sunday evening.  I’m not sure if there is any correlation between HBO’s decision to broadcast of the film in May and the outbreak of the H1N1 Swine Flu.  However, it did start me thinking about pandemics and what could be done to better contain these outbreaks before they turn all of Manhattan into nocturnal, cannibalistic zombies.  The widespread outbreaks of H1N1 in Mexico and the US have made this subject top of mind for everyone from politicians to economists.  Of course, pandemics are yet another area in which B2B interoperability and integration technologies could play a significant role. The Center for Information Technology Leadership published a comprehensive report on how B2B interoperability in the US health care community could not only reduce costs but improve the quality of care.   Much of the data cited in this post is sourced from the 2004 report entitled The Value of Healthcare Information Exchange and Interoperability.   See my January post on how the Obama administration could save $75B annually from B2B interoperability in health care for more background information. Tracking Pandemics at the State, Local and Federal Level State laws require providers and laboratories to report cases of certain diseases to local and state public health departments.  Nationally “notifiable” diseases are forwarded by the state agencies onto the Centers for Disease Control and Prevention (CDC).  Connections between the states and the CDC are electronic and highly automated.  However, the first mile between the providers and the local and state agencies is highly manual.   Providers typically submit data via phone, fax, hard copy forms or very basic B2B communications methods such as a web portal.  For larger provider groups operating in multiple regions, notifications to state health agencies become even more cumbersome.  The 50 US states maintain more than 100 different systems to collect data each with its own communications mode. The most closely monitored “notifiable” diseases are frequently under-reported in the US.  Various studies conducted between 1970 and 1999 showed that only 79% of all STD, tuberculosis and AIDS cases were reported to public health agencies.  Reporting rates for other diseases was much lower at 49%.  There are several reasons for the reporting challenges.  But certainly one of the key issues is the ease with which the information can be transmitted to health authorities.  There is no question that the primitive communications methods used to collect provider data is a critical barrier to success.  However, even more problematic is the dependency upon overworked and understaffed provider personnel to take the time to consistently file the reports. Electronic Health Records – Public Health Benefits A better methodology for reporting on “notifiable” diseases would be to eliminate the need for human initiation altogether.  The process could be completely automated by connecting health care provider’s Health Information Systems and Practice Management Systems which contain the patient data to Public Health and Safety tracking systems.  However, connecting the tens of thousands of medical practices to the hundreds of different public health systems could prove quite an ambitious integration project.  A less complex and costly alternative would leverage the concept of Electronic Health Records (EHR).  The EHR would significantly simplify tracking of public health epidemics without the need for bespoke integration between various state agencies and each different medical provider. The EHR provides a comprehensive set of information about each patient including demographics, medications, immunizations, allergies, physician notes, laboratory data, radiology reports and past medical history.  EHR information could be stored in a series of centralized repository deployed around the country.  Each repository could contain the full medical records or just pointers to the locations of the records.   Triggers could be set up to automatically identify trends in data sets that might not be otherwise noticed, helping to provide an early warning system for potential disease outbreaks.  In the event of a pandemic or bioterrorist event, public health officials could easily access de-identified EHR data such as physician’s notes, patient demographics and medical history.  Without the dependency upon manual data entry, the latency of information flow could be reduced and the quality of information collected could be improved.  Administrative costs would be reduced considerably.  Average cost to send a report manually is $14 as compared to only $0.03 electronically.  CITL estimated that the use of electronic data flow from providers and laboratories to public health agencies would reduce administrative costs by $195M annually.  CITL did not quantify the potential economic savings from early identification of pandemics and bioterrorist events, but there is no question that these could be in the billions of dollars. B2B Interoperability and EHR Of course, a key technology enabler for EHR is interoperability between the various health care providers and the corresponding state, local and federal agencies.  Medical data is transmitted between providers, payers and public agencies using a variety of B2B standards including DICOM, HL7, NCPDP, and HIPAA-compliant EDI transactions.  EHRs could aggregate the available data related to prescriptions, claims, lab reports and radiology images into an electronic record.  Additional services could be layered onto the B2B integration framework such as data quality could be used to ensure the completeness of records and business activity monitoring to identify behavioral trends. Another concept evangelized in the CITL report is the idea of a National Electronic Disease Surveillance System (NEDSS).  The NEDSS would collect data from a number of relevant sources outside of the health care system which could be useful for monitoring   Examples might include 911 call analysis; veterinary clinic activity; OTC pharmacy sales; school absenteeism; health web-site traffic and retail sales of facial tissue, Orange Juice.   Such practices have been deployed by the US Department of Defense and the Utah Department of Health during the Salt Lake City Olympics in 2002.  Such an effort would require integrating additional state and local agencies, educational institutions and retail chains electronically using B2B.  

Read More

The Importance of Integrating B2B and ERP Platforms in the Automotive Industry

Integration, this is an important word on the minds of many CIOs around the world, and yet many companies underestimate the importance of it, especially when trying to implement and work with countless back office systems. Let me try and compare a company trying to use a mixture of different back office systems to a musical orchestra. Each musician in an orchestra has their part to play in the musical performance, if one musical instrument is out of tune or a beat is missed for some reason then the audience will notice it straight away. It is no different for a company trying to integrate back office systems, the reason for doing this is to ensure a smooth flow of information around the organization, making sure that the relevant people in the various departments get the right information at the right time. Without backend integration, companies will struggle to orchestrate the flow of information across their extended enterprise. Whilst on the subject of orchestras, this was one of my favourite automotive adverts from last year, which was performed on actual car parts. Enterprise Resource Planning (ERP) systems have been used for many years to manage materials and enterprise resources within a company. Now unless a company makes everything in house, which is very rare these days, companies will have to manage a range of different suppliers and interact with customers on a daily basis. So there is clearly a potential barrier to information flow here, an ERP system is managing the flow of information within a company, but by itself it cannot manage a global network of trading partners as well. There has to be some level of integration so that the barrier can be removed and information can flow freely across the extended enterprise. Imagine being able to utilize ERP related information across your trading partner community, imagine being able to smoothly receive information from trading partners without any re-keying of information or having to interrogate multiple business systems. Integration, CIOs should not under-estimate the power that this word has and outsourcing the management of the B2B to ERP integration process to a trusted partner such as GXS should be given serious consideration. SAP is the most widely used ERP platform in the automotive industry today, and many companies rely on SAP software to manage their global manufacturing operations. Many European automotive companies, particularly those in Germany, have globalised their manufacturing operations in the past few years. They have established operations in ‘old’ emerging markets such as China, Brazil and India and the ‘new’ emerging markets of Thailand and Vietnam. As the automotive companies stretched their operations around the world it became more important to provide integration to multiple SAP instances and to also provide a means to manage their ever expanding community of trading partners. Ideally, the car companies need to get a single, consolidated view of their resources across their extended enterprise. Providing an integrated B2B and ERP platform allows the automotive companies to improve how they manage inventory levels, trading partner relationships and of course reduce costs across their business. GXS has worked with many companies over the years to integrate their B2B and SAP ERP platforms, in fact we are currently working with one global automotive tier 1 supplier to integrate 80 instances of SAP that are scattered across their global network of manufacturing plants. As you would expect, we have been able to establish a valuable knowledge base of information associated with integrating our Trading Grid® B2B platform with many SAP installations. However we can offer companies much more than this, let me briefly explain. GXS has over 40 years experience of working with companies in the automotive sector, in fact over 70% of the world’s Top 100 automotive suppliers are connected to our Trading Grid infrastructure. GXS can provide seamless SAP integration to a global B2B platform where nearly all the major suppliers already have pre-configured connections to our service We offer a mapping centre of excellence which provides a centralized location for managing over 30,000 maps. We can also provide support for a number of different SAP IDOC documents, for example SHPMNT03, INVOIC01, DELVRY01, ORDERS01. As you would expect GXS are fully certified by SAP for Netweaver® integrations We offer support for the broadest set of communication standards in the industry, whether it is mediating between legacy communication protocols or the latest internet communication protocols or indeed FTP over SAP ALE, either way we can make sure that you can connect to your trading partners irrespective of their technical capability Providing global 24/7, multi-lingual, support services ensures that your trading partners can be on-boarded and supported anywhere in the world. GXS recently acquired Interchange in Brazil which now allows us to work more closely with this important automotive manufacturing region You would have peace of mind knowing that your SAP implementation would be integrated to one of the most modern and highly available B2B infrastructures in the world. This means that your global instances of SAP would be connected to a future proof B2B infrastructure that has the flexibility to grow as your company grows Now I have only scratched the surface here and in future blog entries I will cover each of the above in more detail and explain how they help contribute to the smooth integration of many ERP related back office environments around the world.

Read More

Should the FCC ban Async and Bisync?

legacy B2B protocols

Today is February 17th, which has become the equivalent of Y2K in the television broadcast industry. Today was also the original deadline set by the Federal Communications Commission (FCC) to discontinue analog broadcast television signals throughout the US. Congress recently passed the Digital TV Delay law which has extended the deadline to June 12th to allow consumers more time to complete the transition. While June 12th has become the new “drop dead” date, Congress specified that broadcasters are no longer obligated to continue analog transmissions during this extension period. However, 1100 of the nation’s 1800 TV networks will continue their legacy transmissions to maximize advertising distribution. The FCC’s ability to phase out older technology in favor of more modernized, cost effective protocols begs a question in my mind. If the FCC can apply such a policy to TV broadcasts, should we consider enacting similar legislation for outdated B2B communications protocols such as async, bisync and X.25? Y2K for the Television of the Industry The official motivation for the US switching to digital TV is to free up wireless broadcast spectrum which is in high demand by other user groups. Rather than using the spectrum for analog TV broadcasts, the frequencies could be allocated to municipal fire, police and emergency rescue departments to support public safety efforts. Of course, the upgrade to newer technology has numerous benefits for consumers and broadcasters as well such as improved picture quality and a wider variety of programming options. After the cutover, older televisions, which do not have a digital receiver and are not connected to a cable or satellite service provider, will not be able to receive the new transmissions. Consumers with older TVs will need to follow one of three courses of action to watch TV: Purchase a digital converter box, subsidized by the US government Upgrade to a newer model television Subscribe to a cable or satellite service The February deadline was extended to June due to a much higher than anticipated population of non-digital TV users. The US government originally estimated that only 15% of households received analog signals TV via free antennas. However, actual utilization of the analog broadcasts appears to be closer to 35% due to the fact that many homes have extra TVs not connected to cable or satellite TV networks. Legacy B2B Communications Protocols I think most government officials were surprised to learn of such a high population of legacy TV technology still in use in 2009. I suspect a similar level of disbelief would be experienced if a study of the use of legacy communications in the B2B integration sector were conducted. One would assume with all of the e-commerce technology businesses have today such as AS2, FTP and Web Forms that legacy technology such as async, bisync and X.25 have become virtually extinct. Unfortunately, this is not the case. For those of you not familiar with these legacy communications protocols, which is by no means something to be embarrassed about, here is a brief introduction. Async – is a communications protocol in which a start and stop signal are used to encapsulate individual characters and words. Async was originally developed for use with teletypwriters in the early 20th century.  Asynchronous signalling was very popular for dial-up modem access to time sharing computers and bulletin board systems during the 1970s, 80s and 90s.  Here is a link to the wikipedia page on async Bisync – is an acronym for the “Binary Synchronous Communication” protocol intorduced by IBM in the 1960s.  The bisync protocol was the most popular file transfer protocol during the 1970s and 1980s. EDI applications were a primary user of bisync as were ATM machines, cash registers and radar systems X.25 – is a Wide Area Network (WAN) protocolused with leased lines, ISDN connections and traditional telephone lines.  X.25 was especially popular in the 1980s within the financial services industry to connect ATMs to banking networks. Here is a link to the wikipedia page on X.25 I have never seen an official report on the use of legacy communications in B2B integration. However, I am confident there are over 10,000 companies are still using legacy dial-up connections such as async, bisync and X.25 throughout the US. What are the implications of the business world’s continued dependency on these ancient networking standards?  Can commercial entities effectively phase out these technologies on their own or will government regulation be required?

Read More

Cloud Computing may Change Outsourcing/BPO

cloud outsourcing

Recent events (economic, technological, business) may be driving the convergence of two traditionally separate sources of value in the managed services or outsourcing world, specialized knowledge, and scale. Although some organizations have traditionally applied both specialized knowledge and scale, it was far more common for smaller service providers to compete on specialized knowledge — with higher infrastructure costs, and larger firms to compete mostly on scale, with less specialized knowledge and focus than their smaller rivals. But recent research and trends, show that the advent of cloud computing, and large commodity cloud service providers, may allow specialized managed services providers to focus and still provide compelling scale in the infrastructure. Referred to as “ADAMs” by Gartner (Alternative Delivery Models), these providers leverage not just Cloud architectures, but also Software as a Service (SaaS) platforms. And there is no reason why this has to be only a two-tier model! I recently posted a link via Twitter, Facebook, etc regarding a beta-service called CloudMQ. This service, not commercialised yet, may be an indicator. It aims to offer standards based (JMS, or Java Messaging Service), business grade messaging (guaranteed delivery, etc) “in the cloud” (offered to customers via the internet). So far, this sounds normal enough, until you realize the entire service is hosted on Amazon’s EC2 and S3 services, which are themselves infrastructure services!  And we are only at the start of this. Commercial services like Amazon change the economics, and when the economics change, the models quickly follow…

Read More