Compliance

SWIFT and Sibos 2011: Financial Services Growth in 2012 and Beyond

Amid a week in which Moody’s downgraded three top US banks and the global stock market fell into bear territory, SWIFT held its annual  Sibos Conference in Toronto, Ontario, Canada. Billed as the world’s premier financial services event, this year’s Sibos brought together more than 7,000 leaders from financial institutions, market infrastructures, multinational corporations and technology partners from around the globe. Facilitated and organized by SWIFT, the global provider of secure messaging services, this year’s conference program focused on four big topics: Regulation re-visited, technology, changing landscape and new expectations. There were a number of “big ideas” discussed during the plenary sessions including pursuing growth, emerging markets and the impact of regulation. In the opening plenary, John Havens, President and Chief Operating Officer of Citigroup,  stated that to survive higher capital requirements, lower leverage rates and higher funding levels, the banking industry will have to “fundamentally restructure in order to recapture much of that lost revenue.” Banks will need to become leaner, fitter and more ruthless in how they manage costs and pursue revenues. However, Havens saw opportunity for banks to help corporate customers navigate the new macro-economic landscape and understand and access new markets. In a plenary session dedicated to “Where’s the Growth in 2012 and Beyond,” Tim Keaney, Vice Chairman & CEO Asset Servicing, BNY Mellon, said that he now spends as much of his technology budget on compliance as he spent three years ago on developing new products and services and accessing new markets for clients. Keaney admitted it is challenging to explain to customers that regulation is raising costs and increasing lead times. The customers understand that the cost of doing business is higher, but they expect associated benefits from increased regulations. The speakers advocated proactive client engagement. Keaney said “When there is a change affecting our clients, it is an opportunity. Clients are very willing to sit down and talk about what we can do differently to help them–usually that will result in a good product idea that clients are willing to pay for.” Paul Simpson, Head of Global Transaction Services, Bank of America Merrill Lynch (BAML), concurred, stating “Just because I’ve served clients in a particular way for 15 years, have I really stepped back in the last two years and asked whether it’s the optimal way to serve the client going forward?” Simpson also discussed the huge growth he’s seen at BAML in middle market companies expanding into high growth countries. He said that banks need to step up and be more supportive of this global growth. M.D. Mallya, Chairman, Indian Banks’ Association stated that the number of emerging market companies in the global top 1000 used to be 10 to 15, and now numbers into the hundreds. Mallya also talked about this shift from West to East: “Money is flowing east, follow the money.” John Coverdale, Group General Manager, Head of Global Transaction Banking, HSBC Holdings plc, pointed out that the emerging markets are not dealing with legacy systems, whether regulatory or infrastructure based. This provides a situation where there will be regulatory arbitrage (and advantage). One speaker talked about how “profit challenged” institutions see opportunity in outsourcing non-core functions to specialists—not simply “taking somebody else’s mess for less.” In a separate session, David Robertson, Partner, Treasury Strategies, voiced a similar view: “Our clients of all sizes are globalizing profoundly. In response, banks and other providers are expanding their footprint, but they’re also seeking deeper and more comprehensive partnerships.” How do these trends impact transaction banking technology? Some emerging economies are growing three times as fast as the US and EMEA. Large corporate and middle market companies expanding into these markets require sophisticated treasury technology and associated bank integration solutions to manage cash flows, working capital and liquidity. Technology firms able to service their clients across international footprints, whether that client is a financial institution or corporation, can deliver global solutions delivered with local resources and knowledge.

Read More

How to Get the US Treasury to Pay Its Bills Faster

With the entire world focused on the US debt ceiling debate over the past few weeks many suppliers have become increasingly nervous about the federal government’s ability to pay its bills.  Fortunately, three weeks ago the US Treasury announced a new program which would enable suppliers to be paid even faster than they have before.  And the program does not involve raising taxes or cutting social programs.  By the end of 2012 the Treasury Department will implement the Internet Payment Platform (IPP). The key to the IPP program is the requirement for suppliers to exchange information electronically with government agencies.  Agencies can send electronic purchase orders for items they buy such as computers, furniture, office supplies and professional services.  Suppliers can respond with order acknowledgements, shipping notices and electronic invoices.  The official press release stated: “Treasury estimates that adopting IPP across the federal government would reduce the cost of entering invoices and responding to invoice inquiries by as much as 50 percent or $450 million annually.  These government-wide savings equal roughly one quarter of the $2.1 billion of the efficiency savings that the President’s 2012 Budget called upon agencies to identify.” IPP is an example of Business-to-Government (B2G) technology which has been growing in usage since the late 1990s.  For example, the Department of Defense’s Wide-Area Workflow platform processes more than 7 million invoices a year from over 90,000 vendors.  B2G technologies have become widely used in many European nations in the past few years. Starting in 2005, Denmark mandated that all vendors selling to the public sector submit invoices electronically.  Since then Norway, Sweden, Spain, Switzerland, Finland and Italy have all signed mandates shifting public sector invoicing to electronic formats. Paper invoices are fraught with problems which cause delays.  Paper invoices can get lost in the mailing, routing, workflow or approval process.  Because there is no validation on a paper invoice before it is submitted, the bill may not have all the necessary information for approval. For example, the invoices may not have the appropriate accounting information required to enter the expense in the general ledger.  Or the supplier may not have provided the address or bank account number that the payment should be routed to. The quantities, prices and descriptions of items on the invoice may not match what was on the original purchase order or shipping documentation.  Each of these problems creates an exception scenario that requires contacting the supplier for resolution.  Exceptions lead to additional cost for the government and risk of delayed payment to the supplier. Electronic invoicing reduces the likelihood of lost invoices by creating an audit trail of each step in the workflow.  Invoices entered via the IPP web site can be validated to ensure all of the required fields have been entered.  Electronic invoices can also be automatically matched to the purchasing and shipping documents.  The result is the removal of manual handling of invoices.  This straight through processing accelerates approval and payment cycles. IPP is an already in use by the Department of the Interior; the Small Business Administration; and the Bureau of Engraving and Printing.  The Department of Agriculture and Social Security Administration are in the process of deploying IPP.  The vision for IPP is for all government agencies to use the platform.  Such a shared service model would not only reduce costs for federal agencies, but it would simplify for processes for suppliers.  Companies selling to the US government would have a single portal by which to receive purchase orders, submit invoices and check on payment status. Doing business with the government remains a complex art attempting only by those who know the system.  Unfortunately, the complexity deters many suppliers from engaging in government sales, which lowers competition.  As taxpayers we all benefit from programs which encourage competition and lower prices for federal purchases.  IPP is a great example of a cross-agency solution which will not only increase competition, but help to reduce budget deficits in the coming years.

Read More

GSA Advantage – An Example of How B2G Technologies Can Help to Lower the US Federal Deficit

Just a few weeks remain until the August 2nd deadline set by the Treasury Department for raising the US government’s $14.3 trillion debt ceiling.  President Obama held a news conference this morning rejecting any proposals for a short-term 30, 60 or 90 day extension. If a deal is not reached in the next few weeks the Treasury Department will begin a complex process of trying to decide who to pay and who not to pay.  It may choose to pay bondholders first to avoid risk of default.  Payments to social security, military service members and government personnel could be suspended indefinitely.  Regardless of what political solution prevails there will undoubtedly be a renewed focus on how to reduce government expenses.  B2B e-commerce, or I should say B2G (Business-to-Government) e-commerce technologies, can play a critical role in reducing the cost of government purchases. One of the best examples of B2G technologies is the GSA Advantage web site.  GSA offers a purchasing portal that federal agencies can use to acquire goods and services.  GSA Advantage looks similar to most retail B2C e-commerce sites.  However, it is specially designed for use by government personnel.  Products and services are organized into approximately 20 categories such as building and industrial supplies; IT solutions and electronics; vehicles and watercraft.  There are a number of aspects of GSA’s site that I find particularly compelling: First, access to GSA Advantage is not limited to government personnel.   Anyone can browse the site.  Of course, only government employees can actually make purchases.   I love the transparency offered to taxpayers on exactly what products the government can buy.  Furthermore, I like being able to view the actual prices being paid for the merchandise.  Private-sector procurement professionals should visit the GSA site regularly to understand what the €œmost favored nation€ pricing for a particular item is. Second, GSA Advantage offers advanced search and filtering capabilities specifically designed for government buyers.  For example, you can filter product searches to include only companies participating in the Ability One program, such as the National Institute for the Severely Handicapped. Skilcraft, which is the brand name for products manufactured by the National Industries for the Blind, offers 3000 different SKUs ranging from office supplies to call center services.  GSA Advantage also allows you to search for products offered by small businesses.  There are many different categories of small businesses including “Women-Owned Small Businesses” and “Veteran-Owned Small Businesses.” Another category is Historically Underutilized Business zones, which are called HUBzones.  Examples of qualified HUBzone companies might be those based within the boundaries of a Native American Indian reservation or those businesses established on former military base locations. Third, all of the items being sold on GSA Advantage are from the GSA schedule, which means that terms and conditions have been pre-negotiated.  As a result, discounts, payment terms, shipping costs, warranty coverage, return policies and customer support levels have already been standardized in advance.  GSA Advantage greatly simplifies the process for acquiring goods and services.  A buyer selects items from the web site by adding them to a shopping cart.  A checkout process occurs in which the government employee utilizes a SmartPay commercial credit card to complete the purchase.  There is no invoicing or payment process required as the financial settlement occurs via the SmartPay process. B2G technologies such as GSA Advantage play a critical role in lowering government expenses for everyday purchases.  By leveraging e-commerce technologies public sector organizations can make it easier for more suppliers to sell goods and services to the government.  As we all know, increased competition from more suppliers will lead to lower costs.  B2G technologies also lower the administrative costs associated with purchases.  The combination of commercial credit cards and online B2B storefronts such as GSA Advantage can yield savings of $50 to $100 per order by eliminating the need for time-consuming purchase orders requisition and invoices approval processes. Of course, GSA Advantage is just one of many opportunities for the government to use B2G e-commerce technologies to reduce costs.  More examples and thoughts to come in future posts.

Read More

Introducing GXS Data Quality and Compliance Service (DQC) – Foundation for “ERP Firewalls”

For those of us in Product Management, there aren’t many things that are more fulfilling than new product introductions. It gives me great pleasure to share details around an exciting product launch! GXS today announces general availability of the next-generation Data Quality and Compliance service (DQC) for the GXS Trading Grid, the world’s largest integration cloud. Data Quality and Compliance is a multi-enterprise PaaS solution that provides real-time visibility and control by automatically tracking in-flight B2B transactions and processes against business and compliance rules to detect and prevent errors. It applies both document-level and business process rule validation to data allowing for proactive recognition of business impacting problems. An AMR Research (now Gartner) report entitled ERP Projects Create Significant B2B Opportunities found that one-third of all data housed in an ERP system originated outside the organization. This study found that external data came from three key sources: customers and distributors (43%), suppliers and contract manufacturers (31%), and third-party logistics providers and transportation carriers (17%). Furthermore, research conducted in the discrete manufacturing segment found that, on average, 2.9% of transactions originating from external trading partners required exception processing or error handling. The consequences of exception processing resulting from poor data quality are numerous and highlight the need for B2B integration solutions to play a vital role in monitoring and improving the quality, accuracy and timeliness of supply chain data exchanged between organizations. GXS Data Quality and Compliance (DQC) delivers capabilities for validation, exception management, collaboration, and reporting to allow for specific  problems to be effectively resolved and repeat problems to be driven out of the business altogether. Visibility tools allow users to research issues, highlight trends and conduct root-cause analysis to business problems and data quality issues. These tools are designed to improve the quality and timeliness of B2B transactions in audit mode. DQC ushers in several features that have evolved, benefiting from inputs provided by several beta customers in the retail, consumer products and manufacturing sectors. Highlights of these capabilities are: Compliance-Guide Modeling   A flexible, business-rule modeling database captures and maps a company’s exact trading partner requirements, such as compliance guides, routing guides and service level agreements. Transaction Validation – The validation service sits on top of the GXS stack and utilizes a validation engine to evaluate inbound/outbound transactions against documented rules and requirements, flagging errors and issues with drill-down to root-cause data. Exception Management – The service provides exception management capabilities that can be as simple as enabling notifications to trigger real-time email alerts by role and organization. Complex exception management processes are enabled via a flexible workflow engine, including generation of production issues and alerts for exceptions that users are expected to take action on. Online Reporting and Analytics – Automated reports provide visibility to partner performance and key metrics.  Reports can be customized, shared, subscribed to, exported and drilled into, making it easy to collaborate with partners on issues. Scorecards – The service provides the ability to view a consolidated scorecard that represents key performance indicators (KPI’s) for the business or individual trading partner scorecards to measure against specific metrics and SLAs. We’re actively working with several global organizations to help them leverage the Data Quality and Compliance Service and achieve supply chain goals.

Read More

Insurance Technology: Time to Get Your Head in the Clouds

As the spring 2011 conference season winds down, I attended the ACORD/LOMA Insurance Systems Forum in San Diego, CA. ACORD (Association for Cooperative Operations Research and Development) is a global, nonprofit standards development organization serving the insurance industry. LOMA (Life Office Management Association) provides training and education for insurance professionals worldwide. The event is billed as the premier business and technology event for insurance professionals. One of my goals at ACORD/LOMA was to better understand cloud computing in the insurance industry. There were several sessions that touched on cloud and Software-as-a-Service (SaaS). One of the most interesting was “Cloud Computing for Insurers: Time to Get Your Head in the Clouds” by Bob Hirsch, Director Technology Strategy and Architecture, Deloitte Consulting LLP. Bob provided some interesting thoughts on why cloud isn’t more prevalent in insurance. One reason is that cloud vendors have been slow to meet the regulatory demands of insurance. Another is that vendors are not in the “core” space–most cloud implementations are at the “edge for specific workloads.” Insurance firms also have concerns about data loss, security and privacy, audit and assurance, backup and disaster recovery, vendor “lock in”, and IT organizational readiness. Bob described vendor “lock in” as the inability to easily migrate your company’s information from the cloud provider’s data center to your own if you decide to bring processing back in-house. Bob suggested that with quality datasets, computing advances and maturing tools, analytics could become a strategic cornerstone of the enterprise.  As an example, he talked about the cost savings from moving volatile computing needs to the cloud. Bob explained that insurance companies need to run stochastic models each quarter to estimate risk. Large insurers are running grids of 2500 nodes and growing for this type of computing. Running the models can take 24 to 48 hours, but the rest of the time the servers are idle.  Bob stated that current grid systems can be modified to be cloud aware and “burst” capacity to clouds as needed by storing the grid image in the cloud and deploying it across servers as needed for periods of peak demand. Bob also walked through a cost/benefit analysis for Monte Carlo simulations for hedge funds which have limited in-house IT resources. The analysis showed in-house monthly costs of $14,280 vs. $6,930 for cloud, a 51% savings. For the moment, Bob said that smaller insurance firms are ahead of larger ones with using cloud-based applications.  This is because insurance systems are very fragmented within larger organizations and they are slow to consolidate systems across the enterprise.

Read More

How ENS Helps to Secure Supply Chain Shipments into the European Union

For many years customs agencies around the world first learned of an ocean based shipment’s existence as it approached a port. This notification came in the form of a manifest which was prepared and filed by the carrier of the goods. These manifests often contained very vague descriptions of the contents of a shipment and a more detailed customs declaration was sent by the importer at a later date. This process was in place for years but over the past ten years, heightened security tensions around the world led many countries to tighten up on the screening of all imported goods. North America was one of the first countries to tighten its import procedures with the introduction of the Advanced Trade Data Initiative which was then superseded by the 10+2 compliance procedure in early 2010. Ten pieces of information about the shipment were provided by the importer and two were provided by the carrier. This information allows U.S customs to pre-screen shipments to check and make sure that goods are safe for import. The more formal name of this import procedure is the Importer Security Filing (ISF) and my colleague Pradheep Sampath wrote a blog on this subject just after it was introduced in North America in early 2010. Other countries have been busy trying to replicate the success of the 10+2 system and in January 2011 the member countries of the European Union introduced a similar pre-screening process called an Entry Summary Declaration or ENS for short. Each of the 27 member countries making up the EU have been actively establishing their programs albeit on slightly different time schedules and in some cases using different names. For example in the UK their version of ENS is called Import Control System (ICS) and was introduced on 2nd November 2010. The main differences between the North American ISF and the EU’s ENS are twofold, firstly unlike ISF which is filed by the importer and the carrier, the carrier is solely responsible for filing the EU’s ENS declaration. Secondly, unlike the 12 elements contained in ISF, ENS has nearly twice as many data elements. Now as you can imagine trying to come up with an agreement of what information would be required by 27 different countries was not easy but an agreement was made and all shipments into the European countries now require the following 22 pieces of information to be sent to the port of entry before the shipment leaves its point of origin. Seller/Consignor (EORI #) Buyer/Consignee (EORI #) House BL Number Master BL Number CarrierPerson Entering the Filing Notify Party Country of Origin At least the first four digits of the HTSUS Number (Commodity Harmonized Tariff Schedule of the EU) Place of Loading Location First Port of Entry in EU Description of Goods (Not required if four or six digit HTS is provided) Packaging Type Code Number of Packages Shipment Marks and Numbers Container Number Container Seal Number Gross Weight in Kilos UN Dangerous Goods Code Transportation Method of Payment Code Date of Arrival First Port EU Declaration Date Following the introduction of ISF in North America, U.S Customs and Border Patrol estimated that as of July 2010, nearly 80% of importers were ISF compliant, this figure is likely to be even higher now.  Further information on the ISF procedure is available for download here. There are two significant benefits that customs agencies around the world are seeing from the introduction of ISF, ENS and other shipping notification processes.  Firstly through the submission of a standard set of information to support an imported consignment of goods means that more accurate information about consignments is being used across the supply chain.  Secondly with shipping information having to be sent ahead of time, preferably electronically via EDI, it means that supply chains are becoming more secure and it is helping for example to reduce the amount of counterfeit goods entering the western economies.  If any counterfeit goods are found then with the additional information submitted via ISF or ENS it will be a lot easier to track down where these counterfeit goods originated from. Further information about ENS can be found at the European Customs Portal and I will discuss how GXS can help improve the visibility of global shipments in a future blog entry.

Read More

CPSIA – The Certificate Exchange Challenge

The Consumer Product Safety Improvement Act (CPSIA) debate is heating up once again as another key deadline approaches in February.  Back in December 2009, the CPSC decided to extend a stay on certification and third party testing for children’s products subject to lead content limits until February 10, 2011. Under this decision, products must still were required to meet the limits for lead composition, but the need to demonstrate certification and third party testing was extended to apply only for products manufactured after February 10, 2011.  There is a growing debate about whether the deadlines should be extended further. One of the key provisions of the CPSIA legislation is that manufacturers are required to make General Certificates of Conformity (GCC) available to retailers and distributors.  The certificates attest to compliance with the law and its principles to minimize the use of toxins such as lead and Phthalates in children’s products.  A GCC must include not only the product and importer, but also its date and place of manufacture as well as its date and place of testing.  State Attorney General Inspectors can audit stores and demand to see proof of testing certificates, which retailers must produce within 24-48 hours.   The management of certificates for CPSIA illustrates one of the common challenges with information exchange in the supply chain.   The easiest way for a supplier or distributor to comply would be to post their certificates to a secure web portal.  Retail customers could be notified of the URL and offered credentials for accessing the certificates posted on the supplier’s site.  Of course, the challenge with such an approach is the burden placed upon the retailer.  To ensure compliance retailers would need to log in to hundreds of supplier portals weekly to download and/or verify the appropriate certificates. An alternative approach to managing certificates is for retailers to request that the suppliers push the certificates directly to them.  The certificates could be exchanged via an e-mail message, file transfer or manual file upload to a retailer’s portal.  While greatly simplifying the process for retailers, such an approach creates a significant burden for suppliers.  Suppliers must create a process for routing certificates corresponding to specific lots to each retailer.  Such a process is highly error prone.  If a supplier routes the certificate to the wrong retailer or fails to transmit certain certificates than both parties could be non-compliant with the regulations. CPSIA conformity certificates would not be the first documents to be exchanged using such an inefficient, point-to-point model.  Retailer compliance guides for shipment labeling, carrier routing and EDI policies are shared via supplier portals today. An alternative and more efficient model would be to create a centralized clearinghouse for the exchange of CPSIA certificates.  Suppliers would publish all certificates to a centralized location, tagging each with details such as the product name (GTIN) and recipient of the goods (i.e. the retailer).  Retailers would be able access all the GCCs in one central location for all suppliers and all product categories. Of course, security procedures would need to be enforced to ensure that retailers only have access to the certificates for products which they ultimately received.  A centralized clearinghouse simplifies the process for both the retailer and the supplier.  Each party has a single location to publish or subscribe to certificates thereby reducing the level of effort required to comply and lowering the probability of errors. Enterprise Community Management vendor, Rollstream, has developed such a clearinghouse, which it calls the Certificate Exchange Network (CEN).  The CEN is an online platform where manufacturers can post or link to existing certificates of conformity directly.  Retailers can then download on demand any product certificate needed to demonstrate CPSIA compliance.  Through a partnership with National Association of Chain Drug Stores (NACDS), Rollstream has already achieved significant levels of adoption amongst grocery, chain drug and mass merchandise retailers and suppliers in the pharmaceutical and consumer products segments. Despite on-going lobbying efforts of selected manufacturers and retailers, it seems unlikely that the CPSIA will be repealed any time soon.  As the February deadline approaches it will be interesting to see if the focus amongst industry participants shifts towards ensuring compliance with the regulations.  Solutions such as the Rollstream CEN certainly provide an example of how technology can be used simplify the administrative burden for all parties.

Read More

The Periodic Table of the Supply Chain

In the past decade a new wave of regulations have been passed in the areas of greenhouse gas emissions, waste recycling, rare earth minerals and consumer product safety, which are changing product design and supply chain management strategies.  At the current pace of regulation, one will need a PhD in chemistry and a law degree to be able to manage supply chain operations.  Below is a list of some of the elements on the watch list: #6 – Carbon (C) – has become synonymous with greenhouse gas emissions that are associated with global warming.  Consequently, carbon compounds, primarily CO2, are the target of numerous emissions reduction regulations and the focal point of various non-governmental organizations. #27 – Cobalt (Co) – is widely used in rechargeable batteries and recordable media.  Sometimes mined in Central Africa with various human rights violations, Cobalt is considered a “conflict mineral.”  The mineral is widely dispersed in the Earth’s crust providing a number of different sourcing options. #39 – Yttrium (Y) – has a variety of commercial applications ranging from powering superconductors to fighting cancer cells.  Only a few tons of Yttrium is produced each year, the majority of originates in China.  In September 2009, China placed limitations on exports of rare earth minerals such as Yttrium to protect the environment and conserve scarce resources. #48 – Cadmium (Cd) – is primarily used in batteries to power electric devices.  Cadmium poses numerous health risks including possible links to cancer.  Cadmium is one of six substances banned through the European Reduction of Hazard Substances (RoHS) directive which governs the content of electronics. #50 – Tin (Sn) – is used widely in electronics as a protective coating or for solder.  Tin is sometimes combined with other elements into alloys.  Sometimes mined in Central Africa as a conflict mineral, Tin can be good or bad depending upon where it comes from. #60 – Neodymium (Nd) – is used in lasers, electric motors, microphones, professional loudspeakers, in-ear headphones, and computer hard disks.  Neodymium was used in the original Sony Walkmans due to its magnetic properties.  The majority of mining occurs in China where the local government has imposed export restrictions on such rare earth minerals. #63 – Europium (Eu) – is used in television sets, LED lights and fluorescent lamps.  In the 20th century Europium was used to display the color red in TV sets.  Although it was named after Europe, the majority of mining occurs in China today.  Europium falls under China’s rare earth mineral regulations. #65 – Terbium (Tb) – is used in television sets, fluorescent lamps, naval sonar systems and solid state devices.  Terbium also is amongst the rare earth minerals, predominantly mined in China. #66 – Dysprosium – is used in nuclear reactors and high intensity lighting.  Dysprosium can also be used in drive motors for hybrid electric vehicles.  99% of the world’s Dysprosium is mined in China.  The rare earth metal has experienced significant price increases during recent years, which will likely continue with China’s export restrictions. #74 – Tungsten (W) – has a high melting point, making the element popular for use in high temperature environments such as Cathode Ray Tubes (CRTs) and light bulbs.  Tungsten is also used extensively on integrated circuits.  Tungsten is included in the category of conflict minerals originating in Central Africa. #73 – Tantalum (Ta) – is a good conductor of heat and electricity.  Tantalum is often used in capacitors found in mobile phones and personal computers.  One of the primary conflict minerals, Tantalum from Central Africa should only be sourced from mines employing fair labor practices. #79 – Gold (Au) – Because of its electrical conductivity and resistance to corrosion, Gold is popular in audio, video and computer connector cables.  Gold is also used in mission-critical computers, spacecraft and jet aircraft engines.  Mined in the Central Africa, Gold can be a conflict mineral as well.   #80 – Mercury (Hg) – is extremely toxic.  Historically, Mercury was used in various switches and batteries.  Today its uses include Neon signs, cosmetics and medicines.  It is amongst the banned substances governed by the RoHS directive.  Mercury is also regulated via the US Environmental Protection Agency’s Clean Water Act and Clean Air Act. #82 – Lead (Pb) – Is a poison that can cause brain and blood disorders when ingested, particularly by small children.  Lead is used in a variety of products from radiation shielding to construction materials. Consumer product safety regulations such as the CPSIA minimize the use of lead in paints, especially for children’s toys, books, apparel and games.  Lead is also amongst the banned substances included in the European RoHS directive.

Read More

How the Military Health System Supports Our Veterans with Electronic Health Records

Tomorrow is Veterans Day today in the US so I thought a post about the military was appropriate.  This morning, I had the opportunity to attend an excellent session at the WEDI Fall Conference on how the Military Health System (MHS) uses Electronic Health Records (EHR). The speaker was Nancy Orvis from the Office of the Assistant Secretary of Defense.  I was surprised to learn how advanced the military’s E.H.R. system as compared to the commercial sector in the US. About the Military Health System The MHS is one of the largest in the world with 9.7M beneficiaries and a $50B budget. MHS supports active duty personnel in the Army, Navy and Air Force as well their dependents (immediate family members). Retired military personnel are supported as well.  MHS is very different than almost any other health care provider in the world, in that it truly has global operations.  MHS has 59 fixed-location hospitals, 364 medical clinics and 275 dental clinics. Additionally, MHS operations have a variable number of temporary locations “in theatre” regions such as Iraq and Afghanistan.   For example, there are currently 15 hospitals, 262 resuscitative sites and operations on 25 Navy ships. Electronic Health Records The military E.H.R. system is known as AHLTA (Armed Forces Health Longitudinal Technology Application). AHLTA originated from a presidential directive in 1997 that reinforced the “need for a centralized, longitudinal patient record for military personnel accessible across the DoD enterprise.”  The system was fully deployed in 2004 and is now considered to be on one of the most expansive E.H.R. applications in the US. AHLTA has been deployed with a phased approach introducing broader sets of functionality with each release.  The system is currently on version 3.3.  There is a desktop, browser and mobile version of the AHLTA application to support the various use cases at fixed-location facilities and in-theatre needs. AHLTA enables retrieval of a beneficiary’s health record at a point of care anywhere within the military’s health operations.  On an annual basis, AHLTA supports: 2.2M prescriptions 1.8M outpatient visits 103.4K dental procedures 21.8K inpatient admissions 2300 babies Unique Challenges for Military E.H.R. Some of the challenges that MHS faces include: Consistency – Military personnel have a relatively short deployment at each location, switching every 2-3 years between posts in the US, Europe or Japan.  To ensure optimal productivity of personnel, the E.H.R. user interface needs to look and act the same no matter where a provider is accessing the system. Storage – Health records for military personnel are considered federal government records.  Consequently, military E.H.R.s fall under the jurisdiction of national archive laws, which require that records be stored for 75 years. Sharing – Selected retirees and active duty personnel may be qualified to receive benefits from the Veterans Administration or Social Security.  However, universal access to all patient records cannot be provided with the other agencies as many of the 9.7M beneficiaries are not eligible for the extra benefits. Speed – Increasingly, injured military personnel are being moved from local, in theatre care centers back to centralized facilities in the US such as the Walter Reed Medical Center in DC or San Antonio, TX.  The transfers can occur in less than 24 hours.  All of the E.H.R. data updated in the field must be available to the providers treating the injured personnel back in the US. Supply Chain MHS also uses other B2B integration technology such as EDI to support its procurement, materials management and logistics processes.  MHS places an average of 1500 orders per day for a total of $4.5B annually in pharmaceuticals and medical-surgical supplies to stock its various facilities.  Not all of the materials are purchased in the US.  There is local procurement in Europe and Japan as well.  To streamline the supply chain, MHS uses the ANSI X12 EDI standards for product catalogs (832) purchase orders (850), order changes (860) and advanced shipment notices (856). Revenue Cycle Management MHS acts as both a provider and a payer.  Not only does it provide care at its own facilities, but it offers beneficiaries access to 3rd party commercial health services in the US.  After care delivery these third party providers will seek reimbursement from MHS for services rendered.  To facilitate the processing of claims, MHS supports the ANSI EDI 4010 (soon to be 5010) standards for eligibility verification (271), claims submission (837), claims status (835) and electronic remittance advice (835).

Read More

E-Prescription Networks – Making Health Care Less Painful

One of the most annoying parts of the US health care system is all the paper work you have to fill out.  Every time I visit a new provider, I have to fill out a whole series of paperwork answering questions about my demographics, medical history and current prescriptions.  Even when you visit the same provider consistently, you are still asked a series of repetitive questions.  For example, every time I visit my primary care physician, both the nurse that checks my vitals and the physician who consults me ask for a list of current medications.  If I fail to list one (e.g. the Ambien Rx I requested nine months ago for an international flight), then they query me about any disconnects between their records and my answer.   Why is it my responsibility as the patient to know the names and strengths of all the medications I am taking?  Why doesn’t the provider track this since they are the ones recommending the medications in the first place?  Surely, the process of tracking prescription medications by patients could be simplified.  The good news is that there has been an effort underway for almost a decade now to digitize the exchange of prescription data throughout the health care system. 

Read More

Tracking Terrorists on B2B Networks – SWIFT EU Data Privacy Debate Continues

In my last post, I outlined how the TFTP (Terrorist Finance Tracking Program) administered by the US Central Intelligence Agency has been leveraging data from the SWIFT network to identify money laundering activities of terrorist cells. The program was made public back in June of 2006 by the New York Times. Nonetheless, in 2010 it continues to generate a significant amount of controversy due to data privacy concerns in both the US and European Union. In fact, the European Parliament rejected a proposed formal agreement that would legitimize US access to data involving European citizens back in February. I highlight the SWIFT TFTP program, not as a criticism of SWIFT, but rather because it provides an interesting case study in the growing regulation of B2B e-Commerce activities by local, state and national governments. Controversy in the US Although most of the focus on TFTP in the past 12 months has been within European Parliament, there was a fair amount of controversy in the US back in 2006. The secret data mining practices of TFTP were first exposed by the New York Times in a June 23rd, 2006 article by Eric Lichtblau and James Risen. The journalists interviewed over 20 government officials about the classified program to gather further details. However, the story was almost not published as the Bush Administration requested the New York Times not disclose the program fearing it would jeopardize TFTP’s effectiveness. Bill Keller, the newspaper’s executive editor, said: “We have listened closely to the administration’s arguments for withholding this information, and given them the most serious and respectful consideration. We remain convinced that the administration’s extraordinary access to this vast repository of international financial data, however carefully targeted use of it may be, is a matter of public interest.” It is amazing to me that the program was in operation for nearly 5 years before being disclosed to the public. Perhaps, more interesting is that the program became effectively permanent without any Congressional approval or formal authorization. TFTP’s access to SWIFT data was justified under the International Emergency Powers Act, which gives the President authority to “investigate, regulate or prohibit” foreign transactions in responding to “an unusual and extraordinary threat.” The Act enabled government agencies to access financial data without seeking court-approved warrants or subpoenas. Criticism within the European Union Shortly after the publication of the New York Times article, SWIFT came under high levels of scrutiny by both the European Union and Belgian (SWIFT is headquartered outside Brussels) governments. The primary concern was the access to financial data of European citizens granted to the US government. SWIFT was originally accused of breaking data protection laws. It is unclear whether any EU Parliament Members or European Commission members were aware of the TFTP program prior to the New York Times story. However, SWIFT has disclosed that its’ 25-member Board of Directors as well as many of the Central Banks which leverage its’ services were notified. SWIFT offered a strong response to Parliamentary criticisms of its actions, stating “SWIFT is subject to lawful subpoenas in the United States because it has substantial business and operations there, including data storage.” In its defense, SWIFT cites the controls and audits put into place to ensure restricted use of the data. “SWIFT negotiated with the US over the scope and oversight of the subpoenas to protect the confidentiality of its members’ data and obtained extraordinary protections and assurances as to the purpose, confidentiality, oversight and control of limited sets of data produced under the subpoenas. These protections ensure that only a limited set of data is accessed.” In fact, in 2003 SWIFT threatened to suspend data sharing efforts with the US due to concerns about the scope and use of the data. However, a meeting between SWIFT executives and US officials including Alan Greenspan solidified a longer-term agreement. A key outcome of the negotiation was that an outside auditing firm be used to verify that data searches are focused on terrorist related activities as opposed to tax fraud, drug trafficking or other illegal activities. Status of the SWIFT Data Transfer Deal within the EU In November 2009, a nine-month interim agreement was reached which would allow the US to continue mining selective SWIFT records for terrorist activities. However, shortly thereafter the European Parliament rejected the deal in February due to concerns about inadequate data privacy protection for European citizens. EU Parliament Members have called for additional stipulations in the data sharing agreement such as: Reciprocal access to data about the activities of US citizens if the EU chose to create a similar Terrorist Finance Tracking Program Stricter limitations on data usage such as maximum retention periods of 5 years and embargoes on sharing the data with 3rd party governments Remedies for EU citizens to seek restitution from the US government if data is misused and the ability to terminate the agreement upon evidence of privacy violations While the European Parliament and US Government have been negotiating agreements, SWIFT has been busy reorganizing its operations to mitigate the risk of EU citizen data privacy breeches. Historically, SWIFT has operated two data centers €“ one in the US and the other in the EU. For business continuity purposes all transactions and data is mirrored into both centers. Consequently, all data related to European transactions is stored both in the Belgian data center and US data center, which operates in US jurisdiction. SWIFT has recently brought a third data center online in Switzerland. The third center will allow SWIFT to logically segregate data while maintaining business continuity. Transactions that occur within the EU will replicated to the Swiss and Belgian centers, but not the US. Such an approach should mitigate the risk of the US seeking access to intra-EU transactions. However, transactions which involve money transfers between the EU and the US may continue to be hosted in the US center. There is an excellent series of articles on the Finextra site which provide a chronology of the events leading up to the current negotiations for those interested in more details.

Read More

TFTP – A Government-to-Business Bulk Data Transfer Utility

The tech sector headlines over the past few weeks have been dominated with discussions about privacy policies of Facebook, Google and Yahoo both in Europe and the US.  Why doesn’t anything controversial like this ever happen in the B2B e-Commerce sector? You might be surprised to learn that consumer and business privacy issues do arise in B2B.  And you would probably be even more surprised to learn that the issues are so significant that they make the front-page of the New York Times.  And you would be in disbelief if I told you that these data privacy concerns are considered strategic issues by the European Parliament, US Federal Bureau of Investigation and Central Intelligence Agency. In this post, I want to highlight an example of how B2B networks are being used to track money-laundering activities of Al Queda terrorist cells through a system called TFTP. What is TFTP? If you are a B2B e-Commerce practitioner or regular reader of this blog you might have guessed that TFTP is yet another proprietary Managed File Transfer protocol that is creating barriers to frictionless commerce.  Unfortunately, you would be incorrect.  TFTP refers to the Terrorist Finance Tracking Program that was initiated by the Bush Administration shortly after the September 11th attacks. The objective of the program is to identify and cutoff the international sources of financing that terrorist cells in the US are dependent upon to their conduct operations locally.  Nine of the hijackers responsible for September 11th funneled money from Europe and the Middle East to SunTrust bank accounts in Florida.   What is the relationship between TFTP and B2B e-Commerce?  For almost nine years, the TFTP program has been leveraging data about international finance transactions that is transmitted over the SWIFT network. SWIFT is a specialized business process network which connects all the major financial institutions around the globe for the purpose of exchanging information related to payments, securities, foreign exchange and letters of credit transactions.  SWIFT is not a bank and does not process payments, exchange currencies or facilitate securities trades. Instead SWIFT is an information network that allows its customers, primarily financial institutions, to send instructions about which payments to execute; how to settle securities trades; and cash positions in bank accounts.  SWIFT does not refer to itself as a B2B e-Commerce network.  However, I classify them as one due to the fact that their primary business is the exchange of standards-based messages and structured files between different institutions. SWIFT is the biggest financial network with connections to over 9000 financial institutions in over 200 countries.  With every major financial institution on the planet connected to it, SWIFT routes in excess of $6 Trillion daily between institutions.  Although, SWIFT supports a wide variety of transaction types, the network’s strong suit has always been activities related to cross-border, high-value payments.   Consequently, SWIFT possesses a rich source of international financial transaction data that would be of great interest to US government agencies seeking to track terrorist money flows. And that is exactly the nature of the relationship that has existed between the US TFTP and SWIFT since a few weeks after the September 11th attacks.  The US government claims that by mining data from SWIFT transactions it has been able to identify thousands of terrorist related funding activities, including several high profile arrests. Of particular interest to the US are transactions originating in the United Arab Emirates or Saudi Arabia destined for the accounts of US businesses and individuals with known terrorist affiliations.  For example, SWIFT data provided a link which helped to locate Riduan Isamuddin, believed to be responsible for a 2002 bombing of a Bali resort, in Thailand in 2003. More in a future post…

Read More

B2B Integration could help improve tracking of Pandemics such as H1N1 Swine Flu

I was watching the movie I am Legend on HBO Sunday evening.  I’m not sure if there is any correlation between HBO’s decision to broadcast of the film in May and the outbreak of the H1N1 Swine Flu.  However, it did start me thinking about pandemics and what could be done to better contain these outbreaks before they turn all of Manhattan into nocturnal, cannibalistic zombies.  The widespread outbreaks of H1N1 in Mexico and the US have made this subject top of mind for everyone from politicians to economists.  Of course, pandemics are yet another area in which B2B interoperability and integration technologies could play a significant role. The Center for Information Technology Leadership published a comprehensive report on how B2B interoperability in the US health care community could not only reduce costs but improve the quality of care.   Much of the data cited in this post is sourced from the 2004 report entitled The Value of Healthcare Information Exchange and Interoperability. Tracking Pandemics at the State, Local and Federal Level State laws require providers and laboratories to report cases of certain diseases to local and state public health departments.  Nationally “notifiable” diseases are forwarded by the state agencies onto the Centers for Disease Control and Prevention (CDC).  Connections between the states and the CDC are electronic and highly automated.  However, the first mile between the providers and the local and state agencies is highly manual.   Providers typically submit data via phone, fax, hard copy forms or very basic B2B communications methods such as a web portal.  For larger provider groups operating in multiple regions, notifications to state health agencies become even more cumbersome.  The 50 US states maintain more than 100 different systems to collect data each with its own communications mode. The most closely monitored “notifiable” diseases are frequently under-reported in the US.  Various studies conducted between 1970 and 1999 showed that only 79% of all STD, tuberculosis and AIDS cases were reported to public health agencies.  Reporting rates for other diseases was much lower at 49%.  There are several reasons for the reporting challenges.  But certainly one of the key issues is the ease with which the information can be transmitted to health authorities.  There is no question that the primitive communications methods used to collect provider data is a critical barrier to success.  However, even more problematic is the dependency upon overworked and understaffed provider personnel to take the time to consistently file the reports. Electronic Health Records – Public Health Benefits A better methodology for reporting on “notifiable” diseases would be to eliminate the need for human initiation altogether.  The process could be completely automated by connecting health care provider’s Health Information Systems and Practice Management Systems which contain the patient data to Public Health and Safety tracking systems.  However, connecting the tens of thousands of medical practices to the hundreds of different public health systems could prove quite an ambitious integration project.  A less complex and costly alternative would leverage the concept of Electronic Health Records (EHR).  The EHR would significantly simplify tracking of public health epidemics without the need for bespoke integration between various state agencies and each different medical provider. The EHR provides a comprehensive set of information about each patient including demographics, medications, immunizations, allergies, physician notes, laboratory data, radiology reports and past medical history.  EHR information could be stored in a series of centralized repository deployed around the country.  Each repository could contain the full medical records or just pointers to the locations of the records.   Triggers could be set up to automatically identify trends in data sets that might not be otherwise noticed, helping to provide an early warning system for potential disease outbreaks.  In the event of a pandemic or bioterrorist event, public health officials could easily access de-identified EHR data such as physician’s notes, patient demographics and medical history.  Without the dependency upon manual data entry, the latency of information flow could be reduced and the quality of information collected could be improved.  Administrative costs would be reduced considerably.  Average cost to send a report manually is $14 as compared to only $0.03 electronically.  CITL estimated that the use of electronic data flow from providers and laboratories to public health agencies would reduce administrative costs by $195M annually.  CITL did not quantify the potential economic savings from early identification of pandemics and bioterrorist events, but there is no question that these could be in the billions of dollars. B2B Interoperability and EHR Of course, a key technology enabler for EHR is interoperability between the various health care providers and the corresponding state, local and federal agencies.  Medical data is transmitted between providers, payers and public agencies using a variety of B2B standards including DICOM, HL7, NCPDP, and HIPAA-compliant EDI transactions.  EHRs could aggregate the available data related to prescriptions, claims, lab reports and radiology images into an electronic record.  Additional services could be layered onto the B2B integration framework such as data quality could be used to ensure the completeness of records and business activity monitoring to identify behavioral trends. Another concept evangelized in the CITL report is the idea of a National Electronic Disease Surveillance System (NEDSS).  The NEDSS would collect data from a number of relevant sources outside of the health care system which could be useful for monitoring   Examples might include 911 call analysis; veterinary clinic activity; OTC pharmacy sales; school absenteeism; health web-site traffic and retail sales of facial tissue, Orange Juice.   Such practices have been deployed by the US Department of Defense and the Utah Department of Health during the Salt Lake City Olympics in 2002.  Such an effort would require integrating additional state and local agencies, educational institutions and retail chains electronically using B2B.

Read More

BIAN, TWIST, SWIFT: Why Standards Matter

standards organizations

There was a great presentation that has circled the web  for the past few years called ShiftHappens which focused on “the speed of change”. It made some great observations about population, technology and societal changes that have happened in a really short span of time. In fact, when you focus only on technology the speed of change is staggering. Someone once told me that “technology is anything that has been created since I was born.” And to prove that point every Fall someone sends me an email with all the things that the newest class of college undergrads have never lived without like Wi-fi, GPS or iPods. Needlesss to say every Fall, I get a little depressed…but only a little…because hey at least I don’t have millions of dollars invested in technology that is obsolete or soon to be obsolete. But many bankers have not been so lucky. Today, many banks’ investments in technology are so far behind the needs of the marketplace for security, transparency and flexibility that they might not be able to afford to recover or compete. One of the reasons for the existence of standards organizations is to help mitigate the risks associated with the speed change. Not only the speed of change in regulations and processes but also in technology. And many standards organizations are working to help banks keep pace with the speed of change while also helping them to keep an eye on the investments they have made, are making and will make so that at the end of the day, millions aren’t spent on technology that will be obsolete before the next annual report. Articles abound about the banking industry’s focus on replacing outdated and expensive legacy core systems in the quest to add new capabilities, ones that can address issues around risk, regulation and customer retention. But not surprisingly, the focus on economic stability has put replacement plans on hold for many, although these issues really can’t wait for better days in order to be fixed. Which is one reason, we should all be glad that the alphabet-soup of standards organizations,are still moving forward with strategies, road-maps and updates that may help to make life a little less complicated when credit starts flowing again. For example, the Banking Industry Architecture Network (BIAN) announced a change to their mission statement and also that they will publish their first set of deliverables. BIAN’s goal is to be the standard for service-oriented architecture (SOA) in the banking space which should equate to a clearer understanding of technology needs required for growth. The operative word of course is “should”, because while BIAN is focused on architecture, it is not the only standards organization that is contributing to the technology conversation, and others may have a completely different take on the direction that technology needs to take to prepare for change. Particularly when you consider that many standards organizations are made up almost exclusively of a single group of  players from a given business segment with a singular focus on their specific area of interest. This laser focus often leads to a less than ideal adoption rate because the standard does not have a holistic value proposition that resonates across the organization. Which is one reason that groups with more diverse memberships like the Transaction Workflow Innovation Standards Team (TWIST)which helps corporates with standardizing  platforms, matching & netting services as well as settlement and clearing services to integrate their relevant systems (i.e. ERP, payment and reporting systems) are equally important to the “speed of change” conversation. Since TWIST is comprised of members from the corporate sector as well as the banking and technology vendor segments, they approach the concept of change differently than a group like the Society for Worldwide Interbank Financial Telecommunication (SWIFT ) which until fairly recent was comprised solely of bankers. TWIST with its focus on the automation of corporate financial supply chains, looks at technology needs from a perspective of gaining interoperability by building on existing technology investments. By utilizing a modular approach to the adoption of its standards, TWIST lets adopters to use their recommendations on good practice work-flows, message standards and data security how they want and when they wish. This modular approach to implementation is in some ways very similar to our Managed Services approach to B2B integration. How so?  I knew you were going to ask. Well I won’t do a sales pitch because how much fun is that for anyone but I will highlight a few bullets based on TWIST’s game-plan  to show similarities/overlap/complementary tactics, etc. and you can decide for yourself. “The TWIST standards aim to enable straight through processing (STP) from end-to-end of the three processes, irrespective of the way the processes are transacted, the service providers that are involved and the system infrastructure that is used. By standardizing: Information flows Business process Electronic communications (whether direct or indirect between market participants and service providers Platforms, matching & netting services as well as settlement and clearing services, and the methods of integrating the relevant systems” Our Managed Services aims to help businesses and banks to connect with any corporate client or trading partner, regardless of location, size or B2B technical capabilities by supporting:  Information flows–Optimizing the flow and quality of technical data and information Business process–Performing all of the day-to-day management of a customers’ B2B infrastructure including systems-health monitoring, data backup, network and database management Electronic communication–Providing a broad range of trading partner connectivity options including FTP, FTP/S, MQ Series, AS2, HTTP/S, XML etc. Integrating relevant systems–Delivering a team of experts who are proficient in SAP and Oracle B2B integration and have a deep knowledge of industry standards Okay, hopefully you don’t feel as if you’ve navigated a sales pitch but as I said there are similarities in the approaches largely because both TWIST and OpenText are working to create a “win/win” environment. An environment that operates to meet the current and future needs of its customers and members. The promise of standards is similar to the promise inherent in a compelling managed services offering, to simplify the complex, create a repeatable methodology that can serve the current and future needs of the organization and help contain costs by leveraging existing or past technology investments. Read more here.

Read More

Bisync 2020 – The Case for FCC Regulation of B2B Communications

B2B communications

In my last post, I commented on the continued use of legacy communications protocols such as async, bisync and X.25 for B2B e-commerce.  I have never seen an official report on the use of legacy communications in B2B integration. However, I am confident there are over 10,000 companies are still using legacy dial-up connections. In this post, I want to continue the discussion exploring the implications of the business community’s continued use of these ancient technologies. I also want to explore the roles of commercial entities and government regulators in phasing out these legacy communications protocols. Who and Why? The primary users of async, bisync and X.25 are small businesses that established EDI programs over a decade ago and see no benefit in upgrading to newer technology. These companies are still running EDI software on a Windows 95-based PC and a dial-up connection to their VAN provider. Many of these companies running the older technology do not even know what async or bisync is. They just know that the PC sitting in the corner automatically phones home every night and magically retrieve all of their purchase orders. Why don’t these customers upgrade to newer technology?  Most small businesses lack the resources, budget and technology expertise to upgrade to a newer B2B communications protocol such as AS2 or FTP. Furthermore, most are reluctant to make changes to their EDI configuration for fear of disrupting electronic commerce with customers. Would you want to be the EDI manager who was responsible for losing a $2M order from Wal-Mart during a cutover from bisync to AS2? Challenges of Legacy Communications in B2B Does it matter that so many businesses are still using legacy communications technology for B2B? That is a subject of significant debate. But I have listed below a few disadvantages of such pervasive use of the older technologies: Business Disruption – In the early days of EDI, communications functionality was bundled into desktop software packages. Many of the developers of these early EDI translator packages no longer offer support for the older software. Consequently, if the older software breaks then there is a significant risk that users will not be able to quickly remedy the problem. The result could be a disruption to order flow with trading partners, which could have a ripple effect across the value chain Limited Functionality – Users of legacy communications technology are only able to conduct the types of B2B transactions supported by the original EDI software package. In most cases, the older software does not support the newer XML schemas introduced in recent years to support automating a wider range of business processes. Consequently, the ability to develop more collaborative business processes between buyers and suppliers is constrained by legacy technology Outdated Information – Users of legacy communications tend to operate in an off-line batch mode. EDI documents are exchanged with the VAN once a day or sometimes once a week. Consequently, these companies receive updates to orders, forecasts, shipments and bank balances only once a day or once a week. The overall supply chain becomes less agile when companies cannot exchange order, logistics, inventory and payment data in near-real time to respond to changing market conditions Why consider a Regulatory Approach? There have been several attempts by industry to force technology upgrades each of which have failed: Lower Cost Substitutes – The advent of the Internet in the late 1990s introduced a number of substitute technologies that small businesses could use for EDI such as AS2, FTP and Web Forms. Despite aggressive sales efforts by vendors, there remains a significant population of small businesses unwilling to upgrade their B2B technology Product End of Life – Commercial service providers such as the major telecommunications carriers have discontinued support for legacy protocols such as X.25, async and bisync. However, carrier efforts have been handicapped by their large customers, which have trading partners still using the legacy protocols. These corporations are major buyers of telecom services that use their purchasing power to negotiate extensions to the end-of-life to the legacy B2B protocols Pricing Deterrents – A number of VAN providers have attempted to raise the prices of async and bisync dial-up services in an attempt to encourage customers to transition to more modern communications protocols. The new pricing models of VAN providers met with considerable outrage from the end-user community.  Ultimately, the service providers were forced to abandon the pricing policies and extend indefinite support periods for the older communications With vendor-led efforts to drive technology upgrades failing, it seems that the only remaining alternative might be public policy. Or we could accept the fact that bisync will be alive and well in 2015, 2020 or maybe even 2050.  Should the FCC impose an end-of-life date for legacy B2B communications protocols? Should there be government subsidies to enable upgrades to AS2 and FTP?  Post your thoughts and let me know what you think.

Read More