Compliance

E-Invoicing Will See Increased Government Adoption in 2012

This past weekend I shared my thoughts on the likely changes to global trade instruments in 2012.  Whilst Bank Payment Obligations and Supply Chain Finance will be popular topics in 2012, I think the fastest growing area of the financial supply chain will continue to be e-invoicing.  However, the growth (on a percentage basis) in the commercial sector may be eclipsed by adoption in the public sector this year.

Read More

Impacting Business with Enterprise Architecture: What the Future Holds for EA Efforts

Submitted by: Process Matters Blogger on: January 5, 2012 Clichéas it may be, I can’t stop myself from turning the page on the calendarof a new year and turning my mind to my personal goals for the year.Naturally, many organizations have a tendency to follow suit.Bolsteredby this spirit of the possible,organizations begin to envisionthemselves achieving their goals – to rethink the way their businessoperates with renewed desire to drive innovation, increase speed tomarket and dramatically improve customer service. To bring life to those enterprise aspirations, business andtechnology leaders should look to 2012 as a year to continue improvingtheir collaborative efforts to achieve business change. There are noindications that the new year will bring any relief from the increasingpace of technology and business change, nor the increasing demands frommore educated and socially connected customers. 2011 continued the trendtoward business driving IT and 2012 offers the opportunity to make thisshift pay off for organizations. Many organizations who are focused onbridging the gap between business and IT groups will achieve far morebenefits if they fuse these two groupsinto business teams workingcollaboratively to drive transformations. So, what does 2012 have in store for EA teams? In its yearly series, Gartner Inc. recently predicted that manyorganizations will begin to leverage EA tools to drive business valueand impact. According to the report, “Gartner Predicts: Opportunitiesfor EA to Lead Business Transformation in Turbulent Times,” December 1,2011, Phillip Allega, Betsy Burton, et all. “EA practitioners will beginto shift their focus to begin to think about their role differentlyand, in many cases, employ a new way of working.” With only 40% of EAprograms worldwide reporting to IT, EA’s focus must shift from IT andoperations to delivery of demonstrable business value. As I read through the report, I found the following assumptions particularly interesting. The managed diversity approach“By 2015, 25%of Global 1000 organizations will produce cohesive EA artifacts thatsupport the diversity of complex business ecosystems.” When undergoing a business transformation initiative, organizationsmust account for global operational diversity. According to the report,“the managed diversity style defines choices or options for whatprojects or customers can leverage without defining only strict, rigidstandards. Managed diversity does not mean that there are no standards,but rather that EA planning achieves a balance between the need for aset of standards that help control costs and the need for a diversity ofsolutions to increase innovation, business growth and competitiveadvantage across locations that the organization operates in.” Properly executed, EA can help organizations achieve the delicatebalance of identifying and propagating best practices, maximizingtechnology investments, ensuring compliance with local regulatorybodies, and risk reduction with the flexibility to adapt businesssystems to compete in global markets. With flexible but definedguardrails, organizations typically find a significant increased speedin their ability to execute when teams are empowered to leverage theelements they need with the guidanceto avoid critical mistakes. Working together“By year-end 2014, 50% of Global 1000 organizations will support EA as a collaborative business and IT effort.” Successful organizations have already started moving their EA teams out of IT and into the business. Gartner’s survey results indicate thatwhile 68% of EA programs in the US report to the IT organization thispicture is already considerably different worldwide. China, whoprimarily looks to EA for business transformation initiatives, reportsto business leadership 76% of the time. This shift in reportingrelationships naturally drives changes in the focus and composition ofproject teams. The complementary nature of skills, perspectives andinsights from enterprise architects and business peoplecan combine to produce dramatically better results. Organizations cannot drivebusiness growth without carefully selecting the members of the projectteam. I particularly appreciated Gartner’s caution, “do not assume thatjust because business leaders are collaborating and engaging in EA, theeffort will be “business strategy driven.” Executive leadership shouldbe mindful that they have defined a clear business strategy thatincludes actionable directives to provide the context in which thesecollaborative teams can drive execution. Increased focus on the decision process“Through year-end 2014, 60% of organizations will continue to focus EA on assurance, rather than governance.” According to the report, there are two key challenges whenimplementing EA governance: 1) they lack training and criticalunderstanding of the topic, and 2) they focus exclusively on control andassurance. This is a problem because EA practitioners often lack anunderstanding of how the business uses information to makes businessdecisions. Quite honestly, it is easier to focus on control andstandardization because this space is more comfortable for individualswith a technical background. Increasing collaboration between IT andbusiness can be part of this solution but only if architects dig deeperto understand the decision process, the relative value of investmentpriorities in the context of the business strategy and which standardsprovide value to the organization. This level of understanding requiresmore than collaboration between the groups.It requires a true respectand commitment to understanding how the organization defines and drivesbusiness value and how they can then become a part of driving thatchange. What is your take on Gartner’s predictions for EA this year? Do yousee your EA team driving or reacting to these predicted trends?Willthis be your year to deliver strategic business value? Leave yourcomments below and we can discuss.

Read More

How do you go about expanding your B2B platform into a new market? – Part2

In my previous blog entry I discussed some of the more general considerations for extending a B2B platform into a new or emerging market. In this blog I would like to continue by discussing some of the more technical considerations that have to be taken into account. From understanding the technical capabilities of your trading partner community through to deploying the right B2B tools, ensuring that you have 100% trading partner participation is crucial to the smooth running of your supply chain.  So what should you consider? Understand the technical capabilities of your supplier – When connecting with a trading partner for the first time it would be worthwhile undertaking a B2B connectivity and technology audit.  For example how does the trading partner connect to the internet at the moment?, what type of business applications do they currently use?, do they have any internal resources to look after their IT infrastructure?, if using business applications, are they behind the firewall or hosted and delivered as-a-service?, do they connect electronically with any other customers?, if so how do they achieve this?,  the sooner you can understand the technical landscape of your trading partner, the sooner you can start planning to support their needs and deploy your B2B/IT infrastructure. You also have to bear in mind that the type of document they send through could be dependent on the communications infrastructure that might be available, for example a small supplier in Thailand might only have a slow speed dial up connection.  To ensure 100% participation by your trading partner community you must be able to embrace each and every need of your trading partner, no matter how small they might be or what level of IT adoption they might have. Are you deploying the right B2B solution for each user? – When dealing with trading partners in emerging markets it is important to understand the current B2B capabilities (if they exist) of the trading partners that you will be dealing with. If you are working with a trading partner in India or China for example, there will be a high probability that they could be using paper or even Microsoft Excel based ways of exchanging information with their customers. If using paper and to ensure 100% integration with your B2B processes then you may need to offer a web form based method for these particular suppliers to submit information electronically to your B2B platform. Alternatively, Microsoft Excel is one of the most commonly used business applications in China so you might want to think about how you can exchange these types of files and more importantly utilise the information contained within a spreadsheet.  If you can find a way of integrating spreadsheet content to your B2B platform then it will minimise any re-keying of information and hence minimise any errors getting into your business systems. So what about back end integration? – Another reason for ensuring that your trading partner community is 100% enabled is to ensure that externally sourced information can enter other back end business systems, such as ERP platforms, as seamlessly as possible. For example if you are running SAP, will your trading partner be able to send you an SAP IDOC file directly or will this be required to be converted by your B2B provider or internally by your own resources?  Which accounts package are they using and will you be able to integrate to it? Given that your production lines or equipment may be waiting to receive B2B related information from your trading partner it is important to consider back end integration and how these trading partners will connect to your back end IT infrastructure. Successfully integrating to back office systems  will help to bring additional benefits to your B2B platform, this includes reducing rework of incorrect data and speeding up the flow of information across your extended enterprise. Have you thought about extending supply chain visibility? – If you are sourcing goods from a trading partner in an emerging market, as well as ensuring that business documents can be exchanged electronically it is important to ensure that the physical supply chain can be monitored end-to-end as well. Key to this will be to ensure that you can monitor transactions across borders, customs agencies and multi-modal methods of logistics and transportation. If goods are delayed at a country border then the customer ordering the goods needs to be made aware of the situation. Being able to inform your customers of when their products or goods are to be delivered can often be a key measure of customer satisfaction and competitive advantage. So in addition to conducting a technology audit it may be worthwhile finding out which logistics partners and countries the supplier does business with already. Have you thought about how your business might expand or contract? – In these uncertain economic times it is important that your B2B platform can scale up or down depending on the exact needs of your business. If you have five Chinese suppliers on-boarded to your B2B platform, what happens if you need to onboard a further ten trading partners in a short time frame?  Would you be able to scale up your B2B infrastructure accordingly?, would you be able to support the inevitable changes to your existing B2B platform in order to accommodate these new trading partners?  Does your existing B2B infrastructure and accompanying service contract have the flexibility to incorporate a volume increase in B2B traffic? Factoring in likely changes to your B2B infrastructure at the planning stage, if you know what they are likely to be, can save you a lot of time in the long run. GXS has been helping companies globalise their B2B infrastructures for many years, from onboarding trading partners in China to connecting to a new third party logistics provider, GXS has the solutions to achieve this.  GXS has a number of B2B tools for onboarding suppliers in remote locations, please click here to find out more, alternatively click here if you would like to learn more about our community onboarding and enablement offerings.

Read More

Why the Time is Right for Manufacturers to Adopt e-Invoicing

In previous blog entries I have often described the manufacturing industry as being truly global in nature. The industry has to work across different country borders, different languages, different time zones and customers expect their goods to be delivered on time irrespective of where a manufacturing company’s production facilities are based. One of the key drivers for electronic invoicing or e-Invoicing adoption across Europe for example has come from the country governments. They have been working feverishly to get suppliers to their public sector organisations to automate the way in which they exchange invoices in country and across other member countries of the European Union. As well as the significant costs savings this can bring to a manufacturer, it also introduces green benefits by helping to reduce the amount of paper based invoices that flow across supply chains. Improved cash flow was highlighted as one of the key factors facing the survival of many smaller suppliers during the most recent economic downturn. Many smaller suppliers are using paper based methods for invoicing their customers but to increase speed of payment they should be considering the adoption of e-Invoicing.  For the purposes of this blog entry let me just describe some of issues that are driving many companies in Europe to adopt an e-Invoicing strategy. So the first challenge facing manufacturers in Europe is that they have to operate across 27 different countries that make up the European Union. Each country has its own tax rate applied to purchases, each has their own requirements for applying digital signatures to electronic invoices and in addition as electronic invoices are required to be archived, yes you guessed it, each country has its own rules with regards to how long electronic invoices are required to be archived for. So you could say that with these three issues alone no wonder electronic invoicing has taken some time to gain adoption across Europe. Now of course European based manufacturers may be using contract manufacturers in emerging markets such as Brazil or even Mexico, but even these countries have their own ways of processing electronic invoices. So how do manufacturers get round these issues? The analyst firm Gartner compiled a report in 2009 which looked at European specific e-Invoicing regulations. The European Association of Corporate Treasurers identified that the average processing cost of a paper based invoice in Europe was around 30 euros. The association also identified that by using e-Invoicing, an 80% cost savings is possible. Confirming this data, other case studies highlighted that e-Invoicing has proved to reduce the cost of processing one invoice to less than 5 euros.  e-Invoicing offers many other benefits including improvements in accounts payable processes by reducing invoice processing time and minimising manual intervention. This leads to a reduction in operating expense.  This factor alone makes some companies start e-Invoicing related projects, it makes many other companies begin evaluating such projects. As manufacturing companies source from many different suppliers across Europe and indeed the rest of the world, e-Invoicing can be deceptively challenging, this is why it is important to choose the correct vendor who can help implement an e-Invoicing strategy at either a European or global level. Implementing an e-Invoicing strategy may be a daunting challenge, especially as it will affect internal business processes, mutual agreements among business partners, financial transactions, taxes and legal compliance and of course the associated IT infrastructure that supports all of this. Each company will implement an e-Invoicing strategy in a slightly different way due to how internal business processes must be adhered to. In addition, each implementation will have to cater for a diverse range of users, for example IT staff, admin payments/accounting staff and of course tax auditors. Each of these groups of users will have varying levels of IT understanding and so it is important that users are shielded from the complexities of using such systems. For this reason many e-Invoicing solutions in Europe are delivered as a service, not just so that users can use the system with ease but to ensure that the system is deployed quickly and smoothly so that tangible benefits can be realised in a short space of time. The diagram below highlights typical order-to-pay and order-to-cash processes. In 2001 the European Commission implemented a directive (2001/115/EC) that would provide a way for countries in the EU to simplify, modernise and harmonise the conditions laid down for invoicing in respect of VAT in the EU. A key goal of the directive was to promote the efficient cross-border creation, transmission, acceptance storage and retrieval of invoices. In 2005, Denmark became the first European country to make e-Invoicing mandatory for the public sector. In 2012 Norway will become the latest EU country to adopt e-invoicing as they will be asking all SMEs supplying the Norwegian public sector to send invoices electronically. Further information on this can be found on the Pan-European Public Procurement Online portal (PEPPOL) by clicking here. The Nordic countries have traditionally been early adopters of e-Business technologies due to the fact that they have one of the best broadband and communications infrastructures of any country in the EU. The e-Invoicing directive requires that invoicing parties guarantee the authenticity and integrity of e-Invoices in transport and in storage, through the use of e-signatures, EDI or by other means. Some member states are more prescriptive than others and so it is up to each government tax office to rule whether a specific method of guaranteeing authenticity and integrity is acceptable.  Many government tax offices around the world simply do not have the necessary knowledge to do this. Many companies will start to realise significant benefits when they have implemented an e-Invoicing solution, for example more streamlined payment processes, reduced human efforts to process invoices, and providing a complete set of data that can automatically reconcile the invoice with the goods or services received, typically by integration with an ERP package such as SAP or Oracle. e-Invoicing provides benefits for senders as well as receivers, for example improved customer satisfaction, reduced admin costs in credit collection, more effective capital management and cash-flow control. So why is e-Invoicing starting to go mainstream now?, well there is strong user demand due to the significant cost savings that can be realised, increasing maturity and effectiveness of e-Invoicing solutions such as those offered by GXS, more governments are mandating e-Invoicing, especially in the EU and finally there are more and more user cases being released that highlight the benefits that have been realised by many companies today. Manufacturers have just emerged from one of the most severe economic downturns of recent years and if the news is to be believed we are not out of the woods yet with a potential double dip recession looming in the near future. So how do manufacturers go about implementing an e-invoicing solution whilst at the same time remaining focused on their core competency which of course is manufacturing goods or products? How do manufacturers navigate their way through implementing e-Invoicing?, let alone ensure that invoices have the correct e-signature applied and then make sure that e-Invoices are archived for the correct period of time depending on which country they are doing business with? To address these issues and to help companies get a better understanding of how e-Invoicing solutions should be implemented, GXS will be launching a new microsite in the near future called e-Invoicing Basics. This new educational resource provides an introduction to the area of e-Invoicing, what it consists, how it should be deployed and the business benefits that it can bring to a company. You can get a preview of the new microsite by CLICKING HERE. The site is very similar in nature to our popular EDI Basics microsite which was originally launched nearly five years ago. In a future blog entry I will take a deeper look at the area of e-Invoicing and how it is being deployed across multi-country business environments.

Read More

SWIFT and Sibos 2011: Financial Services Growth in 2012 and Beyond

Amid a week in which Moody’s downgraded three top US banks and the global stock market fell into bear territory, SWIFT held its annual  Sibos Conference in Toronto, Ontario, Canada. Billed as the world’s premier financial services event, this year’s Sibos brought together more than 7,000 leaders from financial institutions, market infrastructures, multinational corporations and technology partners from around the globe. Facilitated and organized by SWIFT, the global provider of secure messaging services, this year’s conference program focused on four big topics: Regulation re-visited, technology, changing landscape and new expectations. There were a number of “big ideas” discussed during the plenary sessions including pursuing growth, emerging markets and the impact of regulation. In the opening plenary, John Havens, President and Chief Operating Officer of Citigroup,  stated that to survive higher capital requirements, lower leverage rates and higher funding levels, the banking industry will have to “fundamentally restructure in order to recapture much of that lost revenue.” Banks will need to become leaner, fitter and more ruthless in how they manage costs and pursue revenues. However, Havens saw opportunity for banks to help corporate customers navigate the new macro-economic landscape and understand and access new markets. In a plenary session dedicated to “Where’s the Growth in 2012 and Beyond,” Tim Keaney, Vice Chairman & CEO Asset Servicing, BNY Mellon, said that he now spends as much of his technology budget on compliance as he spent three years ago on developing new products and services and accessing new markets for clients. Keaney admitted it is challenging to explain to customers that regulation is raising costs and increasing lead times. The customers understand that the cost of doing business is higher, but they expect associated benefits from increased regulations. The speakers advocated proactive client engagement. Keaney said “When there is a change affecting our clients, it is an opportunity. Clients are very willing to sit down and talk about what we can do differently to help them–usually that will result in a good product idea that clients are willing to pay for.” Paul Simpson, Head of Global Transaction Services, Bank of America Merrill Lynch (BAML), concurred, stating “Just because I’ve served clients in a particular way for 15 years, have I really stepped back in the last two years and asked whether it’s the optimal way to serve the client going forward?” Simpson also discussed the huge growth he’s seen at BAML in middle market companies expanding into high growth countries. He said that banks need to step up and be more supportive of this global growth. M.D. Mallya, Chairman, Indian Banks’ Association stated that the number of emerging market companies in the global top 1000 used to be 10 to 15, and now numbers into the hundreds. Mallya also talked about this shift from West to East: “Money is flowing east, follow the money.” John Coverdale, Group General Manager, Head of Global Transaction Banking, HSBC Holdings plc, pointed out that the emerging markets are not dealing with legacy systems, whether regulatory or infrastructure based. This provides a situation where there will be regulatory arbitrage (and advantage). One speaker talked about how “profit challenged” institutions see opportunity in outsourcing non-core functions to specialists—not simply “taking somebody else’s mess for less.” In a separate session, David Robertson, Partner, Treasury Strategies, voiced a similar view: “Our clients of all sizes are globalizing profoundly. In response, banks and other providers are expanding their footprint, but they’re also seeking deeper and more comprehensive partnerships.” How do these trends impact transaction banking technology? Some emerging economies are growing three times as fast as the US and EMEA. Large corporate and middle market companies expanding into these markets require sophisticated treasury technology and associated bank integration solutions to manage cash flows, working capital and liquidity. Technology firms able to service their clients across international footprints, whether that client is a financial institution or corporation, can deliver global solutions delivered with local resources and knowledge.

Read More

How to Get the US Treasury to Pay Its Bills Faster

With the entire world focused on the US debt ceiling debate over the past few weeks many suppliers have become increasingly nervous about the federal government’s ability to pay its bills.  Fortunately, three weeks ago the US Treasury announced a new program which would enable suppliers to be paid even faster than they have before.  And the program does not involve raising taxes or cutting social programs.  By the end of 2012 the Treasury Department will implement the Internet Payment Platform (IPP).

Read More

Insurance Technology: Time to Get Your Head in the Clouds

As the spring 2011 conference season winds down, I attended the ACORD/LOMA Insurance Systems Forum in San Diego, CA. ACORD (Association for Cooperative Operations Research and Development) is a global, nonprofit standards development organization serving the insurance industry. LOMA (Life Office Management Association) provides training and education for insurance professionals worldwide. The event is billed as the premier business and technology event for insurance professionals. One of my goals at ACORD/LOMA was to better understand cloud computing in the insurance industry. There were several sessions that touched on cloud and Software-as-a-Service (SaaS). One of the most interesting was “Cloud Computing for Insurers: Time to Get Your Head in the Clouds” by Bob Hirsch, Director Technology Strategy and Architecture, Deloitte Consulting LLP. Bob provided some interesting thoughts on why cloud isn’t more prevalent in insurance. One reason is that cloud vendors have been slow to meet the regulatory demands of insurance. Another is that vendors are not in the “core” space–most cloud implementations are at the “edge for specific workloads.” Insurance firms also have concerns about data loss, security and privacy, audit and assurance, backup and disaster recovery, vendor “lock in”, and IT organizational readiness. Bob described vendor “lock in” as the inability to easily migrate your company’s information from the cloud provider’s data center to your own if you decide to bring processing back in-house. Bob suggested that with quality datasets, computing advances and maturing tools, analytics could become a strategic cornerstone of the enterprise.  As an example, he talked about the cost savings from moving volatile computing needs to the cloud. Bob explained that insurance companies need to run stochastic models each quarter to estimate risk. Large insurers are running grids of 2500 nodes and growing for this type of computing. Running the models can take 24 to 48 hours, but the rest of the time the servers are idle.  Bob stated that current grid systems can be modified to be cloud aware and “burst” capacity to clouds as needed by storing the grid image in the cloud and deploying it across servers as needed for periods of peak demand. Bob also walked through a cost/benefit analysis for Monte Carlo simulations for hedge funds which have limited in-house IT resources. The analysis showed in-house monthly costs of $14,280 vs. $6,930 for cloud, a 51% savings. For the moment, Bob said that smaller insurance firms are ahead of larger ones with using cloud-based applications.  This is because insurance systems are very fragmented within larger organizations and they are slow to consolidate systems across the enterprise.

Read More

How ENS Helps to Secure Supply Chain Shipments into the European Union

For many years customs agencies around the world first learned of an ocean based shipment’s existence as it approached a port. This notification came in the form of a manifest which was prepared and filed by the carrier of the goods. These manifests often contained very vague descriptions of the contents of a shipment and a more detailed customs declaration was sent by the importer at a later date. This process was in place for years but over the past ten years, heightened security tensions around the world led many countries to tighten up on the screening of all imported goods. North America was one of the first countries to tighten its import procedures with the introduction of the Advanced Trade Data Initiative which was then superseded by the 10+2 compliance procedure in early 2010. Ten pieces of information about the shipment were provided by the importer and two were provided by the carrier. This information allows U.S customs to pre-screen shipments to check and make sure that goods are safe for import. The more formal name of this import procedure is the Importer Security Filing (ISF) and my colleague Pradheep Sampath wrote a blog on this subject just after it was introduced in North America in early 2010. Other countries have been busy trying to replicate the success of the 10+2 system and in January 2011 the member countries of the European Union introduced a similar pre-screening process called an Entry Summary Declaration or ENS for short. Each of the 27 member countries making up the EU have been actively establishing their programs albeit on slightly different time schedules and in some cases using different names. For example in the UK their version of ENS is called Import Control System (ICS) and was introduced on 2nd November 2010. The main differences between the North American ISF and the EU’s ENS are twofold, firstly unlike ISF which is filed by the importer and the carrier, the carrier is solely responsible for filing the EU’s ENS declaration. Secondly, unlike the 12 elements contained in ISF, ENS has nearly twice as many data elements. Now as you can imagine trying to come up with an agreement of what information would be required by 27 different countries was not easy but an agreement was made and all shipments into the European countries now require the following 22 pieces of information to be sent to the port of entry before the shipment leaves its point of origin. Seller/Consignor (EORI #) Buyer/Consignee (EORI #) House BL Number Master BL Number CarrierPerson Entering the Filing Notify Party Country of Origin At least the first four digits of the HTSUS Number (Commodity Harmonized Tariff Schedule of the EU) Place of Loading Location First Port of Entry in EU Description of Goods (Not required if four or six digit HTS is provided) Packaging Type Code Number of Packages Shipment Marks and Numbers Container Number Container Seal Number Gross Weight in Kilos UN Dangerous Goods Code Transportation Method of Payment Code Date of Arrival First Port EU Declaration Date Following the introduction of ISF in North America, U.S Customs and Border Patrol estimated that as of July 2010, nearly 80% of importers were ISF compliant, this figure is likely to be even higher now.  Further information on the ISF procedure is available for download here. There are two significant benefits that customs agencies around the world are seeing from the introduction of ISF, ENS and other shipping notification processes.  Firstly through the submission of a standard set of information to support an imported consignment of goods means that more accurate information about consignments is being used across the supply chain.  Secondly with shipping information having to be sent ahead of time, preferably electronically via EDI, it means that supply chains are becoming more secure and it is helping for example to reduce the amount of counterfeit goods entering the western economies.  If any counterfeit goods are found then with the additional information submitted via ISF or ENS it will be a lot easier to track down where these counterfeit goods originated from. Further information about ENS can be found at the European Customs Portal and I will discuss how GXS can help improve the visibility of global shipments in a future blog entry.

Read More

CPSIA – The Certificate Exchange Challenge

The Consumer Product Safety Improvement Act (CPSIA) debate is heating up once again as another key deadline approaches in February.  Back in December 2009, the CPSC decided to extend a stay on certification and third party testing for children’s products subject to lead content limits until February 10, 2011. Under this decision, products must still were required to meet the limits for lead composition, but the need to demonstrate certification and third party testing was extended to apply only for products manufactured after February 10, 2011.   There is a growing debate about whether the deadlines should be extended further.

Read More

The Periodic Table of the Supply Chain

In the past decade a new wave of regulations have been passed in the areas of greenhouse gas emissions, waste recycling, rare earth minerals and consumer product safety, which are changing product design and supply chain management strategies.  At the current pace of regulation, one will need a PhD in chemistry and a law degree to be able to manage supply chain operations.  Below is a list of some of the elements on the watch list:

Read More

How the Military Health System Supports Our Veterans with Electronic Health Records

Tomorrow is Veterans Day today in the US so I thought a post about the military was appropriate.  This morning, I had the opportunity to attend an excellent session at the WEDI Fall Conference on how the Military Health System (MHS) uses Electronic Health Records (EHR).  The speaker was Nancy Orvis from the Office of the Assistant Secretary of Defense.  I was surprised to learn how advanced the military’s E.H.R. system as compared to the commercial sector in the US.

Read More

E-Prescription Networks – Making Health Care Less Painful

One of the most annoying parts of the US health care system is all the paper work you have to fill out.  Every time I visit a new provider, I have to fill out a whole series of paperwork answering questions about my demographics, medical history and current prescriptions.  Even when you visit the same provider consistently, you are still asked a series of repetitive questions.  For example, every time I visit my primary care physician, both the nurse that checks my vitals and the physician who consults me ask for a list of current medications.  If I fail to list one (e.g. the Ambien Rx I requested nine months ago for an international flight), then they query me about any disconnects between their records and my answer.   Why is it my responsibility as the patient to know the names and strengths of all the medications I am taking?  Why doesn’t the provider track this since they are the ones recommending the medications in the first place?  Surely, the process of tracking prescription medications by patients could be simplified.  The good news is that there has been an effort underway for almost a decade now to digitize the exchange of prescription data throughout the health care system. 

Read More

TFTP – A Government-to-Business Bulk Data Transfer Utility

The tech sector headlines over the past few weeks have been dominated with discussions about privacy policies of Facebook, Google and Yahoo both in Europe and the US.  Why doesn’t anything controversial like this ever happen in the B2B e-Commerce sector?  You might be surprised to learn that consumer and business privacy issues do arise in B2B.  And you would probably be even more surprised to learn that the issues are so significant that they make the front-page of the New York Times.  And you would be in disbelief if I told you that these data privacy concerns are considered strategic issues by the European Parliament, US Federal Bureau of Investigation and Central Intelligence Agency.  In this post, I want to highlight an example of how B2B networks are being used to track money-laundering activities of Al Queda terrorist cells through a system called TFTP. What is TFTP? If you are a B2B e-Commerce practitioner or regular reader of this blog you might have guessed that TFTP is yet another proprietary Managed File Transfer protocol that is creating barriers to frictionless commerce.  Unfortunately, you would be incorrect.  TFTP refers to the Terrorist Finance Tracking Program that was initiated by the Bush Administration shortly after the September 11th attacks.   The objective of the program is to identify and cutoff the international sources of financing that terrorist cells in the US are dependent upon to their conduct operations locally.  Nine of the hijackers responsible for September 11th funneled money from Europe and the Middle East to SunTrust bank accounts in Florida.   What is the relationship between TFTP and B2B e-Commerce?  For almost nine years, the TFTP program has been leveraging data about international finance transactions that is transmitted over the SWIFT network. SWIFT is a specialized business process network which connects all the major financial institutions around the globe for the purpose of exchanging information related to payments, securities, foreign exchange and letters of credit transactions.  SWIFT is not a bank and does not process payments, exchange currencies or facilitate securities trades.  Instead SWIFT is an information network that allows its customers, primarily financial institutions, to send instructions about which payments to execute; how to settle securities trades; and cash positions in bank accounts.  SWIFT does not refer to itself as a B2B e-Commerce network.  However, I classify them as one due to the fact that their primary business is the exchange of standards-based messages and structured files between different institutions. SWIFT is the biggest financial network on the planet with connections to over 9000 financial institutions in over 200 countries.  With every major financial institution on the planet connected to it, SWIFT routes in excess of $6 Trillion daily between institutions.  Although, SWIFT supports a wide variety of transaction types, the network’s strong suit has always been activities related to cross-border, high-value payments.   Consequently, SWIFT possesses a rich source of international financial transaction data that would be of great interest to US government agencies seeking to track terrorist money flows. Image Source:How Stuff Works And that is exactly the nature of the relationship that has existed between the US TFTP and SWIFT since a few weeks after the September 11th attacks.  The US government claims that by mining data from SWIFT transactions it has been able to identify thousands of terrorist related funding activities, including several high profile arrests.  Of particular interest to the US are transactions originating in the United Arab Emirates or Saudi Arabia destined for the accounts of US businesses and individuals with known terrorist affiliations.  For example, SWIFT data provided a link which helped to locate Riduan Isamuddin, believed to be responsible for a 2002 bombing of a Bali resort, in Thailand in 2003. More in a future post…

Read More

B2B Integration could help improve tracking of Pandemics such as H1N1 Swine Flu

I was watching the movie I am Legend on HBO Sunday evening.  I’m not sure if there is any correlation between HBO’s decision to broadcast of the film in May and the outbreak of the H1N1 Swine Flu.  However, it did start me thinking about pandemics and what could be done to better contain these outbreaks before they turn all of Manhattan into nocturnal, cannibalistic zombies.  The widespread outbreaks of H1N1 in Mexico and the US have made this subject top of mind for everyone from politicians to economists.  Of course, pandemics are yet another area in which B2B interoperability and integration technologies could play a significant role. The Center for Information Technology Leadership published a comprehensive report on how B2B interoperability in the US health care community could not only reduce costs but improve the quality of care.   Much of the data cited in this post is sourced from the 2004 report entitled The Value of Healthcare Information Exchange and Interoperability.   See my January post on how the Obama administration could save $75B annually from B2B interoperability in health care for more background information. Tracking Pandemics at the State, Local and Federal Level State laws require providers and laboratories to report cases of certain diseases to local and state public health departments.  Nationally “notifiable” diseases are forwarded by the state agencies onto the Centers for Disease Control and Prevention (CDC).  Connections between the states and the CDC are electronic and highly automated.  However, the first mile between the providers and the local and state agencies is highly manual.   Providers typically submit data via phone, fax, hard copy forms or very basic B2B communications methods such as a web portal.  For larger provider groups operating in multiple regions, notifications to state health agencies become even more cumbersome.  The 50 US states maintain more than 100 different systems to collect data each with its own communications mode. The most closely monitored “notifiable” diseases are frequently under-reported in the US.  Various studies conducted between 1970 and 1999 showed that only 79% of all STD, tuberculosis and AIDS cases were reported to public health agencies.  Reporting rates for other diseases was much lower at 49%.  There are several reasons for the reporting challenges.  But certainly one of the key issues is the ease with which the information can be transmitted to health authorities.  There is no question that the primitive communications methods used to collect provider data is a critical barrier to success.  However, even more problematic is the dependency upon overworked and understaffed provider personnel to take the time to consistently file the reports. Electronic Health Records – Public Health Benefits A better methodology for reporting on “notifiable” diseases would be to eliminate the need for human initiation altogether.  The process could be completely automated by connecting health care provider’s Health Information Systems and Practice Management Systems which contain the patient data to Public Health and Safety tracking systems.  However, connecting the tens of thousands of medical practices to the hundreds of different public health systems could prove quite an ambitious integration project.  A less complex and costly alternative would leverage the concept of Electronic Health Records (EHR).  The EHR would significantly simplify tracking of public health epidemics without the need for bespoke integration between various state agencies and each different medical provider. The EHR provides a comprehensive set of information about each patient including demographics, medications, immunizations, allergies, physician notes, laboratory data, radiology reports and past medical history.  EHR information could be stored in a series of centralized repository deployed around the country.  Each repository could contain the full medical records or just pointers to the locations of the records.   Triggers could be set up to automatically identify trends in data sets that might not be otherwise noticed, helping to provide an early warning system for potential disease outbreaks.  In the event of a pandemic or bioterrorist event, public health officials could easily access de-identified EHR data such as physician’s notes, patient demographics and medical history.  Without the dependency upon manual data entry, the latency of information flow could be reduced and the quality of information collected could be improved.  Administrative costs would be reduced considerably.  Average cost to send a report manually is $14 as compared to only $0.03 electronically.  CITL estimated that the use of electronic data flow from providers and laboratories to public health agencies would reduce administrative costs by $195M annually.  CITL did not quantify the potential economic savings from early identification of pandemics and bioterrorist events, but there is no question that these could be in the billions of dollars. B2B Interoperability and EHR Of course, a key technology enabler for EHR is interoperability between the various health care providers and the corresponding state, local and federal agencies.  Medical data is transmitted between providers, payers and public agencies using a variety of B2B standards including DICOM, HL7, NCPDP, and HIPAA-compliant EDI transactions.  EHRs could aggregate the available data related to prescriptions, claims, lab reports and radiology images into an electronic record.  Additional services could be layered onto the B2B integration framework such as data quality could be used to ensure the completeness of records and business activity monitoring to identify behavioral trends. Another concept evangelized in the CITL report is the idea of a National Electronic Disease Surveillance System (NEDSS).  The NEDSS would collect data from a number of relevant sources outside of the health care system which could be useful for monitoring   Examples might include 911 call analysis; veterinary clinic activity; OTC pharmacy sales; school absenteeism; health web-site traffic and retail sales of facial tissue, Orange Juice.   Such practices have been deployed by the US Department of Defense and the Utah Department of Health during the Salt Lake City Olympics in 2002.  Such an effort would require integrating additional state and local agencies, educational institutions and retail chains electronically using B2B.  

Read More

BIAN, TWIST, SWIFT: Why Standards Matter

standards organizations

There was a great presentation that has circled the web  for the past few years called ShiftHappens which focused on “the speed of change”. It made some great observations about population, technology and societal changes that have happened in a really short span of time. In fact, when you focus only on technology the speed of change is staggering. Someone once told me that “technology is anything that has been created since I was born.” And to prove that point every Fall someone sends me an email with all the things that the newest class of college undergrads have never lived without like Wi-fi, GPS or iPods. Needlesss to say every Fall, I get a little depressed…but only a little…because hey at least I don’t have millions of dollars invested in technology that is obsolete or soon to be obsolete. But many bankers have not been so lucky. Today, many banks’ investments in technology are so far behind the needs of the marketplace for security, transparency and flexibility that they might not be able to afford to recover or compete. One of the reasons for the existence of standards organizations is to help mitigate the risks associated with the speed change. Not only the speed of change in regulations and processes but also in technology. And many standards organizations are working to help banks keep pace with the speed of change while also helping them to keep an eye on the investments they have made, are making and will make so that at the end of the day, millions aren’t spent on technology that will be obsolete before the next annual report. Articles abound about the banking industry’s focus on replacing outdated and expensive legacy core systems in the quest to add new capabilities, ones that can address issues around risk, regulation and customer retention. But not surprisingly, the focus on economic stability has put replacement plans on hold for many, although these issues really can’t wait for better days in order to be fixed. Which is one reason, we should all be glad that the alphabet-soup of standards organizations,are still moving forward with strategies, road-maps and updates that may help to make life a little less complicated when credit starts flowing again. For example, the Banking Industry Architecture Network (BIAN) announced a change to their mission statement and also that they will publish their first set of deliverables. BIAN’s goal is to be the standard for service-oriented architecture (SOA) in the banking space which should equate to a clearer understanding of technology needs required for growth. The operative word of course is “should”, because while BIAN is focused on architecture, it is not the only standards organization that is contributing to the technology conversation, and others may have a completely different take on the direction that technology needs to take to prepare for change. Particularly when you consider that many standards organizations are made up almost exclusively of a single group of  players from a given business segment with a singular focus on their specific area of interest. This laser focus often leads to a less than ideal adoption rate because the standard does not have a holistic value proposition that resonates across the organization. Which is one reason that groups with more diverse memberships like the Transaction Workflow Innovation Standards Team (TWIST)which helps corporates with standardizing  platforms, matching & netting services as well as settlement and clearing services to integrate their relevant systems (i.e. ERP, payment and reporting systems) are equally important to the “speed of change” conversation. Since TWIST is comprised of members from the corporate sector as well as the banking and technology vendor segments, they approach the concept of change differently than a group like the Society for Worldwide Interbank Financial Telecommunication (SWIFT ) which until fairly recent was comprised solely of bankers. TWIST with its focus on the automation of corporate financial supply chains, looks at technology needs from a perspective of gaining interoperability by building on existing technology investments. By utilizing a modular approach to the adoption of its standards, TWIST lets adopters to use their recommendations on good practice work-flows, message standards and data security how they want and when they wish. This modular approach to implementation is in some ways very similar to our Managed Services approach to B2B integration. How so?  I knew you were going to ask. Well I won’t do a sales pitch because how much fun is that for anyone but I will highlight a few bullets based on TWIST’s game-plan  to show similarities/overlap/complementary tactics, etc. and you can decide for yourself. “The TWIST standards aim to enable straight through processing (STP) from end-to-end of the three processes, irrespective of the way the processes are transacted, the service providers that are involved and the system infrastructure that is used. By standardizing: Information flows Business process Electronic communications (whether direct or indirect between market participants and service providers Platforms, matching & netting services as well as settlement and clearing services, and the methods of integrating the relevant systems” Our Managed Services aims to help businesses and banks to connect with any corporate client or trading partner, regardless of location, size or B2B technical capabilities by supporting:  Information flows–Optimizing the flow and quality of technical data and information Business process–Performing all of the day-to-day management of a customers’ B2B infrastructure including systems-health monitoring, data backup, network and database management Electronic communication–Providing a broad range of trading partner connectivity options including FTP, FTP/S, MQ Series, AS2, HTTP/S, XML etc. Integrating relevant systems–Delivering a team of experts who are proficient in SAP and Oracle B2B integration and have a deep knowledge of industry standards Okay, hopefully you don’t feel as if you’ve navigated a sales pitch but as I said there are similarities in the approaches largely because both TWIST and OpenText are working to create a “win/win” environment. An environment that operates to meet the current and future needs of its customers and members. The promise of standards is similar to the promise inherent in a compelling managed services offering, to simplify the complex, create a repeatable methodology that can serve the current and future needs of the organization and help contain costs by leveraging existing or past technology investments. Read more here.

Read More

Bisync 2020 – The Case for FCC Regulation of B2B Communications

B2B communications

In my last post, I commented on the continued use of legacy communications protocols such as async, bisync and X.25 for B2B e-commerce.  I have never seen an official report on the use of legacy communications in B2B integration. However, I am confident there are over 10,000 companies are still using legacy dial-up connections. In this post, I want to continue the discussion exploring the implications of the business community’s continued use of these ancient technologies. I also want to explore the roles of commercial entities and government regulators in phasing out these legacy communications protocols. Who and Why? The primary users of async, bisync and X.25 are small businesses that established EDI programs over a decade ago and see no benefit in upgrading to newer technology. These companies are still running EDI software on a Windows 95-based PC and a dial-up connection to their VAN provider. Many of these companies running the older technology do not even know what async or bisync is. They just know that the PC sitting in the corner automatically phones home every night and magically retrieve all of their purchase orders. Why don’t these customers upgrade to newer technology?  Most small businesses lack the resources, budget and technology expertise to upgrade to a newer B2B communications protocol such as AS2 or FTP. Furthermore, most are reluctant to make changes to their EDI configuration for fear of disrupting electronic commerce with customers. Would you want to be the EDI manager who was responsible for losing a $2M order from Wal-Mart during a cutover from bisync to AS2? Challenges of Legacy Communications in B2B Does it matter that so many businesses are still using legacy communications technology for B2B? That is a subject of significant debate. But I have listed below a few disadvantages of such pervasive use of the older technologies: Business Disruption – In the early days of EDI, communications functionality was bundled into desktop software packages. Many of the developers of these early EDI translator packages no longer offer support for the older software. Consequently, if the older software breaks then there is a significant risk that users will not be able to quickly remedy the problem. The result could be a disruption to order flow with trading partners, which could have a ripple effect across the value chain Limited Functionality – Users of legacy communications technology are only able to conduct the types of B2B transactions supported by the original EDI software package. In most cases, the older software does not support the newer XML schemas introduced in recent years to support automating a wider range of business processes. Consequently, the ability to develop more collaborative business processes between buyers and suppliers is constrained by legacy technology Outdated Information – Users of legacy communications tend to operate in an off-line batch mode. EDI documents are exchanged with the VAN once a day or sometimes once a week. Consequently, these companies receive updates to orders, forecasts, shipments and bank balances only once a day or once a week. The overall supply chain becomes less agile when companies cannot exchange order, logistics, inventory and payment data in near-real time to respond to changing market conditions Why consider a Regulatory Approach? There have been several attempts by industry to force technology upgrades each of which have failed: Lower Cost Substitutes – The advent of the Internet in the late 1990s introduced a number of substitute technologies that small businesses could use for EDI such as AS2, FTP and Web Forms. Despite aggressive sales efforts by vendors, there remains a significant population of small businesses unwilling to upgrade their B2B technology Product End of Life – Commercial service providers such as the major telecommunications carriers have discontinued support for legacy protocols such as X.25, async and bisync. However, carrier efforts have been handicapped by their large customers, which have trading partners still using the legacy protocols. These corporations are major buyers of telecom services that use their purchasing power to negotiate extensions to the end-of-life to the legacy B2B protocols Pricing Deterrents – A number of VAN providers have attempted to raise the prices of async and bisync dial-up services in an attempt to encourage customers to transition to more modern communications protocols. The new pricing models of VAN providers met with considerable outrage from the end-user community.  Ultimately, the service providers were forced to abandon the pricing policies and extend indefinite support periods for the older communications With vendor-led efforts to drive technology upgrades failing, it seems that the only remaining alternative might be public policy. Or we could accept the fact that bisync will be alive and well in 2015, 2020 or maybe even 2050.  Should the FCC impose an end-of-life date for legacy B2B communications protocols? Should there be government subsidies to enable upgrades to AS2 and FTP?  Post your thoughts and let me know what you think.

Read More