Supply Chain

Finding the Opportunity in Crisis

Among the primary definitions of the word “crisis” found in Merriam-Webster’s dictionary, I think the one that best defines the past 18 months in the banking sector is “an unstable or crucial time or state of affairs in which a decisive change is impending“.  As I look around, I definitely see signs of decisive change across the industry.  And as the best crisis management teams will tell you–what people really remember is not the triggering event that begins a crisis but the long-term response to it. Already, most US businesses are giving the Federal Reserve high marks for its response to last year’s meltdown. But what about the perception of corporates–your clients–to the banking industries’ response? All good crises’ responses begin with a plan. In many ways, the banking industry as a whole was caught off guard by the scope and depth of the meltdown and was caught without a plan.  In fact, throughout much of the early response to the meltdown, the basic steps of crisis management were either forgotten or completely overlooked.  But hindsight is 20/20 and the question for most banks should not be what did you miss but rather “what can you do to turn this situation around and create an opportunity out of the crisis?” I’ve written quite extensively about the need for visibility in the financial supply chain. A number of the problems that created the environment for the financial meltdown had to do with visibility or rather the lack of end-to-end visibility. But an issue of equal if not even greater importance also exists and it is one that is deeply ingrained in the industry. The issue is a lack of flexibility–not only in its IT systems but also in it’s business processes. Flexibility has a number of definitions but for the purposes of this discussion, I am speaking specifically about market flexibility as it relates to the ability of  banking systems to adapt to market demands.  Banking systems are notoriously slow to change.  While there are many good reasons for the prevalence of legacy systems in the banking space, the primary reasons that banks haven’t changed or updated their systems are the inherent risks and costs of replacement. A 2008 pre-meltdown survey found that most banks will still be using the same legacy-based technologies for at least another 5 years. Now 5 years may not seem like that long but in technology 5 years represents a lifetime. In this short span of time, government requirements and standards have changed, while initiatives like SEPA and customer demands for banking as a service (BaaS) type offerings have accelerated. Yet, most banking systems  remain fundamentally unchanged. They are still largely reliant on IT systems that are ten, twenty or more years out-of-date.  Despite the age, limited functionality and high costs of maintaining legacy systems, most banks see themselves caught in “lose/lose” situation when it comes to addressing these problems. We all know that “rip and replace” is not an option and even with the promise of service-oriented architecture or SOA-based solutions, the simple truth is that the world will not stop while the bank’s technology plays catch up.  Banks must be able to respond today, even while the debate about legacy systems vs. modern systems continues.  Which means there must be an interim strategy that delivers and enables the bank to address market changes in a flexible but still repeatable way.  This is the promise and the reality of integration–to allow banks to leverage the inherent value still to be found in their legacy systems while also adding much needed functionality to address market changes fluidly and most importantly, quickly. Integration services providers, like us have helped and continue to help banks navigate the path between the past and the future.  At the crossroads of legacy systems vs modern ones is the data, so while avoiding a sales pitch like the plague, it is critical note that  the key area where integration vendors excel is in “bringing data together”.  This expertise along with experience in process, work-flow and systems integration help make walking the tightrope between the old and the new easier for banks and bankers. So while it’s just a guess on my part,  I suspect that at least some of the banks that are still spending big on IT right now; are doing so by working with integration services providers to identify, mine and capitalize on the opportunities to be found in this crisis.

Read More

How to Win Friends and Influence Supply Chain Finance

In 1937, Dale Carnegie published, How to Win Friends and Influence People, because he believed that; Financial success is due 15% to professional knowledge and 85% to “the ability to express ideas, to assume leadership, and to arouse enthusiasm among people.”  Right now, it seems that many business executives would benefit from reminders about the wisdom of Carnegie’s approach to doing business. In many ways, globalization has transformed the marketplace. It is difficult today for buyers and suppliers to meet face-to-face or to make the type of handshake deals that were once commonplace in business. The trust that was created by frequent lunch time meetings and rounds on the golf course, now relies on conference calls, email and Skype. Yet, despite the lack of personal interaction, business relationships those built on met obligations, assumed responsibility and yes trust continue to develop and grow. As a result, business operations continue to expand across the globe, and financial instruments like  open account transactions, which rely largely on trust, increase in influence. So much so that now open accounts are the norm rather than the exception.  According to a recent survey conducted by scholars at Carnegie Mellon, Trust across Borders: Buyer-Supplier Trusting Global B2B e-Commerce, trust is a critical component in all business relationships but no where more so than in global B2B ecommerce.  So it is truly unfortunate that the current credit crunch, has not only created a power struggle among some buyers and suppliers but also that it is–in some cases–eroding trust. The balance of power in the buyer-supplier relationship has shifted dramatically over the past few years, in favor of the buyer.  In fact, many believe that today more than ever, it is a “Buyer’s Market”.  As such, buyer’s are demanding that suppliers recognize and accept all that a buyer’s market entails, including accepting payment terms far beyond the traditional 30 days. The credit crunch has exacerbated the numbers, forcing more and more buyers to put the squeeze on their suppliers so that they–the buyers–can hold onto their cash a little longer. Some companies are just resorting to outright bullying of their suppliers which is really not the best strategy to use when you’re trying to foster a strong relationship built on trust. But, unfortunately negotiation and finesse can often go out the window when  credit is tight and cash is king. Not surprisingly, the pressure and bullying are not going over very well with suppliers particularly suppliers who have to look for alternative short-term financing to offset extended payment terms and keep  their operations running. This type of stand-off is not a win/win for either side but it can be a win/win/win if say a “friendly “bank is available to mediate the process to make sure that everyone comes out with what they really need…namely cash. A few days ago, a post on the Procurement Blog from David Rae asked if “Supply Chain Finance (SCF) was the elephant in the room” because while corporates recognize the potential value of an SCF program, most haven’t seriously considered implementing one.  Why not?  Simple, because they don’t know where to start. So as a bank–as financial professionals–why isn’t the banking industry doing more to educate the corporate market about the value of SCF and offering solutions that support it?  Or as Dale Carnegie might wonder; where is the industry’s 85% for expressing ideas, assuming leadership and arousing enthusiasm? Well, first the definition of supply chain finance is fairly broad and not necessarily one that rolls off the tongue.  All of the processes related to the sale of all or part of a trade transaction to a third party whether it is the invoice or the actual goods which can be utilized by both buyer and supplier to improve  cash management and ultimately, working capital can be housed under the umbrella term of Supply Chain Finance.  So factoring, reverse factoring, forfaiting, discounting, etc. all provide opportunities that can be financially beneficial to supply chain participants. Today, perhaps more than at any time in the past, banks have an opportunity to add value to their relationships with the corporate client community, by helping to reduce the tensions that exist in the buyer-supplier relationship as everyone struggles to retain cash by getting into the SCF market with new products and solutions that make the process easier. However, most banks even those who have great product offerings in the trade finance space, do not have the resources or the expertise required to on-board trading partner communities quickly; the flexibility to make prompt changes to data or processes based on new requirements, the visibility into transactions to mitigate the impact of supply chain disruptions proactively or even the capabilities to support trading partners who come to the table with varying levels of technical sophistication. Far too many banks are spinning their wheels and resources trying to build rather than buy the resources they need to support SCF because they believe that buying will ultimately crush any competitive advantage they receive from the buy investment. They might be right…but again they could be very wrong. If David Rae is right that the US receivables management market alone from SCF could top $1.3 trillion, then banks that are waiting to build their own SCF solutions might just miss the boat. Particularly if the competitors that they are so concerned with decide to buy it now.

Read More

BIAN, TWIST, SWIFT: Why Standards Matter

standards organizations

There was a great presentation that has circled the web  for the past few years called ShiftHappens which focused on “the speed of change”. It made some great observations about population, technology and societal changes that have happened in a really short span of time. In fact, when you focus only on technology the speed of change is staggering. Someone once told me that “technology is anything that has been created since I was born.” And to prove that point every Fall someone sends me an email with all the things that the newest class of college undergrads have never lived without like Wi-fi, GPS or iPods. Needlesss to say every Fall, I get a little depressed…but only a little…because hey at least I don’t have millions of dollars invested in technology that is obsolete or soon to be obsolete. But many bankers have not been so lucky. Today, many banks’ investments in technology are so far behind the needs of the marketplace for security, transparency and flexibility that they might not be able to afford to recover or compete. One of the reasons for the existence of standards organizations is to help mitigate the risks associated with the speed change. Not only the speed of change in regulations and processes but also in technology. And many standards organizations are working to help banks keep pace with the speed of change while also helping them to keep an eye on the investments they have made, are making and will make so that at the end of the day, millions aren’t spent on technology that will be obsolete before the next annual report. Articles abound about the banking industry’s focus on replacing outdated and expensive legacy core systems in the quest to add new capabilities, ones that can address issues around risk, regulation and customer retention. But not surprisingly, the focus on economic stability has put replacement plans on hold for many, although these issues really can’t wait for better days in order to be fixed. Which is one reason, we should all be glad that the alphabet-soup of standards organizations,are still moving forward with strategies, road-maps and updates that may help to make life a little less complicated when credit starts flowing again. For example, the Banking Industry Architecture Network (BIAN) announced a change to their mission statement and also that they will publish their first set of deliverables. BIAN’s goal is to be the standard for service-oriented architecture (SOA) in the banking space which should equate to a clearer understanding of technology needs required for growth. The operative word of course is “should”, because while BIAN is focused on architecture, it is not the only standards organization that is contributing to the technology conversation, and others may have a completely different take on the direction that technology needs to take to prepare for change. Particularly when you consider that many standards organizations are made up almost exclusively of a single group of  players from a given business segment with a singular focus on their specific area of interest. This laser focus often leads to a less than ideal adoption rate because the standard does not have a holistic value proposition that resonates across the organization. Which is one reason that groups with more diverse memberships like the Transaction Workflow Innovation Standards Team (TWIST)which helps corporates with standardizing  platforms, matching & netting services as well as settlement and clearing services to integrate their relevant systems (i.e. ERP, payment and reporting systems) are equally important to the “speed of change” conversation. Since TWIST is comprised of members from the corporate sector as well as the banking and technology vendor segments, they approach the concept of change differently than a group like the Society for Worldwide Interbank Financial Telecommunication (SWIFT ) which until fairly recent was comprised solely of bankers. TWIST with its focus on the automation of corporate financial supply chains, looks at technology needs from a perspective of gaining interoperability by building on existing technology investments. By utilizing a modular approach to the adoption of its standards, TWIST lets adopters to use their recommendations on good practice work-flows, message standards and data security how they want and when they wish. This modular approach to implementation is in some ways very similar to our Managed Services approach to B2B integration. How so?  I knew you were going to ask. Well I won’t do a sales pitch because how much fun is that for anyone but I will highlight a few bullets based on TWIST’s game-plan  to show similarities/overlap/complementary tactics, etc. and you can decide for yourself. “The TWIST standards aim to enable straight through processing (STP) from end-to-end of the three processes, irrespective of the way the processes are transacted, the service providers that are involved and the system infrastructure that is used. By standardizing: Information flows Business process Electronic communications (whether direct or indirect between market participants and service providers Platforms, matching & netting services as well as settlement and clearing services, and the methods of integrating the relevant systems” Our Managed Services aims to help businesses and banks to connect with any corporate client or trading partner, regardless of location, size or B2B technical capabilities by supporting:  Information flows–Optimizing the flow and quality of technical data and information Business process–Performing all of the day-to-day management of a customers’ B2B infrastructure including systems-health monitoring, data backup, network and database management Electronic communication–Providing a broad range of trading partner connectivity options including FTP, FTP/S, MQ Series, AS2, HTTP/S, XML etc. Integrating relevant systems–Delivering a team of experts who are proficient in SAP and Oracle B2B integration and have a deep knowledge of industry standards Okay, hopefully you don’t feel as if you’ve navigated a sales pitch but as I said there are similarities in the approaches largely because both TWIST and OpenText are working to create a “win/win” environment. An environment that operates to meet the current and future needs of its customers and members. The promise of standards is similar to the promise inherent in a compelling managed services offering, to simplify the complex, create a repeatable methodology that can serve the current and future needs of the organization and help contain costs by leveraging existing or past technology investments. Read more here.

Read More

PAXLST and CUSRES – How EDI keeps our planes safe from Terrorists

The technologies being deployed by government agencies such as the US Transportation Security Administration (TSA) to keep us safe in the air continue to advance year-over-year. Throughout many airports in the US, the TSA has deployed millimeter wave scanners; explosive detection systems and threat image protection software. There are now specialized types of computer cases that can pass through security checkpoints without having to remove the laptop. There is even a Cast-Ray system being deployed to screen passengers with casts, braces and heavy bandages. Perhaps, the biggest improvement I am looking forward to in the next twelve months is avoiding the need to pack my toiletries in a clear 3-3-3 plastic bag. However, one of the most important technologies being utilized to support advanced passenger screening against terrorist watch lists receives little publicity at all – EDI. In January of this year, the TSA began the first phase of implementation for its Secure Flight program. Secure Flight is one of the many homeland security recommendations put forth by the 9/11 Commission. The program calls for airline operators to transfer responsibility for watch list monitoring and passenger screening to the US Federal government, specifically TSA. Through government ownership, the risk of security breaches is minimized and a higher level of consistency can be enforced across airlines. In the first phase of the program, TSA will perform screening of only US domestic flights. In future versions of the program, monitoring will expand to include international flights as well. Secure Flight Powered by EDI The process starts by airline operators submitting a file for each scheduled flight that includes the names of all passengers as well as their date of birth, gender and full itinerary.  TSA performs a matching process against its No Fly List, which contains approximately 2500 known terrorists restricted from boarding any commercial aircraft in or out of the US.  A Secondary Security Screening Selection process is then performed to identify passengers requiring additional inspections.  While the criteria for this Selectee List are not published, a few known triggers exist such as one-way fliers; tickets purchased in cash; and tickets purchased on the day of travel. In 2002, UN/EDIFACT created two new EDI documents specifically designed for passenger screening by border control authorities – PAXLST and CUSRES.  The two documents were adopted by the International Air Transport Association (IATA) and the World Customs Organization (WCO) shortly thereafter for commercial and government use.  Selected nations such as the US Department of Homeland Security have extended or customized the documents to satisfy country-specific regulatory requirements. PAXLST and CUSRES PAXLST is a message created by an airline operator containing detailed information about the passengers and crewmembers on-board the plane. It is required to be submitted to TSA for advanced approval prior to passenger boarding. PAXLST includes a number of data elements for each flight such as origin; destination; time of departure; and scheduled time of arrival.  Extensive detail on each individual passenger is included such as the full name, gender, date-of-birth, citizenship, home address, e-mail address. PAXLST can include passport data such as country of issuance, expiration date and document number. If a travel agency was used then PAXLST should include the name of the specific agent who ticketed the reservation.Even flight-time data such as the passenger’s seat assignment and baggage tag are included. Perhaps, most interesting is that the payment details are listed in PAXLST. For the credit or debit card used for the reservation, the cardholder’s name and account number are included along with the total charges, the currency and date of booking. I wonder if TSA is PCI-compliant? CUSRES is the message returned from a border control authority to the airline operator, which either acknowledges the receipt of a PAXLST or, if appropriate, provides actions to be taken. CUSRES includes reference data about specific flights and passengers as instructions to for additional inspection or the need to deny boarding approval. TSA supports both synchronous and asynchronous transmission of PAXLST and CUSRES messages with airline operators. Messages can be submitted via one of the two primary airline EDI networks, SITA (Societe Internationale de Telecommunications Aeronautique) and ARINC (Aeronautical Radio Inc). Alternatively, transmissions can occur over an encrypted Internet channel (VPN) using web services or the IBM Websphere MQ protocol.

Read More

Where Supplier Portals Went Wrong

supplier portals

There is no dispute that portals have changed supply chains forever. However, as with many disruptive technologies, not only has a new set of benefits been introduced, but also a new set of challenges. The early success experienced by portals led to an overly aggressive expansion of their usage in the supply chain. Consequently, a number of unforeseen negative consequences have arisen from supplier portals. Listed below are four of the major problems created by portals. #1 – Too Many Portals Small suppliers can provide better service to their customers by engaging with them electronically for routine transactions such as PO acknowledgements, advanced shipment notifications and invoice submissions. However, most small suppliers do not just sell to one customer. Most suppliers sell to multiple different companies. Consider a small German auto parts manufacturer that sells to Volkswagen, Daimler and BMW or a small American food supplier that sells to Safeway, Kroger and Wal-Mart. Each of these suppliers has to login into three separate portals on a daily basis to interact with their customers. While the customers benefit by automating supply chain processes, the supplier’s costs are higher due to the added complexity and duplication in their daily business processes. #2 -Swivel Chair Most of the data that suppliers key into customers’ portals already exists in digital format within their own in-house systems. While small suppliers may not be running SAP or Oracle, many of them do run small business accounting packages such as Sage, MYOB or Quickbooks. When a purchase order is received in a customer portal, it must then be rekeyed into the supplier’s accounting system. Similarly, when a small supplier needs to send an invoice they will have to re-key the data that is already in their billing system into the customer’s portal. This process is frequently referred to as “swivel chair” due to the fact that the supplier has to switch back and forth between two different applications. The re-keying process not only creates a duplication of effort for the supplier, but it leads to higher probability of error due to the increased possibility of typographical errors. #3 – Manual Processes Contending with multiple portals is not a problem limited to only small suppliers. Much of the information and functionality contained on a portal cannot be obtained through any other means. For example, information about changes to routing guides, contact details and business processes can only be downloaded from a portal. Many buying hubs only make sourcing opportunities, payment status and performance scorecard information available via the portal. In other cases, processes such as collaborative design and joint promotions planning can only occur through manual interaction on a portal. Requiring suppliers to pull information rather than pushing it to them creates more work for suppliers and it introduces the opportunity for missed communications. #4 – EDI Replacement Some buying hubs have embraced portals holistically for all supplier interactions. These organizations have suspended the exchange of point-of-sale, manufacturing forecasts and product catalogs via EDI and XML transactions. Instead suppliers are required to manually process the data. For example, an electronics supplier receiving a manufacturing forecast from an OEM would have to download the data from the portal then re-key it into their ERP application. Once the forecast was analyzed, the supplier would then have to take the results from the ERP system and re-key them into the customer’s portal. These examples are becoming more and more common in the supply chain. EIDX Insights on Portals The Electronics Industry Data Exchange (EIDX) group, which was recently disbanded by its parent CompTIA, performed some very insightful analysis of portal usage in the high tech sector a few years ago. EIDX found that manually inputting data such as a sales forecast from a portal took an average of 90 minutes as compared to only 5 minutes on average with EDI. Due to the extensive time commitments required, portals increased the number of support staff required from a ratio of 84:1 with EDI to 5:1 with portals. This is clear, quantitative evidence that when used as an EDI replacement portals are, in fact, leading a regression in the level of supply chain automation between trading partners. 92% of the EIDX survey respondents stated that they preferred system-to-system communications via EDI or XML over portals. However, most indicated that they believed portal usage would increase rather than decrease in the coming years. Why would a buying organization push the use of portals over EDI and XML when there is such an obvious decrease in productivity? I will offer some theories: Simplified Interfaces – Large, multi-national corporations have too many entry points for B2B communications. Organizations which have grown by acquisition have a spaghetti-like mess of VAN connections, web forms and direct Internet connections. Centralizing all B2B transactions into a portal greatly simplifies the technology architecture for the buyer. Enrollment, security, reporting and enhancements can all be centralized into the portal significantly reducing the cost of maintenance. Data Quality – Through the use of a portal connected to an ERP system, buying organizations can ensure that only high quality data is received from suppliers. Portals can enforce the completion of all mandatory fields by a supplier. Data in fields can be limited to only a small list of choices available in a pull-down box on the user interface. Such data quality enforcement cannot be accommodated easily in traditional machine-to-machine EDI transactions. Furthermore, errors can be minimized through the use of “turnarounds” which are pre-populated forms based upon a related document. For example, 80% of the data in an invoice can be generated from the original purchase order. While these arguments provide positive ROI for buyers, they add significant costs for suppliers. Such win-lose propositions are bad for the supply chain. Dr. Hau Lee of Stanford University once stated that “Instead of company to company competition, we are now in an era of supply chain to supply chain competition.” When developing a portal strategy companies should evaluate the overall cost and complexity impact to not only their own business, but to their trading partners as well.

Read More

Benefits of Supplier Portals

supplier portals

Portals were one of the innovative new technologies introduced during the dot com era. But unlike marketplaces, exchanges and other failed concepts, portals successfully gained widespread adoption. In fact, portals have been both a disruptive and transformative force in the supply chain. As we look back on the past ten years of evolution I think it is interesting to discuss both the positive and negative impacts portals have had on the supply chain. Below are some of the benefits of supplier portals. In my next post, I will outline some of the negative impacts of portals, which will help to make the case for my AS5 proposal. Portals can enable 100% Supplier Enablement, checklist available here. A few of the benefits enabled by the introduction of supplier portals include: Broader Supplier Enablement – Portals enabled a new tier of suppliers to automate routine supply chain execution transactions such as purchase orders, ship notices and commercial invoices. EDI had gained a critical mass of usage amongst larger companies. However, smaller businesses often struggled to find the resources, budget and in-house expertise to implement EDI. Portals filled the white space in the market quickly. Anyone with a PC and an Internet connection could connect to a portal with minimal training and investment. As a result, the barrier to entry for e-commerce was lowered enabling tens of thousands of small suppliers to interact with customers electronically. New Business Process Automation – Portals enabled a new group of business processes such as strategic sourcing, collaborative design and demand planning to be automated. Historically, these processes occurred over the phone, via e-mail correspondence or in face-to-face meetings. Due to their complex nature these supply chain practices were too sophisticated to automate through machine-to-machine transactions. By moving these processes on-line, portals reduced not only the cost of these transactions, but the latency of information sharing and the barriers to adoption. Supplier Self-Service – Portals offer a lens into the buyer’s ERP system. Inquiries that would need to have been conducted via a time-consuming game of phone tag could instead be performed with just a few mouse clicks. For example, a high percentage of the call volume to accounts payable organizations is from collections personnel in the supplier organization attempting to determine when an invoice will be paid. Portals offer the ability for suppliers to perform self-service inquiries online whenever they need to know the status of an expected payment. Collaborative Processes – Portals provide both supplier and buyer a single, shared view of data. Historically, personnel from buyer and the supplier each viewed data in their own business applications which were hopelessly out of sync. With portals both supplier and buyer share a common view of data such as performance scorecards. The newfound visibility enables the two parties to collaborate on corrective actions to improve overall supply chain performance. Dispute resolution is another process which benefited from the shared view on a portal. Change Management – Supply chains are constantly changing. Buyers open up new distribution centers, manufacturing plants and retail stores, which changes routing guides. As business process re-engineering occurs, new and improved forecasting, purchasing, labeling, shipping and invoicing procedures are introduced. Portals provide an online resource for buyers to communicate changes to contact details, routing guides and business processes to the supplier community. Historically, these changes had to be communicated to each supplier through direct mail, phone conversations or vendor conferences.

Read More

ISO20022: If You Build it…Will They Come


Today was the opening day of the SWIFT Operations Forum. This forum, like many industry events,  has relevant information for the business and technical sides of the bank, with enough content to satisfy both groups–without overwhelming either. A neat trick if you can pull it off and hats off to SWIFT and so far they are succeeding. Anyway during today’s various sessions, I started noticing how often ISO20022 was mentioned, both as a current concern and as the next big adoption wave for both the bank and its wholesale customers. Now I know ISO20022 isn’t news. I’ve read about it, written about and seen numerous presentations about it for at least the past two years if not longer and I’m sure most of you have as well. But sometimes, we have to look back to look forward. Because despite generating a lot of headlines and interest; many people haven’t made up their minds about just how important ISO20022 adoption really is to the future of the financial services industry. So, I think a little refresher on the subject can’t hurt. Particularly at a time when the banking industry is feeling more than a little pressure to invest strategically. ISO20022 was created by the International Organizational for Standardization (which makes me wonder why it’s not IOS20022 but I digress) and was designed to provide the financial industry with a common platform for the development of messages in a standardized XML schema. Take note, ISO20022 is not a standard, it is a tool to develop standards. Which makes sense when you remember that the ISO20022 payment standards are just the first ones in the market. This fact is primarily because of the Single Euro Payments Area (SEPA) which has adopted it as its standard to improve efficiencies around electronic cross-border payments. Nonetheless, ISO20022  is promoted as the universal financial industry message scheme which as of today is really just a marketing message. Why? Because so far the  adoption of ISO20022 is far from universal. During one of today’s sessions, a speaker made a throwaway statement about his organization’s investment in ISO20022.  “Everyone had it so we made the investment and discovered, well…not everyone has it.”  I’m paraphrasing the comment but the gist of it is accurate. I’m not going to dispute that there is value in or a need for a universal message scheme within the financial services industry. Especially one that can deliver a global set of common standards based on technology as well-known, widely used and cost-effective as XML.  Anyone who does business with global organizations can see the value in implementing an agreed upon standard like this. But just because there is value doesn’t mean the value is equal or equally compelling across the market. The biggest push for ISO20022 comes from Europe which is not surprising given the highly fragmented nature of that market. Last week Kristin White provided a great overview of the uphill battle to ISO20022 adoption outside of Europe. She notes as have others that the people promoting ISO20022 haven’t been as adept in promoting a business case that would make industry-wide investment worthwhile particularly when no can predict what the future will bring. Which makes it hard to imagine that people will stand up in large numbers to fight for ISO20022 adoption in their organizations today. But I could be wrong. I have another day here at SOFA tomorrow and ISO20022  is the focus of one of the sessions. Maybe it will become the industry standard for financial services sooner rather than later, but then again maybe not. What do you think? 

Read More

What Will the Next ASX Standard be?


By ASX, I am referring to the Applicability Statements 1-4, which are popular standards for exchanging electronic documents between business partners. Each of ASX standards has been modeled off a popular Internet protocol. The introduction of a new number in the ASX standards is not intended to replace prior versions. Instead, each release AS1, AS2, etc offers features and benefits appropriate for different e-commerce applications. Companies may choose to use both AS2 and AS4, for example, in different scenarios depending upon the security profile, file size and business value of the message. Applicability Statements #1-4 The four ASX standards introduced to date have enjoyed varying levels of success. Here is a quick recap: AS1 – First, there was AS1 based on SMTP. SMTP, of course, is the standard utilized for e-mail exchanges. EDI documents were transmitted between companies using the SMTP protocol. AS1 was the beginning of the revolution. Although it is not used extensively today, AS1 helped to fundamentally change thinking around how the Internet could be used for B2B communications AS2 – Next AS2 was introduced based on the HTTP protocol. AS2 emulated the interactions between a web browser and a web server in order to exchange EDI documents. AS2 was wildly successful. Adoption was catalyzed by Wal-Mart’s announcement to utilize the standard with its supplier community. AS2 remains one of the most widely used protocols for B2B communications today, especially in the US retail sector AS3 – The third in the series was AS3. AS3 was based upon FTP, file transfer protocol. FTP has been a popular standard for exchanging messages between companies for many years. As a result, there were high expectations for how AS3 would change the B2B communications landscape, especially in the area of large file transfer. Unfortunately, AS3 has largely been a failure from my perspective as it has yet to achieve any meaningful adoption AS4 – The most recently released standard is AS4. AS4 is based upon SOAP and the web services model.  Although web services have become the defacto paradigm for message exchange for numerous web-based applications, there has been little adoption of SOAP for B2B communications. Early indications suggest that AS4 will enjoy more adoption than AS3. Members of the European aerospace community are piloting the standard today. However, it is unclear whether AS4 will outperform AS2 The big question in my mind is what is next?  Will there be a subsequent release of the ASX standards, in other words, an AS5. And, if so, what network protocol will AS5 be based upon? There are a few remaining Internet standards that have not been standardized for B2B communications. A few that initially came to my mind included: Telnet – Used primarily in LAN environments to establish a command line interface connection with a remote server Gopher – Named after the University of Minnesota sports mascot, Gopher is an Internet search and retrieval protocol popular before the introduction of web browsers NNTP – Network News Transfer Protocol was a popular standard for reading and posting articles to Usenet groups and news servers throughout the 1990s Given the declining popularity of these three protocols, I would suspect that none are an ideal choice for AS5. There are other application layer IP protocols such as DNS, BGP, SNMP, SIP and DHCP, but none of these are designed for the types of transmissions required for B2B e-commerce. After some thinking I finally came to the conclusion that RSS, Really Simple Syndication, would be the ideal candidate for AS5. In my experience, it rarely works with any level of consistency or reliability. Therefore it would benefit from a standard. Furthermore, there is a growing need to transmit not just structured transactional data but free form text, images, video and audio content between businesses engaged in electronic commerce.

Read More

Introducing EDInomics

If you were to survey CIOs at Global 2000 companies about their 10 most important IT applications, I can almost guarantee you that B2B e-commerce wouldn’t make their list.  In fact, many of the CIOs would even go so far as to tell you that they shouldn’t have to pay for these technologies any more.  But I find this to be a very naive view – especially when you step back and look at the bigger picture of how industries are restructuring in today’s economy.  Companies are becoming more and more specialized, outsourcing more and more business functions to third parties.  This transformation is occurring in every industry and every business process.  In the supply chain, retailers and OEMs are outsourcing functions such as design, development, manufacturing, logistics, sales and service to third parties.  As business process outsourcing becomes more popular we are seeing more finance, IT and HR disciplines being transitioned to third party specialists.  This is significant because it means that companies are more dependent than ever on business partners!  Consequently, enterprise systems are more dependent upon external applications to get the data they need to function.  My own qualitative analysis suggests that over 50% of corporate data now originates outside the enterprise. That means if you cannot connect to your business partners, your IT systems won’t function properly and you cannot run your business.  Does that sound like an area to under-invest in?  Yet, unfortunately, most enterprises don’t take such a strategic view towards B2B e-commerce. In this blog I will explore the continually evolving role of B2B e-commerce technology plays in supporting the needs of industry.  I have responsibility for the vertical industry strategy at GXS, which means that my team monitors the trends in sectors like automotive, retail, high tech, pharmaceuticals, logistics, banking and insurance. Being a global company, one of our challenges is trying to understand the differences and similarities across different regions of the world such as Europe, Asia, North and South America.  There is a lot of fascinating activity occurring in the B2B segment today.  Whole new communities of users in emerging markets such as China, India and Eastern Europe are just starting to use B2B.  In mature markets such as the US, Western Europe and Japan, corporations are using B2B not just as an instrument to cut costs, but as a growth enabler.  Every day I see examples of how customers are utilizing B2B technologies to accelerate time to market for new products and to differentiate their offerings by becoming easier to do business. I refer to this collective set of activities by which B2B is helping to improve business performance as “EDInomics.”  In this blog, I will share my insights on EDInomics — the primary goal being to challenge many of the current misconceptions that outsiders and, unfortunately, many insiders have about B2B, EDI and the related vendors.  I think you will find that B2B is a highly underrated technology, that isn’t on the decline, but, in fact, has yet to peak… Steve Keifer © Copyright 2007 GXS, Inc.  All Rights Reserved.

Read More

Bisync 2020 – The Case for FCC Regulation of B2B Communications

B2B communications

In my last post, I commented on the continued use of legacy communications protocols such as async, bisync and X.25 for B2B e-commerce.  I have never seen an official report on the use of legacy communications in B2B integration. However, I am confident there are over 10,000 companies are still using legacy dial-up connections. In this post, I want to continue the discussion exploring the implications of the business community’s continued use of these ancient technologies. I also want to explore the roles of commercial entities and government regulators in phasing out these legacy communications protocols. Who and Why? The primary users of async, bisync and X.25 are small businesses that established EDI programs over a decade ago and see no benefit in upgrading to newer technology. These companies are still running EDI software on a Windows 95-based PC and a dial-up connection to their VAN provider. Many of these companies running the older technology do not even know what async or bisync is. They just know that the PC sitting in the corner automatically phones home every night and magically retrieve all of their purchase orders. Why don’t these customers upgrade to newer technology?  Most small businesses lack the resources, budget and technology expertise to upgrade to a newer B2B communications protocol such as AS2 or FTP. Furthermore, most are reluctant to make changes to their EDI configuration for fear of disrupting electronic commerce with customers. Would you want to be the EDI manager who was responsible for losing a $2M order from Wal-Mart during a cutover from bisync to AS2? Challenges of Legacy Communications in B2B Does it matter that so many businesses are still using legacy communications technology for B2B? That is a subject of significant debate. But I have listed below a few disadvantages of such pervasive use of the older technologies: Business Disruption – In the early days of EDI, communications functionality was bundled into desktop software packages. Many of the developers of these early EDI translator packages no longer offer support for the older software. Consequently, if the older software breaks then there is a significant risk that users will not be able to quickly remedy the problem. The result could be a disruption to order flow with trading partners, which could have a ripple effect across the value chain Limited Functionality – Users of legacy communications technology are only able to conduct the types of B2B transactions supported by the original EDI software package. In most cases, the older software does not support the newer XML schemas introduced in recent years to support automating a wider range of business processes. Consequently, the ability to develop more collaborative business processes between buyers and suppliers is constrained by legacy technology Outdated Information – Users of legacy communications tend to operate in an off-line batch mode. EDI documents are exchanged with the VAN once a day or sometimes once a week. Consequently, these companies receive updates to orders, forecasts, shipments and bank balances only once a day or once a week. The overall supply chain becomes less agile when companies cannot exchange order, logistics, inventory and payment data in near-real time to respond to changing market conditions Why consider a Regulatory Approach? There have been several attempts by industry to force technology upgrades each of which have failed: Lower Cost Substitutes – The advent of the Internet in the late 1990s introduced a number of substitute technologies that small businesses could use for EDI such as AS2, FTP and Web Forms. Despite aggressive sales efforts by vendors, there remains a significant population of small businesses unwilling to upgrade their B2B technology Product End of Life – Commercial service providers such as the major telecommunications carriers have discontinued support for legacy protocols such as X.25, async and bisync. However, carrier efforts have been handicapped by their large customers, which have trading partners still using the legacy protocols. These corporations are major buyers of telecom services that use their purchasing power to negotiate extensions to the end-of-life to the legacy B2B protocols Pricing Deterrents – A number of VAN providers have attempted to raise the prices of async and bisync dial-up services in an attempt to encourage customers to transition to more modern communications protocols. The new pricing models of VAN providers met with considerable outrage from the end-user community.  Ultimately, the service providers were forced to abandon the pricing policies and extend indefinite support periods for the older communications With vendor-led efforts to drive technology upgrades failing, it seems that the only remaining alternative might be public policy. Or we could accept the fact that bisync will be alive and well in 2015, 2020 or maybe even 2050.  Should the FCC impose an end-of-life date for legacy B2B communications protocols? Should there be government subsidies to enable upgrades to AS2 and FTP?  Post your thoughts and let me know what you think.

Read More

X Marks the Spot: SEC mandate for XBRL is only the beginning

The vision of a true international financial reporting standard just got a little closer to reality.  Last Thursday, the SEC announced April 13, 2009 as the effective date for the 500 largest U.S. public companies to begin to file their financial results using Extensible Business Reporting Language (XBRL) an XML-defined standard used to analyze, exchange and report information by using tagged data elements.  But this announcement should come as no surprise to anyone in the financial services space since discussions about XBRL have been ongoing for more than a decade. 

Read More

Should the FCC ban Async and Bisync?

legacy B2B protocols

Today is February 17th, which has become the equivalent of Y2K in the television broadcast industry. Today was also the original deadline set by the Federal Communications Commission (FCC) to discontinue analog broadcast television signals throughout the US. Congress recently passed the Digital TV Delay law which has extended the deadline to June 12th to allow consumers more time to complete the transition. While June 12th has become the new “drop dead” date, Congress specified that broadcasters are no longer obligated to continue analog transmissions during this extension period. However, 1100 of the nation’s 1800 TV networks will continue their legacy transmissions to maximize advertising distribution. The FCC’s ability to phase out older technology in favor of more modernized, cost effective protocols begs a question in my mind. If the FCC can apply such a policy to TV broadcasts, should we consider enacting similar legislation for outdated B2B communications protocols such as async, bisync and X.25? Y2K for the Television of the Industry The official motivation for the US switching to digital TV is to free up wireless broadcast spectrum which is in high demand by other user groups. Rather than using the spectrum for analog TV broadcasts, the frequencies could be allocated to municipal fire, police and emergency rescue departments to support public safety efforts. Of course, the upgrade to newer technology has numerous benefits for consumers and broadcasters as well such as improved picture quality and a wider variety of programming options. After the cutover, older televisions, which do not have a digital receiver and are not connected to a cable or satellite service provider, will not be able to receive the new transmissions. Consumers with older TVs will need to follow one of three courses of action to watch TV: Purchase a digital converter box, subsidized by the US government Upgrade to a newer model television Subscribe to a cable or satellite service The February deadline was extended to June due to a much higher than anticipated population of non-digital TV users. The US government originally estimated that only 15% of households received analog signals TV via free antennas. However, actual utilization of the analog broadcasts appears to be closer to 35% due to the fact that many homes have extra TVs not connected to cable or satellite TV networks. Legacy B2B Communications Protocols I think most government officials were surprised to learn of such a high population of legacy TV technology still in use in 2009. I suspect a similar level of disbelief would be experienced if a study of the use of legacy communications in the B2B integration sector were conducted. One would assume with all of the e-commerce technology businesses have today such as AS2, FTP and Web Forms that legacy technology such as async, bisync and X.25 have become virtually extinct. Unfortunately, this is not the case. For those of you not familiar with these legacy communications protocols, which is by no means something to be embarrassed about, here is a brief introduction. Async – is a communications protocol in which a start and stop signal are used to encapsulate individual characters and words. Async was originally developed for use with teletypwriters in the early 20th century.  Asynchronous signalling was very popular for dial-up modem access to time sharing computers and bulletin board systems during the 1970s, 80s and 90s.  Here is a link to the wikipedia page on async Bisync – is an acronym for the “Binary Synchronous Communication” protocol intorduced by IBM in the 1960s.  The bisync protocol was the most popular file transfer protocol during the 1970s and 1980s. EDI applications were a primary user of bisync as were ATM machines, cash registers and radar systems X.25 – is a Wide Area Network (WAN) protocolused with leased lines, ISDN connections and traditional telephone lines.  X.25 was especially popular in the 1980s within the financial services industry to connect ATMs to banking networks. Here is a link to the wikipedia page on X.25 I have never seen an official report on the use of legacy communications in B2B integration. However, I am confident there are over 10,000 companies are still using legacy dial-up connections such as async, bisync and X.25 throughout the US. What are the implications of the business world’s continued dependency on these ancient networking standards?  Can commercial entities effectively phase out these technologies on their own or will government regulation be required?

Read More

How can MMOG/LE Improve B2B Processes Across a Supply Chain?

So what does MMOG/LE stand for and how is it used? Well, the MMOG part stands for Materials Management Operations Guideline and was initially developed by the Automotive Industry Action Group in North America (AIAG). The LE part refers to Logistics Evaluation and was developed by the European Odette organisation. When combined, MMOG/LE provides a powerful continuous improvement tool which helps companies improve the efficiency and accuracy of their materials management processes whilst helping to reduce costs, errors and waste. By including logistics in the evaluation tool it allows companies to improve the efficiency of their transport and distribution network. MMOG/LE has now become a recognised global standard for improving materials and logistics processes. AIAG and Odette have developed a self assessment tool which allows companies to evaluate their own internal business processes and identify gaps. The tool has been successfully used by the major OEMs such as Ford, Chrysler, PSA and Renault as a way to evaluate both new suppliers and existing suppliers and make sure that their business processes comply with the high standards expected by the OEMs. In some cases suppliers must agree to take the MMOG/LE self assessment as part of their potential contract with an OEM. In particular, OEMs use the assessment with their suppliers to: Understand management commitment Look at capacity planning of the supplier Implement EDI Integration to ERP and automatic processing of data Measure the ability to interface with Tier 2 suppliers, especially those in emerging markets Develop an improvement plan and measure quality levels MMOG/LE has been in use since 2004 and many companies have seen significant improvements including an 85% reduction in premium freight, 80% reduction in obsolescence costs, 43% reduction in inventory carrying costs and a 20% reduction in data entry time. The self assessment looks at six different areas of a supply chain process and comprises of over 200 different questions. The six chapters of the assessment looks at strategy and improvement, work organisation, capacity and production planning, customer interface, production and product control and the supplier interface. Depending on responses, companies are then given one of three grades. The aim of the assessment is to identify gaps in any key business processes and this then allows the suppliers to take corrective action to improve and fill in the gaps. More importantly suppliers will uncover critical areas where automation and new IT systems can significantly increase plant efficiency and streamline processes. MMOG/LE essentially ensures that Information Flow follows Materials Flow. Now from an information flow point of view it is clear that B2B has a role to play here to help suppliers address some of the gaps within their MMOG/LE assessment. For example improving global connectivity and electronic document support, improving visibility of shipments and orders, providing integration between B2B and ERP, providing supplier enablement and improve availability of service. In fact, the MMOG/LE assessment helps to confirm what we have known for years that B2B solutions and services have a key role to play in improving an OEM’s supply chain. MMOG/LE has been so successful with helping improve business processes and information flow in the automotive industry that other industry sectors such as Healthcare and Retail have started to use the assessment tool as well. With the current downturn in the global economy I believe MMOG/LE will be used by many more companies in 2009 as a way of not only improving the efficiency of their supply chains but more importantly identify areas where manual processes can be replaced with automated processes. In addition, MMOG/LE helps to identify areas where costs can be reduced which in this tough economic climate is not a bad thing. B2B solutions and services can certainly help with automating a lot of manual business processes and for this reason I will be spending some time over the next few months taking a closer look at how B2B solutions can help resolve some of the issues and potential gaps highlighted by the MMOG/LE assessment process. I will report back on my findings very soon, so watch this space. In the meantime, if you would like further information on how your company can use the assessment tool then please visit the AIAG or ODETTE websites.

Read More

Managing Cash–Visibility is Key

With Thanksgiving and Black Friday now behind us, the Holiday Season has officially kicked off in the States. For consumers, corporations and even for banks the next few weeks will focus not just on what you can buy but also on how much you have to spend, save and loan at the end of each day.  Now I have a fairly extensive gift list€”big family=big holiday budget.  But like most people, I am thinking before I buy and more often than not I’m paying with cash and not credit. Because come January, I want some money in my pocket. So all through the holiday season, probably every day because again I have a lot of gifts to buy, I will be checking my online banking statement to make sure that I haven’t spent more than I can see. And according to a recent survey of U.S. Treasurers, I’m not the only one thinking that seeing is definitely believing.

Read More

OAGi and CIDX – The Long Tail Gets a Little Shorter

B2B standards

Earlier this month CIDX, the Chemical Industry Data Exchange, organization voted to transition its’ operational and governance roles to the Open Applications Group (OAGi) and the American Chemistry Council (ACC). CIDX will transfer its intellectual property to OAGi, which will create a new Chemical Industry Council to manage the standards going forward. I view this as a positive development for the B2B e-commerce sector. There are far too many e-business standards in the marketplace today. While these standards can create significant efficiencies and competitive advantages within vertical industries, they complicate matters for trading partners who need to collaborate across industry sectors. Rationalizing the standards efforts under a common governance and organizational model will promote more interoperability and ultimately wider adoption of standards. More about CIDX I do not have a background in the chemical industry so I cannot comment extensively on the use of CIDX within the sector. The CIDX web site offers very little information about its practice to non-members, despite advertising the standards as “free.” However, I did find information on the history, membership, financials and business processes supported by CIDX. History – CIDX was founded in 1985, but the most interesting work to me began in 2000 when the Chem eStandards were developed for XML-based buying, selling and delivery of chemical products Membership – includes industry leaders such as BASF, Bayer, Celanese, Dupont, Eastman Chemical, Shell Chemical and Rohm & Haas Business Processes – are very similar to other industries – RFQ, Catalog, Purchase Order, Credit upon Proof of Sale, Financials, Logistics and Forecasting Financials – CIDX generated $860K with a net income of approximately $40K in 2006 according to their annual report. OAG – The Long Term Winner in the XML Standards Race I read a Forrester report recently which stated that “industry initiatives like OAGi have failed to deliver consensus on any one set of standards…”  I would have to respectfully disagree with this opinion. If anyone is having success linking the efforts of XML-based standards across industries it is OAGi. The CIDX announcement is an excellent example. In my opinion, OAGi is a clear contender for the short head segment of the B2B standards curve I refer to as the Long Tail. OAGi has been active in the automotive sector for many years now and has recently taken on a more prominent role in the high tech and electronics sector as well. Industry insiders tell me that OAGi could potentially replace 1SYNC’s RosettaNet division as the primary standards organization for high tech. OAGi is even active in the financial services sector, collaborating with SWIFT and others on the new ISO 20022 XML standard for payments, securities and cards transactions. Another reason I think OAGi is poised to be the long term winner in the XML space is the reputation it enjoys in the marketplace. Some standards organizations have a reputation for being highly political and bureaucratic. However, I hear nothing but good things about OAG from its participants. Promoting Cross Industry Standards CIDX is a good match for OAGi, as it shares a similar vision of cross-industry standards. CIDX has always promoted the idea of cross-industry standards. Even before the recent announcements with OAGi, CIDX worked closely with other industry associations such as papiNet in the forestry sector; GUSI in the consumer products industry and RAPID in the agriculture community. Under the OAGi leadership the opportunity for collaboration with other industry sectors should be accelerated enabling even faster growth of XML.

Read More

It’s Not Easy—Green Banking Continues to Grow

green banking

“It’s not easy being green.”  When this tune started floating around my head for this post, I thought Kermit the Frog had first sung these words about twenty years ago. Imagine my surprise when I checked Wikipedia and found out the song is over forty years old. And what was once a frog’s simple lament about growing comfortable in his own somewhat hunter green skin, has over the years been co-opted by the green movement as a reminder of the difficulties of living green with Kermit’s blessing of course.   So for the purposes of this post, I’m going to focus on the environmental Kermit, my earliest exposure to and hero of the green movement. Kermit the Frog and the Ad Council anti-pollution ad that featured the Crying Native American guy made a big impression on me and I’m not the only one. For many people of my generation, these impressions spurred us to action both then and now. I’ve been an avid recycler for more than a decade but I remember organizing a neighborhood clean-up when I was still watching the Frog on a daily basis. Somehow I think some of the banking decision makers who are helping to implement green initiatives have these same cultural references working in their subconscious too. This fact might help to explain why, despite the economic upheaval, so many financial institutions and banks are moving forward with their green initiatives. Citi announced the opening of a 305,000-square-foot facility to house its computer systems and components in Georgetown, Texas. This facility will employ about 50 people and is only the first step that Citi is taking toward more environmentally friendly data centers. Other organizations are taking similar steps to help eliminate paper waste and reduce their carbon footprint. While it would be great to think that the focus on eco-friendly is only driven by altruism, the truth is that it makes good business sense as well. Let’s take a look at my own employer, which recently formed an alliance with Verizon Business targeting transactions among vendors, partners and customers, and the companies said that by automating annual business-to-business transactions worldwide, estimated at about 40 billion transactions, companies would cut 2.3 billion pounds of CO2, or the equivalent of removing about 200,000 cars of the road for a year. According to the Aberdeen Group, a company that processes 500,000 invoices per year can cut manual process costs by 60 percent or more if automation is implemented. For financial institutions that support wholesale clients, this green trend toward cost savings and streamlined operations can translate into additional capital to fund new growth. Good business return and good press are compelling reasons for any company to go green. More and more banks are “Finding New Opportunity in Green Banking.” And I’d like to think that the little green frog is playing a part in these decisions. What about you? Did Kermit influence you to take some green action? If not Kermit, then who and what type of action are you taking?

Read More

Green Coffee XML and the Long Tail


The Green Coffee Association’s XML standard offers an excellent example of the Long Tail of B2B Standards. The standard automates a highly specialized set of business processes within a niche industry subsector. Tremendous benefits can be derived from market participants as a result of the flexibility offered from the wide variety of tendering, payment, pricing and performance management terms that can be modeled in the XML.  Such benefits would not be practically achievable with a more generalized standard such as EDI.  Benefits of Green Coffee XML Lower Days Sales Outstanding (DSOs) for sellers of coffee bean products are achieved by creating an electronic audit trail of commercial transactions that reduces the likelihood of post-shipment quality and invoicing disputes Improved Order Fulfillment rates are accomplished by routing contracts directly from buyer procurement applications to seller order management systems thereby obviating the need for error-prone, human interactions Reduced Total Landed Costs for transportation are realized by standardizing and digitizing the import and export processes that often delay shipments at international borders and result in unexpected penalties or fines Challenges of Green Coffee XML Specialized e-commerce standards such as Green Coffee XML offer unparalleled levels of automation and efficiency within a particular market sub-segment. However, such niche XML standards create challenges for market participants whose core business utilizes a different e-commerce framework. Green Coffee XML offers an all-inclusive solution for the agricultural businesses that grow the coffee beans and the specialized brokers who act as middlemen in sales transactions. However, the standards complicate e-commerce scenarios for other market participants such as consumer products manufacturers, food service retailers, government trade ministries, transportation providers, commercial insurer and banking institutions. Consumer Products Manufacturers Brand owners such as Kraft and P&G who purchase large quantities of green coffee beans can benefit from the GCA standards during procurement processes. However, Green Coffee XML is not the only standard utilized by consumer products manufacturers. Other non-coffee suppliers of ingredients, raw materials or packaging materials are adopting the GUSI XML standards. Retail customers expect their suppliers to exchange information in the EDI document standard via AS2 Internet transmission protocols. Furthermore, retailers expect product branding, pricing, packaging, promotion, taxation and regulatory data to be transmitted using the Global Data Synchronization (GDS) standards. A wealth of information about coffee bean quality can be exchanged using GCA XML, but the data must be transformed into GDS XML formats for distribution to downstream retailers. Banking Institutions Financial institutions provide risk mitigation and working capital solutions to coffee buyers and sellers. Examples include letters of credit and post-export supply chain finance. Cost effective and timely processing of such services requires electronic communication with the buyer and seller. Consequently, banks engaged in coffee-related transactions must either embrace the GCA XML standards or develop a process for mapping data to and from their own preferred standards. Financial institutions utilize the SWIFT FIN MT standards for international trade processing. Additionally, a new set of Trade Services Utility (TSU) standards are being developed for supply chain finance. Transportation Vendors Ocean, rail and ground freight carriers as well as third party logistics providers offer a variety of transportation, warehousing, freight forwarding and customs clearing services to buyers and sellers of coffee products. The cost-effective and timely processing of coffee-related shipments necessitates electronic communications between buyers, sellers and government trade ministries. Consequently, transportation vendors engaged in coffee-related shipments must either embrace the GCA XML standards or develop a process for mapping data to and from their own preferred standards.  Transportation providers utilize higher volumes of EDI than any other e-commerce standard. However, due to the broad range of industries serviced by logistics providers, a myriad of standards including AS2, RosettaNet, Odette, STAR and OAG are being embraced by transportation vendors.  The emerging multi-standard model in the transportation industry is customer-friendly, but costly and complex for the carriers who must manage highly customized e-commerce infrastructures.  Similar challenges exist for other trading partners involved in coffee-related transactions: Government Trade Ministries They are charged with monitoring the import and export of coffee products across their borders. Imports and exports must be properly classified to ensure the appropriate taxation of goods. Incomplete or inaccurate documentation will result in delayed processing and financial penalties to either the buyer or supplier. To expedite trade processing, governments support electronic interfaces with importers and exporters of goods. However, government ministries focus e-commerce capabilities on broader, industry-neutral standards such as EDI or ebXML. Such an approach creates challenges for long tail standards adopters such as Green Coffee bean buyers and sellers. Commercial Insurers They offer policies that compensate the buyer or seller of a coffee-related transaction for losses, damages or theft that occurs during the transportation of goods from origin to destination. Such losses might occur from inclement weather, collisions, stranding, pirates or acts of war. Damages might be caused by seawater, fire, smoke or chemical contact with the coffee goods. Insurers require documentation of the shipment contents and value in order to underwrite a marine coverage policy.  To expedite the processing of the policy, insurers offer electronic interfaces to submit documents such as shipment advices, bills of lading and packing lists. Similarly, insurers can distribute insurance certificates electronically to a buyer, seller, shipper or banker facilitating the trade. Insurers utilize specialized standards to support underwriting, policy rating, claims processing and customer billing processes. For example, the ACORD standards offer a highly specialized set of transactions designed to orchestrate insurance transactions. Both ANSI X12 EDI and EDIFACT offer insurance document sets as well. Varying standards between insurers and their customers creates challenges for all parties. Green Coffee XML offers tremendous return on investment and unparalleled levels of efficiency for producers and brokers of coffee. However, the value proposition becomes substantially diluted for other market participants with core businesses that utilize alternative e-commerce frameworks. Consequently, there are tradeoffs to be considered when evaluating whether to adopt specialized, industry sub-sector level e-commerce standards or to rely on less robust, but more highly adopted standards such as EDI.  As cross-industry trade continues to grow, there will be an increasing level of tension between competing e-commerce frameworks.

Read More

What can Dominos Pizza Teach us About Supply Chain Visibility?

supply chain visibility

Last weekend we had some friends over to visit our house and rather than cooking we decided to place an order for pizza delivery from Dominos. Ordering a pizza today is amazingly convenient. We didn’t even have to pick up the phone. Instead we placed the pizza delivery order online using the Dominos web site and a credit card. The fact that you can order a pizza online should probably come as little surprise in today’s increasingly Internet-centric world. What did surprise me, however, was the new Domino’s Pizza Tracker site that provides you a step-by-step update on the status of your order. Using the phone number that you used to place the order you can login to a graphical interface that monitors the progress of your delivery. The web site also displays the street address of your delivery, the complete contents of your purchase and the exact time of your order. The tracker shows five different steps throughout the food preparation and delivery process: Order placement Food preparation Baking Boxing and packaging Delivery en route My tracker even told me the name of the driver who was coming to our house! All that was missing was a link to the driver’s MySpace page so you could learn more about he or she before they arrived…. Dominos launched this new tracker service on January 30th. I am not sure exactly how it works, but the system somehow ties the local operations systems in each store to the B2C web site on a real time basis. One thing I will bet on is that it doesn’t use RFID. The tracker site is one of the more customer friendly web experiences that I have seen in recent years. In fact, there is even a survey that consumers can fill out after the delivery to comment on their experience. So as I ate my pizza I started thinking about the idea of providing customers real time visibility into the status of their orders. And I began to relate this to some of the customer visits I have conducted recently with multi-national corporations in the automotive, high tech and retail industries. One thing that I am consistently surprised by is how little visibility many of the largest companies have to inventory moving through their supply chains. Most of the order management portals large manufacturers provide their customers don’t come anywhere close to the supply chain visibility that Dominos offers its consumers. So I ask you – if the average Dominos franchise with an employee base of 10 and annual revenues of $580K can notify me of the exact time that a pizza is boxed and taken out the door for delivery, why can’t a multi-billion dollar manufacturer of airplane parts be able to tell their customers when a $50M shipment of goods has arrived at a port and cleared US customs? The good news is that many manufacturing leaders seem to be recognizing the challenges associated with lack of supply chain visibility. We have seen a recent surge in interest amongst large manufacturers seeking to provide their key distributors and customers with better visibility to outbound shipments. The highest concentration of demand seems to be with industrial goods manufacturers who are seeking better quality data on ocean freight traversing long distance trans-Pacific routes. Consumer products companies are slightly ahead of their peers in the automotive, high tech, aerospace and industrial machinery sectors when it comes to supply chain visibility. In the industrial sector, it is not uncommon to find that neither buyer nor supplier has visibility to the location and status of shipments of goods once they leave the manufacturing plant. Interestingly, this situation applies even to high value goods including consumer electronics such as high definition plasma televisions, kitchen appliances such as refrigerators and computing equipment such as robotic tape backup libraries. Not only are these expensive goods targets for theft and counterfeiting, but many of them can be easily damaged without proper handling. Due to the value of the goods, suppliers often purchase cargo insurance to protect against these types of threats. Also due to the value of the goods, many suppliers seek out third party financing for their inventory in transit. The financing frees up working capital that would otherwise be tied up in long distance supply chains. So the point here is that we have multi-million dollar shipments of industrial goods insured and financed by third parties and no one other than the transportation provider holding the actual inventory seems to know the location at any point in time. Should industrial manufacturers hire pizza delivery persons to help them?  That is probably unnecessary. But they should consider investing further in transportation management applications, particularly in logistics visibility suites that offer customers insights into the status of inbound freight.

Read More

Battle of the Supply Chains

supply chain

One of the industry associations we have been working with recently is the Global Supply Chain Forum sponsored by Stanford University. The forum is comprised of representatives from many of the world’s largest manufacturing companies as well as some of Stanford’s leading faculty such as supply chain thought leader Dr. Hau Lee. Dr. Lee has introduced a number of revolutionary ideas over the past few years, but there is one particular insight that stands out in my mind: “Instead of company-to-company competition, we are now in an era of supply chain-to-supply chain competition.” This is a concept that I think becomes more and more critical every day that goes by. To illustrate my point, let us examine the high tech industry as an example. More specifically, consider the sub-sector of high tech that manufactures computers and related peripherals. This is a relatively young sector that was first started back in the 1960s and 1970s. However, during its short history the supply chain model has undergone a radical transformation. Mainframe Value Chain When the first mainframes were introduced a single vendor often functioned as the sole source for all computing needs. OEMs such as IBM and Honeywell manufactured not only the finished mainframe product, but most of the components as well including the memory, storage (DASD) and processors. The operating system, database and even some applications were developed by the same vendor who manufactured the hardware. If the mainframe broke or needed an upgrade, the hardware OEM provided the repair and service. 2008 PC Value Chain Contrast the mainframe model to the complex, multi-tiered value chain in today’s computer industry. I work on an “IBM Thinkpad.” However, while the logo on my laptop says IBM, the manufacturer of the machine is actually a Chinese company – Lenovo. Although Lenovo is the OEM, it only contributes a small fraction of the content of the laptop. The components inside the laptop are sourced from third party suppliers (Kingston for memory; Seagate for storage; Intel for microprocessors). Also noteworthy is the fact that Lenovo does not typically sell the machine directly to end users. My laptop was purchased through our company’s preferred distributor – CDW. The software on the machine is made by another group of specialized companies. Microsoft publishes the Windows operating system and Office application suite. Other software vendors such as Adobe, Symantec and Apple provide other applications such as document viewing, desktop security and digital music. And when my laptop breaks, who do I call? Not Lenovo, but a third party such as a high tech distributor, third party logistics provider or a contract manufacturer for warranty support and epair. The point here is that the computer industry has migrated from a vertically integrated model to a highly specialized, heavily outsourced model. This type of highly outsourced model in which OEMs outsource much of the manufacturing and supply chain management to suppliers is growing more common in all discrete manufacturing sectors. Examples can be found not only in high tech, but also aerospace, automotive, consumer products and industrial equipment. Supply Chain versus Supply Chain The key take-away from the discussion above is that OEM manufacturers are increasingly dependent upon a community of outsourcing partners to achieve success. Factors that can go wrong (and do go wrong) are, in many cases, completely out of the control of the OEM. In these new value chain models, companies are actually not competing with other companies, but instead their supply chains are competing with other supply chains. This crucial concept, first introduced by Dr. Lee, is critical for channel masters in today’s supply chain to understand. However, while it may seem obvious, the majority of today’s leading retailers and manufacturers continue to structure models that prioritize the near-term financial performance of their own company above the overall long-term competitiveness of their supply chains. The term “partner” continues to be utilized ever more frequently to describe suppliers in a value chain. However, the approach of most channel masters remains more adversarial than collaborative. The largest exception is, of course, the Japanese manufacturing community which has structured itself around kereitsu relationships between OEMs and key suppliers. Consider the following “company centric” paradigms that are becoming more commonplace in today’s supply chains. Performance Scorecards and Penalties – Retailers and manufacturing OEMs have instituted elaborate chargeback mechanisms that penalize suppliers for problems arising during routine order fulfillment. Not only are these penalties designed with the goal of optimizing the buyer’s business processes, but each retailer and manufacturer has different measurement criteria. As a result, suppliers are forced to comply with terms such as delivering during tightly monitored 2-hour receiving windows and labeling of pallets with customer-specific serialized barcodes and text. While these processes simplify receiving for the buyer, they add cost and complexity for the supplier and friction to the overall relationship. Open Account – Large buyers are moving from their traditional letter of credit processes with overseas suppliers towards open account models. The goal of the migration is to reduce banking fees for the buyer, but in many cases the side-effects to suppliers are significant. Without a bank-guaranteed letter of credit to use as collateral for short term financing, suppliers struggle to fund raw materials purchases, manufacturing plant payrolls and other operating expenses. Extended Payment Terms – In an effort to hold on to cash longer, buyers are extending payment terms with suppliers to periods of 60 or 90 days. Extended terms create a cash flow issue for suppliers who must now seek out short term loans to fund their operations. For smaller suppliers with lower credit ratings, these expensive short term loans compromise profit margins and increase the overall cost of goods sold. Vendor Managed Inventory – More and more customers are looking for their suppliers (or a third party) to hold title for inventory until the point of consumption or sale to the end-customer. Buyers prefer these types of models as they shift the inventory carrying costs to the supplier’s balance sheet along with the risk of product obsolescence and retail shrinkage. For high volume channels, large suppliers can benefit from the added demand visibility and end-customer insights available through a VMI program. However, for many buyer-supplier relationships the risks and costs are heavily unbalanced in favor of the customer. What do suppliers as valued partners in the relationship receive in exchange for these terms? Buyers will offer appealing terms to suppliers willing to engage in customer centric business processes: Greater share of a customer’s wallet as the supplier becomes the preferred vendor for a particular product line Broader scope of services that may include many value added services that increase the average revenue per unit sold Suppliers must weigh the pros and cons of such arrangements to determine their best strategy.  Often the tradeoff is a choice between revenues and profitability. What are the EDInomics of supply chain to supply chain competition? B2B integration technology can be the key to unlocking the potential of collaborative relationships in a value chain.  B2B can be used to enable a variety of strategies such as multi-echelon demand visibility, collaborative product development and third party supply chain finance. But the technology is rendered ineffective unless the channel master in a relationship has a long-term, supply-chain wide perspective on their activities. Unhealthy suppliers introduce performance drag, cost overhead and higher risks to the overall supply chain.  While these factors may not be visible in the buyer’s next quarterly income statement, they will most certainly define the long term success of the buyer. After all, as Dr. Lee states “The weakest link in the supply chain defines the supply chain.”

Read More