Cloud

So What Was Driving B2B in 2011?, Top 10 Blog Posts

I have been a GXS blogger for four years now and during 2011 I posted nearly forty blog entries covering a range of B2B and manufacturing related topics. Over this time Google Analytics has provided me with some interesting feedback on which blog posts were being read the most, which key words were used to find the posts in the first place, where my readership is located around the world and which other industry trends were helping to drive visitors to our blog site. The great thing about Google Analytics is that we can see what interests the readers of our blogs and this can help us to prioritise various marketing projects that we conduct throughout the year. The manufacturing industry went through some difficult times in 2011, especially as a result of the natural disasters in the Japan and Thailand. This inevitably provided me with some blog content during the year, but what else was capturing the interest of visitors to my blog in 2011, well let’s find out! 1. Which B2B Trends will the Manufacturing Industry Focus on in 2011 – The year started with me providing an overview of the key B2B trends which I thought would be on the minds of CXOs across the manufacturing sector during 2011. 2. How will Cloud Computing Benefit the Manufacturing Industry? – Cloud Computing was one of the more important IT trends of 2011. The global nature of the manufacturing industry combined with the simplified way in which cloud computing allows manufacturers to manage their B2B environments meant that this became one of the most popular blog topics of 2011. 3. Which Technology Trends will Drive Growth in the High Tech Industry in 2010 – The high tech industry in 2010 was starting to undergo a major restructuring to accommodate new consumer driven trends such as tablet devices.  The tablet single handedly shook up the PC market in 2010 and into 2011.  This blog looked at how tablet devices and cloud computing would impact both the consumer and enterprise market in 2010.  These trends continued to impact the high tech market in 2011 as well. 4. Build-to Order or Build-to-Stock? – This blog entry has been incredibly popular since I originally posted it in 2009.  Once again, changing consumer demand for more customized or tailored products has meant that manufacturers have had to change the way in which they source goods and manufacture them. This meant that in the automotive sector for example the volume car producers such as GM and Ford were having to embrace the sourcing and production best practices used for years by premium car manufacturers such as BMW and Mercedes. 5. Geely Acquires Volvo Cars, What Will This Mean for the Global Automotive Industry? – In 2009 the Chinese car company Geely made an audacious acquisition of Volvo Cars. This immediately positioned Geely on to the world automotive stage but what plans would they have for Volvo and how would the Chinese automotive industry in general benefit from this? Global car manufacturers have continued to invest billions of dollars in China in 2010 and 2011 and at the time this blog attempted to highlight some of the key benefits that Geely would obtain through this acquisition. 6. Where Can I Charge My Electric Car? – The electric car industry started to gain momentum in 2011 with the launch of two key electric vehicles, the Nissan Leaf and Chevrolet Volt.  Even though the car manufacturers have been keen to launch plans for numerous electric vehicles, there is one problem, the charging infrastructure to support these vehicles hardly exists.  This blog highlighted some of the key players in the electric vehicle charging market and how they would have to work closely with the manufacturers to roll out their infrastructures around the world. 7. B2B Challenges Across the Manufacturing Industry – Global expansion, increased use of contract manufacturers and improving visibility of global shipments were some of the key business challenges faced by the manufacturing in 2010 and these issues continued on in to 2011 for many companies. There are many B2B tools and services available today to help companies address these challenges and this blog helped to highlight some of these. 8. Five Reasons Why Cloud Computing Helps to Develop Greener Supply Chains – Much was discussed about cloud computing in 2011, not just via my blog posts but the much wider IT industry in general.  Many benefits have been identified for how cloud computing can benefit a business however many companies were unsure how cloud computing could help boost their green credentials.  This blog helped to highlight some of the green benefits that could be obtained through the adoption of cloud computing. 9. B2B Adoption Across the Global Steel Industry – The global steel industry has come under significant cost pressures in recent years and many steel companies have had to restructure their operations to remain competitive in the market.  There has also been extensive consolidation across the industry and this has led to a need to update out of date or legacy IT infrastructures.  This blog entry helps to illustrate how B2B technologies are used across the steel industry and how they can help to address these issues. 10. How Will Tablet Devices Impact the High Tech Industry in 2011? – Increasing demand from consumers completely changed the dynamics of the traditional laptop and notebook PC markets in 2011. Tablet devices, especially Apple devices have captured significant market share and they have changed the traditional structure of today’s high tech supply chains.  Apple has been keen to sign up a number of exclusive supply deals with for example battery and screen providers and this has changed the dynamics with other high tech industry suppliers such as Intel and Toshiba.  This blog highlighted how the tablet market was impacting the high tech supply chain. So there you have it, my top ten most read blog posts from 2011.  I hope you found the content useful and I look forward to providing more insights into what is Driving B2B across the manufacturing sector as we progress through 2012 🙂

Read More

Considerations for Deploying a Cloud Integration Platform

Over the past twelve months I have posted a number of blog entries relating to cloud based B2B integration and I will summarise these posts in a followup blog.  In the meantime I thought it might be useful to try and highlight some of the more general things that need to be taken into consideration when looking to deploy cloud based B2B environments. Much has been written about cloud based environments so I thought I would just spend a few moments giving my own thoughts on what needs to be considered when deploying a cloud integration platform for the first time. Don’t try and do too much at once €“ Moving to a cloud based environment can bring many benefits to a company, but where do you start and how do you avoid trying to €˜boil the ocean’ so to speak. Before undertaking a cloud related B2B project you need to think about what you want to move to the cloud environment, for example are you looking to move your invoicing process to the cloud to simplify the way in which you work with trading partners in multiple countries?, Do you need to improve end to end visibility across your supply chain?, Due to a lack of internal skills, are you looking to move a complex back office integration process to the cloud? Are you looking to onboard trading partners in a particular country such as India or China? or are you looking to move an entire business process such as reverse logistics into the cloud? Therefore to ensure that your cloud project is a success and to recognize a quick ROI it may be easier to move one application or business process to the cloud at a time rather than trying the all or nothing approach.  For example let us say that in 2012 you will set your company the task of moving your invoicing process to a cloud based infrastructure.  Once the project has been implemented and is regarded as a success you can then scale up your cloud based B2B platform to take on another task, for example introducing a cloud based returns or reverse logistics project. Think about €˜cloud participation’ €“ The success of a cloud based project will depend on how successful you are at encouraging internal or external users to use the new platform. For example, what are the security implications of allowing external trading partners to have access to your cloud platform?, How will remote workers get access to the platform?, Will you need to provide or upgrade mobile/tablet  devices in order to allow users access to the new platform? Will you need to offer multi-language support?, How will users be supported if they encounter problems with using the cloud platform? How will you train users in the use of the new cloud based application? Do you have any checking procedures in place to ensure that user entered data is accurate? The success or failure of a cloud based environment will depend on how successful you are at encouraging users to use the new platform.  If you can take into account the afore-mentioned points then it will certainly help to simplify the deployment of your new cloud platform. The cloud also allows users to interact with information in a totally different way, for example using app based tablet devices such as Apple’s iPad.  This is likely to raise yet another important question, does your company have a mobile B2B strategy? If you can find a way to improve the user experience then there is more chance of your cloud project being a success. Connecting with other business applications €“ Cloud based environments can work well in isolation but to ensure maximum ROI on a cloud platform deployment then you will need to consider which other back office or enterprise systems will need to connect to the cloud platform.  For example you may have spent a high proportion of your budget deploying a cloud platform to allow external trading partners to send you information electronically but what if you then need to get that information into an ERP system for example?  An earlier research project sponsored by GXS highlighted that on average 34% of information entering an ERP system comes from outside the enterprise. Therefore ensuring that information can flow seamlessly between your cloud based B2B platform and your ERP system should be a high priority at the planning stage of your new cloud platform.  What other systems will need to be connected to your cloud platform, transport or warehouse management systems?, third party applications such as RFID or bar code scanning systems? or perhaps accounting packages?  As with trying to think about what to deploy in the cloud, think about what you need to integrate to, DO NOT try and integrate to all your applications at the same time, come up with a project plan for what business applications need to be connected to the cloud platform and what level of integration is required. Once again, think small and integrate your business applications slowly to ensure that the operation of mission critical business processes are not impacted in any way. Understand the costs involved €“ Have you ever undertaken an in-depth analysis of how much it costs to run a behind the firewall software installation?, software licenses, maintenance updates, server and associated network management costs, IT personnel to then manage the infrastructure. The costs escalate very quickly. With a traditional software environment it is regarded as a normal capital expense, servers and software have to be purchased up front, and software/hardware are carried as a long term capital asset. Meanwhile a cloud based environment is regarded as an operating expense, software is used on a pay as you go basis and most importantly of all nothing appears on the balance sheet. Now I have only scraped the surface hear with the types of costs involved with running a software versus cloud based environment but it would be worth spending some time evaluating your existing infrastructure and I think you will be surprised at how much it costs to maintain on a day to day basis. If you understand the full costs involved with running a normal software environment then you may start to ask the question of why, from a financial perspective, you did not consider using a cloud based B2B environment before. Make sure you choose the right cloud integration provider €“ The key to successfully deploying a cloud based infrastructure is ensuring that you have chosen the correct third party cloud provider to work with.  Over the past year a number of cloud solution providers have entered the market, however many so called cloud providers have merely €˜cloud washed’ their existing software portfolio to give the impression that they are a cloud provider when they are not. There are also providers who offer a cloud solution for supply chain visibility, others who will provide e-invoicing and others providing collaboration type tools.  Do you really want to work with different cloud providers when moving your B2B platform to the cloud?, each possibly offering a different level of service, no integration between the cloud applications and different structures for how these cloud platforms are managed and paid for on a monthly basis.  Would it be easier to try and identify a single cloud provider that can move your entire B2B process into the cloud?, a provider that can offer a high availability infrastructure with a single service level agreement to cover all applications that are used in the cloud environment? Can the vendor support multiple languages to allow trading partners anywhere in the world to use the platform and does the vendor offer a 24/7 multi lingual support service so that problems can be resolved as quickly as possible? Please remember to look out for my next blog entry where I will summarise all the cloud related posts and resources that I have posted across my blog, Youtube and Slideshare accounts this year.

Read More

Top 5 Tips for Successfully Deploying ERP/B2B Integration Projects

Today’s ERP environments are at the centre of many enterprise IT infrastructures, from managing employee or customer related information through to delivering information required to run complex production systems. Many people often regard an ERP system as being internally focused, however without reliable connections to outside trading partners, for example banks, suppliers or logistics partners, it can be very difficult to run an ERP system at 100% operational efficiency. In 2009 GXS sponsored a research study that showed that on average 34% of data entering an ERP system came from outside the enterprise. Now considering that many CIOs focus on the internal infrastructure to support an ERP platform, very little effort seems to be applied to ensuring that externally sourced information can just as easily enter the ERP platform. If you have unreliable connectivity between your ERP platform and external trading partners and a network outage occurs then potentially a third of information feeding your ERP system will be lost.  This could in turn lead to downstream business or production systems not receiving information from the ERP system. So the importance of external connectivity to trading partners should not be under estimated. So how do you go about connecting ERP and B2B systems together?, what should you prioritise first?, what do you need to take into consideration when trying to connect trading partners located in different parts of the world. (In my previous blog I started to discuss the considerations for onboarding trading partners in an emerging market). The remainder of this blog will discuss some of the key things to consider when embarking on an ERP/B2B integration project. Make sure you have the correct resources in place – One of the challenges with running an ERP/B2B integration project is ensuring that you have the correct resources available to design, implement and manage the integrated platform from beginning to end. When undertaking such a project you need to find out if you have any internal skills to undertake the integration work itself, you will need a project manager who can reach out to all users, both internal and external, so that their B2B connections can be tested and any document maps that are required can be created.  Finally you will need to ensure that you have a support function that can resolve any issues and help to manage the integrated platform on a day to day basis. If you do not have the resources or you find that B2B/EDI resources are being redeployed on other projects (leaving your B2B platform exposed) then it could be an opportunity to outsource the integrated platform to an outside vendor as a way to overcome many of these issues. Decide on the options for integrating between ERP and B2B systems – When integrating between ERP and B2B systems there are a number of communication and document exchange related standards that need to be reviewed.  For example in the case of SAP, should users connect via SAP ALE, via VPN or across a secure internet connection, should IDOCs be transmitted across AS2 or SFTP? What about integrating via web services?, do you need to support SAP PI based integration?  How do you know when your ERP documents have been processed across the integration platform?, will you need status alerts to be implemented to provide automatic notification of when documents have been successfully converted from one format into another? Make sure your ERP/B2B integration platform is highly available – Ensuring that your trading partners can connect and send information to your integration platform is crucial to the smooth operation of your ERP system. If you lose B2B connectivity for some reason and external information from trading partners is prevented from reaching your ERP system then there is a chance that this could impact other back office systems and downstream production processes.  Your external trading partners should ideally be connected to your ERP system across a highly available B2B integration platform. Choosing the right B2B vendor to work with can make or break the success of deploying an ERP/B2B integration platform. Check the quality of externally sourced information – As highlighted earlier in this blog entry, a high proportion of data entering your ERP system comes from outside the enterprise.  What happens if a supplier sends through information that has an incorrect address or part number included? The document would have to be intercepted before it enters your ERP system, it would need to be reworked to correct the information and then resubmitted in the queue to get processed by the ERP application.  By checking the quality of the information before it enters your ERP platform will allow you to minimise any downtime caused by having to rework data. Given that ERP systems are at the heart of many companies it is important to try and implement what can best be described as an ERP firewall around your ERP applications.  This firewall would continuously monitor all inbound business documents and identify any documents that do not contain a complete or correct set of data before entering your ERP system. Provide a fully scalable integration platform – When designing an integrated ERP/B2B platform it is important to plan for future expansion or growth of the platform. For example your company could acquire a business running legacy SAP or Oracle applications, you may have a need to support a different ERP module or different types of ERP documents. You also need to ensure that the platform is scalable and that you can deploy the platform across any business unit or manufacturing plant located anywhere in the world.  The easiest way to achieve this would be to build your integrated platform in a cloud based environment.  This helps with the way in which users get access to the platform, it helps to minimise time spent integrating to new ERP modules and most importantly of all, integration in the cloud shields the users from the complexities of trying to develop the integration platform themselves. Now having read these tips you might be thinking, where do I start? There is a high chance that you may not have the internal integration expertise to undertake such an integration project or that your existing IT and support infrastructure is unable to support such an integration project.  More importantly of all you need to maintain continuity of your existing IT projects, production facilities and customer service environments. The simplest solution to address these concerns would be to place the integration project in a cloud based environment. GXS Trading Grid is the world’s largest integration cloud platform and we have undertaken many such integration projects for hundreds of companies across different industry sectors around the world.  To find out how GXS can help move your ERP/B2B integration project to the cloud, please click here >>>

Read More

Upgrading Your Treasury Management System? Five Questions to Consider about Bank Connectivity

Are you in the process of upgrading your Treasury Management System (TMS)? So are many other Global 1000 organizations. At GXS, we work with dozens of multi-national corporations each year embarking on treasury transformation projects. Some are seeking better information about offshore cash balances.  Others are focused on hedging currency and interest rates. But one common consideration that all treasury project teams need to address is the optimal approach to bank connectivity.  A TMS is only effective if it is fed with real-time data about cash positions, interest rates, payables, receivables and foreign exchange rates. As a result, most finance organizations rethink their approach to bank connectivity along with a TMS upgrade. The good news (or bad news depending upon your point of view) is that there are many different options for connecting to your bank. You can connect over SWIFT, via direct Internet connection, or simply use your bank’s web-based application. There are also many different message standards available to exchange data with your financial institutions. You can choose from global standards such as ISO 20022 XML and EDI, or regional file formats such as BAI2.  Below are five of the most common questions that we at GXS are asked by multi-national companies about bank connectivity. 1.  What messaging standard(s) should I use? There are three considerations. First, what standard does your treasury workstation vendor prefer?  Increasingly vendors are adopting the newer ISO 20022 XML standard. Some treasury vendors support a mix of messaging standards including BAI2, EDIFACT, EDI, SWIFT FIN and SAP IDOC messages. While your treasury vendor may be flexible, your financial institution may not be. An important secondary consideration is what standards your banks support. If your primary cash management banks cannot send or receive ISO 20022 XML then you may want to stick with EDI or traditional SWIFT FIN messages. The issue of standards may become less of an issue if you have access to a strong team of integration experts. Therefore, an important third consideration is what proficiencies your internal IT organization has. Many larger companies have partnered with a “B2B Managed Services” vendor that staffs hundreds of mapping and integration experts. These Managed Services vendors can quickly and cost effectively transform data from one format to another.  Other companies provide similar services in-house with “Integration Competency Centers.” Using one of these two approaches, the issue of standards can often be effectively neutralized, eliminating manual intervention in payment initiation, receivables posting, bank reconciliations, and cash forecasting. 2.  Should I consider replacing my one-to-one bank connections with SWIFT connectivity?   GXS recommends implementing SWIFT either before or during a TMS upgrade to eliminate unnecessary host-to-host bank connections and bank-specific online cash management systems. Corporates who have implemented SWIFT cite benefits such as improved visibility of worldwide cash as well as the ability to receive acknowledgements and confirmations. Using SWIFT also can simplify integration with standardized connectivity and file formats. With almost 10,000 financial institutions connected to SWIFTNet, many corporates cite the ability to add new banks easily as another key benefit. In recent years SWIFT has expanded the breadth of services available to corporate clients. Both securities and international trade transactions can be exchanged over SWIFT. Those already using SWIFT for payments and cash reporting may want to consider expanding into value added services such as electronic bank account management (eBAM) or exceptions and investigations (E&I). 3.  Should I migrate all of my bank connections to SWIFT? Implementing bank connectivity through SWIFT is not an “all or nothing” proposition. You should perform a cost/benefit analysis of your current connectivity approaches to SWIFT on a bank-by-bank basis to determine the most cost effective integration method for each institution. How does the cost of your current proprietary host-to-host, online cash management, and multibank data aggregation solutions compare to SWIFT? It may make sense to use a combination of both approaches. For example, you might use Internet-based secure FTP for information reporting data and bulk payments with your top two cash management banks.  Additionally, you could use SWIFTNet for payments and reporting related to international bank accounts. Another consideration is whether all of your banks are SWIFT-enabled for the services you require. For example, many financial institutions lack flexible capabilities for FileAct.  For those banks that are not SWIFT-ready, you will need to establish direct connections or work with them to implement SWIFT. 4.  What is the best option for me to connect with the SWIFT Network? So you have completed your cost/benefit analysis and are considering migrating some or all of your bank connections to SWIFT. The next question then becomes how best to connect to SWIFTNet. There are three options; the appeal of which will vary depending on your needs. The first option is direct connectivity. In this scenario your own IT organization manages all aspects of implementing, testing, and maintaining your SWIFT software and infrastructure. Many IT organizations lacking experience with SWIFT are surprised by the level of effort required. Direct connectivity necessitates the deployment of SWIFT’s proprietary software, implementation of specialized security procedures and completion of extensive testing programs. If you are not fully sold on the SWIFT value proposition but would like to experiment with the service, then Alliance Lite may be the best option.  Alliance Lite is a packaged offering providing connectivity through a web browser on a standard PC. Alliance Lite is best suited to companies with basic requirements and limited volumes. The third option is to use a SWIFT Service Bureau provider to manage your SWIFT connectivity. A SWIFT Service Bureau eliminates the need to invest in the infrastructure, maintenance and personnel to support your SWIFT messaging needs while still allowing you to tailor the implementation to meet your specific needs.  5.  For the SWIFT Service Bureau option, how do I choose a provider? Up to 70% of corporates are connecting to SWIFT through a service bureau because it is usually more cost-effective and doesn’t require highly specialized in-house SWIFT experts. Even those organizations that end up using another option typically evaluate this alternative prior to deciding their implementation approach.  Choosing a SWIFT Service Bureau can be a daunting task, with more than 140 organizations worldwide with service bureau offerings. A list of SWIFT Service Bureau providers can be found using SWIFT’s Partner Locator. This tool also highlights those bureaus who have earned the “SWIFTReady Connectivity“ label, awarded to providers that meet best-in-class operational requirements. SWIFTReady vendors must meet more stringent requirements for availability, security, resilience, non-repudiation and guaranteed message deliveries. Most corporates issue a SWIFT Service Bureau Request for Proposal (RFP) or Request for Information (RFI) tailored for their organization’s unique needs. Typical criteria includes the ability to handle bank on-boarding, provide 24×7 global support, meet audit standards (SAS 70 Type II, SSAE 16 &  3420), employ SWIFT-certified experts, and support file translation and transformation. Maximize your TMS investment by automating, integrating and optimizing financial data gathering for complete, timely and accurate TMS analysis and action. GXS offers a variety of bank connectivity options which can be implemented in tandem with your TMS deployment.  If you would like to learn more, contact us and we will happy to discuss different approaches with you.

Read More

How do you go about expanding your B2B platform into a new market? – Part2

In this blog I would like to discuss some of the more technical considerations that have to be taken into account when expanding to new markets. From understanding the technical capabilities of your trading partner community through to deploying the right B2B tools, ensuring that you have 100% trading partner participation is crucial to the smooth running of your supply chain.  So what should you consider? Understand the technical capabilities of your supplier – When connecting with a trading partner for the first time it would be worthwhile undertaking a B2B connectivity and technology audit.  For example how does the trading partner connect to the internet at the moment?, what type of business applications do they currently use?, do they have any internal resources to look after their IT infrastructure?, if using business applications, are they behind the firewall or hosted and delivered as-a-service?, do they connect electronically with any other customers?, if so how do they achieve this?,  the sooner you can understand the technical landscape of your trading partner, the sooner you can start planning to support their needs and deploy your B2B/IT infrastructure. You also have to bear in mind that the type of document they send through could be dependent on the communications infrastructure that might be available, for example a small supplier in Thailand might only have a slow speed dial up connection.  To ensure 100% participation by your trading partner community you must be able to embrace each and every need of your trading partner, no matter how small they might be or what level of IT adoption they might have. Are you deploying the right B2B solution for each user? – When dealing with trading partners in emerging markets it is important to understand the current B2B capabilities (if they exist) of the trading partners that you will be dealing with. If you are working with a trading partner in India or China for example, there will be a high probability that they could be using paper or even Microsoft Excel based ways of exchanging information with their customers. If using paper and to ensure 100% integration with your B2B processes then you may need to offer a web form based method for these particular suppliers to submit information electronically to your B2B platform. Alternatively, Microsoft Excel is one of the most commonly used business applications in China so you might want to think about how you can exchange these types of files and more importantly utilise the information contained within a spreadsheet.  If you can find a way of integrating spreadsheet content to your B2B platform then it will minimise any re-keying of information and hence minimise any errors getting into your business systems. So what about back end integration? – Another reason for ensuring that your trading partner community is 100% enabled is to ensure that externally sourced information can enter other back end business systems, such as ERP platforms, as seamlessly as possible. For example if you are running SAP, will your trading partner be able to send you an SAP IDOC file directly or will this be required to be converted by your B2B provider or internally by your own resources?  Which accounts package are they using and will you be able to integrate to it? Given that your production lines or equipment may be waiting to receive B2B related information from your trading partner it is important to consider back end integration and how these trading partners will connect to your back end IT infrastructure. Successfully integrating to back office systems  will help to bring additional benefits to your B2B platform, this includes reducing rework of incorrect data and speeding up the flow of information across your extended enterprise. Have you thought about extending supply chain visibility? – If you are sourcing goods from a trading partner in an emerging market, as well as ensuring that business documents can be exchanged electronically it is important to ensure that the physical supply chain can be monitored end-to-end as well. Key to this will be to ensure that you can monitor transactions across borders, customs agencies and multi-modal methods of logistics and transportation. If goods are delayed at a country border then the customer ordering the goods needs to be made aware of the situation. Being able to inform your customers of when their products or goods are to be delivered can often be a key measure of customer satisfaction and competitive advantage. So in addition to conducting a technology audit it may be worthwhile finding out which logistics partners and countries the supplier does business with already. Have you thought about how your business might expand or contract? – In these uncertain economic times it is important that your B2B platform can scale up or down depending on the exact needs of your business. If you have five Chinese suppliers on-boarded to your B2B platform, what happens if you need to onboard a further ten trading partners in a short time frame?  Would you be able to scale up your B2B infrastructure accordingly?, would you be able to support the inevitable changes to your existing B2B platform in order to accommodate these new trading partners?  Does your existing B2B infrastructure and accompanying service contract have the flexibility to incorporate a volume increase in B2B traffic? Factoring in likely changes to your B2B infrastructure at the planning stage, if you know what they are likely to be, can save you a lot of time in the long run. GXS has been helping companies globalise their B2B infrastructures for many years, from onboarding trading partners in China to connecting to a new third party logistics provider, GXS has the solutions to achieve this.

Read More

Are you ready for the ‘Perfect Storm’ that will hit the Mobile B2B Space in 2012?

Over the past couple of years a number of technology developments have brought us to the edge of a mobile B2B revolution in the enterprise computing space.  Enterprise adoption of cloud computing technologies, introduction of tablet devices and the development of enterprise specific apps will lead to the mobilization of the B2B space in 2012. Now I may have been slightly ahead of the curve in July 2007 when I posted a blog entitled “Mobile EDI, Can the Apple iPhone Deliver?”.  In this blog I proposed a number of ways in which Apple’s iPhone could be used to view how B2B related information flows across supply chains.  I introduced the concept of Trading Grid Mobile as a potential platform to interact with GXS’ Trading Grid platform.  Since that post we have seen the introduction of the Apple iPad and with it the opportunity to present yet more information in front of the enterprise user. We had our Integration Customer Forum in Washington two weeks ago and it was great to see a GXS app running on an iPad. My colleague Doug Kern discussed this in his earlier blog, but essentially the app allows a user to view exceptions that may arise as B2B transactions flow across our network, for example did the information get processed without causing any errors. Many of the analyst firms are just starting to post their thoughts on their respective websites and blogs about what they perceive will be the big technology trends of 2012, and mobile devices certainly features quite highly in all the analyst reports that I have read so far. So why has mobile suddenly started to get on the radar of so many CIOs around the world? Well, there are a number of factors why the Perfect Mobile B2B Storm is brewing, here are my thoughts :- Firstly, since I originally blogged about this subject in July 2007, mobile devices (especially smart phones and more importantly tablet devices) have hit the markets in their millions. Apple has completely cornered the market with their iPad tablet and nearly every computer manufacturer has tried to rush a tablet device to market. What has also happened, and I have done this myself, is that employees have been taking their tablets in to their work environments and they have been asking their corporate IT departments if they can connect them to corporate resources such as webmail etc. As a result, IT departments have had to be slightly lenient towards this and whilst learning how to embrace tablet devices in the enterprise, CIOs have decided to accept tablet devices into their IT infrastructures. Secondly, over the past 24 months you haven’t been able to open an IT related magazine without some sort of article being included on Cloud Computing or SaaS. One of the difficulties of running software or an app on mobile devices has been trying to optimise the software’s performance against the speed of the processor, ensuring that there is plenty of on-device storage/memory, avoiding battery drainage and finally making best use of the available screen space for you to be able to interact with software applications as easily as possible. Well as software applications can be hosted in the cloud and cloud environments can undertake all the processing and storage requirements it has made it easier for any device to connect to remote applications in the cloud.  High speed 4G networks are just starting to be rolled out and this will help to speed up the interaction and transfer of data to cloud based services.  In addition, battery technology has moved on considerably and the screen space available on an iPad means that applications can be used without having to use a magnifying glass to read text or graphics! Thirdly, the usability and ease of using the many business related apps that are hitting the market now means that users can simplify and automate many mundane tasks. I certainly believe that tablet devices will help to mobilise the ‘second economy’ as described in Steve Keifer’s recent blog post. In a previous blog entry I discussed the idea of a ‘Corporate App Store’ where employees, depending on their role within the business, could download apps to allow them to do their job. I think many CIOs have been reluctant to use hosted software solutions and downloadable apps due to the perceived security risk of downloading them and connecting to back office systems.  But once again, things have moved on very quickly and the ability to use both ‘behind the firewall’ and cloud based software services is now becoming common place in the enterprise market. Fourthly, many of the software vendors who have traditionally received the bulk of their revenues from behind the firewall software have now introduced ‘as-a-service’ solutions to allow mobile apps to interact with ‘enteprise level’ software such as ERP and CRM solutions.  SAP and their Business by Design solution comes to mind in this respect and they have recently announced a whole collection of new apps to allow their enterprise customers to connect to their SAP systems via smart phone and tablet devices.  SAP even actively promotes the use of iPad devices within their business to help improve the flow of information across their company. Finally, from a supply chain perspective B2B vendors such as GXS are actively looking at how these mobile devices can be used to streamline supply chains and provide improved visibility of transactions across B2B networks and associated supply chains. EDI documents, Barcodes, QR codes, RFID tags, all information that can either be scanned, transmitted or exchanged electronically by the latest generation of mobile device such as Apple’s iPad 2.  More importantly for GXS we are seeing increasing demand from our customer base to mobilise our B2B applications and the app we demonstrated at our Integration Customer Forum two weeks ago in Washington won rave reviews from the attendees. The words “When can we have it!” seemed to be resonating round the walls of the demonstration area on a frequent basis! So the perfect storm will bring new business opportunities, new ways of running/managing/streamlining supply chains. Are you ready, batten down the hatches, the perfect mobile B2B storm is about to hit your business :), more on this and our apps in a future blog entry.

Read More

GXS Welcomes SAP to the ‘Top Table’ of B2B Integration

Yesterday, SAP announced they would be acquiring Crossgate to help improve the way in which companies connect to trading partners outside of the enterprise. Over the years, SAP has grown into one of the largest providers of enterprise solutions. Many GXS customers use SAP to manage production processes, HR/Payroll systems, transportation and warehouse environments. The B2B integration top table is certainly getting busy now and my colleague Steve Keifer discussed some of the other recent B2B integration related acquisitions in this blog entry. GXS has been sitting at the B2B integration top table for over forty years now and let me try and draw an analogy here between B2B integration and a fine dining experience!  Not easy when I have zero cooking skills in the kitchen and I think Gordon Ramsay would lose his patience with me Once you reach the top table of B2B integration what can you expect and what etiquette should be used to manage a customer’s B2B integration requirements? Well from an ala carte menu point of view, our customers have been able to pick and choose the B2B services  to suit their needs, may be choosing to use a couple of mail boxes, source some mapping skills, use a simple web based environment to exchange business documents with their customers as and when required. GXS has been offering these core B2B services for many years and we have gained significant experience in helping companies automate their manual business processes. Irrespective of which industry a company may be working in or which business or production process needs to be supported, GXS has gained unrivalled experience in the B2B integration space. The other choice at the B2B integration top table is an ‘all you can eat’ banquet menu, now this could be considered as the equivalent of an outsourced environment whereby you can have as much B2B functionality as you like for a fixed or predictable cost. In restaurant terms they provide the fine cuisine, you enjoy the fine dining experience. GXS has been offering fully hosted or cloud based B2B environments for over 20 years now and as a comparison with any good restaurant it takes time to gain the knowledge to offer a fine customer experience. In addition to the ala carte B2B menu options discussed earlier, you may decide to implement an e-Invoicing solution or a platform to improve the way in which you communicate or collaborate with your trading partner community. GXS has the experience to be able to offer all of these services and more. GXS has worked with many companies around the world, akin to enjoying global cuisine, we understand their cultures, we are able to speak and support our customers in multiple languages. When you reach the B2B integration top table you need to be able to support customers from both large global enterprises with thousands of trading partners through to small medium sized companies that just need to send a document as an EDI file. You also need to offer integration services to more than one ERP system. Many companies today have grown through acquisition and it takes time, sometimes years, to move acquired business units to a single ERP platform. At GXS, we don’t care which ERP platform you may be using, we can provide integration to SAP, Oracle, JD Edwards and many other leading ERP environments.  In fact we have undertaken more SAP related integration projects than any other ERP platform, with over 120 integration projects undertaken to date. In 2009 GXS sponsored a research study to try and get a better understanding of why ERP B2B integration was so important for companies and the expected ROI that can be achieved if these two systems are fully integrated. One of the most surprising pieces of information to emerge from the study was that on average 34% of data entering an ERP system comes from outside the enterprise. Hence highlighting the importance of being able to seamlessly integrate ERP and B2B environments together. From an SAP perspective GXS Managed Services helps companies remove the complexity of integrating these two environments together. You can read more about this via an earlier blog that I posted on this subject. Now, as with any fine dining experience at a top restaurant, a certain kind of etiquette towards customers is established through many years of experience. Being able to improve customer satisfaction levels is top of mind for any CEO, and GXS is no different. So if you would like to experience the B2B equivalent of a fine dining experience then take a look at our website where you will be able to learn more about GXS Trading Grid, the world’s largest integration cloud platform. If you would like to learn more about how we can help significantly increase ROI with SAP/B2B integration projects then why not visit our dedicated microsite, CLICK HERE>>> or alternatively to learn more about how GXS connects with SAP environments, CLICK HERE>>>

Read More

How the Cloud Helps Manufacturers Reduce B2B Complexity

The analyst firm IDC recently undertook a research study to get a better understanding of cloud computing adoption levels in the manufacturing industry. One of the major conclusions from the research was that manufacturers are looking at cloud computing as a key driver of IT productivity in the future. The study revealed that 22% of those surveyed said they were already running some form of cloud based service and a further 44% said they are currently implementing or had firm plans to implement such services.  One of the key drivers for adopting cloud services is the continuing need to reduce complexity across manufacturing operations and I will highlight some examples of this later. Another key figure from the study said that the percentage of companies managing infrastructure internally will fall to about 38%, down from 54.8%. To achieve this significant drop, manufacturers are likely to increase levels of IT related outsourcing. Now B2B complexity is not a new subject and I discussed this in a previous blog entry which can be found here and in fact GXS sponsored a study last year which looked at complexity across supply chains.  IDC’s findings were similar to our own in that to reduce complexity, companies will need to consider outsourcing or moving certain operations to a cloud based environment. In my previous blog entry I discussed the basics of cloud computing and an earlier blog entry looked at why manufacturers should think about moving their B2B environment to the cloud. But once they have their B2B environment operating on an integration cloud platform, how will it help to improve the way in which they run their operations and more importantly how can they increase ROI on their B2B related projects? Before I discuss what an integration cloud platform actually is, I thought it would be useful to explain why integration in the cloud has become such a talking point amongst analysts and researchers over the past year or so.  The increasing use of cloud based environments by companies has led to different types of cloud being used, these were explained in my previous blog.  That is, private versus public versus community type clouds.  Each cloud may be developed to serve a different requirement or service a particular industry need but there comes a time when different clouds need to be connected together to ensure the seamless flow of information between one business application and another. The analyst firm Gartner coined a phrase called Cloud Service Brokerage (CSB) which essentially negotiates and mediates relationships between many different providers of cloud services and the ‘cloud consumers’. Think of CSB as a black box where all the integration between different cloud services is carried out behind the scenes for you.  To help illustrate what a CSB looks like I created the diagram to the left which will hopefully make it a bit clearer. So what the CSB is able to provide is a single point of connectivity for applications and information is routed and processed accordingly to the end cloud consumer. The manufacturing sector is one of the most global of all the industry sectors, many production plants need to be connected to a central B2B hub as well as ERP systems such as SAP and Oracle, and not forgetting aftermarket or returns processes that have to be catered for too. Manufacturers are likely to have numerous external trading partners who will need to connect to the internal business applications via the integration cloud platform. As the diagram below shows, an integration cloud platform such as GXS Trading Grid can help simplify the external connections into your B2B platform by providing a single point of entry. GXS Trading Grid then takes care of routing the information and converting, for example to an IDOC document if entering an SAP system. So we have discussed some of the more generic business benefits of cloud computing but what benefits can manufacturers expect to see from using an integration cloud platform such as GXS Trading Grid?, let me explain 🙂 Offers Scalability and Flexibility – B2B environments can be expanded or contracted depending on the needs of the business. For example seasonal fluctuations or bringing onboard contractors to help with new manufacturing contracts may require additional B2B capacity.  The cloud allows manufacturers to roll out new B2B functionality very quickly from a central location.  The cloud is ideal for on-boarding trading partners or contract manufacturers who may be located in an emerging market.  These additional plants and offices can be brought online with ease. Simplifies Connectivity – The cloud offers any-to-any mediation between different B2B communication or document standards. It is not uncommon for manufacturers to want to use a mix of modern and legacy based standards, the cloud provides the ability to use any standard. Regional standards can be embraced within the cloud as well, thus providing seamless connectivity anywhere in the world. Improves End-to-End Visibility – Cloud based B2B environments provide end-to-end visibility of global business transactions. It also provides a single view of B2B transactions flowing between global trading partners, so if goods are being manufactured and then exported from China for example they can be fully tracked across multi-modal logistics partners or border control agencies to their point of delivery. Encourages Trading Partner Collaboration – All trading partners can be connected to the cloud with ease using a mix of quick to deploy, simple to use web tools.  The cloud effectively removes the traditional collaboration barriers between trading partners based in both developed and emerging markets. The benefit of having all cloud consumers using the same integration cloud platform is that they can all be administered and authenticated from a central contact database. Simplifies Integration to Business Systems – Integration to numerous business systems takes place through a single cloud environment. The integration cloud platform has the ability to check the quality of data from external trading partners before it enters a downstream business system such as an ERP system. In this case the ‘ERP Firewall’ helps to ensure that only accurate data enters ERP and hence helps to avoid any rework of data so that it does not impact any production systems. New users to the integration cloud platform can utilise expertise gained through previous integration projects, for example leveraging various templates and document maps that may have been developed for other cloud based B2B projects. For further information on GXS Trading Grid and to review other resources relating to how GXS can help move your business to an integration cloud platform, please click here.

Read More

Understanding the Basics of Cloud Computing

Earlier in the year I posted a blog entry discussing why manufacturing companies should consider using cloud based environments. One thing I didn’t do in that particular post was to discuss exactly what a cloud computing environment is made up of and what benefits companies can expect to receive by adopting such services. There are many different resources on the internet relating to the subject of cloud computing, so I thought it would be useful to pull together a brief introduction to cloud computing in one blog entry. I will use this blog entry to cover the basics of cloud computing and in my next blog entry discuss how GXS’ Integration Cloud Platform, GXS Trading Grid, can help manufacturers improve and simplify their B2B processes. So first things first, what is a cloud computing?, well as you would expect you could ask five different people and get five different answers but one of the more succinct definitions that I have heard is from the analyst firm Forrester Research. They define cloud computing as “a standardised IT capability (services, software or infrastructure) delivered via internet technologies in a pay-per-use, self-service way”. Now definitions are great, but in this case I believe the easiest way to describe a cloud environment is to simply define the characteristics of how it operates: Dynamic – one of the keys to cloud computing is on-demand provisioning Massively Scalable – The service must react immediately to your needs Multi-tenant – cloud computing, by its nature, delivers shared services Self-service – As a user, you can use the service as often as you require Per-usage based pricing – You should only ever pay for the amount of service you use IP-based architecture – cloud architectures are based on virtualised, internet based technologies Over the past couple of years, efforts have been made to define the different types of cloud computing environments that are available and how they should be used by different ‘cloud consumers’: Public Cloud – available to the general public or large industry group and is owned by an organisation selling cloud services Community Cloud – shared by several organisations and supports a specific community that has shared concerns Private Cloud – operated solely for an organisation or company Hybrid Cloud – combination of two of the above, they remain unique entities but are bound together by standardised technologies As cloud technology has evolved in recent years a number of technology layers have been introduced within the cloud and the following definitions of ‘cloud layers’ seem to be pretty much accepted by many of the analyst firms around the world: Infrastructure-as-a-Service (IaaS) – this delivers the computer infrastructure, typically via platform virtualisation-as-a-service.  It is an evolution of virtual private server offerings.  IaaS includes servers, memory and storage that allow a customer to scale up or down as required by the business. Infrastructure can be used by customers to run their own software with only the amount of resource that is needed at a given moment in time. Platform-as-a-Service (PaaS) – delivers a computing platform and/or solution stack as a service, often consuming cloud infrastructures and sustaining cloud applications. PaaS allows users to use the cloud to develop new applications without the need to have the software or infrastructure purchased in-house. Essentially provides anything to support how a company builds and delivers web applications and services in the cloud. Software-as-a-Service (SaaS) – delivers software over the internet without the need to install applications on the customer’s own computers.  SaaS applications are run from one centralised location which means that the software can be accessed from any location over the internet.  Centralised management of applications helps to simplify maintenance and applications are consumed on a pay-as-you-go basis. The web based nature of cloud computing environments means that it is able to offer companies and users significant benefits over more traditional behind the firewall software environments: Global Accessibility – the cloud platform can be accessed anywhere in the world by simply using an internet browser Easy to Maintain – upgrades to the environment are managed and distributed from a central location thus eliminating the need to upgrade on a local basis Scalability – the platform can be expanded or contracted depending on the requirements of the business Rapid Deployment – new users, plants or offices can be brought online very quickly thus providing a significant competitive advantage, especially in emerging markets Flexibility – new modules can be introduced to the cloud computing environment with ease Low Cost – the pay-as-you-go nature of cloud environments means that IT and B2B teams have improved long term predictability of costs to operate the infrastructure Consistent User Experience – users are presented with an identical environment irrespective of where they login from and it can be tailored on a role by role basis Central Authentication – access to the cloud environment can be managed from a central location, thus simplifying user administration As well as generic business benefits, cloud computing offers many other benefits when compared to more traditional behind the firewall type infrastructures.  The benefits of moving from a traditional IT environment to a cloud based environment are summarised in the table below: So you want to implement a cloud computing environment?, but how do you persuade your Financial Director or Chief Financial Officer that cloud computing can help bring significant cost savings to the business, the table below summarises on-premise versus cloud computing related costs that can be expected. Finally there are many green benefits of cloud computing to consider.  Earlier in the year I wrote a blog on this subject , but in summary cloud computing allows companies to: Lower power consumption requirements, as in-house data centres and server infrastructures can be disbanded Less computer/network equipment packaging to dispose of as new equipment arrives at your office or business location Minimises travel requirements for IT implementation teams as all systems are administered and deployed from a central location The light weight nature of cloud computing applications means that they can be accessed by low power mobile devices such as smart phones and tablets Less paper flows across the supply chain as all trading partners will be exchanging business documents electronically I realise that I have only scratched the surface in terms of understanding about cloud computing but hopefully this blog entry has been useful for you. In my next blog entry I will focus on how B2B environments can benefit from cloud computing and in particular how a PaaS type environment such as GXS Trading Grid can help move your company’s B2B environment to the cloud. In the meantime if you would like to learn more about GXS cloud offerings then please click here.

Read More

Insurance Technology: Time to Get Your Head in the Clouds

As the spring 2011 conference season winds down, I attended the ACORD/LOMA Insurance Systems Forum in San Diego, CA. ACORD (Association for Cooperative Operations Research and Development) is a global, nonprofit standards development organization serving the insurance industry. LOMA (Life Office Management Association) provides training and education for insurance professionals worldwide. The event is billed as the premier business and technology event for insurance professionals. One of my goals at ACORD/LOMA was to better understand cloud computing in the insurance industry. There were several sessions that touched on cloud and Software-as-a-Service (SaaS). One of the most interesting was “Cloud Computing for Insurers: Time to Get Your Head in the Clouds” by Bob Hirsch, Director Technology Strategy and Architecture, Deloitte Consulting LLP. Bob provided some interesting thoughts on why cloud isn’t more prevalent in insurance. One reason is that cloud vendors have been slow to meet the regulatory demands of insurance. Another is that vendors are not in the “core” space–most cloud implementations are at the “edge for specific workloads.” Insurance firms also have concerns about data loss, security and privacy, audit and assurance, backup and disaster recovery, vendor “lock in”, and IT organizational readiness. Bob described vendor “lock in” as the inability to easily migrate your company’s information from the cloud provider’s data center to your own if you decide to bring processing back in-house. Bob suggested that with quality datasets, computing advances and maturing tools, analytics could become a strategic cornerstone of the enterprise.  As an example, he talked about the cost savings from moving volatile computing needs to the cloud. Bob explained that insurance companies need to run stochastic models each quarter to estimate risk. Large insurers are running grids of 2500 nodes and growing for this type of computing. Running the models can take 24 to 48 hours, but the rest of the time the servers are idle.  Bob stated that current grid systems can be modified to be cloud aware and “burst” capacity to clouds as needed by storing the grid image in the cloud and deploying it across servers as needed for periods of peak demand. Bob also walked through a cost/benefit analysis for Monte Carlo simulations for hedge funds which have limited in-house IT resources. The analysis showed in-house monthly costs of $14,280 vs. $6,930 for cloud, a 51% savings. For the moment, Bob said that smaller insurance firms are ahead of larger ones with using cloud-based applications.  This is because insurance systems are very fragmented within larger organizations and they are slow to consolidate systems across the enterprise.

Read More

Five Reasons Why Cloud Computing Helps to Develop Greener Supply Chains

Last year I posted a blog, in support of Earth Day 2010, relating to the volcano that erupted in Iceland, I said at the time that it was nature’s way of reminding us that we should be thinking about greening our supply chains and making our logistics networks more efficient. This year Japan faced a significant natural disaster which not only brought hardship and distress to many families but it also led to global supply chains being significantly impacted by parts shortages from Japanese based suppliers.  Globalisation can be a great thing, BUT you must remember that wherever a new manufacturing plant is established you must be able to ensure prompt delivery of parts. I am sure many Japanese manufacturers, especially those in the automotive sector, will be re-thinking their supplier and logistics related strategies to support their global network of plants moving forwards. With Earth Day 2011 fast approaching I thought I would discuss some of the green related benefits that can be obtained through implementing a Cloud based B2B infrastructure. Now much has been discussed about the benefits of Cloud based infrastructures and in an earlier blog entry I discussed how Manufacturers could benefit from Cloud based environments. There have been many research articles written about how Cloud based  environments are quick to deploy and easy to maintain on an ongoing basis but I have seen very little written about the green and sustainable reasons for deploying a Cloud based infrastructure. Therefore I thought I would highlight a few green benefits in this blog entry.   1. Lower power consumption requirements – This is one of the most important green related benefits that a company can realize by adopting a Cloud based B2B infrastructure. Many companies around the world have spent millions of dollars establishing their own in-house data centres or server environments. From  investing in highly available power supply infrastructures with uninterruptible power supplies (UPS) with diesel generator backups, extensive lighting infrastructures, through to implementing complex networking systems and air conditioning systems.  All of these when combined together produce a relatively large carbon footprint from a day to day infrastructure operations point of view. As Cloud based environments are hosted by an external vendor, this not only helps companies remove the cost of supporting an in-house data centre or server infrastructure but more importantly it can help to significantly reduce a company’s carbon footprint by reducing the power consumed to run the supporting utilities related infrastructures such as those mentioned above. 2. Less equipment packaging required for server and networking hardware – In-house data centres require numerous servers, storage devices, networking equipment etc and these typically arrive from the IT suppliers in large, over packed boxes containing cardboard, wood, plastic and polystyrene.  Once these pieces of equipment have been delivered then the packaging needs to be disposed of carefully or recycled.  In addition, the software installed on the servers will typically arrive on a CDROM and via an extensive paper based installation and setup manual. Cloud computing based environments remove the need for buying and running extensive computer servers and other associated infrastructures. Therefore there is an opportunity to minimise the amount of wasteful packaging materials that are used to transport these pieces of equipment to an office or manufacturing location. 3. Minimises travel requirements for IT implementation teams – Many companies have globalised their operations over the years and in a bid to reduce operational costs many companies have established a presence in emerging markets such as China and India. However one thing that is often overlooked is that when entering a new emerging market you will need to secure local IT implementation resources in order to help setup your IT or B2B environment.  However due to the limited availability of skilled IT resources, many companies deploy their own implementation resources which means that companies must fly personnel into the region and perhaps keep them onsite for a few weeks.  However typically these employees will be seconded to the new plant to not only get everything setup but to also cross train local staff in the future maintenance of the equipment.  So if a North American manufacturer sets up a new plant in China, how many people will it have to fly across the world to support the operation?, how many tonnes of green house gases will the planes burn to carry staff and equipment to the new plant? As Cloud computing environments are hosted by an outside provider and are typically very easy to deploy, you can, in most cases, remove the need to send employees half way around the world as these environments can be brought online and monitored remotely. 4. Less paper required as hosted platform encourages full participation from a trading partner community – Many companies have struggled to encourage all their trading partners, especially those in emerging markets, to send information electronically. Instead, many smaller suppliers still use manual paper based processes. For example in China the fax is still seen as one of the main business related communication methods. Also, there are various systems for exchanging shipping related information between logistics carriers and across customs and border control agencies.  The very nature of Cloud based environments means that they are quick to deploy, easy to use and simple to maintain on a daily basis.  The use of web based forms to replicate paper based form content means that even the smallest or least technically aware supplier or border control agency can simply enter or view information directly via the web based portal environment. Paper based copies of web forms can still be printed off if required, but in a Cloud environment this is more of an on-demand process. Once you enter information via web based forms it automatically gets fed into some form of hosted application within the Cloud environment. Cloud based environments significantly reduce the amount of paper flowing across the extended enterprise, at the same time it encourages less technically capable trading partners to participate in your B2B program. 5. Increased availability of information improves supply chain efficiency – Many companies typically store their business information in multiple enterprise systems across many different servers located in different countries around the world. Trying to track down the information that you require and then access via some form of networked computer system can be difficult at the best of times.  This is made more difficult if you are working remotely and you need to connect into a business system via a laptop computer. Over the past couple of years smart phones and Tablet devices have changed the way in which users can get access to enterprise information on the move.  Whether you are using an office based desktop PC or a laptop, they will either need to be connected to a power supply or charged up in order to be able to do any lengthy or meaningful work. An Apple iPad on the other hand is extremely eco-friendly from a power consumption point of view, especially when you consider that on a full charge the iPad’s battery will last for ten hours. In addition, the very fact that information is held in a central location and is accessible anywhere in the world via the internet means that logistics carriers for example can process shipping information a lot quicker, minimizing border control related delays and thus ensuring that shipments reach their destination in a much shorter period of time. Cloud based environments allow users to access information using more power friendly mobile devices and also helps all trading partners to get access to one central source of information. This helps to minimize supply chain disruption and improve green related efficiencies of logistics carriers. So in summary, Cloud based infrastructures offer companies a way to completely change the way in which they deploy and manage their IT or B2B environments and more importantly helps to significantly reduce a company’s overall carbon footprint. I am sure the recent events in Japan will make manufacturers think very carefully about their future data centre strategies and provide an opportunity to introduce greener ways of working.  I am also sure that as well as the global disruption to parts supplies, many Japanese based suppliers will have seen some form of disruption to their IT and B2B infrastructures following the recent Earthquake and Tsunami.  The main reason for this is that culturally, Japanese companies typically prefer to implement behind the firewall software based solutions rather than depend on external providers to host their IT or B2B solutions for them.  Given the green related benefits mentioned above and the ability to maintain continuity of a business in the face of a natural disaster, I would expect to see more Japanese companies taking an interest in Cloud based B2B infrastructures in the near future. For further information about Earth Day 2011, please click here. For further information on GXS Trading Grid, the world’s largest Integration Cloud Platform, please click here. The Earth Day website highlights the number of  ‘Acts of Green’ occuring in the world, well I guess you could say that every electronic transaction passing across GXS Trading Grid could be considered an ‘Act of Green’ as it removes an equivalent piece of paper from the supply chain. For more information on how GXS can help to introduce green supply chains and to get an idea of how many business transactions we are processing at the moment please visit our dedicated green supply chain microsite by clicking here.  

Read More

Why SaaS Makes Cents for Supply Chain

Applications such as sales force automation, human resources and expense reporting have enjoyed most of the success and publicity in the early years of SaaS.  But there is a growing level of interest in utilizing SaaS for a broader range of business applications. Supply chain is an area in which vendors are investing in SaaS models.  A recent Forrester Study estimated that SaaS is already the primary model for 51-90% of Global Trade Management applications. Other applications such as transportation management, spend analysis, e-procurement and e-invoicing are also transitioning to SaaS with 26-50% of the market using cloud-based providers.  For supply chain applications, SaaS solves one of the biggest challenges, which is how to fund multi-enterprise solutions.

Read More

Dell reboots their supply chain. Again.

The sight must have raised eyebrows. A dozen geeks marching conga-line-style through the cubes at Dell, placing a mass of network cables on an exec’s desk as a trophy to a major milestone. OK, conga line might be a stretch, but you get the picture. The occasion was to celebrate a milestone in Dell’s massive B2B platform transformation in 2009, supporting the switch from in-house manufacturing to third-party providers (eg., ODMs, like Foxconn) and the expansion of their retail customer connections (eg., Best Buy). The numbers? 1,900 trading partners migrated across 30 countries. Three legacy B2B platforms consolidated into one integration cloud platform.  200 servers decommissioned, along with 20 networks, 20 datacenter racks, 10 databases and 6 TB of storage. Partner onboarding times dropped from months to days. While this was written up a while back, a couple of recent items bring it back to the surface. First, this week Dell announced they’d doubled quarterly earnings. They “crushed it” as Gary V would say. This is good news not only for shareholders but also could be seen as a sign of a rebounding economy (disclosure: GXS is a Dell customer. We use Poweredge servers in Trading Grid, our integration cloud platform). Second, the world of B2B managed services is going through a renaissance, as cloud computing helps blur the lines between traditional B2B e-commerce and any-to-any (A2A) integration.  This “Integration Brokerage” market (coined by Gartner) is growing at 18-20% and describes an IT managed service that delivers people, methodologies and cloud-based integration, such as integration platform-as-a-service (iPaaS), for B2B e-commerce and cloud services integration projects. What about you? Will an ERP project spur the need for modernizing your B2B platform? Will there be conga lines in your future?

Read More

How will Cloud Computing Benefit the Manufacturing Industry?

I have been working in the IT industry for nearly 20 years now and I have always worked very closely with the manufacturing sector. In this time I have been fortunate to work for companies in the Product Lifecycle Management (PLM), Enterprise Resource Planning (ERP) and Business to Business (B2B) IT sectors. Over this time I have seen new IT trends come and go, but one trend that I think will be around for some time is Cloud Computing. The manufacturing sector is truly global in nature and has a diverse range of suppliers working in both established and emerging markets. Now I have discussed B2B in the emerging markets in previous blog posts so I do not intend discussing the benefits of why companies should establish a presence in these regions as they are fairly well documented. What I would like to try and cover in this blog post is why cloud computing will help change the way in which manufacturers run their operations around the world. It is really only the last few years that collaborative tools have started to be more widely used across manufacturing companies, whether it is exchanging large CADCAM files, production information or simply improving the way in which suppliers exchange information with their customers, in my mind collaboration has been undertaken in a somewhat ad-hoc way.  This may be because of the limitations of the various IT systems or more likely due to the resistance to want to use collaborative tools across the manufacturing sector. From a design perspective, collaboration tools have seen relatively slow adoption due to the sensitive nature of product related information and the reluctance to send design information outside of a company’s firewall.  But with new, secure communication protocols such as OFTP2 being introduced to the automotive sector and the ability to encrypt information to make it more secure, will manufacturers start to lose their inhibitions and exchange design information more freely? With the drive to lower costs, manufacturers have established a presence in the emerging markets such as China and India and one of the first problems they face is trying to connect a remote plant or office to the company’s central IT platform. In many cases, a western manufacturer will send IT personnel to the new plant to help get the IT infrastructure up and running. They may have to stay onsite for a lengthy period of time to ensure that the remote employees are up to speed with the new IT systems. In fact there is a chance that the remote employees may have to be supported onsite for many months until the production systems stabilize and information begins to flow seamlessly from the various IT systems back to the HQ environment. Now what about if you could simply connect a new plant into a ‘Manufacturing Cloud’ and this cloud contained everything that was required to get the new plant online and up and running as quickly as possible, with minimum resources being applied? Cloud environments are widely regarded as being made up of three core components, Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and finally Software as a Service (SaaS).  So these three components have the ability to replicate an entire behind the firewall IT environment within a hosted or Cloud based environment. Many software companies are now offering SaaS type applications, for example SAP with their Business by Design Solution which is effectively an ERP system in the Cloud. From a PaaS perspective, GXS Trading Grid for example offers the world’s largest Integration Cloud Platform allowing more than 150,000 trading partners to connect with each other. The combination of SaaS, IaaS and PaaS will allow manufacturers to establish a presence in virtually any location around the world. They will be able to connect plants to an IT infrastructure using a ‘Manufacturing Cloud Template’, ie everything that is required to get the plant up and running, for example PLM collaboration tools, ERP applications, B2B and community management tools will all be pre-configured according to a user’s role within the business and they will be able to be deployed with minimal effort. If Facebook can have 500 million users interacting and exchanging information on their website on a daily basis then it must be possible to replicate this concept across say 1000 employees at the new plant and 500 trading partners who may be providing goods and services on a daily basis to keep the new plant running. In fact a recent article in Industryweek by ARC President Andy Chanta highlighted the importance of building Cloud based, company wide networks to bridge the gap between the various departments within a manufacturing operation. Andy goes on to say that U.S manufacturers in particular “need to become more innovation centric to increase their global competiveness”, and increased use of technology is key to this. Many of today’s manufacturers are seeing their global aspirations being eroded due to having to work with legacy IT systems which make it difficult to extend an IT infrastructure in to other regions of the world.  Cloud based infrastructures allow manufacturers to overcome these restrictions and allow them to seek new business opportunities in any region of the world. Key to Cloud adoption across the manufacturing sector will be having support from all management levels within the company, including the CEO. If companies are able to realize significant cost savings by adopting cloud based services and it allows real time information to flow freely across a seamless ‘collaboration network’ then it will allow the company to make more informed decisions with regards to company strategy and future growth of the business. Another quote I liked from Andy was that “manufacturers have done a good job over the past decade of optimizing and automating their supply chains.  Now the focus needs to be on “end-to-end” integration that fosters collaboration between R&D, engineering, operations and functions”.  

Read More

Will AS4 Become the Communications Standard for Cloud Based Integration Services?

2010 has certainly been the year when Cloud Computing has started to get on to the agenda of many CIOs around the world. Nearly every major ICT related conference discusses Cloud Computing in some shape or form, whether it is to do with the infrastructure to support it or the applications that run within the cloud environment. In early November I attended the 112th EDIFICE plenary in Amsterdam which was hosted by Cisco.  EDIFICE is an industry organisation that helps drive the adoption of B2B standards and processes across the High Tech industry. I always find these plenary sessions useful for learning about the latest High Tech industry trends and one that caught my eye at this session was how Cisco’s was deploying an AS4 based environment called Web Services Externalization (WS-X). WS-X essentially acts as a gateway supporting different message formats and has the ability to route requests to the relevant service providers. The framework utilizes a number of modules from Apache including Axis2/Java an open source SOAP implementation, Sandesha reliability module and Apache’s Rampart, security module. Now I have been at GXS for nearly five years and in that time I have seen a lot of interest in AS2, primarily amongst our retail customers and some of our manufacturing customers have implemented AS2 as well. A few years ago I heard that AS3 was on the starting blocks and was going to be the next big thing but for one reason or another it doesn’t seem to have seen widespread adoption, well at least compared to AS2 anyway. AS3 even promised the capability to transfer large files across the internet however even this feature didn’t help it gain momentum in the market. AS2 had one advantage over AS3 in so much as the world’s largest retailer, Walmart, decided to standardize on AS2 for all their trading partner communications.  At that time trading partner communications via the internet was receiving a lot of interest from many retailers and Walmart decided to take the initiative to try and standardize on an approach for its deployment across their business. To date AS3 has not really had this level of interest and I think it is really a solution looking for a problem. But now AS4 is here and you may be asking should companies take an interest in this when AS3 failed to gain any significant market traction? Well I guess it is all about timing. With so much interest in everything Cloud at the moment, companies are starting to look at ways of not only deploying enterprise applications up to a cloud based environment but also looking for ways to provide integration to other cloud based services, whether they are private or public clouds.  AS4, with its web services capabilities has the potential to become the cloud based communications standard moving forwards. AS4 is quite similar to AS2 in many ways however it operates within a web services context and unlike AS2, AS4 has enhanced interaction patterns and acknowledgement receipts. AS4 has the following characteristics: Provides acknowledgement receipts thus enabling reliable message delivery and retry in the event of a lost message Provides password authentication, digital signatures and encryption , confirms authenticity of the sender and ensures that the message is unaltered whilst in transit Offers  large file compression and transfer support Error generation, reports any errors to the message sender of the message receiver Message exchange patterns, allows a rich variety of interactions between the sender and receiver Many companies today use a variety of communication protocols when exchanging B2B documents with their trading partners. AS4 adopts the ‘just enough’ design principle and defines a lightweight profile based on ebMS3.0. The light weight nature of AS4 means that it is a relatively low cost communications standard to implement and is therefore ideal for use with trading partners based in emerging markets for example.  Trading partners in these regions typically have limited technical capabilities. Many high tech suppliers are establishing a presence in the ‘new’ emerging markets of Vietnam and Thailand. As mentioned previously AS4 has the potential to become the standard for inter-cloud integration.  From an integration perspective there are two key layers that make up an integration stack, these are the messaging layer and the payload layer. To achieve cloud interoperability Cisco is of the opinion that AS4 is the appropriate standard for the messaging layer. Cisco strives to use industry standards where possible and in order to support transactions with high tech trading partners the payload layer may use either OAGIS or RosettaNet PIP standards. A key challenge in cloud computing is the interoperability among various cloud providers. This will continue to be a challenge until interoperability requirements are standardized to support business exchanges.  AS4 helps to address this challenge for the messaging layer. The combination of standardized transports and message content will help facilitate critical adoption levels, continuing to drive down costs, and improve time to capability for business exchanges over the internet. Cisco is the one of the first companies that I have heard of that is actively using AS4 and as with the Walmart effect caused by their AS2 supplier mandate I wonder if Cisco will help to see wider adoption of AS4 in the High Tech market space?, especially given their position in the market of setting up data centres for hosting cloud based infrastructures. If nothing else AS4 is perfectly timed to meet the challenging needs of integrating cloud based services.

Read More

Finding Context in a River of Data

I’ve discussed the river of data that flows between trading partners.  This river of data has the potential to deliver far beyond it’s role in business process execution, but there are challenges.  The first challenge is establishing a context. A splash of water  could be the drop that breaks the dam, the drink that saves someone from thirst, the coolant that prevents the engine from overheating, or any one of a dozen other potentials. When a purchase order, ship notice, payment, or change in inventory status comes flowing across the wire, it too needs a context to be truly useful — but what do I mean exactly by context?  It’s one of those concepts that is easiest to understand through examples. Types of Context in B2B Process: what business process is this transaction a part of?  One of the most common processes we help customers with is the “order process”, but there are actually many intersecting processes going on with any real interaction. For instance, orders (hopefully) result in shipments and (sadly, for the buyer) invoices.  The shipment and invoicing processes intersect the order process – so a given transaction (say, an Advance Ship Notice) may participate in many processes simultaneously.  The ability to see a document in the context of a process is very valuable.  Order processes are typically not too bad in this regard, as the originating order number is often a part of the downstream transaction flow.  The trick to process context is associating a given transaction to a completely defined process so that you know how it is supposed to be handled. Customer/Supplier (Partner): what business relationship is this transaction a part of?  Hopefully knowing your partner is not an issue — but partner context is not about identity, it is about the relationship.  If a supplier is going to be late on an order, is it part of a pattern?  Am I about to violate a service level for a customer that I am in the midst of renegotiating a big contract with?  This is theoretically the realm of the CRM/SRM/ERP infrastructures, but very few of them operate well in real time, as events are flowing in off the wire.  The ability to put a B2B transaction in the right partner context quickly can have a big impact. Product: what product line is addressed by this transaction?  Is it a new product, a retired product, a product that doesn’t exist (i.e. a data error).  Product context is also the entry point for trade promotions management in some industries.  This has been an interesting area of late, as customers have sought to tie in B2B execution systems to master data management, precisely for ensuring a correct product context. Financial: how big is this transaction?  What effect does it have on the financial metrics of both organizations?  Traditionally the realm of ERP systems, the ability to look into the river of data and see the money flowing can be very powerful.  Sometimes establishing this context is as simple as integrating effectively to the ledgers within the traditional ERP systems.  Other times, smaller companies may look for help from a SaaS (software as a service) offering to deliver financially oriented reports on what is flowing (because their internal systems lag reality a bit). Logistical: what does this transaction tell me about the logistics involved?  Specifically, is this shipment/payment/etc going to be where I need it (either physically or financially) when I need it?  As an example, if a big promotion is planned for a given product, and several containers of it are held up in customs, that could be a big issue. If a truck of critical components for a factory is delayed, is that an annoyance or will it grind production to a halt.  Part of the logistics context is understanding how things move, and what ultimate impact a disruption could have.  If you know that an item is being shipped by boat, rail and truck – and is delayed in a rail yard, can the truck make up the time?  When is the product needed?  To establish a logistical context, you need to know the planned logistics (often available in shipping transactions), and the historical performance of the modes and carriers involved. Market/competitive: has there been a change in the order pattern?  If a few customers (or even one) change their order pattern, it could be a sign of a competitor move.  If several customers (or suppliers) suddenly change behavior, it could be a sign of a shift in the market.  Change is really only something you can observe within the context of a given relationship.  The reason B2B in context is a critical area to observe is that averages and aggregates can sometimes be misleading.  If the market is growing, but a big customer is shrinking, you may not see the customer issue in the totals. There are many kinds of context I have not even discussed (geographical, regulatory, legal), but you get the idea.  The challenge is to decide which contexts make sense, and learn to see the river of data flowing between you and your partners through the lens of those contexts.

Read More

The Cloud may have the Computing, but the River has the data….

Organizations around the world are busy extolling, exploring or arguing with the notion of Cloud Computing, but there is another metaphor that may grow in importance as Cloud Computing matures — the River of data that flows between organizations and their partners which can be tapped for power and profit. Every day, companies exchange millions of electronic documents with each other that have some very special characteristics: – the data matches actual transactions occurring in the “real world” (orders, shipments, payments) – the data is structured in a way that provides meaning, it can be processed and turned into information – the transaction data has relationships to other transactions in the River that can be correlated Unfortunately, for most organizations, the potential to tap into this powerful source of information remains just that, potential.  The challenges inherent in examining, correlating, and acting upon the massive flows of data generated by even modest enterprises have typically been overcome only rarely, and often not at a sufficient scale to truly exploit the opportunity.  In this post I’d like to look at some of the challenges that make this difficult, and I will explore the opportunities in future posts. Some of the Challenges…. Context: even today, most B2B transactions utilize traditional EDI standards like ANSI X12 or EDIFACT.  Despite the standardization of documents, it is frequently tricky to understand the context in which a given transaction is operating (e.g. for a given ASN — ship notice — what order process is it part of?).  Without context, it is a free floating piece of data, requiring some associations to turn it into information.  If a critical shipment is one day away from delivery to a store about to run a promotion, that is information — if Shipment #101 has been processed, and I don’t know what process it is part of, that’s just data Timeliness: while immediacy is a great benefit of the data flowing between trading partners, age is its enemy.  With today’s more efficient supply chains, data starts to go stale very rapidly.  The challenge is to connect data flowing between partners to other information and to business processes before it is too late to act upon the information.  The data flow is not unlike electricity generated by a dam, which must be sent over wires to be consumed in real time.  Data warehouses can use the last three years of data to help forecast, but a logistics system has to act upon what is happening the chain today to deliver ROI Exceptions: until “the perfect order” is actually achieved, some of the most critical issues between partners are those that generate exceptions, either technically (cannot process an order or logistics document) or on the business side (wrong product, amount, price).  It is truly scary at the latency involved in resolving these exceptions, with most of them basically “failing out” to a manual process.  Despite the fact that for the “success case” there may be 100% automation, the most common “automated” exception handling is an email alert (how full is your inbox?) Integration: the “father of all challenges” when it comes to automation (and its role in resolving the previous three challenges), is the ability to integrate the flow of data into the many systems that are capable of rapidly providing context, and identifying and handling exceptions.  This challenge has always been daunting, but recent developments in technology and the ongoing march of technology have started to improve the integration possibilities Scale: the sheer volume of data, while being its greatest strength, may also be its biggest challenge.  Since millions of transactions flow between partners on a given day, to understand what is happening, all of those transactions have to be interpreted, put into context and analyzed for meaning.  RIGHT NOW!  If the approach is to methodical, you succumb to the timeliness challenge, but if the approach is too loose, you may generate so many “false exceptions” that you actually degrade business performance, rather than improving it These challenges are by no means the only — or even necessarily the most difficult — obstacles to successfully navigating the River, but I want to start focusing on the opportunities in my next post.  In the meantime, please share additional challenges to tapping into the flow without drowning.

Read More

Who has the Largest B2B Integration Network?

Since IBM’s acquisition of Sterling Commerce in May there has been a lot of discussion about the consolidation of B2B networks (and what it means for the industry).  One of the topics discussed amongst B2B integration professionals is – Who has the largest B2B network?  Numerous analysts and thought leaders declared GXS to be the largest following the successful acquisition of Inovis in June. However, I am not convinced that GXS really is the biggest.  Amongst the traditional supply chain oriented EDI networks, GXS earns top ranking for network-based revenues, annual transaction volumes and number of companies connected.  However, if you were to consider other types of similar B2B networks in adjacent industries such as financial services or health care, then the #1 position is not as clear. For purposes of this analysis, I consider a B2B network to be a vendor providing a private cloud specifically designed for the purposes of exchanging standardized messages to automate cross-enterprise business processes.  Many of these types of networks exist to support different business processes ranging from supply chain to financial services to health care.  Below are five, in addition to GXS, of the largest that I am aware of: Ariba – describes its Supplier Network as “the largest transacting network in the world.”  Ariba’s network boasts over 300,000 suppliers which process over $120B in annual spend in 70 currencies across 130 countries.  Buyers and suppliers leverage Ariba’s network to exchange product/service catalog data; purchase orders and commercial invoices.  Over 23M purchase orders and 11.5M invoices are processed annually.  Ariba supports a diverse range of standards including cXML, EDI, CIF and punchout.  The company reported $339M in annual revenues for 2009.  Ariba embeds its Supplier Network within its various spend management, e-procurement and e-sourcing applications making it challenging to derive what percentage of revenue comes specifically from the network. Emdeon – is the largest player in the Health Care Revenue Lifecycle Management segment (formerly referred to as HIPAA clearinghouse).  Emdeon states that its’ clearinghouse “has connections to more payers, providers and vendors than any other healthcare business in the marketplace.”  Emdeon’s network connects to 340,000 health care providers; 81,000 dentists; 55,000 pharmacies; 5000 hospitals; and 1200 government or commercial payers (e.g. Medicare, Medicaid, HMOs, PPOs). One out of every two commercial health care claims delivered electronically in the US was processed by Emdeon.  In total, Emdeon processed 5.3B health-care related transactions in 2009 resulting in $900M+ in revenues.  The most common transactions include eligibility verifications, referral authorizations, claims submissions and remittance advices, which are typically in the HIPAA EDI standard formats. SWIFT – is a co-operative organization that is owned by a consortium of financial institutions around the world.  SWIFT has arguably the largest B2B network in both the banking and capital markets segments.  Over 9000 financial institutions representing over 200 countries utilize SWIFT’s network to exchange payment instructions, account statements, letter-of-credit documents and securities holdings reports.  SWIFT supports both its own set of standards (FIN/MT messages) and numerous third-party standards such as ISO 15022, ISO 20022 XML, FPML and FIX.  In 2009, SWIFT processed 3.76B messages, approximately 49% of which were payment-related and 43.9% were securities-related.  SWIFT generated €596M of revenue in 2009 of which €360M was from traffic. NYSE Technologies (NYFIX) – There are a number of large FIX networks.  Each of these networks operates as a product line within a much larger IT or financial services firm, which makes comparative analysis challenging.  For example, NYSE Technologies acquired the largest remaining independent vendor NYFIX in 2009.  Pre-acquisition, NYFIX reported over $70M from 1000 firms in 2008 revenues from FIX-related products.  Sungard’s Global Network is considerably large as well, but the only information publicly available is the number of connections – 2500 financial institutions.  Thomson’s TradeWeb Routing Network is another large FIX network with 2000 institutional customers trading $1.2B shares representing $280B daily.  Thomson claims to process 12-15% of daily NYSE and NASDAQ volumes, but like Sungard provides no information about revenues generated from the service. Omgeo – is a joint venture between Thomson Financial and the Depository Trust Corporation (DTC).  Omgeo provides a network to facilitate securities clearing transactions (middle-office) between investment managers, broker/dealers, and custodian banks.  The primary transactions exchanged include allocations, confirmations, affirmations and post-trade matching reports (see end of post for clarification).  Omgeo provides services to 6000 clients representing 2400 different financial institutions.  The joint venture does not report revenues publicly, but a 2006 press release stated that Omgeo generated an impressive $250M annually. GXS – is the largest player in the traditional EDI network space.  GXS focuses primarily on supply chain transactions to support the retail, automotive, high tech and manufacturing value chains.  The highest volume transactions processed on the Trading Grid include purchase orders, shipment statuses, commercial invoices, payment instructions and product catalogs.  While EDI remains the most popular choice amongst its customers, GXS also supports a wide range of XML and vendor-specific file formats (e.g. SAP IDOC). GXS claims 40,000 direct customers and 150,000 connected trading partners.  The company maintains direct operations (employees and offices) in 20 countries throughout the world.  In the most recent SEC filing, GXS reported that its 2009 revenues combined with Inovis were $487M (pro-forma, unaudited), the majority of which were from network-based services. There are numerous other vendors in the B2B integration sector that I did not list above.  In a future post, I will provide more details on the other large networks.  But of those that I did list above, which do you think should earn the title the largest B2B integration network?  Of course, the answer depends upon how you measure “largest.”  Multiple different metrics could potentially be used such as Annual revenues generated Number of customers (or connections) Annual transaction volumes Number of countries serviced If you were to rank based upon publicly available revenue reports then Emdeon would clearly place #1 even thought it only focuses on the US market.  SWIFT, which operates around the world in multiple financial sub-verticals would rank  #2.  GXS, following its Inovis merger, would be #3.  There would be some competition for #4 between Ariba, Sterling Commerce, Omgeo, NYSE Technologies (NYFIX) and others.  That is my perspective, but comment below to let me know what you think. Note that the securities transactions processed on FIX networks are trade-related (front office) which precede the stage of the lifecycle in which Omego is involved.  And the securities transactions processed on the SWIFT network typically are later in the lifecycle (settlement or asset servicing) than Omgeo. 

Read More

Does Your Supply Chain Have An Early Warning System?

New Predictive Intelligence-Based Financial Industry Risk Ranking Offers A Great Model for Global Supply Chains The dust, or ash cloud if you prefer has all but settled from the recent Icelandic volcanic eruption. An eruption that grounded flights yet gave flight to heightened awareness of the need to intelligently predict and respond to business disruptions. In days following the event, I was heartened to see humorous (and futile) attempts to master the pronunciation of Eyjafjallajokull quickly make way to discussions of supply chain risk and resiliency. Systemic Risk Ratings For Financial Institutions With ash clouds serving as the perfect backdrop for risk mitigation initiatives, I found it was very interesting to see New York University SternSchool of Business launch last week – the NYU Stern Systemic Risk rankings – a weekly rating and ordering by level of risk that the largest U.S. financial institutions bring to the financial system. The rankings are updated daily and utilize numerous elements of market data, from 1990 to the present, and provide an early warning that will help identify threats to the overall health of the financial system. The Systemic Risk Contribution Index (SRISK%) ranks firms by the percentage of total system risk each is expected to contribute in a future crisis. The Systemic Risk Rankings take into consideration factors such as company size, exposure to loss of market capitalization, and leverage. While new rating will serve as a dashboard to monitor and react to the volatility of corporate health, I can’t help but wonder if a similar early warning system couldn’t have prevented the recent financial industry meltdown, credit default swaps notwithstanding. Past Performance IS Indicative Of Future Results Unlike how mutual fund prospectus are obligated to warn us, it is vital to look into the past performance a company’s trading partners along with measures that analyze prevailing internal and external events to measure risk. An early warning system, similar to the NYU Stern risk ranking is imperative for corporations to predict and prevail in the light of potentially devastating supply chain disruptions. Risk and mitigation strategies are arguably subjective and often highly dependent upon the structure and dynamics of each supply chain. Yet, accurate, effective and timely KPIs, benchmarks and scorecards that track the operational performance of vital supply chain partners such as vendors, contract manufacturers, carriers and agents are absolutely imperative to managing risk. I will elaborate on the various facets of supply chain risk that can be discerned by analyzing operational data in subsequent posts. Where There Is Risk, There Is… Insurance! You know supply chain risk is no longer a theoretical or esoteric concept when you have a structured insurance product to help you hedge against it. Zurich Financial Services group has launched two new products – Supply Chain Risk Assessment and Supply Chain Insurance – to help businesses protect their supply chain in this modern, global marketplace. Corporations would be well served to consider such offerings alongside their own internal initiatives that are required to monitor and manage risk factors before they snowball into expensive disruptions. Operational dashboards, facilitated by a nimble way of analyzing operational data exchanged between trading partners and correlating it to seemingly exogenous market events deliver predictive intelligence. This is the vital early warning system that every supply chain needs.

Read More

Integration Smackdown: Documents versus API in B2B

I have been focused of late (a rare luxury for me) on exploring the integration technologies to the emerging cloud computing providers — especially the SaaS guys.  Last week I attended Dreamforce 2009, which was an absolutely fantastic experience. Salesforce.com has posted a bunch of the material on the site, and it is worthwhile. While at Dreamforce, I attended training on integration with Salesforce.com, both to and from.  I enjoyed the class, and found the relatively rich set of alternatives (remember that this is a very young company that has quadrupled its workforce in the last few years) to be a good toolbox for integration.  One of the more intriguing aspects is how the company’s engineers balance control of service levels (i.e. their ability to prevent you from hurting their performance when you integrate) with access.  One of their key technologies is a set of “governors” to control the resources available. While I really enjoyed the exposure to the API-level integration, I cannot help but compare all of this to how we traditionally do integration between business partners, as well as to SAP, which is via standard documents.  Apparently I’m not alone, as Benoit Lheureux of Gartner wrote a thoughtful blog post on how cloud integration will affect traditional integration. From my point of view, API level integration (and yes, this includes web services), is usually more powerful and functional than integration driven through documents (disclaimer — yes, I realize at its heart that web services is basically the exchange of XML documents over HTTP, but for this discussion, it feels like an API…), but, document driven exchange is easier, more interoperable, and often enjoys much higher adoption.  And a major part of the “document advantage” is the ability to separate the handling of the document from “transport” (communications), which often allows use of familiar technologies to perform the integration. For my purposes, API level integration is the calling of specific low level services, with a set of arguments, often — though not always — using web services.  Although web services is “standard”, the service calls are typically specific to a given SaaS provider (i.e. proprietary).  Additionally, the service calls may change based on the configurations done for a given instance setup for a customer (i.e. custom proprietary). Document level integration, in contrast, uses the basic pattern of sending in a document in a defined format, often — again, not always — in XML.  This document may vary from customer to customer, but most of it is the same, and customers can often submit the document in a variety of ways (web services, ftp, etc). SAP is a pretty popular ERP system in the current world, and it supports a wide variety of integration technologies.  In our B2B managed services worlds, IDocs over ALE is by far the most common integration technology — despite the fact that there are often very good reasons to use a lower level approach (directly invoking RFCs after sending data using FTP, for instance). Why?  Among many, many other reasons, customers and integrators like solutions that work predictably across multiple environments.  IDocs, like X12, EDIFACT, OAG, etc, are defined entities that do not change based on adding fields or custom logic to an application.  But possibly a bigger reason is the ability to use existing technology and processes to perform the work.  SAP IDocs can be manipulated using standard translation technology, and then sent via ALE or other communications methods.  The ALE protocol can be implemented in a comms gateway that you already own. Modern communications gateways have extensive support for web services today, but that is really a kind of toolbox for building custom adapters to custom services, with each one a “one-off”.  This problem can be intensified if the service you are connecting to changes over time as additional data fields are added to objects (a product object is especially susceptible to change). API level integration is usually much more brittle for this reason, and it is one of the characteristics that led many enterprises to switch to message oriented middleware, and attempt to impose canonical standards (“canonical standards” is a fancy way of saying a common format — usually in XML — for frequently used documents, like customer; canonical standards are often adopted from outside, especially the OAGi standards).  Integrating to a single system via API is often fun, maintaining dozens or hundreds of these is not. A common pattern is for emerging technology categories to start with low level APIs, and gradually build up to more abstract interfaces that are easier to use and less brittle.  Already, in the case of Salesforce, they are offering a bulk load service, and they are also offering a “meta-data” API that allows tooling to simplify integration.  Over time, I fully expect that most major SaaS providers will provide integration methods that feel a lot like trading documents, and that B2B teams will take the lead in using them. In the long battle between document and API integration, document style integration will dominate, though API and service level integration will play critical roles…

Read More