Every night retailers around the world batch up their Point of Sale (POS) data and transmit it to their suppliers shortly after the stores close. Suppliers can gain a great deal of insights from inspecting retail sales data. Suppliers can understand how many of each of their products were sold and in what stores. If they are lucky the retailer may provide insights about what other items that were in the shopper’s basket and purchased at the same time. And if they are really lucky, the retailer may provide demographics about the specific shopper who made the purchase, which suppliers can use to understand the profile of their target customer.
Transmitting POS data is critical to demand sensing in the consumer products supply chain, but it often comes at the expense of a migraine headache for the IT department. The root cause of these headaches is that each retailer sends data differently. Some retailers will send each supplier the data for their specific SKUs. Others send over their entire day’s results across all their suppliers. Some retailers send five fields for each purchase transaction while others might send as many as 100 fields. Some days only a subset of the POS may be transmitted if the stores are unable to report in a timely manner. And during busy seasons like the Post-Thanksgiving holiday buying season there may be a spike in sales for particular items, which means much larger files. As a result the size of the POS files to be received can vary considerably from day-to-day.
Why are large POS files a problem for IT? Well, they can be a big problem if the IT infrastructure for which they are destined wasn’t expecting them. First, there may not be enough disk space to accommodate the files. Suppose you have a 100GB hard drive. Your NOC has set an alert to notify them when the disk has less than 10% capacity remaining. At 9pm the disk has 12GB of space remaining. But the retailers send over unusually large files starting at 10pm, whose combined size is in excess of 14GB. Suddenly you are out of space and cannot accept the data.
Large files also tend to choke firewalls and local area network capacity. They can also monopolize the resources of the Managed File Transfer or B2B Integration Software that processes the files. If your B2B/MFT software becomes too overloaded you may get what is effectively a Denial of Service (DOS) condition for other transactions. So a customer who wants to send you a $1 Million purchase order via EDI is unable to do so. Your HR team who needs to submit their weekly payroll run to the local bank is unable to do so.
POS files are not the only types of large files. These days’ companies are exchanging many different types of unstructured data – images, videos, audio files, telemetry, logs and entire databases. And you never know when you will get one of these files, because sender rarely gives any type of warning . In today’s era of nearly-ubiquitous broadband and virtually-free storage costs, why should they be bothered with worrying about the size of files?
By now you can probably guess where this is leading. Couldn’t cloud computing help with this problem? The large file transfer problem should be easily solved with the principles of elasticity that cloud providers introduce to storage and processing power.
Enter Commsbursting, a new technique being used by cloud-based integration providers such as GXS, to automatically provision additional file transfer capacity in situations where large files threaten to deny service. For example, if processing or storage space hits a certain threshold (e.g. 70%) then the cloud provider could auto-provision additional resources to accommodate the spike in traffic.
Commsbursting is not only useful in situations with large files. It can be used to respond to any type of spike in demand. For example, most of the payment clearinghouses around the world operate only during business hours. For example, a payment processing window might be from 9am to 4pm. As a result, near the end of the payment processing window as 4pm approaches there is a surge in demand for payment requests as accounting groups rush to get transactions recorded with today’s date. The pre-cutoff volumes are relatively unpredictable. But banks do not want to be in the situation of telling a client that they could not process an important wire transfer submitted at 3:55pm because their IT infrastructure could not accommodate the load. Using a commsbursting model to handle the spike in payment transaction volume is an elegant and economical solution for a financial institution.
So you may be wondering – how do you get commsbursting for your B2B integration environment? Technically, there is no reason why you could not build the capability into a private cloud in your own data center. But the ROI of having additional capacity available to burst up may not be very compelling. With a public cloud, however, the economics are far more favorable. A provider can recover the costs from a wide base of customers each of whom benefits at only a fraction of the investment that would be required in-house.