Companies with global supply chains need agility and flexibility to respond to supply chain pressures that can be as instantaneous as a volcano in Iceland or a tsunami in Japan. When their list of suppliers is in the hundreds or even thousands, this could well mean shifting manufacturing to another part of the world during a supply chain disruption-with the assurance that there are certified suppliers elsewhere that can meet immediate demand.
Navigating changes like this isn’t easy.
The first hurdle is getting new suppliers onboarded-a process for which companies have historically used EDI (electronic data interchange). EDI is a set of agreed-upon documents and standards that supply chain partners are supposedly in agreement with–and the intent is that if they test out a common interface to transfer purchase orders, invoices and other documents between their respective systems, they can get information into all parties’ internal systems.
Unfortunately, certifying suppliers for EDI can be an iterative and time-consuming process unlikely to meet rapid time-to-market requirements. Further complicating this is the need for agile supply chain events and continuous collaboration between sourcers and suppliers. For this, a common set of data that is both traditional, structured data and also unstructured big data is needed. How else can companies exchange, share and collaborate over engineering documents, product photos, videos and other supporting data needed to design, produce and distribute products?
This is also where a cloud-based supply chain solution that already has thousands of suppliers onboarded and pre-certified-and common data repositories of transactional and big data-can be an asset. For some companies, a cloud-based supplier community where both big and traditional data are managed by the vendor can reduce the timeframe for onboarding a new supplier from weeks to days-with immediate impact on time to market.
Does this mean that companies now struggling with their big data initiatives should automatically seek out cloud-based alternatives?
Not necessarily.
Harnessing big data as an asset is a strategic path to gaining mission-critical business intelligence. For this reason, companies should plan to take an active role in developing their own big data strategies, and not just outsource this. They should also keep an eye on new and affordable big data offerings from vendors that will be coming to market for companies that cannot afford their own supercomputers, and that want to process their big data internally.
However, this doesn’t mean that big data cloud services should be avoided.
There are excellent reasons to look to the cloud for immediate big data relief. What we know about this so far is that those companies who win biggest with this strategy come to the cloud with compelling business cases that the cloud is well positioned to handle (such as the supplier community example).
A cloud-based big data solution also works for organizations just getting their feet wet with big data. In these situations, especially if the cloud provider has industry and analytics expertise in the company’s business vertical, the company can learn from big data practices and analytics that it can later apply on its own, if it chooses.
This is why many companies today are keeping their big data options open. They seriously consider cloud if it delivers a best-of-breed big data solution affordably and painlessly. At the same time, though, companies need to keep their long-term big data strategies in mind. This demands a strategic big data focus within the organization that should never be outsourced.