One of the places companies make a mistake with Big Data is they assume its only about new data. Its not. Its also about the volume growth of traditional transactional data, which according to one survey, is growing 50-60% a year. This is making transactional applications ans analytics infrastructure fall down, and requiring significant hardware upgrades to keep up with the pace. Its also about trying to take advantage of the variety of new data, which is growing 3-4x a year.
Those companies who realize this are working through these roadblocks. With regards to budget, theyre changing their architecture to avoid the expensive hardware upgrades. Theyre offloading processing of source systems and data warehouses by moving to real-time data integration technologies running on commodity hardware (see the webinar recording Tackling Big Data Using Informatica PowerCenter Grid at http://vip.informatica.com/cathertoninformaticacom7562?elqPURLPage=10297. Theyre also addressing the challenges associated with IT know how and the storage buldge by analyzing data more carefully, archiving what theyre not using and using a tiered storage approach. This is freeing up a lot of cash that can be used for new investments related to big data projects. All these savings helps them invest in handling the variety of data and innovations leading to new revenue generating data products and services.
Data integration and data cleanup is often 80% of the work involved with Big Data Analytics so organizations are investing in no-code visual development environments to build these data flows which also enables them to utilize more readily available resources like ETL developers.
Keep Up with TechRepublic