On first glace, Big Data seems to be the domain of only the largest of enterprises. The technologies are new and complex and the implementation costs staggering. Even the staff seemingly needed to run a Big Data operation is hopelessly out of reach of most smaller businesses, with data scientist salaries cracking the $300,000 US mark, a figure many businesses owners only dream of paying themselves, let alone a data scientist. Like many other technologies, however, Big Data need not be the sole province of the big guys.

Small is beautiful

Larger companies that have larger datasets seem like ripe candidates for Big Data, but in many of these organizations their data are scattered across a wide variety of systems and platforms-and along with larger budgets and superior infrastructure come dramatically increased data quality issues. Smaller companies often have data that are more consolidated and, in some cases, have designed their systems and processes to generate and capture cleaner data.

At least half the battle of generating intelligence from data is having readily available data to begin with, so while your smaller organization might not be able to afford a massive Big Data initiative, you can start deriving actionable information from your data while the big competitor is still cleaning and consolidating.

Furthermore, like ERP, CRM, and other enterprise platforms, Big Data is rapidly commoditizing. Just as anyone with a credit card can subscribe to one of the cloud-based business software platforms and have Fortune 500 IT services in a matter of minutes, Big Data analytics are slowly moving out of the confines of the enterprise server room.

Big Data is not just about volume

The “Big” in Big Data is usually the focus of most talk about the technology, with massive storage arrays and in-memory databases assumed to be unavoidable costs to joining the Big Data party. However, Big Data is also about speed and generating actionable information in near real-time. Not only is this speed related to the ability to perform a technical function, but the “speed to market” of the data-essentially the time it takes from when a data point occurs to the time a decision based on that data point is executed.

It’s generally easier to institute new processes and procedures in a small organization, and make and execute rapid decisions based on data, than it is in most large organizations. For many smaller businesses, competing on speed has been a go-to tactic against larger competitors. While the industry giant is crafting RFPs for their multi-million dollar Big Data infrastructure, the small competitor can be using open-source and cloud-based tools to perform more rudimentary analysis, but be acting on the results before a larger competitor even has their project approved.

Built-in intelligence

Like all technologies, successful implementation relies more on organizational acceptance and willingness to change than buying the right hardware and software. With Big Data in particular, if an organization lacks the discipline to gather relevant data, develop the right questions to explore through their analysis, and then have the organizational gumption to act upon the results, all the money and personnel advantages in the world will be for naught.