Three years ago, Cloudera raised
what was then considered a mammoth pile of $40 million in cash. Today, the
Hadoop vendor is reportedly raising $200 million at a $2
billion valuation. It’s a colossal amount of money, topping MongoDB’s $150
million raise just last year. Ironically, it may not be nearly enough.

In what amounts to a tectonic shift
in how the industry stores, manages, and analyzes data, $200 million may be the
barest of table stakes.

You say
you want a revolution?

Let’s be clear. Something big is happening here. Call it big
data. Call it whatever you want. We’re witnessing the first major change in
data infrastructure in 40 years, and investors are piling on. As Cloudera chief
strategy officer Mike Olson put it some three years ago when I
questioned his need to raise $40 million:

“I have been in the database industry for a
quarter century. I’ve done [been deeply involved] in open source software for a
long, long time. I have been watching this roulette wheel go around a whole
bunch of times now, and I am convinced that I have the pattern. I know where
it’s going to stop next. You want to tell me how much money I
should put on that spot?”

At the time, it wasn’t clear that Hadoop was going to become the
800-pound elephant in the room and would quickly assemble a vibrant community
constantly upgrading and extending its functionality. It wasn’t obvious that
some NoSQL databases would move well beyond the web companies to go mainstream
in the enterprise. Big data was just a catch phrase that mostly meant “VCs
latest hype.”

No more.

As Hortonworks’ vice president of strategy Shaun Connolly highlights:

“Apache Hadoop didn’t disrupt the
datacenter, the data did. The explosion of new types of data in recent years
has put tremendous pressure on the datacenter, both technically and
financially, and an architectural shift is underway, where Enterprise Hadoop is
playing a key role in the resulting modern data architecture.”

Today, there is a veritable gold rush to put data to better use
than possible or practicable with legacy technology. In fact, companies are
rushing so fast into Hadoop and other technologies that sometimes key
considerations like security are given secondary importance, as Gartner’s Merv Adrian points out. Over time, however, such
issues will iron themselves out. For now, adoption seems to be the top
consideration.

Picking winners

Given how much is at stake (literally tens of billions of
dollars), it’s not surprising that VCs would be betting big and trying
to pre-determine a winner. This, however, is hard.

Peter Goldmacher, an analyst with Cowen & Co., acknowledges that “picking winners is hard,”
yet stresses that “it’s clear that this opportunity is not an opportunity for
legacy vendors,” however much they may want to join the party. In open source,
she who contributes most influences project direction and the ability to
directly monetize the code. We’re seeing the same thing with Hadoop.

Despite all the money pouring into vendors, it’s important to
remember that the biggest beneficiaries of Hadoop and the changing data
landscape are the enterprises who put it to use. As Goldmacher notes:

“[N]early all Big Data customers
initially buy technology to solve problems they know they have, and once the
users get facility with the technology, they are able to envision entirely new
opportunities to drive growth. It is a significant mental hurdle to transition
from technology being a limitation to technology being an enabler, but as with
all things Big Data related, that transition is accelerating. We believe that
as this transition gains momentum and scale, the legacy vendors will
increasingly struggle.”

Customers are the biggest beneficiaries of big data, assuming
they learn to use data effectively. This starts, as Goldmacher says, with
solving known problems and then quickly transitioning to taking on completely new
challenges. Is that worth $200 million? Yes. It’s actually worth much, much
more.

Do you agree? Share your opinion in the discussion thread below.