So much effort has been focused on the volume of Big Data and what we are doing about managing it. Yet, the more I look at the Big Data picture and the key areas where companies are going to have to execute, the more I think about Big Data velocity-and what we are going to have to do to be sure that we have the technology and the bandwidth to carry it.

My thoughts were seconded the other day in a visit I had with Jeff Kagan. Kagan has been a well known “voice” for the telecommunications industry over the past thirty years.

“For the last ten years, I have seen the [telecommunications] industry move through analog, digital, 1G (gigabit), 2G, 2.5G, 3G and 4G communications and technology advancements,” said Kagan. “During the same time, we began ten years ago with consumers taking their cell phones, keys, and wallets with them before leaving the house.

This is changing. Within the next five years, the smart phone is going to become the center of our world. The smart phone will start the car, make phone calls, send texts and emails, send photos, track locations, check bank account balances and send secure payments for consumer transactions. Literally, you won’t have to carry anything else with you but your smart phone when you leave home.”

This is major for businesses, as there literally isn’t one these days that won’t tell you that being able to provide a pleasant “customer experience” to everyday consumers is paramount.

To facilitate great customer relationships, businesses will want near real-time analytics on every kind of data emanating from consumer mobile devices in order to sharpen their understanding of consumer demographics and buying patterns. Even if you have the most powerful analytics processors in your data center, this near real-time intelligence gathering from incoming streams of raw, Big Data still can’t occur in an instantaneous manner unless there is the network throughput to facilitate the velocity of potentially hundreds of thousands of individual data streams as they occur.

This is where university, corporate, government and education technology consortium Internet2‘s press announcement this week comes in, The press release talked about how Internet2 has upgraded its network backbone to a 100-Gigabit, Ethernet-enabled, 8.8-terabit-per-second optical network.

The upgrade will allow National Science Foundation-supported projects like XSEDE (“Exceed”). to bring together 17 supercomputers, visualization and data analysis engines-and also data storage resources, data collections, computational tools and services to support science, research and educational projects across the U.S. All of this is Big Data at high velocities.

Also participating in the effort are enterprise-space commercialization vendors, which means that some of this research and technology will be coming downstream to companies that are going to need accelerated networks (as well as other technology assets) to assist them with managing Big Data at the high speeds this data is likely to travel.

Are we ready yet for high-velocity Big Data?

Several leading-edge, high velocity Big Data applications are already hard at work in industries and activity areas like financial services, stock brokerage, weather tracking, movies/entertainment and online retail. However, most companies are still getting their arms around Big Data analytics in more “static” data environments, characterized by data warehouses that store information that can tell you about recent or longer-term history and trending, but not about real-time events. In other words, most companies have yet to fully venture into the high velocity dimension of Big Data.

The good news is that technologies are coming onboard now that will help Big Data velocity efforts with built-in business rules, automation, and new ways to store and access data. On the network side, pipelines will be expanding. All of this makes now the time for businesses to map out their high velocity Big Data strategies.