Big Data

Bandwidth and latency must factor into your big data strategy

The big data pipeline is getting more crowded. Learn how to improve your company's big data throughput when going over the public internet.

Image: iStock

There are big data networking issues your company may be overlooking: bandwidth and latency. Even though it may seem daunting to go down that path, it's better to address this topic now, because more will sites rely on the public internet to transfer data streams and large chunks of data from location to location.

For example, in television broadcasting, some companies are taking a hard look at using cloud-based production facilities for their broadcasts. Latency and bandwidth issues could arise if transmissions of big data video signals are slow.

"Bandwidth and latency are both significant concerns right now in the broadcast industry," said Mike Cronk, vice president of core technology at Grass Valley, which provides broadcast television systems and solutions. For broadcasters, this creates trepidation when it comes to deploying video production in the cloud, so they continue to rely on their in-house broadcast and production facilities. Nevertheless, there are several compelling value propositions for broadcasters if they consider a cloud-based approach to production.

SEE: Video: The 3 trends that are defining next-gen big data deployments (TechRepublic)

"We have seen instances when sportscasters will be broadcasting a major sporting event like a soccer match and the broadcaster anticipates far greater demand for the event than his usual programming has," said Shawn Carnahan, chief technical officer at Telestream, which provides cloud-based and standard video production products. "This is where a cloud-based production option becomes appealing, because you can add more production capability in the cloud to accommodate the larger viewer base for the event—and then tear it down when the event is over. You pay for only what you consume."

Unsurprisingly, some of the broadcasters who are early adopters of cloud production are in the sports space. From a latency and bandwidth perspective, these broadcasters can afford to be early adopters because many sports stadiums have upgraded their facilities with fast and direct data pipelines into the cloud to solve these issues. In the PAC 10 conference, for instance, all 12 PAC 10 schools have dedicated circuits established for sports events, which frees broadcasters to use the cloud because there is a dedicated line. NEP The Netherlands, a service provider in the Netherlands to broadcasters, has invested in fiber lines to all of the stadiums in Holland.

SEE: Super Bowl 51 makes digital history with record-breaking data usage (TechRepublic)

But not every industry has participants who are willing invest in expensive communications infrastructures, and not every company wants to tap into cloud-based data with more robust data pipelines. Here are two steps companies can take to improve their big data throughput if they are going over the public internet.

1: Invest in direct data communications lines

For larger companies, if your business case warrants it, or if you already have the infrastructure, it's smart to invest in direct data communications lines. Major cloud services providers, including Microsoft Azure and Amazon AWS, offer direct, dedicated, and redundant (for failover) data communications lines, as do other cloud services providers.

Some companies that desire Quality of Service (QoS) on their external network frameworks choose to invest in their own infrastructures, using data speed-up protocols like multiprotocol label switching (MPLS), which facilitates high-performance communications by routing data over shorter transport paths than those used by traditional networks.

2: Tune your wide area network (WAN)

There are a host of WAN optimization tools (physical and virtual) that help companies make the most of the internet bandwidth they have. These tools accomplish this by eradicating data transmissions that are redundant, compressing and prioritizing data, staging data in local caches, and streamlining chatty protocols. If big data cleaning techniques can be aligned so that the data cleaning business rules are aligned with the data prioritization and extrication functions of the WAN tools, the performance of data streaming and large file transfers can be enhanced.

Conclusion

Bandwidth and latency should be critical elements of your big data and networking strategies, because you can't support the heavy traffic demands of streamed or file-based big data payloads without the pipelines that are needed to carry them. Also, be sure to include the network management group in your big data and analytics planning.

Also see

About Mary Shacklett

Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; Vice President o...

Editor's Picks