Can High-Performance Interconnects Benefit Hadoop Distributed File System?

Download Now
Provided by: The Ohio Society of CPAs
Topic: Big Data
Format: PDF
During the past several years, the MapReduce computing model has emerged as a scalable model that is capable of processing petabytes of data. The Hadoop MapReduce framework has enabled large scale Internet applications and has been adopted by many organizations. The Hadoop Distributed File System (HDFS) lies at the heart of the ecosystem of software. It was designed to operate and scale on commodity hardware such as cheap Linux machines connected with Gigabit Ethernet. The field of High-Performance Computing (HPC) has been witnessing a transition to commodity clusters.
Download Now

Find By Topic