A Comprehensive View of Hadoop MapReduce Scheduling Algorithms

Download Now
Provided by: International Journal of Computer Networks and Communications Security (IJCNCS)
Topic: Software
Format: PDF
Hadoop is a Java-based programming framework that supports the storing and processing of large data sets in a distributed computing environment and it is very much appropriate for high volume of data. It's using HDFS for data storing and using MapReduce to processing that data. MapReduce is a popular programming model to support data-intensive applications using shared-nothing clusters. The main objective of MapReduce programming model is to parallelize the job execution across multiple nodes for execution. Now-a-days, all focus of the researchers and companies toward to Hadoop. Due this, many scheduling algorithms have been proposed in the past decades.
Download Now

Find By Topic