Large-scale Incremental Data Processing with Change Propagation

Provided by: Max Planck Institute for Software Systems
Topic: Big Data
Format: PDF
Incremental processing of large-scale data is an increasingly important problem, given that many processing jobs run repeatedly with similar inputs, and that the de facto standard programming model (MapReduce) was not designed to efficiently process small updates. As a result, new systems specifically targeting this problem have been proposed. Unfortunately, these approaches require the adoption of a new programming model, breaking compatibility with existing programs, and increasing the burden on the programmer, who now is required to devise an incremental update mechanism.

Find By Topic