IBM's System S stream computing demo ushers in a concept of real-time analysis of streaming data that is a total paradigm shift to the seek-store-query based transaction model that has been the bedrock of computing.
From e-mails, SMSs, and text to voice, images, and video, it's a deluge of information out there! The traditional model of seeking data, storing it, and then processing it for information cannot decipher the full swath of available and instantly created data.
Stream computing introduces a new model that analyzes data streams live. The focus is on correlating the data around the specific problem to be solved irrespective of the type of data (text, video, voice, RFID, GPS, images) or hardware. It's about using software algorithms that process data based on the type in real time.
The project, announced at a conference in New York, and a result of four years of research, runs on 800 processors and can be scaled "tens of thousands" (New York Times).
The officials said the new computer system has the ability to assemble applications on the fly-based on the inquiry it is trying to solve-by using a new software architecture that pulls in the components it needs when they are needed to handling a specific task. (Quote from eWeek)
As quoted in the Electronic Design, the shift towards "non-uniform and complex memory hierarchies, rapidly increasing core (and thread) counts, and the integration of specialized acceleration units," the days ahead will be very interesting and radical to computing as a whole. What radical trends do you feel will define the next generation of computing? Share your views.