Video: Using Message Passing Interface (MPI) for parallel programming

Intel's James Reinders explains how Message Passing Interface (MPI), an API for parallelism, works. He also expalins how it forces developers to write code that often scales better than code written for shared memory.

Intel's James Reinders explains how Message Passing Interface (MPI), an API for parallelism, works. He also expalins how it forces developers to write code that often scales better than code written for shared memory.

Note: This video was originally published as part of ZDNet's Parallelism Breakthrough series - sponsored by Intel.

By Bill Detwiler

Bill Detwiler is Editor in Chief of TechRepublic and the host of Cracking Open, CNET and TechRepublic's popular online show. Prior to joining TechRepublic in 2000, Bill was an IT manager, database administrator, and desktop support specialist in the ...