Video: Using Message Passing Interface (MPI) for parallel programming

Intel's James Reinders explains how Message Passing Interface (MPI), an API for parallelism, works. He also expalins how it forces developers to write code that often scales better than code written for shared memory.

Intel's James Reinders explains how Message Passing Interface (MPI), an API for parallelism, works. He also expalins how it forces developers to write code that often scales better than code written for shared memory.

Note: This video was originally published as part of ZDNet's Parallelism Breakthrough series - sponsored by Intel.