CXO

Video: Using Message Passing Interface (MPI) for parallel programming

Intel's James Reinders explains how Message Passing Interface (MPI), an API for parallelism, works. He also expalins how it forces developers to write code that often scales better than code written for shared memory.

Intel's James Reinders explains how Message Passing Interface (MPI), an API for parallelism, works. He also expalins how it forces developers to write code that often scales better than code written for shared memory.

Note: This video was originally published as part of ZDNet's Parallelism Breakthrough series - sponsored by Intel.

About Bill Detwiler

Bill Detwiler is Managing Editor of TechRepublic and Tech Pro Research and the host of Cracking Open, CNET and TechRepublic's popular online show. Prior to joining TechRepublic in 2000, Bill was an IT manager, database administrator, and desktop supp...

Editor's Picks

Free Newsletters, In your Inbox