General discussion


How do Apps use Multiple Processors?

By accolon ·
I should like to understand, how applications use multiple processors on a server, if indeed they do fully.

Take an Office app like Excel for example, on a Terminal Server, with 4 processors. Will the Excel executable actually make use of the 4 processors with multiple bits of the job at hand being handed to the other processors?

Or will it simply use one processor that it is 'assigned to', and the only real advantage in this example, is that there are still three other processors for other programs/apps/users on this terminal server to make use of.

Trying to get a crystal clear understanding, so if anyone knows of a good white paper on the subject, feel free to quote a link. :-)


This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Comments

Collapse -

How do Apps use Multiple Processors?

by jimandkathy In reply to How do Apps use Multiple ...

Applications do not use multiple processors, rather it is only the operating system that splits the work load. Windows 9X computers can not use multiple processors. Windows 2000 Workstation, Server and XP can all use two.

In order for an application to be spread over two (or more) CPUs it must be written to allow threads that the OS can decide what best to do with. Most applications even with threads are run under one CPU but very large programs and/or data can use more. Database programs, large servers, and ISP can all use the extra CPUs.

Most end user programs that can be found in the average home or office will receive little (if any) use across CPUs. If you run a lot of programs at once the extra CPUs can help. For instance you can run your anti-virus programs, defrag, and still kill Diablo all with out slowing down your computer.

Related Discussions

Related Forums