General discussion

Locked

Ed Bott's Microsoft Challenge--8/10/00

By ebott ·
A network administrator passes along this puzzle. He's baffled, because he hasn't seen any improvement in performance on his five-user Windows 2000 network after increasing server RAM from 64MB to 128MB. That should be plenty of memory for such a small network, but he isn't seeing the performance gains he expected; in fact, his network seems to be running slower than before. How can this TechRepublic member figure out where his memory is being used up? Are Windows 2000's performance monitoring tools enough, or should he invest in third-party tools?

This conversation is currently closed to new comments.

Thread display: Collapse - | Expand +

All Comments

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by Tony Hogan In reply to Ed Bott's Microsoft Chall ...

The existing tools should be sufficient to monitor the situation. Monitoring is a black art unto itself so finding the "right" items to monitor is the majority of the battle.

In regards to the slowness, the additional memory and [assuming] increased swap file size may be leading to additional swapping. There are also some older chipsets / motherboards [if this is an older machine] that can't cache above 64MB so the additional memory actually causes a decrease in system performance.

The aforementioned answers on memory are also applicable. CAS latencies, etc can come into play where effectively the system adjusts the entire memory pipeline to lowest common denominator. In the case of servers, you definitely get what you pay for.

-tony

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by ebott In reply to Ed Bott's Microsoft Chall ...

The question was auto-closed by TechRepublic

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by fyousuff In reply to Ed Bott's Microsoft Chall ...

the administrator should
1. check the disk paging.
2. Run the disk defrag then check the performance and then increase his swap files on the server. he/she should see some performance diffrence.
3. if the above does not work, check the server cache.

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by ebott In reply to Ed Bott's Microsoft Chall ...

The question was auto-closed by TechRepublic

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by chrisengland2 In reply to Ed Bott's Microsoft Chall ...

So many good points already covered - I hope I can add something new...
Free space is probably running low on the primary partition. If the hard drive is one partition, it may be possible to transfer that drive's image to a bigger hard drive in order to achieve more free space. The other solution would be to add an additional hard drive.

In the first situation, it would be possible to increase the size of the page-file after transferring the original drive's image to the larger hard drive.

In the second situation (adding a second hard drive), I would partition and format the second drive, create a new page-file on it (about twice the RAM size), and then minimize the page-file on the primary drive to about 20MB min\40MB max. That would give the primary drive room to "breath".

Of course, if the system has a single hard drive with two partitions (and plenty of free space on the second partition), it would accomplish the same thing to add a new page-file to the second partition, and then minimize

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by ebott In reply to Ed Bott's Microsoft Chall ...

The question was auto-closed by TechRepublic

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by levihilomen In reply to Ed Bott's Microsoft Chall ...

Check the NT Task Manager memory usage to determine where all his memory is being used up. Remove NETBUI protocol as these uses broadcast in the network. I'm not downgrading Win2000 performance monitoring tool but a third-party tools will be helpful. Lastly, try increasing your paging file size.

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by ebott In reply to Ed Bott's Microsoft Chall ...

The question was auto-closed by TechRepublic

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by mferrel In reply to Ed Bott's Microsoft Chall ...

IMHO 128MB is the recommend amount of memory for W2k Pro standard user. That pops to 256MB if they are working in graphics or with engineering software. 256MB is also my minimum for a W2k server. At 64MB I'm surprized info acces wasn't equated toparallel ZIP access! Obviouslt other things are also dependant as pagefile and just what is running on top of server. In nearly all cases $ spent on RAM beats $ spent on CPU!

Collapse -

Ed Bott's Microsoft Challenge--8/10/00

by ebott In reply to Ed Bott's Microsoft Chall ...

The question was auto-closed by TechRepublic

Related Discussions

Related Forums