General discussion

  • Creator
    Topic
  • #2084281

    Ed Bott’s Microsoft Challenge–8/10/00

    Locked

    by ebott ·

    A network administrator passes along this puzzle. He’s baffled, because he hasn’t seen any improvement in performance on his five-user Windows 2000 network after increasing server RAM from 64MB to 128MB. That should be plenty of memory for such a small network, but he isn’t seeing the performance gains he expected; in fact, his network seems to be running slower than before. How can this TechRepublic member figure out where his memory is being used up? Are Windows 2000’s performance monitoring tools enough, or should he invest in third-party tools?

All Comments

  • Author
    Replies
    • #3781040

      Ed Bott’s Microsoft Challenge–8/10/00

      by ronnonf ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      I think task monitor and performance monitors should be enough.

    • #3781036

      Ed Bott’s Microsoft Challenge–8/10/00

      by dc1 ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Performance Monitor is really all the admin will need. 128mb really isn’t that much of an improvement for NT server especially if he is utilizing Active Directoy and this is the only server in the forest.

    • #3781006

      Ed Bott’s Microsoft Challenge–8/10/00

      by zbrain75 ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      You said nothing about the hardware your network administrator is using. Did he add quality memory? Is his server hardware compatible with Microsoft’s Windows 2000?

      Beyond making sure the server is using quality hardware, Windows 2000 performance monitoring tools should be enough to determine whether his system really has performance problems.

      He should also consider whether his performance problem is a network problem. What kind of network is he using? Is it a token ring network, ethernet, or some other architecture? What is the speed of his network? Could one station that is not his server on the network be slowing it down? He may want to do some network monitoring to be sure his slowdown is not a network problem. He should be sure there is no network interference from a malfunctioning NIC.

      He should also consider how his network is being used by the users. What kind of applications are being run? Are they run on the server or client side? How memory and CPU intensive are they? How

    • #3780994

      Ed Bott’s Microsoft Challenge–8/10/00

      by wpatrey ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Increasing server RAM will only help if the Server RAM was Maxed out. You can use the monitoring Tools to figure it out fast by selecting the right parameters to watch. Unfortunately, it could be many things but if you think it is related to the Server RAM try assigning more memory to the network card.
      Find your network card’s IRQ (from adaptor properties in the network properties window)
      open the system.ini file
      add Irq[n]=4096 (where n is the network card IRQ) to the 386enh section

      This will assign RAM to your network card so that even if you have a shortage of memory your netwi\ork card won’t.

    • #3780959

      Ed Bott’s Microsoft Challenge–8/10/00

      by yorkster ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      It doesn’t sound like ram was the bottle neck in the first place. I would set up performance monitor to log a number of objects to determine where his bottle neck is. first I would remove the added in ram so that I could find a basic performance matrix, then I would add the ram back in and log the same objects to see where the bottle neck is.

    • #3780891

      Ed Bott’s Microsoft Challenge–8/10/00

      by brian lusk ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      I have to agree with all of the above suggestions. Particular items you want to watch in the Performance Monitor include the Paging File, # of Pages to it, etc.

      You should also monitor network accesses (bytes in and bytes out) for several days.An additional idea is to use a network sniffer to watch for extreme amounts of network accesses in an idle state.

      Personally, I would check to ensure the NIC card is working correctly with the latest drivers and that I had sufficient hard disk space to run with. Then, I would change the swap file parameters to force Windows to write a new one that reflects the added RAM. That is a usual culprit for performance hits after upgrades.

      Good luck, and hope this helps the Administrator!

      Brian Lusk
      brianlusk@netzero.net

    • #3771979

      Ed Bott’s Microsoft Challenge–8/10/00

      by bill.parks ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      How vague. What performance gains is he expecting??? Disk??? If so, change his Server properties from “Balance” to “Optimize for File Sharing.” This will up his disk cache for file sharing. Authentication??? Name Resolution??? Use PerfMon to grab some basic counters:
      Disk: – Log. Disk. – Avg. Disk Queue Length. (if consistenly high, upsize disk subsystem)
      CPU: – System – %Total Processor Time. (if consistently >80 upgrade CPU)
      System: – Server – Work Item Shortages. (if consistenly increasing, it means the server is failing requests, usually CPU or memory)
      Memory: – Memory – Pages/Sec. and Paging File – % Usage. (if these two are always high, add memory). Create the Page file to be 2 times the physical RAM on Min and Max to prevent excessive fragmentation, and/or put it on another physical drive. There is no need to invest in 3rd party tools. NT’s PerfMon is the API used for most of them anyway.

    • #3771871

      Ed Bott’s Microsoft Challenge–8/10/00

      by techytype ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      All things remaining equal, assuming no changes are taking place such as new software etc. my comment is simply this. When was the last time the server hard drives were optimized? My newest personal favourite for NT or W2K is Diskeeper by Executive Software. The crown used to belong to Symantec’s Speedisk for optimization (still does for 9x though)

    • #3771869

      Ed Bott’s Microsoft Challenge–8/10/00

      by jvincent ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Windows 2000 Performance Monitoring tools should suffice for checking memory-hogging applications. In particular, I would use Performance Monitor and Task Manager. Specifically, I would check the page file size and increase it in proportion to theincrease in RAM.

    • #3771852

      Ed Bott’s Microsoft Challenge–8/10/00

      by jamiec ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      I agree with the ‘how vague’ statement, you first need to determine what this server’s function is. It’s quite possible that you have it overloaded with functions, like maybe it’s the PDC, IIS, Exchange, and SQL server all rolled up into one! That might work for such a small network, but it would be better to split the functions off to different servers.

    • #3771846

      Ed Bott’s Microsoft Challenge–8/10/00

      by jshilling ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Is the administrator using WINS on this server? And if so has he upgraded to Windows 2000 Service Pack 1? Windows 2000 has a known issue with WINS, in fact it loses approx. 1MB of RAM per minute in Windows 2000. If he is using WINS on this serverthen perhaps this is his problem, and he needs to apply SR1.

    • #3771795

      Ed Bott’s Microsoft Challenge–8/10/00

      by joek83 ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Alright, it is like this Bott Daddy, he should examine what programs are utilizing his RAM via the Windows Task Manager under the process tab. Perhaps the five peons on this network are partaking in an Quake 3 tournament game unbeknowst to the network admin….this would also explain way last quarters sales are way down. Lastly, he should check that he has the correct Ram bus speed such as his other Ram strip.

      I surf therefore, I am.

    • #3771792

      Ed Bott’s Microsoft Challenge–8/10/00

      by tony_giboney ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Was the real problem lack of memory or somthing he is overlooking.

      The problem is slowly getting worse so he may want to look a his disks fragmentation on his server and his workstations. He may want to check out Microsofts KB articles Q13539, Q227463, and Q223146. Also there is a mass of information on this subject in the form of white papers from NSTL on the highly recomended Diskeeper web site http://wwww.execsoft.com/whats-new/whitepaper.com

      Also note that the version of Diskeeper that comes packaged in WIN2K has its limitations, (see MS KB Q223146), one of those are that It can’t be ran as a user, can’t be scheduled (so you cant run it in the off hours).

      I have had good results with Diskeeper ver 5.0 for WIN2K, running it in the off hours on my servers and workstations, the improvments are noticed by all.

    • #3771790

      Ed Bott’s Microsoft Challenge–8/10/00

      by gkleffner ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Once again, the problem is too vague. Not enough details. If this Administrator would start Task Manager, and click on the Performance tab, it will indicate the amount of available memory and the peak Commit Charge. If there is very little availablememory and the peak is significantly greater than 128MB, it is still too low on memory. Then, click on the Processes tab and then the mem usage header. This will sort the processes by memory usage. It will be simple enough to determine where memory is being consumed. If it appears that there isn’t enough memory, do not resort to increasing the size of the pagefile. This will only increase the bottleneck at the disk subsystem and will not improve performance. If this Windows 2000 server is the AD server, and it is running DNS, WINS, possibly DHCP, and applications, I would recommend installing NT 4.0 instead. Further problem determination could be done using Performance Monitor, but certainly there is no need for third-party tools.

    • #3771589

      Ed Bott’s Microsoft Challenge–8/10/00

      by pvp ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      With so few facts to go on, the first item on the list is whether the paging file was increased in proportion to the RAM increase. If not, that alone will drop performance like a rock.

    • #3771548

      Ed Bott’s Microsoft Challenge–8/10/00

      by tony hogan ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      The existing tools should be sufficient to monitor the situation. Monitoring is a black art unto itself so finding the “right” items to monitor is the majority of the battle.

      In regards to the slowness, the additional memory and [assuming] increased swap file size may be leading to additional swapping. There are also some older chipsets / motherboards [if this is an older machine] that can’t cache above 64MB so the additional memory actually causes a decrease in system performance.

      The aforementioned answers on memory are also applicable. CAS latencies, etc can come into play where effectively the system adjusts the entire memory pipeline to lowest common denominator. In the case of servers, you definitely get what you pay for.

      -tony

    • #3770753

      Ed Bott’s Microsoft Challenge–8/10/00

      by fyousuff2013 ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      the administrator should
      1. check the disk paging.
      2. Run the disk defrag then check the performance and then increase his swap files on the server. he/she should see some performance diffrence.
      3. if the above does not work, check the server cache.

    • #3770752

      Ed Bott’s Microsoft Challenge–8/10/00

      by chrisengland2 ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      So many good points already covered – I hope I can add something new…
      Free space is probably running low on the primary partition. If the hard drive is one partition, it may be possible to transfer that drive’s image to a bigger hard drive in order to achieve more free space. The other solution would be to add an additional hard drive.

      In the first situation, it would be possible to increase the size of the page-file after transferring the original drive’s image to the larger hard drive.

      In the second situation (adding a second hard drive), I would partition and format the second drive, create a new page-file on it (about twice the RAM size), and then minimize the page-file on the primary drive to about 20MB min\40MB max. That would give the primary drive room to “breath”.

      Of course, if the system has a single hard drive with two partitions (and plenty of free space on the second partition), it would accomplish the same thing to add a new page-file to the second partition, and then minimize

    • #3770593

      Ed Bott’s Microsoft Challenge–8/10/00

      by levihilomen ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      Check the NT Task Manager memory usage to determine where all his memory is being used up. Remove NETBUI protocol as these uses broadcast in the network. I’m not downgrading Win2000 performance monitoring tool but a third-party tools will be helpful. Lastly, try increasing your paging file size.

    • #3752516

      Ed Bott’s Microsoft Challenge–8/10/00

      by mferrel ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      IMHO 128MB is the recommend amount of memory for W2k Pro standard user. That pops to 256MB if they are working in graphics or with engineering software. 256MB is also my minimum for a W2k server. At 64MB I’m surprized info acces wasn’t equated toparallel ZIP access! Obviouslt other things are also dependant as pagefile and just what is running on top of server. In nearly all cases $ spent on RAM beats $ spent on CPU!

    • #3752515

      Ed Bott’s Microsoft Challenge–8/10/00

      by mferrel ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      IMHO 128MB is the recommend amount of memory for W2k Pro standard user. That pops to 256MB if they are working in graphics or with engineering software. 256MB is also my minimum for a W2k server. At 64MB I’m surprized info acces wasn’t equated toparallel ZIP access! Obviouslt other things are also dependant as pagefile and just what is running on top of server. In nearly all cases $ spent on RAM beats $ spent on CPU!

    • #3749386

      Ed Bott’s Microsoft Challenge–8/10/00

      by moshker ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      I agree that he should do some monitoring to determine where his bottleneck is. There are also utilities for benchmarking and diagnosing memory specifically. Check out http://tucows.com and http://wintune.winmag.com/. Fragmentation might be a problem, but he should also make sure that all the hardware he is using is on the compatibility list (www.microsot.com/hcl). Also, what chipset is he running? Older tx, and lx chipsets run no faster with 128 mb than they do with 64 mb because of a caching limitation among other things.

    • #3764093

      Ed Bott’s Microsoft Challenge–8/10/00

      by ebott ·

      In reply to Ed Bott’s Microsoft Challenge–8/10/00

      This question was auto closed due to inactivity

Viewing 22 reply threads