Hardware

Measuring the impact of Terminal Services on your server

As you can probably guess, Terminal Services adds overhead to your server and can slow it down. But how do you find out how much? Brien Posey shows you how, using the Windows 2000 Performance Monitor.

As I mentioned in the Daily Drill Down "Introducing Windows 2000 Terminal Services," Terminal Server clients can place quite a load on your server. But exactly how much do they affect your server and how do you find out? You can answer these questions by using the Windows 2000 Performance Monitor.

So that you can see exactly how much load is placed on a server, I’m going to run a series of tests through the Windows 2000 Performance Monitor. First, I’ll establish some baseline values. Next, I’ll take a reading again with Terminal Services enabled but with no clients connected. After that, I’ll take some more readings with one client connected but not really doing much. Finally, I’ll take a reading with a single Terminal Server client attached and running a high-demand application.

Author’s note
Obviously, the readings that I get will be specific to my system. The amount of strain Terminal Services places on your system depends highly on your specific hardware and what else is running on your server. For this particular series of tests, I’ll be using a low-end server. Because I am only performing the tests with a single client, it will be easier to judge the impact if I use one of my slower servers. The server is a 400-MHz Pentium II with 256 MB of RAM. However, this test will give you a feel for the effects of Terminal Server clients on your server and what to check on your server. Just remember, your mileage may vary.

Establishing the baseline
The first step in the process is to establish a baseline reading. I started by taking readings with the Terminal Services and the Terminal Services Licensing services disabled. I watched the processor, the memory, the hard disk, and the network bandwidth. To do so, I tracked the following counters in Performance Monitor:
  • Processor/% Processor Time
  • Memory/Available Mbytes
  • Physical Disk/% Disk Time
  • Redirector/Bytes Total/sec

After loading the counters and giving them some time to stabilize, I took the following readings:
  • The average percentage of processor time used was 8.9 percent, as shown in Figure A.
  • The average number of MB of available memory was 35.490.
  • The average percentage of disk time used was 3.492 percent.
  • The average amount of network flowing through the redirector was 1.495 bytes per second.

Figure A
Without Terminal Services running, the processor was running at an average of 8.9 percent of its total capacity.


Testing Terminal Server’s impact
After I had taken some initial readings, it was time to enable Terminal Services and the Terminal Service Licensing Service. This simulated an idle Terminal Server. After enabling these services, I gave the server some time to stabilize and took the following readings:
  • The average percentage of processor time used was 16.5 percent.
  • The average number of MB of available memory was 37.
  • The average percentage of disk time used was 24.184 percent.
  • The average amount of network flowing through the redirector was 563 bytes per second.

As you can see, merely activating the services had an impact on the system, doubling processor time and increasing the amount of disk time by almost eight times. Also, as you can see in Figure B, the average amount of redirector traffic was thrown off by a huge burst of network traffic. After this burst, there was almost no network traffic.

Figure B
After enabling Terminal Services, there was a large burst of network traffic followed by almost no network traffic.


Next, I initiated a Terminal Services connection from the client. As you can see, it had an almost immediate impact. Again, after allowing the readings to stabilize a bit, I saw the following results:
  • The average percentage of processor time used was 13.340 percent.
  • The average number of MB of available memory was 43.48.
  • The average percentage of disk time used was 60.786 percent.
  • The average amount of network flowing through the redirector was 78.554 bytes per second.

Finally, it was time to generate a heavy workload on Terminal Services. To do so, I ran several applications at the same time on the client through its Terminal Services connection. Here are the results of this test:
  • The average percentage of processor time used was 12.955 percent.
  • The average number of MB of available memory was 28.9.
  • The average percentage of disk time used was 49.170 percent.
  • The average amount of network flowing through the redirector was 165.649 bytes per second.

Examining the numbers
Now that all of the tests have been completed, let’s take a look at the numbers and what they really mean. Many times the average number generated isn’t truly representative of the actual performance. For example, in Figure A, the processor was running at about 8 percent off the bat. However, in that graph, you can see several spikes. These spikes are generated by the process of taking screen captures. Therefore, the initial processor time was probably much lower.

When I started Terminal Services, the processor time doubled. You can see two large spikes in Figure C that were generated by the physical act of starting the services. The graph in Figure C to the left of these spikes indicates the actual overhead that the system is normally subjected to after loading Terminal Services.

Figure C
After enabling Terminal Services, the processor was running at an average of 16.5 percent of its total capacity and spiked twice.


You can see that the average number in Figure C isn’t truly representative by comparing that number to the numbers generated when the client actually started working. In both of those cases, the total average processing time is lower than when Terminal Services was just starting and running at idle. Attaching clients to the Terminal Server causes the processor to work harder than it does when no clients are attached.

Now, let’s take a look at the memory that was available. You’ll notice that as I progressed through the tests, the available memory actually increased. What the test doesn’t tell you, though, is that the increase in memory was due to swapping physical RAM for virtual memory, which doesn’t perform anywhere near as quickly as RAM. When I logged on through the Terminal Server client and started placing a light load on the server, the amount of available memory started steadily decreasing. By the time I started placing a heavy load on the server, the available memory had dropped dramatically.

The measurement of how much the hard disk is used is a little more accurate than some of the other numbers that we’ve looked at. Initially, there wasn’t much disk activity. The disk activity started increasing as I launched Terminal Services. When I attached a client to the server, the hard disk was absolutely hammered as I attached a client to the Terminal Server. Eventually, the hard disk had a chance to catch up with the workload somewhat, but as you can see, Terminal Services was still hitting the hard disk really hard.

The one thing about this test that really surprised me was that Windows displayed a minimal amount of network traffic throughout all of the tests. My theory is that because I was using a really fast network connection with only one client, the traffic did actually pass through my network, as shown by the numbers, but did so in such a quick burst that it may not have appeared on the graph. Another possible reason for the low amount of network traffic is that I had data compression enabled.

Conclusion
In this Daily Feature, I explained a few configuration options that you can use to set up your Terminal Server to best suit your company’s needs. I then went on to show you how to measure the impact that Terminal Services has on your network.
0 comments