Hardware optimize

Don't be fooled by these 10 PC performance myths

Much of what you hear about boosting PC performance is outdated -- and some of it was never true to begin with. Here's the real story.

Computer lovers are always looking to get more speed out of their computers. Unfortunately, a number of incorrect or outdated performance tips have been around long enough to become myths. Here are 10 of these myths -- and the truth about them. As always, I am sure you'll be able to think of plenty more. So be sure to post your own myth-busting in the forums!

1: Vista and Windows 7 require many times more RAM than XP

When people first move from Windows XP to Windows Vista or Windows 7 and bring up a RAM usage meter, they often panic. What they see is something like Figure A.

Figure A

Does Windows 7 really use more than 8 GB of RAM?!

Wow, that looks scary, doesn't it? The system is doing just about nothing (1% CPU usage) but needs 8.84 GB of physical RAM to run. Here's what is really happening.

Starting in Vista, Windows got aggressive about RAM use. The engineers at Microsoft made it pre-allocate RAM and pre-cache commonly used items, even if they were not actually in use. For example, if you use Word a lot, it will keep Word in memory ready to be used. Obviously, this lets it chew up a ton of RAM, and why not? It isn't like you were using the RAM anyway, and you will eventually be using it, most likely for the purposes Windows is preparing for. Applications start much faster as a result.

2: More RAM is always faster

More RAM is not a guarantee of a faster machine, although more RAM has never hurt. Actually, that isn't quite true, either! Many times, the bigger RAM runs as a slower bus speed than the smaller RAM chips. So in theory, more RAM can be mildly harmful to performance. More important is the Dual Channel vs. Triple Channel RAM issue. If you have a choice between 12 GB of RAM using Triple Channel, and 16 GB of RAM on Dual Channel, the 12 GB of RAM will be faster, so long as you rarely need to go to the swap file. Also, since Windows does pre-allocate RAM and cache often-used items, having that extra RAM could conceivably make a difference, assuming that you are a huge RAM user.

3: Anti-malware apps kill performance

Yes, anti-malware apps have an effect on performance. And at one time, that effect was massive. Back in the day, many PC slowdown issues could be solved by removing applications like antivirus. In recent years, things have changed.

It used to be that anti-malware apps essentially had to hijack the OS to see what was going on with the file system and RAM, and this was where the slowdown occurred. That is no longer the case. Windows now provides hooks into the OS for anti-malware applications to receive files and sign off on them in a regular fashion. As a result, anti-malware apps still have a performance hit, but it's very minimal.

4: If you clear the browser history, you'll gain some speed

On a regular basis, I see advice like this bandied about:

  • Delete your browser history to speed things up.
  • Clear your cookies for more speed.
  • Empty your browser cache to make the Web fly!

Guess what? It's bunk. The only thing that clearing the history could make faster is the display of suggestions from your browser (which quietly pares the list as needed for performance anyway).

Dumping the cookies won't do anything, since they don't sit in memory; they are merely read and uploaded to the server when requested, and they're so small that they won't slow things down noticeably. And the browser cache? It makes things faster! Think about it: What's going to be faster when your browser needs an image, CSS, or JavaScript file -- re-downloading it from the site or pulling it off the local hard drive? Emptying your cache was a storage space tip in the 90s when drive space was at enough of a premium that the browser cache could be a big chunk of it. Somehow, the tip eventually morphed into a bogus performance trick.

5: Registry cleaning is a miracle worker

This is another one I see all the time, too. In theory, yes, a smaller registry will have an effect on performance. But that presumes that your applications are constantly hitting the registry and that your registry is in such poor shape that the junk is a significant part of it. And even then, guess what? You've optimized the data in a database that is already designed to be fast, that resides in RAM, and is a few megabytes in size anyway.

Unless you're running on a PDP-11, working with a database the size of the registry is so blazing fast that you could slash it to 1% of its current size and still not have a real difference. That said, cleaning the registry could have some benefits (especially getting rid of entries for apps that were uninstalled but that you may reinstall), but performance is not going to be one of them. This is a tip that made a lot more sense a long time ago, but is no longer important.

6: Having more cores is always better

Having more CPU cores is not going to be slowing you down. But in many cases, you are simply wasting your money. Few applications are multi-threaded and follow a true parallel processing paradigm in which the application is grinding away on all your cores at once to solve hard problems. Writing parallel processing code is hard to do (I know from first-hand experience) -- and it's even harder to do right.

Many of the most demanding applications, like games and graphics processing, often push the hardest processing onto the GPU, not the CPU --even non-graphics work (like bulk cryptography), in many cases. So yes, while having extra cores is great, don't expect that putting dual quad core CPUs in your box is going to give you anything extra on speed, unless you run those rare applications that are really optimized for it or you do a ton of work with virtual machines.

7: Drive RPMs are all that matter

When measuring drive performance, people love to look at the RPMs that the platters spin at. While faster RPM drives can theoretically read large chunks of data faster and perhaps seek a little faster, the better number to look at is actually the seek time. Little data transfer is done in long, drawn out reads or writes; most of it is small random access. Seek time is therefore very important to performance. Also look for larger caches on the disks and the total transfer speed number.

8: You should empty the Recycle Bin for more speed

This is another of those tips that made sense 10 years ago but is outdated now. Emptying the Recycle Bin will obviously free up disk space. But where is the performance boost going to come from? I suppose that if you have a huge amount of data in the Recycle Bin and dump it, and then perform a defrag, it is possible that you'll suddenly get such a well-optimized disk that there will be a noticeable difference. This presumes, of course, that you regularly create large amounts of data in the middle of the physical disk and then remove it. Unless you are constantly installing and uninstalling large applications, and creating and then removing large amounts of data, emptying the Recycle Bin is not going to give you a noticeable speed improvement.

9: You need a fancy hard drive for ultimate performance

For a long time now, specialty drives like the Western Digital VelociRaptor have been used to get the best disk speed around. There's no doubt that these drives are fast. But did you know that you can get just about the same speed from less expensive drives? The secret of the VelociRaptor's performance is that it uses small platters, so the heads never have far to move. If you can find a drive with similar performance stats for cache and RPMs, you can "short stroke" the disk. Essentially, you partition the drive in half and then format and use only the partition closer to the inside of the disk. This gives you the benefits of the smaller platters without the cost of the specialty drive.

10: One big disk is fine

People think that just because it's rare for multiple applications to be pounding on the hard drives at the same time that just having one large disk is okay for performance. Yes, it's unlikely that you will have two applications simultaneously trying to grab a ton of disk access, unless you are running a server, running VMs, or doing some crazy multitasking. Splitting your data between two disks (like the common OS and apps on one disk and documents on another scheme) really does not give much performance gain. At the same time, multiple disks can be a huge performance boost... when put into a RAID. Check out the information on Wikipedia about what different RAID levels can do for the read and write times of your PC, and you'll see why RAID is a hidden performance gem.

About

Justin James is the Lead Architect for Conigent.

233 comments
Shrike49
Shrike49

While all the above have some merit, I believe the Whole crux of PC Performance HAS to Start with the Machine itself. Hardware, Starting with the Motherboard & BIOS, What on board & Whats NOT etc is Just the Tip of the Iceburg. Then Come's the RAM & HDD and Onboard features etc etc. But wait we are NOT finished yet, The Operating System & SetUp Then Application Programs also adds to performance issue's. Bottom Line... You need to look at the whole Situation Including Age of the Computer & Software in Use!! So, Generalising on Performance Myth's with the above in mind is a bit of waste of time, Still it's food for thought!

andrew232006
andrew232006

1. Vista(requires 1GB, 2Gb to be usable in my opinion) requires a ton more ram than XP(Can run on 64mb) What windows thinks I may need ram isn't always what I will need it for. 2. Not always faster, but it is usually a cheap and effective upgrade. It is rarely slower, who buys slower ram to upgrade their pc? 6. Who here isn't running more than 6 programs or services right now?

cquirke
cquirke

I was interested in Tips 7, 9 and 10 as regards to hard drives and speed. My approach has always been to restrict 90% of hard drive activity to 5% of the hard drive's head span, via partitioning, shell folder relocation and control over activities that automatically grope hard drive content underfoot. It helps to have a large quantity of content at a single head position, to minimize head clicks. If reading the wrong part of the right cylinder, the drive could read the content anyway and cache it somewhere, but while heads are moving, no data can flow. If you can minimize most head travel to 5% of the drive, you are on the way to SSD's absence of head travel, without any of the downsides of SSDs. Small-capacity high-RPM drives won't do that for you (Tip 7), and partitioning the first part of a large drive helps as per Tip 9. This works better than keeping a large partition empty, if the OS and file system insist on placing often-accessed structures in the middle of the volume, as NTFS appears to do. No matter how brain-dead the OS logic or how fragmented the volume, head travel will never be beyond the end of the partition. You can then apply Tip 10, setting up the rest of a large drive to hold material that is not accessed that often; wads of music, videos and pics, huge but seldom-played games, etc. Part of that involves relocating the shell folders, but you also need to kill any background processes that automatically grope file systems; System Restore, search indexers and media thumbnailers, background "whole system" av scans, etc. The payoff is a PC that stays as fast as it was when empty, even if it's filling up with content. But you have to keep an eye on how OS and file systems change, as most design assumes "one big C:" - for example, the way Windows keeps inactive copies of installation material and updated code files within C:, without leveraging a shell folder that would make it easier to relocate this stuff elsewhere. Previous Versions and file system attempts to maintain undoability may also bloat up volume space use, and spread the position of material across the volume even when the volume appears to be largely "empty".

cquirke
cquirke

On your Tips (4) and (8): The impact of thousands of directory entries is high on FATxx, which uses a linear lookup method. In contrast, NTFS uses a B-tree directory structure, which should mitigate that performance impact - though not sure if it does so completely. A full Recycle Bin and large TIF may both incur a large number of directory entries in single directories, though this will be offset in TIF doe to multiple subdirs there. TIF is unlikely to be on FATxx in Vista/7, but there may be Recycle Bins on FATxx, and that may hurt as the contents of all bins are displayed in any bin (or "the" bin). Then again, if this display is done from an Index.dat rather than by enumerating each bin's contents, the pain is moved to whenever that .dat is updated. I don't see any value in huge Temporary Internet Files (TIF). IE used to duhfault to absurdly large caches, as a % of volume size; that stopped in IE 7 or so, but started again in IE 9. When these are repeated across multiple user accounts, the effect is that much worse. Consider a user with a 3G monthly ADSL capacity limit, i.e. the total Internet content that can be consumed is limited by the ISP to 3G per month. What is the point of a 5G web cache, in such cases? What is the point of storing bits of web pages that are a month stale, when the site is likely to have changed by that time, forcing a refresh anyway? If the connection's fast enough to fill a huge cache, it's fast enough to benefit less from heroic caching. If a connection is so slow it needs caching to speed it up, it won't fill an enormous cache anyway. A large cache means lots of small files that persist for quite a while, and then get deleted - which fragments C: into swiss cheese. Even if NTFS fully encloses such file content with the metadata store, that store will get large enough to hurt.

cquirke
cquirke

On Tip (6): Multiple cores... I think there's a great value in more than one core, but for many users, that's where the advantages stop - especially if you lose clock speed to gain cores. When a thread hogs a core, it shows up as a 99% CPU use in Task Manager on a single-core PC, which virtually stops running. In contrast, a dual-core will have that threat pegging 50% CPU and the rest of the system does stay running. But unless you're planning on running multiple threads that hog a core each, most of the advantage is gained with 2 cores. More common is where a PC virtually stops running, but Task Manager shows CPU to be 98% idle. That is either because whatever was hogging the CPU also delayed Task Manager startup until it let the CPU go, or there is an off-CPU bottleneck at work. Suspect the first when there's no response to Ctl+Alt+Del for ages, until Task Manager pops up along with all the other long-ignored keystrokes and mouse clicks. Suspect the second when there's no drama in getting Task Manager up, yet CPU is 98% idle even when the system is still "running" like mud.

cquirke
cquirke

Yes, Vista/7 need more RAM than XP. I think the blogger's coming from the same first-world perspective that says "always multiply Microsoft's RAM requirements by 4" and was prolly used to running XP in 2G RAM - as may be appropriate, if you want to run a bunch of apps as well as the OS itself. XP used to be sweet in 512M RAM, as Vista/7 are in 2G, but the types of apps, antivirus etc. you'd be using with XP today may make it quite slow in 512M; 1G would be a more reasonable baseline, which is closer to Vista/7's 2G, though at modern RAM prices and capacities, 4G is a no-brainer (and 64-bit is a must). The blogger's original point has got lost here, with the all the squabbling over 2000/XP vs. Vista/7 RAM requirements, and that is; don't use reported "memory in use" as your target for physical RAM. Remember, the larger the RAM, the larger the swap and hibernate files, and the greater your head travel will be within a small-C:-for-speed model... but that's another story :-)

harishkumar09
harishkumar09

More cores are indeed better, because while very few programs are coded to take advantage of multiple cores, most users these days run multiple applications simultaneously. At present, I have 10 browser tabs open (thats one application) , an anti-virus program, a video encoding program converting avi to divx, two notepads and I dont know what else! The OS will run each of these in a different core. So there is a huge benefit. Even in the days of single-core computers people were running multiple applications simultaneously, thanks to the Windows Multitasking Feature. People who have made the shift to dual core can notice the huge difference in speeds. Plus video encoding programs handle problems which naturally lend themselves to parallel processing, so I am sure that application alone runs on two threads. And the GPU's which the author talks about also have multiple cores to handle the naturally parallely solvable graphics calculations

janitorman
janitorman

such as the Samsung 470 series. There is no spin speed, as there are no moving parts, there is no wait time to seek a sector, as all are available just as quickly.. etc. These are the wave of the future. Sure, the current generation is expensive, and has a limited number of read/writes compared to a traditional drive, but future models will probably correct this. Also, I challenge you to find the spin speed of the drive in your iPad, Kindle, etc. It won't be there, as they run on SD technology. A traditional drive wouldn't even FIT in one of those devices. Solid State drives work differently from traditional drives, and there is NO NEED to ever defragment them, as the data isn't usually stored in a fragmented form to start with! Happy Computing!

chsalam
chsalam

If these are the myths, then what are the top real actions that could be taken that will actually enhance performance? Thanks!

tpirog
tpirog

I regularly uninstall Norton and McAfee from my customers' computers and see huge performance increases. Who puts together this nonsense? Why propagate lies?

wwoef
wwoef

I also see often that defragging the hard disk regularly improves performance. Yet, is this really the case? Can anyone shed some light on that? I have the feeling that also here improvements are extremely small, if any...

baikosta
baikosta

I consider myself a power user and I don't accept this nonsense about XP. XP is too outdated, in Win 7 everything (even the install itself) is easier, everything is better. At work I use XP, 1 GB RAM, and some 2.66 GHz Celeron, well this outdated pc has troubles with some basic multitasking - IE with 3-4 tabs, 3 word documents, Lotus Notes and 3-4 excel spreadsheets. The biggest problem with XP is that now it comes with outdated hardware and that's it. I am using Win 7 since it got out and I will never go back to XP and I will switch to win 8 the moment it is officially released. This discussion is like the old cars vs new cars. New cars are crap, they break down a lot, etc... but try comparing a 150 HP diesel car which is faster, safer, better in any way, cleaner and more fuel efficient than some 20 years old 75 hp petrol piece of crap ( I drive one like this :)) It's either you go with technology and pay for it becoming better or you hang on to the old things and shout how good they are, when in fact they aren't/

metaphysician
metaphysician

I have no idea how accurate the statement about antivirus programs no longer causing the performance hit they used to do. It's going to take a lot of convincing to get me to try one of the biggies again. No names, but you know who the two biggies are. One of them crippled my server so badly that I felt like I was back in the old timesharing days after the major computer classes got out. It's going to be a long time before I try one of them again. Yes, I do run antivirus software on my Windows machines. I found a company whose software does a fine job without killing performance.

emailadsspam
emailadsspam

Sorry Jessie, but I can't buy your "Oceanfront Property in Arizona"

dovelewis
dovelewis

I have always agreed with most of these and I learned a little from the others and the ensuing opinions. Now that we know for certain what will not work, what will? thanks

JonathanPDX
JonathanPDX

I find that clearing the Temp folder under the user's profile to be more effective than clearing the Temporary Internet Files folder. Also the Windows\Temp folder as well.

pgit
pgit

the bottom line. :) I don't tell people any of this will speed up performance, with the exception perhaps of multiple cores, but I do tell folks these are (mostly) common sense things that make the system simpler, and with fewer items in the registry there may be less of an attack surface for intruders to poke at. No harm in cleaning up disk space and compacting things with a good defrag. If I'm going to work on an individual's computer I'll tell them I'm going to do some of this stuff, clean the registry, clean up disk space etc, and I make it a point that I'm doing this "for free," a complementary service. They think they are getting something (worthwhile) for nothing. That always bolsters the bottom line in the long run. Happy customers are paying customers. Anything I can do to make them feel better about bringing their woes to me is worth my effort. These "myths" are nonetheless worth bringing up, but definitely not as a promise to speed up performance. Just make for a "healthier" system.

dl_wraith
dl_wraith

So, we've had the obligatory "Windoze users are silly. Linux is bettar" comments but so far no Mac fanboys saying things like: "MacOS doesn't get infected" "MacOS doesn't need defragging" "MacOS doesn't need a registry" "MacOS manages RAM a lot better than windows ever did" "MacOS is better than windows" I'm disappointed, Mac users - the Linux crowd have had their dig, where are you when there's an opportunity to put the boot in to poor old windows? =============== Obviously I'm being facetious but just in case anyone takes me seriously I'll go and find my asbestos shirt. :) Don't get me wrong, I agree that Linux and MacOS have their advantages but it always amuses me how much balefire is produced in the comments for articles like this. As far as I can see, this was never meant to be an in-depth article, merely a pointer of some commonly reproduced advice that may no longer be relevant to today's computing environment. Even then the spirit of some of the author's comments seem to have been a little lost along the way. To those of you with well-written and well-intentioned comments expanding on the authors points and clarifying things, bravo and thanks. To those of you who have gotten a little heated about this article if you disagree with the author why not write your own article and submit to TR? There are many ways to cut the performance cake (as you clearly know) and there's always room for another take on the subject.

Thumper33
Thumper33

Even if we take your point at face value, Vista/7 preloads programs (such as Word from your example) into the RAM how can you state it is not being "used"?? If Windows has 6GB of Word preloaded into the RAM, what happens when you open Photoshop? It has to flush the RAM and then load Photoshop from scratch. So then it "preloads" Photoshop, but when you come back later, now you want Word. Well, flush and reload. Also, you show a machine with more than 8GB of RAM. Who in the world has that?? Other than computer techies and IT geeks. Sure, I have a machine with 12GB of RAM, it's also a water cooled overclocked quad core running folding@home 24/7. Your standard user will not have more than 4GB of RAM and even that is ONLY because of Windows Vista and 7. Vista and 7 are amazing resource hogs, especially with everything new about them turned on (Aero, ect). If you turn them down or get the base distribution (with functionality hacked out of them as well as none of the features they advertised to sell it) then you might get away with 2GB of RAM, but then, why are you upgrading to 7 if you are not going to use the features of 7?? Sure, Windows 7 may not seem to use a lot of RAM if you have 8GB, you might not notice the flushing and reloading if you have that much and a fast dual-core CPU. However, again, the people with 2GB of RAM will most certainly notice slower performance on a Vista/7 machine.

Spitfire_Sysop
Spitfire_Sysop

I have a real quibble about the suggestion that tripple channel memory is faster than dual channel memory. It is not. This same issue came up when they released dual channel memory and the real world testers found no measurable performance increase. The idea was that with the extra bandwidth there would be performance gains later on. Tripple channel has proven this not to be the case. The increased bandwidth is actually increased overhead. People are seeing small performance reductions when comparing dual channel and tripple channel setups on the same motherboard and the same ram chips. Read it and weep: http://www.overclock.net/intel-memory/706703-triple-channel-memory-mode-faster-better.html

blarman
blarman

#1. Ummm... I've been using Windows since Windows 3.0, and each successive version has REQUIRED higher and higher levels of memory to run and run effectively. In general, 2x the "suggested" specs will give good performance. Yes, the OS is going to try and cache more in Windows 7 than it did in Windows XP, but I run both and don't notice any kind of significant difference in the two even when XP has 1/2 the RAM (2 GB vs 4 GB) because Windows 7 is ALSO trying to run Aero and a whole bunch of other services at the same time. So, in reality, it DOES require more RAM to run. PS - using an example PC with 8 GB of RAM? Get real. 2 is true, but again, how many of us are running desktops with more than even 4 GB of RAM? 3. This one depends. If you load up one of those complete suites like McAfee or Symantec, these WILL bog down your system compared to running a light AV package like Trend Micro AV + Ad-Aware/Spybot or Malware Bytes. I run the heavy package on my work computer (7) and the light package on my home computer (XP), and the boot times are significantly faster on my home computer. 4. True. On yesterday's small hard drives, space mattered because it caused fragmentation. On today's drives, there is plenty of space to use to avoid fragmentation. 5. This one is a definite maybe, not a myth, especially when it is talking about device drivers. And it's a simple fact that comparing the registry to a database is misleading at best. The registry is a linked-list, not a database, and has no management engine to regulate itself. The biggest problem with the registry is uninstalled programs not cleaning up after themselves. 6. This one is generally a myth for the reasons specified by the author: few applications actually are coded to take advantage of multiple CPU's. But there is a difference between having multiple cores and having multiple CPU's. With multiple CPU's, there is actually I/O management overhead to control the bus and reads from memory, etc. With cores, there is only one bus, it's just that the processing can get split up on a hardware basis rather than a software basis. Most software isn't using lots of data in parallel, however, so the benefit of having multiple cores really isn't all that utilized by most programs. 7 - 10. Seek time is the best indicator of drive performance, but drive RPM's are a large factor in seek time. Just the way these two are worded really blows out of proportion the difference here. And if performance is the primary driver, you can't beat solid state. Also keep in mind that ALL versions of RAID entail a management cost. Striping tends to minimize that cost and provide excellent data retrieval costs by distributing the load amongst the disks in the group. Mirroring maximizes the management cost but provides data security. Cleaning up your disk is again a matter of space - on today's huge hard drives, these are less and less of an issue. The real reason disk space is an issue is because of the lack of a dedicated swap area for Windows. A fragmented swap space can really drag down performance, which is why I always set a static swap file size in the system Environment area equal to my physical RAM size. That way that space is always allocated on boot and the OS never has to try and fight with the file system for space to use.

tamarow82
tamarow82

I read all of this and my next question is...what are your suggestions for increasing speed on your computer (i.e. web surfing, running multiple applications). Also, I encourage my XP users to dump their temp files :\Windows\Temp and c:\Documents and Settings\username.domain\Local Settings\Temp and c:\Documents and Settings\username.domain\Local Settings\Temporary Internet Files to make sure their computers stay free of infection. (Most often I find that viruses hide there as well as in the System Restore Files; so I turn SR down and off.) What is your opinion on that?

fedm235
fedm235

Note also that a large recycle bin, when full, takes longer to make room for new objects when that is required. A smaller recycle bin size is more quickly cleaned. Of course, if you set your bin to empty every time you log out, this is not an issue.

wdewey@cityofsalem.net
wdewey@cityofsalem.net

In my experience antivirus does impact performance especially if there is not enough RAM. A couple people I know run AVG free and after one of the updates their computers were horribly slow. They had about 256mb RAM. After upgrading to 1GB RAM their machines ran much faster. I also have gotten numerous complaints of slow machines at my work. AV is scheduled to run at night, but some people will shut their machines off. This causes the antivirus to run a scan in the morning while their trying to get work done and it greatly impacts the performance of the machine. Simply having them reboot at night instead of shutdown stopped the complaints. Hooks may make AV run faster, but adding a file check process to every call is going to eat up CPU cycles. If you have a lot of unused CPU cycles this may not be noticeable, but if your machine is already maxed out then it can greatly impact performance. Internet cache cleaning. Every time you open a web page your computer scans through the cache to determine if it needs to download something. If you have 10,000 items probably no big deal, but if you have 1,000,000 items it can take some time which can impact performance. One item where I have verified this my self is adobe reader. I have some public use computers and every once in a while adobe reader will take 5 to 10 minutes to open a document from the web. After emptying the temp folder and web cache acrobat documents open fine again. Bill

M.R.
M.R.

So what is Intel using as a metric when it assigns CPU numbers? According to the numbers, more cores = more performance. As mentioned, this depends on the OS and APPs. If they aren't able to properly use multiple cores then wouldn't you be better off with a cheaper CPU with fewer CPUs but a higher core frequency? Not to mention cache, etc.?

codepoke
codepoke

You put your foot in the gluepot, of course, but great information. I've long suspected most of these, but it's nice to see them in print.

oldbaritone
oldbaritone

If that were the case, #5 would be true. But the more important part of registry cleaning is the consistency check. When the registry is full of dead-end paths and pointers to nowhere, error routines have to spend time figuring out what to do. Often there is also a timeout before the error, taking several milliseconds more. Registry clean "tidies up" these loose ends, so that Windows can go directly to the proper handler for the function that needs to happen right now. It's the consistency check, not the database compaction, that is the important part of a registry clean.

Who Am I Really
Who Am I Really

XP Pro SP3 on a 2002 Compaq Presario 6330ca OEM system config here: http://bizsupport1.austin.hp.com/bizsupport/TechSupport/Document.jsp?objectID=c00009597&lang=en&cc=us&taskId=101&prodSeriesId=213854&prodTypeId=12454 current config. - XP Pro SP3 (most updates except IE7, 8 & WiMP11) - P4 2.0GHz - 1GB DDR 400MHz (max the board / BIOS will allow) - Intel 845 chipset - Intel ICH4 2 x ATA-100 E-IDE channels - Asus 128MB 8x AGP - C:\ a 40GB partition of a 320GB WD Scorpio Blue 2.5" E-IDE 5,400 RPM HDD w/ 8MB cache - D:\ 500GB E-IDE ATA-100 7200.9 Seagate Barracuda w/ 2MB cache - E:\ 500GB E-IDE ATA-100 7200RPM WD Caviar AAJB w/ 8MB cache - F:\ LiteOn 48x CR-ROM The system is configured for performance with as much XP bloat turned off, disabled, or outright removed: the five biggest offenders: - System Restore : Disabled - Indexing Service : Disabled, and the catalog deleted etc. - Autorun / Autoplay : disabled on all drive letters and types - User Tracking : Disabled, and all related settings: "no recent doc history, no personalized menus, etc." - Recycle Bin : Disabled and the Recycled folder replaced with 2x 0 byte files named: Recycled and Recycler both must be present and have the attributes RHS applied or the folder reappears with every file delete using a 40GB partition on a 320GB 2.5" 5,400 RPM laptop drive as the boot drive, (short stroked to 12.5% of total HDD size) I get < 45 Seconds to the desktop and that's including Ctrl+Alt+Delete and password entry also using UPHclean, I get < 8 second shutdown time note, this system in 2002 shipped with only 256MB RAM and XP Home w/ no Service Pack Vista / Win7 shipped with no Service Pack on 1GB RAM systems that's a 4x increase in default configurations I guess that means "many times more RAM than XP" have fun getting Vista or win7 to start or even install on system with only 256MB RAM if I run the vista or win7 Upgrade advisor on this 2002 system in it's orignal config. the tool basically says: > get lost, this system will have no chance of running Vista or Win7 even in it's current config. the upgrade advisor would still tell me the same > You need to upgrade the entire system and Intel isn't supplying vista or win7 drivers for the 845 chipset it's always been this way minimum double required hardware from previous windows versions as well as at least 1 - 3 years newer hardware than the OS release date from win3.1x WFWG to win7 double or more each time .

alan.t.kelly
alan.t.kelly

Of all the problems ive had with pcs over the last decade or so RAID has caused me the most hassle. Cheap RAID solutions (on motherboard) in particular have caused me nightmares (entry level HP servers) As the rest of the post seems aimed at relative novices I would not recommend RAID, instead, as a more manageable performance solution i would recommend installing 2 hard drives, one with the OS installed, and one with the users files on. it is much easier to back up and restore in case of a hardware failure and should go quite a way in terms of a performance improvement.

TheNerdyNurse
TheNerdyNurse

So you've told me all the things that won't help windows 7 speed up. 6 months out of the box and my once speedy system in now crawling. So now that I know what won't help, I need to know what will.

Here2serveu
Here2serveu

Sorry this was just weak. Example. Registry. Pointer to invalid locations do cause slow downs and Anti-virus will often monitor the reg and other programs do to. Task manager will never show these and yet the whole system bogs down. Say you have a lenovo system with it's reg monitor, intel wireless reg monitor and install trend micro. Talk about pain. Yes AV/Malware products can cause a major hit on performance. The OS is so weak that 3rd party vendors are trying to protect themselves from getting blamed and in the end the conflicts make a mess. Just try it. You'll see. I fix windoze and Mac but live in Linux and BSD.

Tony Hopkinson
Tony Hopkinson

a machine, or an app on a machine. The real problem is they are optimisations, depending on the circumstances they might have no effect or make things worse. More cores is a classic. If the app isn't threaded (and probably parallelised), it will only use one at a time anyway. If it is and you don't give it enough cores, if poorly written it will spend more time switching threads than processing. Two hard drives on one controller, waste of time performance wise unless it has two caches... There ain't no free lunch, any time you optimise for X you are less than optimal on A to W, Y and Z.

Dukhalion
Dukhalion

If You have like 30000+ files in your browser cache then Windows has to check all those files to see if You already have the required file, and if not, then it goes to the net and gets them. The initial checking takes a lot of time if You have too many files there. And usually You will want the latest updated pages, not old information from the cache, but Windows will still have to check all those cached files first. Keep the browser cache at a reasonable minimum size, no more than a thousand files. I once dealt with a computer that had become very slow, and it had over 100000 files in the cache. I deleted them all and yes, Windows DID fly. Most of those "myths" are myths from a specific narrow angle. Usually those "myths" are in fact true. Lots and lots of specialists have verified that.

jpdecesare
jpdecesare

When you build a performance system (in my case, video editing), the speed rating and amount of ram is crucial, and more is always better, especially because you'll buy ram with fast timings and ease of overclocking, so bigger ram can (not automatically "will") give you noticable performance boost. For the office person on a low-profile Dell, maybe not. The one that really blew me away was #5, that Registry utilities offer no performance gain. That's correct and incorrect. Registry tools are just like chiropractors, you need to find the right one. Uniblue's RegistryBooster has done WONDERS for bloated machines that people have run for over a year. In the past I've always reimaged every years to start clean. No need to any longer, RB is a great chiropractor. The best part is it's not a subscription, you buy it and you're done. Just IMO.

silviustd
silviustd

Like to see an opinion about regularly defragmenting the HDD

M.R.
M.R.

SD as a performance boost pretty much goes without saying although there may some cheap ones with poor performance. Because they have not reached the size or price of standard HDs they are not a small upgrade. Of course neither is a RAID config. My experince with SSD has not been so good. I'm finding a higher failure rate that standard HD drives while having less prefailure notification. Files become corrupt without HD error warnings and diags always show a clean drive. Probably a growing pain issue that will go away but I'm not too excited about them yet at these prices.

jeslurkin
jeslurkin

I too would like to see some positive info along this line. And thanks for the short-stroking info.

Tony Hopkinson
Tony Hopkinson

Depends on waht you are doing but our build machines (and devs) all have a folder to build object files to that we exclude from a certain big name A/V as it not only crippled performance, it was locking files as well, causing an intermitant build failures. Lot of stuff to build, but we are talking about a 40% reduction in build time, just from doing that.

gechurch
gechurch

I hate to break it to you, but if someone is reading your registry they don't need to do much attacking any more - they're already in.

AnsuGisalas
AnsuGisalas

just feed your subords [i]performance cake[/i]. Quick, before the ATF catches on to it!

gechurch
gechurch

1) Don't take it at face value. Google it instead. Prefetch is a good term to start with. (Spoiler - what he says is true!) 2) There's no easy way to show the true state of RAM usage (which is why Task Manager numbers sometimes seem contradictory). A page of RAM storing cached data is the classic example. How do you classify it? Is it in use, because there's data in there? Or is it free, because no program is referencing that page, adn it can be wiped and used to store something else? The answer (according to Windows) is both; it's in use, but also available. Your whole question is kind of the authors point - don't be too bothered by high RAM usage numbers in Task Manager - those numbers don't mean that your computer needs that much RAM. 3) There are a couple of levels of prefetching. Windows does it at an OS level - it looks at which files get loaded after Windows starts, and it makes them contiguous on the hard drive, and loads them all in one read so they are already in physical RAM before the request comes through. The same thing happens for all applications (check out C:\Windows\Prefetch - this is where the prefetch data is stored). 4) "Flush and reload" is incorrect. Vista/7 make much better use of available RAM by holding on to cached data for much longer; it only gets rid of the cache if it needs the RAM for something else. So all files Photoshop need stay resident in RAM, and the same with Word. 5) Although not overly common now, there are plenty of machines being marketed to home users now with 8GB RAM. I often get users asking me about what they need for a home PC, and they have been told they need "at least 8GB RAM". 6) You are incorrect with your 2GB RAM number. Both Vista and 7 runs well on 2GB RAM, even without all the bells and whistles turned off.

wdewey@cityofsalem.net
wdewey@cityofsalem.net

If your OS or application can't use multiple cores then you are better off with a more powerful single core. I use more powerful processor because frequency isn't the only determining characteristic of a good processor. Bill

gechurch
gechurch

I'm surprised to hear you've had trouble with RAID on HP servers. I've set up RAID on lowly ML110 "servers" (I can't call the ML110 G5s proper servers - they are just a desktop with a good PSU and onboard RAID). I've never had any issues at all. Presumably you have just been RAIDing the two drives many low-end servers come with, so have hopefully been using RAID1. If this is the case you shouldn't be expecting much performance increase since it's just a mirror (although you get some benefit on concurrent reads). I'm also surprised to read your last sentence. One logical drive should be easier to backup than two (not that it makes much of a difference either way). But on the restore side... well that's the whole point of RAID. If a drive dies you keep on running with no data loss... just replace the failed drive and let the software automatically rebuild onto the new drive.

AnsuGisalas
AnsuGisalas

sysinternals autoruns can show you some stuff you don't need. Winpatrol is good for cross-checking. Taking care of the page file can help a lot, sysinternals have tools for that. Uninstalling crap programs you don't need REALLY helps, some of them load themselves for no reason. Also acrobat and other useful programs have "speed load" programs that load themselves but are not necessary (especially if the system is running well, because it's not wearing ten pairs of quick-load suspenders). And of course, you want to make sure you're not botted - MalwareBytes Anti-Malware (MBAM) provides a free off-line scanner which will find a lot of hard-to-find critters. To make certain, you can run Combofix from www.bleepingcomputer.com - but read the documentation first, it's a hardcore tool which doesn't like to be disturbed while working. The nice folks at bleepingcomputer forums can help you make sense of the scan logs if something is found. Combofix is specifically good at finding rootkits, something that most versions of commercial AV cannot. Defragmenting helps, and so does realigning the files (defragging so that the files line up according to frequency of use). Finally, don't fill up your hard drives. Leave them half-empty, otherwise your file system will have to spend a lot of time to keep itself sane.

gechurch
gechurch

Of course Task Manager will "never show these"! Task Manager shows you the list of programs running, pointers to invalid locations can't run! I agree that there are still some terribly slow AV programs out there (although many are fast these days). Trend Micro, as of last time I saw it (1 year ago) was one of the worst.

gechurch
gechurch

IE takes the same amount of time to check the cache whether there's one file in there or 100,000. If you think about it you will see the way you think the cache check works isn't even possible. Even if IE did open every file in its cache and read its contents, how could it possibly know whether that file is the one being served by the web site it's about to visit? There has to be a link, and that link is index.dat. This maps URIs (like http://www.site.com/logo.jpg) to a file in the cahe (like A782HFB.jpg).

fedm235
fedm235

I follow what Dukhalion says and keep my browser cache set smaller. The default cache size in browsers seems large since each object is tested against the web site for whether it is current or not. This is a performance hit, where you ballance disk access and net access, and the size of the objects. (Web sites are now using more images, more larger objects, than they did when networks were slower.) I looked at the access dates on objects in my cache over a week and determined that 8 to 10 MB is large enough for my needs, not the 20 to 40 MB that browsers default to.

Brian Doe
Brian Doe

I wish Windows could be made to use EXT3 or EXT4 filesystems, as these don't require defragging. However, since this isn't reality, my advice for Windows users is to set your built-in defragmentation program to automatically defrag your drives once a week during off-peak hours (such as some time after midnight on a Sunday morning, if your computer supports Wake on Event). Defragging more often that this is not only a waste of time for little noticeable gain, but could actually shorten your hard drive's life. Important note: If you have any SSD's installed, DO NOT defrag these drives! Not only is it completely unnecessary, but you will SIGNIFICANTLY shorten the SSD's life!

gechurch
gechurch

NTFS is pretty darn good... it doesn't get anywhere hear as fragmented as FAT32. And out of the box Windows does exactly what you recommend - it schedules a defrag for 1am every Wednesday night.