Data Centers

Quickly find space-hogging files with PowerShell

A key Windows maintenance chore involves determining what data is consuming resources. Here are a couple of scripts that make it easy to get that critical file information.

Few things are more frustrating than knowing you need to address a low drive space issue but not knowing where to look. NTFS file systems are not the best at calculating total size, much less showing you where the large storage consumption is located. Fortunately, PowerShell -- now standard on Windows 7 and Server 2008 -- can help the Windows Server (and client) administrator locate areas of the highest disk consumption.

For most low drive space issues, determining what files are over a certain size will be most helpful. You can easily perform this task using a simple PowerShell script created with PowerGUI. Just download the script and save it as a .PS1 file to run in PowerShell on your PC.

This script queries a path either on a local disk or on a remote system's UNC path to see what files exist over a specified size. Figure A shows this script run on a remote server named DB1 for files over 100 MB in the E$ path:

Figure A

The script returns the SQL Server 2008 installation executable that was left on disk, a common occurence that can easily be remediated by deleting the file or storing that type of data centrally. After that bit of housekeeping is done on the path queried, you can rerun the .PS1 file to see if the threshold of large data is still on disk. If you want to rerun the query on a new system or specify a new file size threshold, you can clear out the running values (path, size, unit of measure) by running this PowerShell script (also included in our download):

$global:fdir = ""

$global:fmz = ""

$global:fsize = ""

At that point, the first script can be reiterated with new values for path, size, and unit of measure. You can use the results to resolve problems immediately, notifying desktop users or application owners about large disk consumption.

The day-to-day process of ensuring responsible disk consumption ends up on the Windows admin's shoulders. Do you use PowerShell to manage disk consumption? Share your scripts, tricks, and strategies below.

Additional PowerShell resources

About

Rick Vanover is a software strategy specialist for Veeam Software, based in Columbus, Ohio. Rick has years of IT experience and focuses on virtualization, Windows-based server administration, and system hardware.

17 comments
chris.van.engelen
chris.van.engelen

Why use that extremely complicated parameter processing, when there is a perfectly reasonable parameter processing in PowerShell? Why use the test on "FileInfo" when there is a Test-Path cmdlet with a PathType parameter? So I adapted the script in the following, leaving out the trap: function Get-LargeFiles( ) { Param ( [ Parameter( Position = 0, HelpMessage = "Enter the full path to a directory to examine for large files (default C:\)" ) ] [string]$Path = "C:\", [ Parameter( Position = 1, HelpMessage = "Enter size (in units as given, default is 1)" ) ] [int]$Size = 1, [ Parameter( Position = 2, HelpMessage = "Enter file size units: KB, MB or GB (default is KB)" ) ] [string]$Units = "KB" ) switch ( $Units ) { "MB" { $Size = $Size * 1MB; } "GB" { $Size = $Size * 1GB; } default { $Size = $Size * 1KB; } } Get-ChildItem -path $Path -recurse | Where-Object { Test-Path -Path $_.PSPath -PathType Leaf } | Where-Object { $_.Length -gt $Size} | Sort-Object -property length -Descending } Always keep it as simple as possible.

rgodbey
rgodbey

What can be entered for "Measurement"?

KiloWatt1975
KiloWatt1975

As someone who makes HD uncompressed files, this is a sweet chance to help locate duplicates ect... for 100gig files. I try to remember to delete uncompressed files after encoding, but can slip and forget, as a BRay file is being encoded for 8-10hours. Thanks for the info.

Wave_Sailor
Wave_Sailor

I have been looking for something like this but I use Win XP. I have Powershell installed but I am unable to get this to run. Any ideas?

netwacker
netwacker

It's probably not ideal for humongous systems, but for examples like the one above, it's easier and more comprehensive. Zillions of small files can be the problem, and that's easier to see with WinDirStat's presentation.

ReynoldsJr
ReynoldsJr

Helps also if you have a text file version of the screen output. Add this line after $largefiles $largefiles | out-File c:\largefiles.txt

rgodbey
rgodbey

Sorry I found it, it was just locking up or sitting there.

bendict101
bendict101

OverDisk has better visual representation of disk space than all of the ones mentioned here. An oldy but a goodie. P.S. it very rarely crashes.

b4real
b4real

Save the text file download as .PS1 extension You probably need the execution policy set, in PowerShell run: "Set-ExecutionPolicy Unrestricted" Then run the .PS1 file as .\filename.ps1

ScienceMikey
ScienceMikey

I've tried them both. It looks like WinDirStat is based on SequoiaView, but with one BIG advantage--color--and other advantages, including more flexibility and better progress reporting. ~~ ScienceMikey

iansavell
iansavell

Provides a nice sorted bar chart of folder sizes which you can then drill down into - shows space hogs in no time even if they are folders of millions of small files. Works for me on a server with around 1Tb of data but does take a couple of minutes to scan.

elomega
elomega

This program provides a great visual if you can map a drive to the directory.

sosummy
sosummy

I've used Sequoia View for years and its been a life saver... and its free. It makes for a quick and easy way to find large files or even clusters of little files where log files have gone crazy.

Editor's Picks