Cleaning up old files automatically (IIS logs, Temp files, etc)

This is a simple utility script I wrote a while ago, after several instances of finding IIS servers that had been configured to use the system drive for logging and were almost out of storage as logs accumulated. I thought it would be useful to have a generalised solution to the problem of working directories which fill up with old files, and particularly when things are busy it’s good to have something that consistently manages this so that servers require less babysitting. (Regular monitoring of storage capacity is also a good idea, of course!)

The script is intended to be called using a scheduled task, with the task action being to run powershell.exe and the parameters on the action being in the form “Remove-Oldfiles.ps1 -directorypath C:\Temp -maxagedays 30”.

The script itself is as follows:

param(
	[Parameter(Mandatory=$true)]  
	[String]$directorypath,
	[int]$maxagedays
)

# Define function
function Remove-OldFiles {
	param(
		[Parameter(Mandatory=$true)]  
		[String]$directorypath,
		[int]$maxagedays
	)
	# Verify that tmp directory exists, create if not
	if (!(Test-Path "C:\tmp")) {
		Mkdir "C:\tmp"
	}
	$logfile=$("C:\tmp\" + [string](get-Date -Format "yyyy-MM-dd") + "_Remove_OldFiles.log")
	[string](Get-Date -Format "yyyy-MM-dd HH:mm")+": Starting old file cleanup" | Out-File -filepath $logfile
	$cutoffdate=(get-Date).AddDays(-$maxagedays)
	if (Test-Path $directorypath) {
		$delcands+=GCI -Path $directorypath | ? {$_.LastWriteTime -lt $cutoffdate}
		[string](Get-Date -Format "yyyy-MM-dd HH:mm")+": Deletion candidates are as follows:" | Out-File -filepath $logfile -append
		$delcands | Out-File -filepath $logfile -append
		$delcands=$delcands | Sort-Object -Property Fullpath -Descending
		foreach ($file in $delcands) {
			try {
				Remove-Item -Path $file.FullName -Force -ErrorAction Stop
			} catch {
				"Unable to delete file $($file.Fullname). Reason was $($_.Exception.Message)" | Out-File -filepath $logfile -append
			}
		}
	} else {
		"Specified target directory $($directorypath) could not be found!" | Out-File -filepath $logfile -append
	}
	[string](Get-Date -Format "yyyy-MM-dd HH:mm")+": Old file cleanup complete." | Out-File -filepath $logfile -append
}

# Invoke function

Remove-OldFiles -directorypath $directorypath -maxagedays $maxagedays

I’ve written the code as a function and implemented the script as a parametrised definition and invocation of the function, because this means it’ll be easier to re-use elsewhere if I need it.

The named parameters are passed into the script and then into the function, which does the following:

  1. Check for the existence of “C:\tmp” and create it if not necessary.
  2. Create a logfile in C:\tmp named with the datestamp (in an alphabetically sortable format) and a suffix indicating the name of the script generating the log.
  3. Sets a cutoffdate based on the current timestamp and the specified maximum file age
  4. Tests for the existence of the target directory.
  5. If found, iterates through the target directory and populates an array with deletion candidates.
  6. Outputs the filenames for the deletion candidates to the logfile.
  7. Iterates through the deletion candidates, starting with the longest filenames first so that files within directories are deleted before the directory itself is deleted. (This may now be fixed, but has been a long-standing bug in Remove-Item for the entirety of the time I have been using PowerShell.)
  8. If a file cannot be deleted, this is noted in the logfile.

There’s nothing more that this script really needs to do. If I was particularly concerned about the success or failure of each run it would be easy enough to make it email our service desk mailbox and generate a ticket.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.