Delete all but the most recent files in Bash
I’ve been reviewing a few things I do and decided I need to be a bit smarter about managing backups. I currently purge by date only. Which is fine if everything is working and checked regularly. I wouldn’t want to return from a two week holiday to find my backups had been failing, nobody checked it, but the purge job was running happily.
Here’s what I came up to try and solve the problem…
cd /path/to/backup/location && f="backup_pattern*.sql.gz" && [$(find ${f} -type f | wc -l) -gt 14] && find ${f} -type f -mtime +14 -delete 2>/dev/null
Breaking this down…
cd /path/to/backup/location - cd to backup location.
f=”backup_pattern*.sql.gz” - set pattern to match backups in variable.
[$(find ${f} -type f | wc -l) -gt 14] - Return true if more than 14 backups are found. Otherwise false and the command will exit at this point.
find ${f} -type f -mtime +14 -delete 2>/dev/null - Delete files that are older than 14 days and throw away error output to /dev/null
This approach makes use of the && (AND) operator to make its magic work. There’s a lot of good discussion on the web about tackling this problem.