Although days of limited hard drive memory space seem to be over as hard drives to hold data of great size are today available for affordable price. However, if you feel that you do not need a hard drive with of greater capacity and yet your hard drive is full this may be the indication that you have lots of junk data stored on your system. This is exactly my case and I do need to clean up regularly. I prefer to delete data according the size so starting with the biggest directory in my home directory is the first step.
Here are couple bash commands I use for my data clean up routine:
This command will list only files and directories in your current working directory. Remove -s option to see a recursive output. -h stands for a human readable output.
$ du -sh *
The previous command is not very useful as it does not sort and prints many unnecessary files which we are not concerned about. The following command prints all directories in current working directory, includes the relevant size in MB and sorts from biggest to smallest.
$ du -m --max-depth 1 | sort -rn
Our clean up data day my end by cleaning up first 10 biggest directories we find. So we may only be interested in first the biggest directories. To find which are those lets use a previous command and include head:
$ du -m --max-depth 1 | sort -rn | head -11
Using the command above we can create a bash script to make our life easier. The following bash script will accept 2 arguments. First argument will be a directory name in which we will start our search and the second argument will by a number of directories the script should output.
#!/bin/bash if [ $# != 2 ]; then echo "Incorrect number of arguments !" >&2 echo "USAGE: sortdirbysize [DIRECTORY] <first n directories>" fi du --block-size=1M --max-depth 1 $1 | sort -rn | head -$2
$ ./sort-dir-by-size.sh /home/linux 15