Subversion Repositories sysadmin_scripts

Rev

Rev 161 | Details | Compare with Previous | Last modification | View Log | RSS feed

Rev Author Line No. Line
158 rodolico 1
# find greatest savings after fdupes run
2
 
161 rodolico 3
Using fdupes to report on duplicate files, or even remove them automatically, is excellent. However, sometimes just working on a subset of the files can be done in 10% of the time and result in a 90% cleanup.
158 rodolico 4
 
162 rodolico 5
fdupesGreatestSavings is a perl script that takes the output of fdupes (on stdin) and determines which entries will result in the greatest savings, whether it is 100 copies of a 1 Meg file or 2 copies of a 20G file. The output is sorted by greatest savings to least.
158 rodolico 6
 
162 rodolico 7
fdupesGreatestSavings takes one parameter, the number of entries to display. It accepts input from stdin and sends output to stdout, so it is a filter.
158 rodolico 8
 
9
> fdupes must be run with only the flags --recurse and --size, though
10
> --recurse is optional.
11
 
162 rodolico 12
The following command will look through the entire file system on a Unix machine and report the top 10 duplicates it finds.
158 rodolico 13
 
14
    fdupes --recurse --size / | ./fdupesGreatestSavings 10
15
 
162 rodolico 16
If you want to save the results (for other procesing), you could do something like
158 rodolico 17
 
18
    fdupes --recurse --size  /path/to/be/checked > /tmp/duplicate_files
19
    fdupesGreatestSavings 100 < /tmp/duplicate_files > /tmp/fdupe.savings
20
 
160 rodolico 21
#### Downloading
159 rodolico 22
 
23
Script is available via subversion at
24
    svn co http://svn.dailydata.net/svn/sysadmin_scripts/trunk/fdupes
25
 
160 rodolico 26
#### Bugs
158 rodolico 27
 
162 rodolico 28
The only bug I have found so far is that the count of the number of files is incorrect (always 2), and I haven't tracked it down. Total space used by an entry is correct, however.