Home > database >  deleting files and folders in a specified directory
deleting files and folders in a specified directory

Time:02-21

#!/usr/bin/perl

# purge backups older than 210 days

use strict;
use warnings;

use File::Find::Rule;
use File::Path 'rmtree'; # listed directory has files and folders

# to delete files and folders in the specified directory age > 210 days

my $dir = '/volume1/Backup01/*/Archived_files/';
my $days = 210;

# Do i need to input something like @folder = File::Path *** ??

my @files = File::Find::Rule->file()
                            ->maxdepth(1) # maxdepth(0) will allow me to delete files in subdirectories as well?
                            ->in($dir)

# How can I make a for loop to look for folders whose -M > 210 and allow me to delete?
                            
for my $file (@files){
    if (-M $file > 210){
        unlink $file or warn $!;
    }
}

Included comments for what I need... background is purging old files in a NAS server and quite lost at the moment as to how to safely purge away a few thousand files and folders >.<

CodePudding user response:

Why use Perl for such an easy task?
You can use a simple UNIX find for this task, as you see here:

find ./ -mtime  210 -delete

In case this does not work (some find versions don't have the -delete switch), you can still use the following:

find ./ -mtime  210 -exec rm -rf {} \;

In case you don't want to go into subdirectories, there's the maxdepth parameter:

find ./ -maxdepth 1 -mtime  210 -delete

It might also be possible you only need to search for files, which you also might need to specify:

find ./ -type f -mtime  210 -delete

CodePudding user response:

use warnings;
use strict;
use feature 'say';

use File::Find::Rule;    
use FindBin qw($RealBin);

my $dir = shift // $RealBin;   # start from this directory

my @entries = File::Find::Rule -> new 
    -> exec( sub { -M $_[2] > 210 } ) 
    -> in($dir); 

say for @entries; 

This finds all entries (files and directories), recursively, that are older than 210 days.

Now go through the list and delete. Can identify directories using again -X filetests (-d), and it may be a good idea to first delete non-directories and then (now empty) directories. Or use rmtree as intended and skip files that would've been in those directories.


For example, something like

if ($entries[0] eq $dir) { 
    say "Remove from this list the top-level dir itself, $entries[0]";
    shift @entries;
}

my $del_dir;

for my $entry (@entries) {
    if (-d $entry) {
        $del_dir = $entry;
        say "remove $entry";  # rmtree
    }    
    # Skip files other than the ones at the top level
    elsif ($del_dir and $entry =~ /^$del_dir/) {
        say "\tskip $entry";  # its directory is removed
    }
    else { 
        say "Remove top-level file: $entry";  # unlink
    }
}

Note -- this has not been fully tested.

Note -- make sure to not remove the top-level directory, in which you start the search!

  • Related