I've a SFTP service using ubuntu server now it have many files and take too many space like:
-rwxrwxrwx 1 systemd-network systemd-journal 7190 Jul 1 2020 'document_1.xlsx'
-rw-r--r-- 1 systemd-network systemd-journal 1606 Jul 1 2021 'document_1.csv'
-rw-r--r-- 1 systemd-network systemd-journal 7191 Jul 1 2021 'document_2.xlsx'
-rw-r--r-- 1 systemd-network systemd-journal 1606 Jul 1 03:10 'document_2.csv'
-rw-r--r-- 1 systemd-network systemd-journal 7191 Jul 1 03:10 'document_3.xlsx'
-rwxrwxrwx 1 systemd-network systemd-journal 1606 Aug 1 2020 'document_3.csv'
-rwxrwxrwx 1 systemd-network systemd-journal 7190 Aug 1 2020 'document_4.xlsx'
-rw-r--r-- 1 systemd-network systemd-journal 1606 Aug 1 2021 'document_4.csv'
Now I want optimal space of server but I can't delete those. If better can we compress by filetype and/or month/year and/or modification/creation and/or delete.
Example:
document_2021.gzip
document_2021_csv.gzip
CodePudding user response:
You gave multiple options for a solution, so here's a simple one for the first option - To compress files by filetype. Of course, please test any commands on sample files before running on your important ones. Also, always have a backup before doing anything like this. But you already knew that. You can compress each file by type using the following command from the directory you want to work on:
find ./ -type f -name "*.xlsx" -exec gzip {} ;
Depending on how your shell is set up, if you need to end the command with \; or simply ; The result will compress your files in place, like document_1.xlsx.gz, document_2.xlsx.gz, etc.