Home > front end >  Store output of "ls" command to file while ignoring any errors
Store output of "ls" command to file while ignoring any errors

Time:09-17

So I'm running a command via ssh on a remote server, and I need to store all the output of this command but it stops as soon as it encounters an error like ls: cannot access '/downloads/something.txt': No such file or directory. Is there a way to keep the process going while ignoring any errors. It doesn't matter if I'm not even notified that errors occurred because I know for a fact that these are only a handful of files (less than 100MB out of a total of 16TB).

Here is my command:

find /downloads/ -type d ! -writable -print0 | xargs -0 -I{.} find {.} -maxdepth 1 -type f | xargs -I% ls -l % > myfile.txt

What I'm trying to achieve with this command is:

  1. Find (recursively) all directories that are NOT writable by me
  2. List the files which are the immediate children of these non-writable directories
  3. Get a long-listing format for these files using ls -l <filename>
  4. Store this output in a file so that later I can run a python script on it to get the total size of all identified files

The reason why I'm doing this is to identify all files on a shared server that are NOT deletable by me, IF I tried. Please also let me know if there's a better way to achieve this goal...

Thanks a lot!

CodePudding user response:

You should not be getting errors like ls: cannot access '/downloads/something.txt': No such file or directory since you're using find to get the input to ls. The reason you're getting them is that your command is missing some -print0 and -0 options. That said, you should probably redirect the stderr from the first two piped commands to /dev/null in case you get any Permission denied errors. After those changes, your command should look like this:

find /downloads -type d ! -writable -print0 2>/dev/null | xargs -0 -I{.} find {.} -maxdepth 1 -type f -print0 2>/dev/null | xargs -0 -I % ls -l % > myfile.txt

This command should generate no errors, but it may take a while if it has to scan 16TB.

You may also want to try the command below to see if it is faster:

find /downloads -type d ! -writable -exec find {} -maxdepth 1 -type f -print0 \; 2>/dev/null | xargs -0 -I % ls -l % > myfile.txt
  • Related