Home > Software engineering >  How to move files using the result as condition after grep command
How to move files using the result as condition after grep command

Time:10-27

I have 2 files that I needed to grep in a separate file.

The two files are in this directory /var/list

TB.1234.txt
TB.135325.txt

I have to grep them in another file in another directory which is in /var/sup/. I used the command below:

for i in TB.*; do grep "$i" /var/sup/logs.txt; done

what I want to do is, if the result of the grep command contains the word "ERROR" the files which is found in /var/list will be moved to another directory /var/last.

for example I grep this file TB.1234.txt to /var/sup/logs.txt then the result is like this:

ERROR: TB.1234.txt

TB.1234.txt will be move to /var/last.

please help. I don't know how to construct the logic on how to move the files, I'm stuck in that I provided, I am also trying to use two greps in a for loop but I am encountering an error.

I am new in coding and really appreciates any help and suggestions. Thank you so much.

CodePudding user response:

If you are asking how to move files which contain "ERROR", this should be extremely straightforward.

for file in TB.*; do
    grep -q 'ERROR' "$file" &&
    mv "$file" /var/last/
done

The notation this && that is a convenient shorthand for

if this; then
    that
fi

The -q option to grep says to not print the matches, and quit as soon as you find one. Like all well-defined commands, grep sets its exit code to reflect whether it succeeded (the status is visible in $?, but usually you would not examine it directly; perhaps see also Why is testing ”$?” to see if a command succeeded or not, an anti-pattern?)


Your question is rather unclear, but if you want to find either of the matching files in a third file, perhaps something like

awk 'FNR==1 && (  n < ARGC-1) { a[n] = FILENAME; nextfile }
  /ERROR/ { for(j=1; j<=n;   j) if ($0 ~ a[j]) b[a[j]]   }
  END { for(f in b) print f }' TB*.txt /var/sup/logs.txt |
xargs -r mv -t /var/last/

This is somewhat inefficient in that it will read all the lines in the log file, and brittle in that it will only handle file names which do not contain newlines. (The latter restriction is probably unimportant here, as you are looking for file names which occur on the same line as the string "ERROR" in the first place.)

In some more detail, the Awk script collects the wildcard matches into the array a, then processes all lines in the last file, looking for ones with "ERROR" in them. On these lines, it checks if any of the file names in a are also found, and if so, also adds them to b. When all lines have been processed, print the entries in b, which are then piped to a simple shell command to move them.

xargs is a neat command to read some arguments from standard input, and run another command with those arguments added to its command line. The -r option says to not run the other command if there are no arguments.

(mv -t is a GNU extension; it's convenient, but not crucial to have here. If you need portable code, you could replace xargs with a simple while read -r loop.)

The FNR==1 condition requires that the input files are non-empty.

If the text file is small, or you expect a match near its beginning most of the time, perhaps just live with grepping it multiple times:

for file in TB.*; do
    grep -Eq "ERROR.*$file|$file.*ERROR" /var/sup/logs.txt &&
    mv "$file" /var/last/
done

Notice how we now need double quotes, not single, around the regular expression so that the variable $file gets substituted in the string.

CodePudding user response:

grep has an -l switch, showing only the filename of the file which contains a pattern. It should not be too difficult to write something like (this is pseudocode, it won't work, it's just for giving you an idea):

if $(grep -l "ERROR" <directory> | wc -l) > 0
then foreach (f in $(grep -l "ERROR")
     do cp f <destination>
end if

The wc -l is to check if there are any files which contain the word "ERROR". If not, nothing needs to be done.

  • Related