Home > Enterprise >  How to filter output lines from bash command based on directorys
How to filter output lines from bash command based on directorys

Time:08-13

I have a file called sftp_output with the output of command from a sftp connectión that list the content of some folders and look like this:

sftp> ls -l dir1/
-rw-------   1 200      100          1352 Jul 01 14:20 file1
-rw-------   1 200      100          1352 Jul 10 14:20 file2
sftp> ls -l dir2/
-rw-------   1 200      100          1352 Jul 01 14:20 file1
-rw-------   1 200      100          1352 Jul 10 14:20 file2
sftp> bye

What I need to do is filter all the files from dir1 to a single file called "dir1_contents" and files from dir2 to a file called "dir2_contents" . What is the best approach to do something like that?

I tried doing something like this

cat sftp_output |grep -v 'sftp' |awk '{print $9}'|sed '/^$/d'

but isn't working and after thinking more about that command I realize that only works for 1 directory.

The expected result needs to be something like this.

File: dir1_contents

file1
file2

Thanks !

CodePudding user response:

This single awk can handle this:

awk '/ls/ {
   sub(/\/$/, "")
   close(fn)
   fn = $NF "_contents"
   next
}
fn {
   print > fn
}' sftp_output

CodePudding user response:

In pure bash:

#!/bin/bash

dir=""
while read -r line; do
  if [ "${line#sftp> ls -l }" != "${line}" ]; then
    dir="${line#sftp> ls -l }"
    dir="${dir%/}"
  elif [ "${dir}" != "" ]; then
    echo "${line}" >> "${dir}_contents"
  fi
done < sftp_output
  • Related