Home > Back-end >  Write the outcome of a for loop in multiple files in a single file in shell
Write the outcome of a for loop in multiple files in a single file in shell

Time:03-20

I am new to shell scripting (bash/awk...etc) so please excuse me for my stupid question. I am aware that to many of you this is so easy

I have multiple files that look like this:

file1.bam, ...., file1000.bam

and for each one of the files I am writing in terminal the following commands and I take a number as an outcome

samtools view -c -F 4 file1.bam 
9
# and 
samtools view -c -f 4 file1.bam 
2

Now, I am fighting hard to put the outcome of all for loops in one text file that looks like this

file1         9      2
...        ...      ...
file1000    100     50

so far I have written

for each in .bam
do
 echo ${each}
 samtools view -c -F 4 ${each}.bam 
 samtools view -c -f 4 ${each}.bam 
done

Edit: output of

samtools view -c -F 4 file1.bam | hexdump -C

00000000  39 0a                                             |9.|
00000002

CodePudding user response:

I suggest with bash:

for each in *.bam; do
  data1=$(samtools view -c -F 4 "${each}")
  data2=$(samtools view -c -f 4 "${each}")
  echo -e "${each}\t${data1}\t${data2}"
done

CodePudding user response:

Consider doing the collection and production of the final output outside of the loop:

for each in *.bam; do
    printf '%s\n' "${each%.*}"
    samtools view -c -F 4 "$each"
    samtools view -c -f 4 "$each"
done |
awk -v OFS='\t' '{a[NR%3]=$0} NR%3==0{print a[1], a[2], a[0]}'
  • Related