Home > database >  Saved files in aws using bash have an unrecognized name
Saved files in aws using bash have an unrecognized name

Time:10-07

I have a list of files (/c/Users/Roy/DataReceived) over which I want to grep some information and store it as txt files(/c/Users/Roy/Documents/Result).

For example purposes: Imagine I have a 20 files with different information about cities, and I want to grep information for the cities that are listed in a txt file. All this information will then be stored in another txt file that would have the name of the given city (NewYork.txt, Rome.txt, etc).

The following code is working:

#!/bin/bash

declare INPUT_DIRECTORY=/c/Users/Roy/DataReceived
declare OUTPUT_DIRECTORY=/c/Users/Roy/Documents/Result

while read -r city; do
  echo $city
  zgrep -Hwi "$city" "${INPUT_DIRECTORY}/"*.vcf.gz > "${OUTPUT_DIRECTORY}/${city}.txt"
done < list_of_cities.txt

However, the .txt files generated have an unrecognized format.

ls: cannot access 'NewYork'$'\n''.txt': No such file or directory
'NewYork'$'\n''.txt'

EDIT: I know what it is. This line of code > "${OUTPUT_DIRECTORY}/${city}.txt" is not working properly, as it's storing the files as .txt/c/Users/Roy/Documents/Result/NewYork. Not sure how to solve it

CodePudding user response:

From the error, it looks like your file contains CRLF (Windows line endings) while Bash, even on Windows (it was not the case some years ago...), is expecting LF.

You must either use LF ending in your editor or convert it using dos2unix before:

while ... 
done < <(cat list_of_cities.txt | dos2unix)
  •  Tags:  
  • bash
  • Related