I have 100s files in the following format:
Jan 20 21:42:36 123UH9887934 [process name] text text text text
tab delimited.
all of them in a dir; many entries per file; in the same format.
I'd like to use a bash or Powershell or even Mac Automator script, to convert them to Excel format.
Importing into Excel just puts all the data into one cell. And is manual.
The problem is that most scripts assume commas as delimiter so they put all the data into one cell.
If I open the files one-by-one in Numbers it understands the tab delimiters and formats it correctly. It just takes forever.
I've Tried Powershell:
Import-Csv "VQ18072500JPJ.csv" | Format-Table
but can't quite figure out how it wants me to convert and save to a new file.
I've tried Bash: for file in ./*.csv do ssconvert -O "separator=' ' $file ${file%.csv}.xls; done
but it doesn't understand the delimiters either.
Any ideas on how to get 100s of tab delimited CSVs converted with a script would be very appreciated.
Thanks
CodePudding user response:
Assuming the sole objective is to covert tabs to commas:
$ tr '\t' ',' < file.csv
Jan,20,21:42:36,123UH9887934,[process name],text,text,text,text
$ awk -v OFS=, '{$1=$1}1' file.csv
Jan,20,21:42:36,123UH9887934,[process,name],text,text,text,text
$ sed 's/\t/,/g' file.csv
Jan,20,21:42:36,123UH9887934,[process name],text,text,text,text
CodePudding user response:
When your csv file starts with a line SEP=x
, the character x will be used by Excel as the field separator. Replace the x with a real TAB.
for f in *.csv; do
sed -i '1s/^/SEP=\t\n/' "$f"
done
Perhaps you can use PowerShell with the code given in https://stackoverflow.com/a/21169406/3220113 Perhaps change one line into
$query.TextFileOtherDelimiter = "`t"