Home > Blockchain >  Exporting data from Teradata to HDFS using TDCH
Exporting data from Teradata to HDFS using TDCH

Time:01-19

I'm trying to export a table from Teradata into a file in my hdfs using TDCH.

I'm using the below parameters :

hadoop jar $TDCH_JAR com.teradata.connector.common.tool.ConnectorImportTool \
        -libjars $LIB_JARS \
        -Dmapred.job.queue.name=default \
        -Dtez.queue.name=default \
        -Dmapred.job.name=TDCH \
        -classname com.teradata.jdbc.TeraDriver \
        -url jdbc:teradata://$ipServer/logmech=ldap,database=$database,charset=UTF16 \
        -jobtype hdfs \
        -fileformat textfile \
  -separator ',' \
  -enclosedby '"' \
        -targettable ${targetTable} \
        -username ${userName} \
        -password ${password} \
        -sourcequery "select * from ${database}.${targetTable}" \
        -nummappers 1 \
  -sourcefieldnames "" \
  -targetpaths ${targetPaths}

It's working, but I need the headers in the file, and when I add the parameter:

-targetfieldnames "ID","JOB","DESC","DT","REG" \

It doesnt work, I don't even generate the file anymore.

Can anyonne help me?

CodePudding user response:

The -targetfieldnames option is only valid for -jobtype hive. It does not put headers in the HDFS file, it specifies Hive column names. (There is no option to prefix CSV with a header record.)

Also the value supplied for -targetfieldnames should be a single string like "ID,JOB,DESC,DT,REG" rather than a list of strings.

  • Related