"
Mysql table:
Mysql> Desc s_site_pv_uv;
+ + -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - + -- -- -- -- -- - + + -- -- -- -- -- -- -- -- -- -- -- -- -- - + -- -- -- -- -- -- -- +
| Field | Type | Null | Key | Default | Extra |
+ + -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - + -- -- -- -- -- - + + -- -- -- -- -- -- -- -- -- -- -- -- -- - + -- -- -- -- -- -- -- +
| | thatday | varchar (100) NO | | NULL | |
| | | site varchar (500) YES | | NULL | |
| | site1 | varchar (500) YES | | NULL | |
| | pv | int (11) YES | | NULL | |
| | | uv int (11) YES | | NULL | |
+ + -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- - + -- -- -- -- -- - + + -- -- -- -- -- -- -- -- -- -- -- -- -- - + -- -- -- -- -- -- -- +
5 rows in the set (0.00 SEC)
HDFS data the describe:
Hadoop fs - cat/user/HDFS/test
The 345 aaa BBBB 12 2015-02-09
The 2015-02-23 edadfsdf eddd 23 34
Under the shell:
[HDFS @ cdh162 log] $export - connect JDBC: mysql://172.16.164.162:3306/jrj_log - the username root - password jrj123456 - table s_site_pv_uv - export - dir/user/HDFS/test - input - fields - terminated - by '\ t' - lines - terminated - by '\ n'
Bash: export: - : invalid option
Export: usage: export [- fn] [name] [=value]... The or export - p
[HDFS @ cdh162 log] $sqoop export - connect JDBC: mysql://172.16.164.162:3306/jrj_log - the username root - password jrj123456 - table s_site_pv_uv - export - dir/user/HDFS/test - input - fields - terminated - by '\ t' - lines - terminated - by '\ n'
Warning:/opt/cloudera/parcels/CDH 5.3.0-1. Cdh5.3.0. P0.30/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Both Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/02/16 17:18:53 INFO sqoop. Sqoop: Running sqoop version: 1.4.5 - cdh5.3.0
15/02/16 17:18:53 WARN tool. BaseSqoopTool: Setting your password on the command - line is insecure. Consider using - P home.
15/02/16 17:18:53 INFO manager. MySQLManager: Preparing to use a MySQL streaming resultset.
15/02/16 17:18:53 INFO tool. CodeGenTool: Beginning code generation
15/02/16 17:18:54 INFO manager. SqlManager: Executing SQL statement: SELECT * FROM t. ` s_site_pv_uv ` AS t LIMIT 1
15/02/16 17:18:54 INFO manager. SqlManager: Executing SQL statement: SELECT * FROM t. ` s_site_pv_uv ` AS t LIMIT 1
15/02/16 17:18:54 INFO orm.Com pilationManager: HADOOP_MAPRED_HOME is/opt/cloudera/parcels/CDH/lib/hadoop - graphs
Note:/TMP/sqoop - HDFS/compile/ed405314b1f07f087740b4675f341edc/s_site_pv_uv Java USES or overrides a deprecated API.
Note: Recompile with - Xlint: deprecation for details.
15/02/16 17:18:55 INFO orm.Com pilationManager: Writing jar file:/TMP/sqoop - HDFS/compile/ed405314b1f07f087740b4675f341edc/s_site_pv_uv jar
15/02/16 17:18:55 INFO graphs. ExportJobBase: Beginning the export of s_site_pv_uv
15/02/16 17:18:55 INFO Configuration. Deprecation: mapred. Jar is deprecated. Home, use graphs. The job. The jar
15/02/16 17:18:56 INFO Configuration. Deprecation: mapred. Reduce. The tasks. The speculative. Execution is deprecated. Home, use graphs. Reduce. Speculative
15/02/16 17:18:56 INFO Configuration. Deprecation: mapred. Map. The tasks. The speculative. Execution is deprecated. Home, use graphs. The map. The speculative
15/02/16 17:18:56 INFO Configuration. Deprecation: mapred. Map. The tasks is deprecated. Home, use graphs. Job. Maps
15/02/16 17:18:56 INFO client. RMProxy: Connecting to the ResourceManager at cdh162.com/172.16.164.162:8032
15/02/16 17:18:58 INFO input. FileInputFormat: Total input paths to the process: 1
15/02/16 17:18:58 INFO input. FileInputFormat: Total input paths to the process: 1
15/02/16 17:18:58 INFO graphs. JobSubmitter: number of splits: 4
15/02/16 17:18:58 INFO Configuration. Deprecation: mapred. Map. The tasks. The speculative. Execution is deprecated. Home, use graphs. The map. The speculative
15/02/16 17:18:58 INFO graphs. JobSubmitter: date tokens for the job: job_1423964363172_0047
15/02/16 17:18:58 INFO impl. YarnClientImpl: Submitted application application_1423964363172_0047
15/02/16 17:18:58 INFO graphs. The Job: The url to track The Job: http://cdh162.com:8088/proxy/application_1423964363172_0047/
15/02/16 17:18:58 INFO graphs. Job: Running Job: job_1423964363172_0047
15/02/16 17:19:11 INFO graphs. The Job: the Job job_1423964363172_0047 running in uber mode: false
15/02/16 17:19:11 INFO graphs. The Job: the map reduce 0% 0%
15/02/16 17:19:20 INFO graphs. The Job: the map reduce 100% 0%
15/02/16 17:19:20 INFO graphs. The Job: the Job job_1423964363172_0047 completed successfully
15/02/16 17:19:20 INFO graphs. Job: Counters: 30
The File System Counters
FILE: Number of bytes read=0
FILE: Number of bytes written=527604
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=697
HDFS: Number of bytes written=0
HDFS: Number of read operations=19
HDFS: a Number of large read operations=0
HDFS: Number of write operations=0
Job Counters
nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull