Home > Software engineering >  Hive "insert into" doesnt add values
Hive "insert into" doesnt add values

Time:11-26

Im new to hadoop etc. Connect via beeline to hiveserver2. Then I create table:

create table test02(id int, name string); 

Table creates and I try to insert values:

insert into test02(id, name) values (1, "user1");

And nothing happens. table02 and values__tmp__table__1 are created but they are both empty. Hadoop directory "/user/$username/warehouse/test01" is empty to.

0: jdbc:hive2://localhost:10000> insert into test02 values (1,"user1");  
No rows affected (2.284 seconds)
0: jdbc:hive2://localhost:10000> select * from test02;
 ------------ -------------- 
| test02.id  | test02.name  |
 ------------ -------------- 
 ------------ -------------- 
No rows selected (0.326 seconds)
0: jdbc:hive2://localhost:10000> show tables;
 ------------------------ 
|        tab_name        |
 ------------------------ 
| test02                 |
| values__tmp__table__1  |
 ------------------------ 
2 rows selected (0.137 seconds)

CodePudding user response:

Temp tables like these are created when hive needs to manage intermediate data during an operation. Hive automatically deletes all temporary tables at the end of the Hive session in which they are created. If you close the session and open it again, you won't find the temp table.

https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.5.0/bk_data-access/content/temp-tables.html

Insert data like this ->

insert into test02 values (999, "user_new");

Data would be inserted into test02 and a temp table like values__tmp__table__1 (temp table will gone after the hive session).

CodePudding user response:

Actually found the answer by myself. Im new to hadoop&co so it was not kinda obvious.

Firstly I turned hive logging to level ERROR to see the problem:

  1. Find hive-exec-log4j2.properties ({your hive directory}/conf/)
  2. Find property.hive.log.level and set value ERROR (..log.level = ERROR)

Then while executing command (insert into) via beeline I saw all the errors. Main was "There are 0 datanode(s) running and no node(s) are excluded in this operation".

Found the same question here. The first answer to delete all /tmp/* files (there was stored all my local hdfs data) helped me. Then, like the first time, I initialized namenode (-format) and hive (ran my metahive script).

Problem was solved but actually another occured: insert into executes 25 seconds.

  • Related