1. The spark to create table SQL process is as follows:
The./bin/spark - the shell - master yarn
.
Scala> SqlContext. SQL (" CREATE TABLE IF NOT EXISTS sparksql (key INT, the value STRING) ")
2. Hive using usually create table SQL statement
Hive> The CREATE TABLE IF NOT EXISTS hivetest (key INT, the value STRING)
(3) in the management of the hadoop page or the use of "/usr/local/hadoop - 2.5.2/bin/hadoop fs - ls/user/hive/warehouse", you can see two kinds of method to create table;
But in running hive can only search to the hive using SQL statement create table:
Hive> Show tables;
Hivetest
Scala program can only see the scala create table:
Scala> HQL (" show tables "). Collect ()
.
Res2: Array [org. Apache. Spark. SQL. Row]=Array ([sparksql])
Spark USES is 1.2.0, said the official documentation support hive0.13.1, hive is 0.13.1
CodePudding user response:
People don't have encountered similar problems?CodePudding user response:
Add, sqlcontext use hivecontext created:Val sqlContext=new org. Apache. Spark. SQL. Hive. HiveContext (sc)
CodePudding user response:
Found the reason of his own, hive at run subdirectory to seek metastore_db directory, the directory to store all the information and hive table, hive which metastore_db stored under the directory, hive or spark see is which hive table, can also by setting changes,CodePudding user response:
I also met the same problem, could you tell me you specific is how to solveCodePudding user response:
/spark - shell created under the table, didn't really write metadataCodePudding user response:
Under the original poster can detailed say how to solveCodePudding user response:
Ditto, I also encountered such a problem, how to solve specific ah! The original poster!!!!!! Please carefully explain solution,CodePudding user response:
For detailed solution?? O. (? ??????????????? )? O.? The several solves the upstairs