Home > Back-end >  How can i show hive table using pyspark
How can i show hive table using pyspark

Time:02-23

Hello i created a spark HD insight cluster on azure and i’m trying to read hive tables with pyspark but the proble that its show me only default database

Anyone have an idea ?

CodePudding user response:

If you have created tables in other databases, try show tables from database_name. Replace database_name with the actual name.

CodePudding user response:

You are missing details of hive server in SparkSession. If you haven't added any it will create and use default database to run sparksql.

If you've added configuration details in spark default conf file for spark.sql.warehouse.dir and spark.hadoop.hive.metastore.uris then while creating SparkSession add enableHiveSupport().

Else add configuration details while creating sparksession

.config("spark.sql.warehouse.dir","/user/hive/warehouse")
.config("hive.metastore.uris","thrift://localhost:9083")
.enableHiveSupport()
  • Related