So, I was creating a table using over hdfs using CTAS command, as:
CREATE TABLE <catalog>.<schema>.<table_name> AS ( external_location = 'hdfs:///path' ) SELECT ... ;
This throws me an error
Failed checking path: hdfs:/path
While, if I write my query as the following, it works fine:
CREATE TABLE <catalog>.<schema>.<table_name> AS ( external_location = 'hdfs://[master_ip]/path' ) SELECT ... ;
CodePudding user response:
The behaviour is correct only hdfs:///path isn’t a complete path. hdfs:// specifies the scheme of the path that it is a Hadoop Distributed FileSystem, just that. Post that you need to specify where is that located by using Namenode IP or HDFS namespace id and then the path.
Kind of hdfs:// tells the type of filesytem is hdfs Namespace or IP post that tell where is the FileSystem located And then the path, that is which path to use inside the hdfs filesystem located or identified by the IP or Namespace Id
CodePudding user response:
If you'd like to set the filesystem used for the tables, edit the core-site.xml
to include hdfs://namenode.address:port
as fs.defaultFS
and restart the Trino server(s).
Only then can you exclude the host/port from the URI