Is it possible in Spark, to query by table properties?
Databricks has a page where they show an example of using the deparments as "tblproperties".
> CREATE TABLE T(c1 INT) TBLPROPERTIES('this.is.my.key' = 12, this.is.my.key2 = true);
> SHOW TBLPROPERTIES T;
option.serialization.format 1
this.is.my.key 12
this.is.my.key2 true
I would now like to be able to do somehow get a list of all table names where "this.is.my.key" = 12. Is this possible with e.g. Spark-SQL?
My current solution is to simply loop over all tables and do the check, but I feel like there could be a "nicer" solution.
Thanks in advance!
CodePudding user response:
This is not possible in an "elegant" way.