I don't know how the add columns clause works in spark sql. Here is my code. But it has parserexception. What's wrong with it?
spark.sql("ALTER TABLE deltaTable ADD COLUMNS (abc LongType, dea LongType AFTER ttt)")
Error is:
ParseException
/databricks/spark/python/pyspark/sql/session.py in sql(self, sqlQuery)
775 [Row(f1=1, f2='row1'), Row(f1=2, f2='row2'), Row(f1=3, f2='row3')]
776 """
--> 777 return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
778
779 def table(self, tableName):
/databricks/spark/python/lib/py4j-0.10.9.1-src.zip/py4j/java_gateway.py in __call__(self, *args)
1302
1303 answer = self.gateway_client.send_command(command)
-> 1304 return_value = get_return_value(
1305 answer, self.gateway_client, self.target_id, self.name)
CodePudding user response:
LongType is dataframe datatype not spark SQL datatype which if you scroll down in your error message you will see
So in spark sql use below query
spark.sql("ALTER TABLE deltatable ADD COLUMNS (abc Long, dea Long AFTER ttt)")