I am trying to bind data to map type column in Cassandra using my Spark Scala application.
val updateTemplate = s"""UPDATE test_ks.test_table_ttl USING TTL 5
|SET ttl_col = ttl_col {:mapKey : (':value1', ':value2')}
|WHERE consumer_id=:consumer_id""".stripMargin
val prep_statement: PreparedStatement = cqlSession.prepare(updateTemplate)
This is throwing an error.
Invalid map literal for ttl_col: bind variables are not supported inside collection literals
My DDL for Cassandra table is like this.
CREATE KEYSPACE test_ks
WITH replication = {'class':'SimpleStrategy', 'replication_factor' : 3};
CREATE TABLE test_ks.test_table_ttl (
consumer_id TEXT PRIMARY KEY,
ttl_col map<text, frozen<tuple<text, text>>>
);
CodePudding user response:
As it is mentioned in the error itself, bind variables are not supported inside collection literals.
Hence we have to change the updateTemplate
itself.
val updateTemplate = s"""UPDATE $keySpace.$tableName USING TTL 30
|SET one_time_pa = one_time_pa :mapData
|WHERE id=:id""".stripMargin
val tupleType: TupleType = DataTypes.tupleOf(DataTypes.TEXT, DataTypes.TEXT)
val rowKey = // some row key value
val mapKey = // some map key value
val mapValue = mapValueTupleType.newValue(tuple)
val mapData = ImmutableMap.builder().put(mapKey, mapValue).build()
prep_statement.bind(mapData, rowKey)