I followed the solutions in here, however, I am still getting the "cannot resolve symbol SQLContext" error. ".implicits._" cannot be resolved either. What would be the reason for it?
Spark/Scala versions I use:
- Scala 2.12.13
- Spark 3.0.1 (without bundled Hadoop)
Here is my related code part:
import org.apache.log4j.LogManager
import org.apache.spark.{SparkConf, SparkContext}
object Count {
def main(args: Array[String]) {
...
...
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
}}
CodePudding user response:
You didn't import SQLContext
at all:
import org.apache.spark.sql.SQLContext
You should probably not use SQLContext
anymore in the first place though:
As of Spark 2.0, this is replaced by SparkSession. However, we are keeping the class here for backward compatibility.
https://spark.apache.org/docs/latest/api/scala/org/apache/spark/sql/SQLContext.html
See how to use a SparkSession
from SparkContext
at How to create SparkSession from existing SparkContext and then import sparkSession.implicits._
.