Home > Net >  [Solution]The spark context has stopped and the driver is restarting. Your notebook will be automati
[Solution]The spark context has stopped and the driver is restarting. Your notebook will be automati

Time:10-21

If your notebook or diver program is getting killed with the below error

Code

from pyspark.sql import SparkSession
def main():
    spark_sess = SparkSession.builder.appName('app_name').config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension").config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog").getOrCreate()
    print("hello")
    spark_sess.stop()


if __name__ == '__main__':
    main()

Error:

The spark context has stopped and the driver is restarting. Your notebook will be automatically reattached

CodePudding user response:

Please check if you are doing any of it.

  1. spark_context.stop()
  2. Memory of cluster

We should not stop spark context when we run anything in databricks.

CodePudding user response:

driver is not having enough memory to handle your data in driver side .

  • Related