Home > other >  The local local run Spark program ERROR ERROR SparkContext: 91 - ERROR initializing SparkContext
The local local run Spark program ERROR ERROR SparkContext: 91 - ERROR initializing SparkContext

Time:09-17

Just to fit the Spark, and to ensure the success of the environment has been set up, the scala version is 2.11.12, Spark is version 2.4, using the eclipse installed scala plugin, then create a scala project, import the jar package of local scala and Spark directory all the jars in the jars, run a wordcount program, as follows:
 import org. Apache. Spark. SparkConf 
The import org. Apache. Spark. SparkContext
The import org. Apache. Spark. RDD. RDD

Object the HelloWorld {
Def main (args: Array [String]) : Unit={

Val conf=new SparkConf (.) setMaster (" local "). SetAppName (" WC ")
Val sc=new SparkContext (conf)
Val RDD=sc. TextFile (" hello. TXT ", 1)
RDD. FlatMap (_. The split (" ")). The map ((_, 1)). ReduceByKey (+ _ _). The foreach (println)
}
}


Error message is as follows:
 
The 2019-02-03 00:05:46 WARN Utils: 66 - Your hostname, josonlee - PC resolves to a loopback address: 127.0.1.1; Using 192.168.0.106 home (on interface wlp2s0)
The 2019-02-03 00:05:46 WARN Utils: 66 - Set SPARK_LOCAL_IP if you need to bind to another address
The 2019-02-03 00:05:46 INFO SparkContext: 54 - Running Spark version 2.4.0
The 2019-02-03 00:05:46 WARN NativeCodeLoader: 62 - Unable to load native - hadoop library for your platform... Using the builtin - Java classes where applicable
The 2019-02-03 00:05:46 INFO SparkContext: 54 - Submitted application: WC
The 2019-02-03 00:05:46 INFO SecurityManager: 54 - Changing view acls to: josonlee
The 2019-02-03 00:05:46 INFO SecurityManager: 54 - Changing the modify acls to: josonlee
The 2019-02-03 00:05:46 INFO SecurityManager: 54 - Changing view acls groups to:
The 2019-02-03 00:05:46 INFO SecurityManager: 54 - Changing the modify acls groups to:
The 2019-02-03 00:05:46 INFO SecurityManager: 54 - SecurityManager: authentication disabled; The UI acls disabled; The users with the view permissions: Set (josonlee); Groups with the view permissions: Set (); The users with the modify permissions: Set (josonlee); Groups with the modify permissions: Set ()
The 2019-02-03 00:05:47 INFO Utils: 54 - Successfully started service 'sparkDriver on port 40621.
The 2019-02-03 00:05:47 INFO SparkEnv: 54 - Registering MapOutputTracker
The 2019-02-03 00:05:47 INFO SparkEnv: 54 - Registering BlockManagerMaster
The 2019-02-03 00:05:47 INFO BlockManagerMasterEndpoint: 54 - Using org. Apache. Spark. Storage. DefaultTopologyMapper for getting the topology information
The 2019-02-03 00:05:47 INFO BlockManagerMasterEndpoint: 54 - BlockManagerMasterEndpoint up
The 2019-02-03 00:05:47 INFO DiskBlockManager: 54 - Created local directory at/TMP/blockmgr c8dd57f7 - cf15-4823 - a44e - b9a17fa86443
The 2019-02-03 00:05:47 INFO MemoryStore: 54 - MemoryStore started with capacity of 1403.1 MB
The 2019-02-03 00:05:47 ERROR MetricsConfig: 91 - ERROR loading the configuration file metrics. The properties
Java. Lang. NullPointerException
The at org. Apache. Spark. The metrics. MetricsConfig. LoadPropertiesFromFile (MetricsConfig. Scala: 133)
At org. Apache. Spark. The metrics. MetricsConfig. The initialize (55) MetricsConfig. Scala:
The at org. Apache. Spark. The metrics. MetricsSystem. & lt; init> (MetricsSystem. Scala: 95)
The at org. Apache. Spark. The metrics. MetricsSystem $. CreateMetricsSystem (MetricsSystem. Scala: 233)
At org. Apache. Spark. SparkEnv $. The create (357). SparkEnv scala:
The at org. Apache. Spark. SparkEnv $. CreateDriverEnv (SparkEnv. Scala: 175)
The at org. Apache. Spark. SparkContext. CreateSparkEnv (SparkContext. Scala: 257)
The at org. Apache. Spark. SparkContext. & lt; init> (SparkContext. Scala: 424)
At the HelloWorld $. The main (HelloWorld. Scala: 9)
At the HelloWorld. Main (HelloWorld. Scala)
The 2019-02-03 00:05:47 INFO SparkEnv: 54 - Registering OutputCommitCoordinator
The 2019-02-03 00:05:47 INFO log: 192 - Logging the initialized @ 1049 ms
The 2019-02-03 00:05:47 ERROR SparkContext: 91 - ERROR initializing SparkContext.
Java. Lang. NullPointerException
The at org. Apache. Spark. UI. JettyUtils $. CreateStaticHandler (JettyUtils. Scala: 193)
The at org. Apache. Spark. UI. WebUI. AddStaticHandler WebUI. Scala: (121)
The at org. Apache. Spark. UI. SparkUI. The initialize (SparkUI. Scala: 68)
The at org. Apache. Spark. UI. SparkUI. & lt; init> (SparkUI. Scala: 80)
At org. Apache. Spark. UI. SparkUI $. The create (175). SparkUI scala:
The at org. Apache. Spark. SparkContext. & lt; init> (SparkContext. Scala: 444)
At the HelloWorld $. The main (HelloWorld. Scala: 9)
At the HelloWorld. Main (HelloWorld. Scala)
The 2019-02-03 00:05:47 INFO MapOutputTrackerMasterEndpoint: 54 - MapOutputTrackerMasterEndpoint stopped!
The 2019-02-03 00:05:47 INFO MemoryStore: 54 - MemoryStore cleared
The 2019-02-03 00:05:47 INFO BlockManager: 54 - BlockManager stopped
The 2019-02-03 00:05:47 INFO BlockManagerMaster: 54 - BlockManagerMaster stopped
The 2019-02-03 00:05:47 WARN MetricsSystem: 66 - Stopping a MetricsSystem belong not running
The 2019-02-03 00:05:47 INFO OutputCommitCoordinator $OutputCommitCoordinatorEndpoint: 54 - OutputCommitCoordinator stopped!
The 2019-02-03 00:05:47 INFO SparkContext: 54 - Successfully stopped SparkContext
The Exception in the thread "main" Java. Lang. NullPointerException
The at org. Apache. Spark. UI. JettyUtils $. CreateStaticHandler (JettyUtils. Scala: 193)
nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull
  • Related