15/11/18 12:34:30 INFO repl. SparkILoop: Created the spark context..
The Spark context available as sc.
15/11/18 12:34:31 INFO hive. HiveContext: Initializing execution hive, version 1.2.1
15/11/18 12:34:31 INFO client. ClientWrapper: Inspected the Hadoop version: server
15/11/18 12:34:31 INFO client. ClientWrapper: the Loaded. Org. Apache hadoop. The hive. The shims. Hadoop23Shims for hadoop version server
15/11/18 12:34:32 INFO metastore. HiveMetaStore: 0: Opening raw store with implemenation class: org.. Apache hadoop. Hive. Metastore. ObjectStore
15/11/18 12:34:32 INFO metastore. ObjectStore: ObjectStore, initialize called
15/11/18 12:34:33 INFO DataNucleus. Persistence: Property hive. The metastore. J integral. Jdo. Pushdown unknown - will be ignored
15/11/18 12:34:33 INFO DataNucleus. Persistence: Property DataNucleus. Cache. Level2 stores unknown - will be ignored
15/11/18 12:34:33 WARN DataNucleus. Connection: BoneCP specified but not present in the CLASSPATH (or one of dependencies)
15/11/18 12:34:33 WARN DataNucleus. Connection: BoneCP specified but not present in the CLASSPATH (or one of dependencies)
15/11/18 12:34:40 INFO metastore. ObjectStore: Setting metastore object pin classes with hive. The metastore. Cache. Pinobjtypes="Table, StorageDescriptor, SerDeInfo, Partition, the Database, the Type, FieldSchema, Order", "
15/11/18 12:34:42 INFO DataNucleus. Datastore: The class ". Org. Apache hadoop. Hive. Metastore. Model. MFieldSchema "is tagged as" embedded - only "so does not have its own Datastore table.
15/11/18 12:34:42 INFO DataNucleus. Datastore: The class ". Org. Apache hadoop. Hive. Metastore. Model. The MOrder "is tagged as" embedded - only "so does not have its own Datastore table.
15/11/18 12:34:47 INFO DataNucleus. Datastore: The class ". Org. Apache hadoop. Hive. Metastore. Model. MFieldSchema "is tagged as" embedded - only "so does not have its own Datastore table.
15/11/18 12:34:47 INFO DataNucleus. Datastore: The class ". Org. Apache hadoop. Hive. Metastore. Model. The MOrder "is tagged as" embedded - only "so does not have its own Datastore table.
15/11/18 12:34:48 INFO metastore. MetaStoreDirectSql: Using direct SQL, physicist DB is the DERBY
15/11/18 12:34:48 INFO metastore. ObjectStore: the Initialized ObjectStore
15/11/18 12:34:48 WARN metastore. ObjectStore: Version information not found in metastore. Hive. Metastore. Schema. Verification is not enabled so recording the schema Version 1.2.0
15/11/18 12:34:49 WARN metastore. ObjectStore: Failed to get the default, returning NoSuchObjectException
15/11/18 12:34:50 INFO metastore. HiveMetaStore: Added the admin role in metastore
15/11/18 12:34:50 INFO metastore. HiveMetaStore: Added public role in metastore
15/11/18 12:34:50 INFO metastore. HiveMetaStore: No user is added in the admin role, since the config is the empty
15/11/18 12:34:50 INFO metastore. HiveMetaStore: 0: get_all_databases
15/11/18 12:34:50 INFO HiveMetaStore. Audit: the ugi==unknown spark IP - the IP - addr CMD=get_all_databases
15/11/18 12:34:50 INFO metastore. HiveMetaStore: 0: get_functions: db=default pat=*
15/11/18 12:34:50 INFO HiveMetaStore. Audit: the ugi==unknown spark IP - the IP - addr CMD=get_functions: db=default pat=*
15/11/18 12:34:50 INFO DataNucleus. Datastore: The class ". Org. Apache hadoop. Hive. Metastore. Model. MResourceUri "is tagged as" embedded - only "so does not have its own Datastore table.
Java. Lang. RuntimeException: java.net.ConnectException: Call From master/127.0.0.1 to localhost: 9000 failed on connection exception: java.net.ConnectException: rejected the connection. For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
At org, apache hadoop. Hive. Ql. Session. The SessionState. Start (SessionState. Java: 522)
The at org. Apache. Spark. SQL. Hive. Client. ClientWrapper. & lt; init> (ClientWrapper. Scala: 171)
The at org. Apache. Spark. SQL. Hive. HiveContext. ExecutionHive $lzycompute (HiveContext. Scala: 162)
The at org. Apache. Spark. SQL. Hive. HiveContext. ExecutionHive (HiveContext. Scala: 160)
The at org. Apache. Spark. SQL. Hive. HiveContext. & lt; init> (HiveContext. Scala: 167)
At sun. Reflect. NativeConstructorAccessorImpl. NewInstance0 (Native Method)
At sun. Reflect. NativeConstructorAccessorImpl. NewInstance (NativeConstructorAccessorImpl. Java: 62)
At sun. Reflect. DelegatingConstructorAccessorImpl. NewInstance (DelegatingConstructorAccessorImpl. Java: 45)
The at Java. Lang. Reflect. Constructor. NewInstance (422) Constructor. Java:
The at org. Apache. Spark. Repl. SparkILoop. CreateSQLContext (SparkILoop. Scala: 1028)
The at $iwC $$iwC. & lt; init> (
The at $iwC. & lt; init> (
The at & lt; init> (
Palawan & lt; init> (
Palawan & lt; Clinit> (
Palawan & lt; init> (
Palawan & lt; Clinit> (
The at $print (& lt; Console>)
At sun. Reflect. NativeMethodAccessorImpl. Invoke0 (Native Method)
nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull