Recently started learning, the spark in the spark to read data in hbase this stage, an exception, baidu for a long time and can't get everything right, I use the IDE eclipse, scala's version is 2.11.6 hbase version 1.2.6, spark is version 2.1.0, hadoop version is 2.7.3 (test), I didn't use my own zookeeper, using 3.4.6 version, single use hbase - shell used to delete data is no problem, the corresponding jar package I also in the hbase jar added to the project of the spark, I machines are pseudo distributed, only a computer, hadoop and spark are pseudo-distributed, code is as follows:
The object Test {
Def main (args: Array [String]) {
Val conf=HBaseConfiguration. The create ();
The conf. Set (" hbase. Zookeeper. Quorum ", "192.168.0.102")
The conf. Set (" hbase. Zookeeper. Property. ClientPort ", "2181")
Val sc=new SparkContext (new SparkConf () setMaster (" local "). SetAppName (" Hbase - test "))
The conf. Set (TableInputFormat INPUT_TABLE, "student")
Val stuRDD=sc. NewAPIHadoopRDD (conf, classOf [TableInputFormat], classOf [. Org. Apache hadoop, hbase. IO. ImmutableBytes
Writable], classOf [. Org. Apache hadoop, hbase client. The Result])
Val count=stuRDD. The count ()
Println (" Students RDD Count: "+ Count)
StuRDD. Cache ()
StuRDD. Foreach ({case (_, result)=& gt; Val key=Bytes. ToString (result. GetRow)
Val name=Bytes. ToString (result. GetValue (" info ". GetBytes, "name" getBytes))
Val gender=Bytes. ToString (result. GetValue (" info ". GetBytes, "gender". GetBytes))
Val age=Bytes. ToString (result. GetValue (" info ". GetBytes, "age" getBytes))
Println (" Row Key ":" + Key + "Name:" + Name + "Gender:" + "Age:" + + Gender Age)
})
}
}
Exceptions are as follows:
Omit part of unrelated, at first I think is my zookeeper configuration has a problem, I try to use hbase own zookeeper, error is still the same,
17/11/03 14:30:29 INFO ZooKeeper: Initiating the client connection, connectString=192.168.0.102:2181 sessionTimeout=90000 watcher=hconnection x5c089b2f0x0 0, quorum=192.168.0.102:2181, baseZNode=/hbase
17/11/03 14:30:29 INFO ClientCnxn: Opening a socket connection to the server bigdata3/192.168.0.102:2181. Will not attempt to authenticate using SASL (unknown error)
17/11/03 14:30:29 INFO ClientCnxn: Socket connection established to bigdata3/192.168.0.102:2181, initiating the session
17/11/03 14:30:29 INFO ClientCnxn: Session establishment complete on server bigdata3/192.168.0.102:2181, sessionid=0 x15f807724d60007, negotiated a timeout=40000
17/11/03 14:30:29 INFO RegionSizeCalculator: Calculating region sizes for the table "student".
17/11/03 14:31:07 INFO RpcRetryingCaller: Call the exception, tries=10, retries=35, started=38458 ms and cancelled=false, MSG=row 'student,, 00000000000000' on the table 'hbase: meta at region=hbase: meta, and 1.1588230740, the hostname=bigdata3, 16201150688469, 97, seqNum=0
17/11/03 14:31:17 INFO RpcRetryingCaller: Call the exception, tries=11, retries=35, started=48531 ms and cancelled=false, MSG=row 'student,, 00000000000000' on the table 'hbase: meta at region=hbase: meta, and 1.1588230740, the hostname=bigdata3, 16201150688469, 97, seqNum=0
17/11/03 14:31:17 INFO ConnectionManager $HConnectionImplementation: Closing the zookeeper sessionid=0 x15f807724d60007
17/11/03 14:31:17 INFO ZooKeeper: Session: 0 x15f807724d60007 closed
17/11/03 14:31:17 INFO ClientCnxn: EventThread shut down
The Exception in the thread "main" org, apache hadoop, hbase. Client. RetriesExhaustedException: Failed after attempts=36, exceptions:
Fri Nov 2017, 03 14:31:17 CST null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=68712: row 'student,, 00000000000000' on the table 'hbase: meta at region=hbase: meta, and 1.1588230740, the hostname=bigdata3, 16201150688469, 97, seqNum=0
At org, apache hadoop. Hbase. Client. RpcRetryingCallerWithReadReplicas. ThrowEnrichedException (RpcRetryingCallerWithReadReplicas. Java: 276)
At org, apache hadoop. Hbase. Client. ScannerCallableWithReplicas. Call (210) ScannerCallableWithReplicas. Java:
At org, apache hadoop. Hbase. Client. ScannerCallableWithReplicas. Call (60) ScannerCallableWithReplicas. Java:
At org, apache hadoop. Hbase. Client. RpcRetryingCaller. CallWithoutRetries (RpcRetryingCaller. Java: 210)
At org, apache hadoop. Hbase. Client. ClientScanner. Call (327) ClientScanner. Java:
At org, apache hadoop. Hbase. Client. ClientScanner. NextScanner (ClientScanner. Java: 302)
At org, apache hadoop. Hbase. Client. ClientScanner. InitializeScannerInConstruction (ClientScanner. Java: 167)
At org, apache hadoop. Hbase. Client. ClientScanner. & lt; init> (ClientScanner. Java: 162)
At org, apache hadoop. Hbase. Client. HTable. GetScanner (HTable. Java: 797)
At org, apache hadoop. Hbase. Client. MetaScanner. MetaScan (MetaScanner. Java: 193)
At org, apache hadoop. Hbase. Client. MetaScanner. MetaScan (MetaScanner. Java: 89)
At org, apache hadoop. Hbase. Client. MetaScanner. AllTableRegions (MetaScanner. Java: 324)
At org, apache hadoop. Hbase. Client. HRegionLocator. GetAllRegionLocations (HRegionLocator. Java: 89)
At org, apache hadoop. Hbase. Util. RegionSizeCalculator. Init (RegionSizeCalculator. Java: 94)
At org, apache hadoop. Hbase. Util. RegionSizeCalculator. & lt; init> (RegionSizeCalculator. Java: 81)
At org, apache hadoop. Hbase. Graphs. TableInputFormatBase. GetSplits (TableInputFormatBase. Java: 256)
nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull