Home > other >  S3 Spark connection in Beijing area
S3 Spark connection in Beijing area

Time:09-27

Hello, everyone,

Beginner's Spark, want to let the Spark from the data on the S3

The Spark builds on EC2

I perform under spark - shell:
Sc. HadoopConfiguration. Set (fs) s3a) access. The key, "XXXX")
Sc. HadoopConfiguration. Set (fs. S3a. Secret. "key", "yyyy")
Val textFile=sc. TextFile (" s3a://...
")TextFile. The count ()

Exception:
15/12/21 13:29:58 INFO S3AFileSystem: Caught an AmazonServiceException, which means your request made it to Amazon S3, but was rejected with an error response for some reason.
15/12/21 13:29:58 INFO S3AFileSystem: Error Message: the Status Code: 403, AWS Service: Amazon S3, AWS Request ID: A5C5253A63B271A6, AWS Error Code: null, AWS Error Message: who
15/12/21 13:29:58 INFO S3AFileSystem: HTTP Status Code: 403
15/12/21 13:29:58 INFO S3AFileSystem: AWS Error Code: null
15/12/21 13:29:58 INFO S3AFileSystem: Error Type: Client
15/12/21 13:29:58 INFO S3AFileSystem: Request ID: A5C5253A63B271A6
15/12/21 13:29:58 INFO S3AFileSystem: Class Name: com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception
Com.cloudera.com.amazonaws.services.s3.model.AmazonS3Exception: Status Code: 403, AWS Service: Amazon S3, AWS Request ID: A5C5253A63B271A6, AWS Error Code: null, AWS Error Message: who, S3 Extended Request ID: 7 zaafp60a1u8pt02jaywuwtf1onvd1g5n6 xAbutkxtU/M1ZvtSV0pjVzPN2aFOpANTj5R7Cikg=
At com.cloudera.com.amazonaws.http.AmazonHttpClient.handleErrorResponse AmazonHttpClient. Java: (798)
At com.cloudera.com.amazonaws.http.AmazonHttpClient.executeHelper AmazonHttpClient. Java: (421)
At com.cloudera.com.amazonaws.http.AmazonHttpClient.execute AmazonHttpClient. Java: (232)
At com.cloudera.com.amazonaws.services.s3.AmazonS3Client.invoke AmazonS3Client. Java: (3528)

Seems to be the server rejected (403)

Excuse me if I'm wrong operation?
Whether related to Beijing area S3, the default access to global? The Endpoint/Region?
In addition, there are links to my learn s3://, s3n://, s3a://difference? Which one should I use?

Thank you very much!
  • Related