This command in snowflake works perfectly w. this new s3 bucket our devops team has created:
CREATE STAGE IF NOT EXISTS test.business_tools
URL = 's3://new_bucket/'
CREDENTIALS = (aws_role = 'arn:aws:iam::xxxxxxx:role/s3-snowflake-blah-dev')
but when I try to run this command:
COPY INTO test_table.testing1
FROM (
SELECT CURRENT_TIMESTAMP::TIMESTAMP_LTZ, METADATA$FILENAME, METADATA$FILE_ROW_NUMBER, $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12
FROM @test.business_tools/test_file.csv
)
I see this error:
Error assuming AWS_ROLE. Please verify the role and externalId are configured correctly in your AWS policy.
Our devops team said the trust relationship config is OK and they've updated the IAM policy for the s3 bucket so I'm confused what else this could be. see below for the policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::new_bucket"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:DeleteObject",
"s3:DeleteObjectVersion"
],
"Resource": "arn:aws:s3:::new_bucket/*"
}
]
}
CodePudding user response:
The error is caused by incorrect trust policy configuration. Every time when stage is created/re-created it generates new AWS_EXTERNAL_ID which must match the "sts:ExternalId" in the trust policy
Run:
describe stage <stage_name>;
It will print the actual AWS_EXTERNAL_ID and make sure it is the one configured as sts:ExternalId in the trust config.
Doc reference: Step 3 and 4
Alternatively, Snowflake suggests using cloud storage integration for external stages. It will allow to store the credentials in Snowflake and use them creating multiple stages.