I am trying to list the buckets located in the other GCP Project, essentially using code from here: https://cloud.google.com/docs/authentication/production#auth-cloud-explicit-python
To do this I need to validate access with Json file.
Unfortunately I cant resolve error with linking my Json file to the function. This is the code I use:
def explicit(argument):
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json(
'gs://PROJECT/PATH/service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
The error I am getting:
Traceback (most recent call last): File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 2073, in wsgi_app response = self.full_dispatch_request() File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 1518, in full_dispatch_request rv = self.handle_user_exception(e) File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 1516, in full_dispatch_request rv = self.dispatch_request() File "/layers/google.python.pip/pip/lib/python3.9/site-packages/flask/app.py", line 1502, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args) File "/layers/google.python.pip/pip/lib/python3.9/site-packages/functions_framework/__init__.py", line 99, in view_func return function(request._get_current_object()) File "/workspace/main.py", line 6, in explicit storage_client = storage.Client.from_service_account_json( File "/layers/google.python.pip/pip/lib/python3.9/site-packages/google/cloud/client.py", line 106, in from_service_account_json with io.open(json_credentials_path, "r", encoding="utf-8") as json_fi: FileNotFoundError: [Errno 2] No such file or directory: 'gs://PROJECT/PATH/service_account.json'
How can I properly refference the file to execute the function?
CodePudding user response:
Google's client libraries all support Application Default Credentials (ADCs) very helpfully including finding credentials automatically. You should utilize this.
When your code runs on the Cloud Functions service, it runs as a Service Account identity. You're probably running the Cloud Functions using the default Cloud Functions Service Account. You can specify a user-defined Service Account when you deploy|update the Cloud Function.
NOTE It's better to run your Cloud Functions using user-defined, non-default Service Accounts.
When the Cloud Function tries to create a Cloud Storage client, it does so using it's identity (the Service Account).
A solution is to:
- Utilize ADCs and have the Cloud Storage client be created using these i.e.
storage_client = storage.Client()
- Adjust the Cloud Function's Service Account to have the correct IAM roles|permissions for Cloud Storage.
NOTE You should probably adjust the Bucket's IAM Policy. Because you're using a Cloud Storage Bucket in a different project, if you want to adjust the Project's IAM Policy instead, ensure you use the Bucket's (!) Project's IAM Policy and not the Cloud Function's Project's IAM Policy.
See IAM roles for Cloud Storage: predefined roles and perhaps roles/storage.objectViewer
as this includes storage.objects.list
permission that you need.
CodePudding user response:
You're trying to list buckets located in another GCP projects while authorizing your client by pulling a Service Account (SA) key file that you also pull from from Google Cloud Storage (GCS).
I'd recommend a different security pattern to resolve this. Essentially use a single SA that has permissions to invoke your cloud function and permissions to list contents of your GCS bucket. It'll circumvent the need for pulling in a file from GCS that contains the key file while still maintaining security since a bad actor will require access to your GCP account. The following steps show you how to do so.
- Create a Service Account (SA) with the role Cloud Functions Invoker in your first Cloud Functions project
- Grant your user account or user group the Service Account User role on this new SA
- Change your Cloud Function to use this newly created SA, where to change Cloud Function Runtime SA
- Grant the Cloud Function runtime SA a GCS role in the second project or GCS bucket
In this pattern, you will "actAs" the Cloud Function runtime SA allowing you to invoke the Cloud Function. Since the Cloud Function runtime SA has adequate permissions to your GCS bucket in your other project there's no need for an SA key since the runtime SA already has adequate permissions.