Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storage Object Get Access Error #105

Open
rgreinho opened this issue May 15, 2020 · 17 comments
Open

Storage Object Get Access Error #105

rgreinho opened this issue May 15, 2020 · 17 comments
Assignees
Labels

Comments

@rgreinho
Copy link

rgreinho commented May 15, 2020

TL;DR

When using the cloudbuild workflow, the action job reports as failing due to a storage access error. However the job is correctly triggered in cloudbuild and completes successfully.

My problem seems similar to the one described in #49, but as I needed some clarification I opened this issue.

Expected behavior

The workflow would complete successfully

Observed behavior

The error message:

Build and push image to Google Container Registry(4s)

Run gcloud builds submit \
  gcloud builds submit \
    --quiet \
    --tag "gcr.io/$PROJECT_ID/$REPOSITORY_NAME:$GITHUB_SHA"
  shell: /bin/bash -e {0}
  env:
    PROJECT_ID: ***
    CLOUDSDK_CORE_PROJECT: ***
    REPOSITORY_NAME: ***
    CLOUDSDK_METRICS_ENVIRONMENT: github-actions-setup-gcloud
Creating temporary tarball archive of 148 file(s) totalling 8.9 MiB before compression.
Some files were not included in the source upload.

Check the gcloud log [/home/runner/.config/gcloud/logs/2020.05.15/21.07.45.085431.log] to see which files and the contents of the
default gcloudignore file used (see `$ gcloud topic gcloudignore` to learn
more).

Uploading tarball of [.] to [gs://***_cloudbuild/source/1589576865.25-e65b89df2a91419fbff076630958d5ee.tgz]
Created [https://cloudbuild.googleapis.com/v1/projects/***/builds/59a1f2ff-beee-4f1a-8147-504efe4014fd].
Logs are available at [https://console.cloud.google.com/cloud-build/builds/59a1f2ff-beee-4f1a-8147-504efe4014fd?project=192068846044].
ERROR: (gcloud.builds.submit) HTTPError 403: <?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>*****@*****.iam.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object.</Details></Error>
##[error]Process completed with exit code 1.

Following the logs link I can see that everything ran fine in spite of the error:

image

Reproduction

Action YAML

name: ci

on:
  pull_request:
    types:
      - opened
      - synchronize
      - reopened
  push:
    branches:
      - master
    tags:
      - "[0-9]+.[0-9]+.[0-9]+"
env:
  PROJECT_ID: ${{ secrets.PROJECT_ID }}
  CLOUDSDK_CORE_PROJECT: ${{ secrets.PROJECT_ID }}

jobs:
  check:
    runs-on: ubuntu-latest

    steps:
      - uses: actions/checkout@v2
      - name: Retrieve the repository name
        run: echo ::set-env name=REPOSITORY_NAME::$(echo "$GITHUB_REPOSITORY" | awk -F / '{print $2}')
        shell: bash
      - name: setup gcloud CLI
        uses: GoogleCloudPlatform/github-actions/setup-gcloud@master
        with:
          service_account_key: ${{ secrets.GCP_SA_KEY }}
          project_id: ${{ secrets.PROJECT_ID }}
      - name: Build and push image to Google Container Registry
        run: |-
          gcloud builds submit \
            --quiet \
            --tag "gcr.io/$PROJECT_ID/$REPOSITORY_NAME:$GITHUB_SHA"
  • I have a dedicated service account for my project
  • I use a JSON key to authenticate
  • I assigned it the following roles to this service account:
    • roles/cloudbuild.builds.builder
    • roles/cloudbuild.serviceAgent
    • roles/compute.serviceAgent
    • roles/container.clusterAdmin
    • roles/container.serviceAgent
    • roles/storage.admin

Additional information

As a workaround, I added a JSON key to the service account which got automatically created by GCP (ID-compute@developer.gserviceaccount.com), use it to authenticate this action and it worked like a charm.

EDIT(June 21st 2020):

  • My service account did not have the role roles/viewer.
@rgreinho rgreinho added the bug label May 15, 2020
@agray-gfs
Copy link

agray-gfs commented May 16, 2020

This page might help explain the error or at least point towards workarounds.

The error quit happening when I tried either of this ^ page's options for viewing logs:

I'm still surprised I ran into this issue. It feels like the Cloud Build documentation is missing some required grants.

@bandwiches
Copy link

bandwiches commented Jun 9, 2020

I ran into this today as well and as a beginner with Google Cloud this was a huge set back. I sat here troubleshooting redeploy after redeploy until this morning when I found them in my Container Registry and realized my builds were not failing after all.

I'm still surprised I ran into this issue. It feels like the Cloud Build documentation is missing some required grants.

The documentation for GCP and any of its services is 50/50 at best. I've been scouring GCP documentation for the last month for various things and most of their docs are referencing one partial service of the entire workflow very vaguely or it's missing any relevant and helpful documentation to actually get something working. Service Accounts and IAM are the worst offenders. I've been leaving a trail of feedback.

@rgreinho
Copy link
Author

rgreinho commented Jun 21, 2020

Thanks to @agray-22 's comment, I can confirm that explicitly adding the roles/viewer role to my custom service account solved the issue.

It is not clear to me why roles/storage.admin is not enough, but at least I can now use custom service accounts for my pipelines.

@alexfoxy
Copy link

alexfoxy commented Jun 22, 2020

I can also confirm adding the Viewer role to my service account fixed the issue:

Screenshot 2020-06-22 at 08 59 27

@morphalus
Copy link

morphalus commented Jul 9, 2020

You should not set Viewer role which is too huge (project-wide). As @agray-22 explained, setting logsBucket in cloudbuild.yaml is a better solution (then you just have to set right permissions on bucket).

@alcortesm
Copy link

alcortesm commented Sep 18, 2020

I had a similar problem than the OP, but in my case it was because I gave my service account the Storage Object Admin role instead of the Storage Admin role. Please double check that, the names of the roles are very similar.

@averikitsch averikitsch self-assigned this Oct 30, 2020
@mirzawaseembaig
Copy link

mirzawaseembaig commented Jan 9, 2021

read and access to all reports and post with easy step for new user , like other face book and instagram application

@mirzawaseembaig
Copy link

mirzawaseembaig commented Jan 9, 2021

Data storage problem

@antoinfive
Copy link

antoinfive commented Jan 14, 2021

I can also confirm adding the Viewer role to my service account fixed the issue:

Screenshot 2020-06-22 at 08 59 27

Can also confirm this fixed our issue

@mirzawaseembaig
Copy link

mirzawaseembaig commented Jan 15, 2021

Thanks

@bourliam
Copy link

bourliam commented Mar 2, 2021

Hello I also confirm that the role viewer solved this issue for us ! Thanks a lot !

@spothala
Copy link

spothala commented Jun 1, 2021

Can also confirm that the role viewer solved this issue for us but its bit strange as we already have storage admin role.

@setyven
Copy link

setyven commented Aug 25, 2021

I can get this to work without viewer role by granting storage.objects.get on project level, on top of adding storage admin role on bucket level. While this is still on project level, at least it's a lot less broader compared to viewer role as some have pointed out.

That said, looks like this is lack of documentation on GCP rather than issue on this action.

@obahareth
Copy link

obahareth commented Sep 15, 2021

Adding the viewer role on the service account did not fix this issue for me.

@jos-
Copy link

jos- commented Dec 7, 2021

Giving the Cloud Build service account a Storage Object Viewer role worked for me.

The Viewer role gives more permissions than required.

@sethvargo sethvargo added docs and removed bug labels Dec 23, 2021
@ferraricharles
Copy link

ferraricharles commented Feb 4, 2022

Giving a service account view privilege in the whole project cannot be taken as a solution for production environment, that defuses the whole purpose of having service accounts.

@jpaik
Copy link

jpaik commented May 27, 2022

If anyone's still having this issue (of service account does not have access to the Google Cloud Storage object) and doesn't want to give the Project Viewer role for production environments, I was able to resolve this by doing the following steps:

  1. On Cloud Storage (object storage) create a new bucket with the Access Control to be Fine-grained (Object-level ACLs)
  2. Get the bucket id (located in the Configuration tab under gsutil URI) - it'll look like gs://BUCKET_ID
  3. Give your service account the Storage Object Viewer role for that new bucket.
  4. In your Github Actions YAML file, add this flag to the gcloud builds script: --gcs-log-dir "gs://$BUCKET_ID". This is specifying the log bucket for the cloud build action as referenced here but using the CLI, it's --gcs-log-dir as referenced here.

Add a new secret key for BUCKET_ID which has the value of the bucket id of the cloud storage bucket into your Github keys and then your YAML should look something like this:

...
- name: Build and Push
      run: |-
        gcloud builds submit \
          --quiet \
          --gcs-log-dir "gs://$BUCKET_ID" \
          --tag "gcr.io/$PROJECT_ID/$REPOSITORY_NAME:$GITHUB_SHA"
...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests