1

my ~/.aws/credentials looks like

[default]
aws_access_key_id = XYZ
aws_secret_access_key = ABC

[testing]
source_profile = default
role_arn = arn:aws:iam::54:role/ad

I add my remote like

dvc remote add --local -v myremote s3://bib-ds-models-testing/data/dvc-test

I have made my .dvc/config.local to look like

[‘remote “myremote”’]
url = s3://bib-ds-models-testing/data/dvc-test
access_key_id = XYZ
secret_access_key = ABC/h2hOsRcCIFqwYWV7eZaUq3gNmS
profile=‘testing’
credentialpath = /Users/nyt21/.aws/credentials

but still after running dvc push -r myremote I get

ERROR: unexpected error - Forbidden: An error occurred (403) when calling the HeadObject operation: Forbidden

** Update here is the output of dvc push -v

2021-07-25 22:40:38,887 DEBUG: Check for update is enabled.
2021-07-25 22:40:39,022 DEBUG: Preparing to upload data to 's3://bib-ds-models-testing/data/dvc-test'
2021-07-25 22:40:39,022 DEBUG: Preparing to collect status from s3://bib-ds-models-testing/data/dvc-test
2021-07-25 22:40:39,022 DEBUG: Collecting information from local cache...
2021-07-25 22:40:39,022 DEBUG: Collecting information from remote cache...                                                                                                                     
2021-07-25 22:40:39,022 DEBUG: Matched '0' indexed hashes
2021-07-25 22:40:39,022 DEBUG: Querying 1 hashes via object_exists
2021-07-25 22:40:39,644 ERROR: unexpected error - Forbidden: An error occurred (403) when calling the HeadObject operation: Forbidden                                                          
------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 246, in _call_s3
    out = await method(**additional_kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 1057, in _info
    out = await self._simple_info(path)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 970, in _simple_info
    out = await self._call_s3(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 265, in _call_s3
    raise translate_boto_error(err)
PermissionError: Access Denied

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 246, in _call_s3
    out = await method(**additional_kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/aiobotocore/client.py", line 154, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/main.py", line 55, in main
    ret = cmd.do_run()
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/command/base.py", line 50, in do_run
    return self.run()
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/command/data_sync.py", line 57, in run
    processed_files_count = self.repo.push(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/repo/__init__.py", line 51, in wrapper
    return f(repo, *args, **kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/repo/push.py", line 44, in push
    pushed += self.cloud.push(objs, jobs, remote=remote)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/data_cloud.py", line 79, in push
    return remote_obj.push(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 57, in wrapper
    return f(obj, *args, **kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 494, in push
    ret = self._process(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 351, in _process
    dir_status, file_status, dir_contents = self._status(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 195, in _status
    self.hashes_exist(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/remote/base.py", line 145, in hashes_exist
    return indexed_hashes + self.odb.hashes_exist(list(hashes), **kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/objects/db/base.py", line 438, in hashes_exist
    remote_hashes = self.list_hashes_exists(hashes, jobs, name)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/objects/db/base.py", line 389, in list_hashes_exists
    ret = list(itertools.compress(hashes, in_remote))
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/_base.py", line 619, in result_iterator
    yield fs.pop().result()
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/_base.py", line 444, in result
    return self.__get_result()
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
    raise self._exception
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/objects/db/base.py", line 380, in exists_with_progress
    ret = self.fs.exists(path_info)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/dvc/fs/fsspec_wrapper.py", line 92, in exists
    return self.fs.exists(self._with_bucket(path_info))
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/fsspec/asyn.py", line 87, in wrapper
    return sync(self.loop, func, *args, **kwargs)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/fsspec/asyn.py", line 68, in sync
    raise result[0]
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/fsspec/asyn.py", line 24, in _runner
    result[0] = await coro
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 802, in _exists
    await self._info(path, bucket, key, version_id=version_id)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 1061, in _info
    out = await self._version_aware_info(path, version_id)
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 1004, in _version_aware_info
    out = await self._call_s3(
  File "/Users/nyt21/opt/miniconda3/envs/dvc/lib/python3.8/site-packages/s3fs/core.py", line 265, in _call_s3
    raise translate_boto_error(err)
PermissionError: Forbidden
------------------------------------------------------------
2021-07-25 22:40:39,712 DEBUG: Version info for developers:
DVC version: 2.5.4 (pip)
---------------------------------
Platform: Python 3.8.10 on macOS-10.16-x86_64-i386-64bit
Supports:
        http (requests = 2.26.0),
        https (requests = 2.26.0),
        s3 (s3fs = 2021.6.1, boto3 = 1.18.6)
Cache types: reflink, hardlink, symlink
Cache directory: apfs on /dev/disk3s1s1
Caches: local
Remotes: s3
Workspace directory: apfs on /dev/disk3s1s1
Repo: dvc, git

Having any troubles? Hit us up at https://dvc.org/support, we are always happy to help!
2021-07-25 22:40:39,713 DEBUG: Analytics is enabled.
2021-07-25 22:40:39,765 DEBUG: Trying to spawn '['daemon', '-q', 'analytics', '/var/folders/4x/xhm22wt16gl6m9nvkl9gllkc0000gn/T/tmpo86jdns5']'
2021-07-25 22:40:39,769 DEBUG: Spawned '['daemon', '-q', 'analytics', '/var/folders/4x/xhm22wt16gl6m9nvkl9gllkc0000gn/T/tmpo86jdns5']'

I can upload through python

import boto3
import os
import pickle

bucket_name = 'bib-ds-models-testing'
os.environ["AWS_PROFILE"] = "testing"
session = boto3.Session()
s3_client = boto3.client('s3')

s3_client.upload_file('/Users/nyt21/Devel/DVC/test/data/iris.csv',
    'bib-ds-models-testing',
    'data/dvc-test/my_iris.csv')

I don't use aws CLI but the following also gives an access deny !

aws s3 ls s3://bib-ds-models-testing/data/dvc-test

An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied

but it works if I add --profile=testing

aws s3 ls s3://bib-ds-models-testing/data/dvc-test --profile=testing
                       

PRE dvc-test/

just you know environment variable AWS_PROFILE is already set to 'testing'

UPDATE

I have tried both AWS_PROFILE='testing' and AWS_PROFILE=testing, neither of them worked.

enter image description here

Areza
  • 5,623
  • 7
  • 48
  • 79
  • 1
    Could you add `-v` to the command you are running and post full log, please? Also, are you able to use awscli (e.g. `aws s3 ls s3://bib-ds-models-testing/data`) with those credentials? – Ruslan Kuprieiev Jul 25 '21 at 18:11
  • Could you try `profile=testing` instead of `profile=‘testing’`? – Ruslan Kuprieiev Jul 26 '21 at 13:04
  • @RuslanKuprieiev thanks for your reply - I updated the question - I tried AWS_PROFILE and profile - neither worked – Areza Jul 27 '21 at 10:55
  • Just making sure: did you remove the `profile=` from the config when testing? The config one has priority over the env one. – Ruslan Kuprieiev Jul 27 '21 at 17:36
  • yes (i removed from config) and set env to testing/'testing' - neither of them worked. just to remind you - in .dvc/config I have url, access_key_id, secret_access_key and credentialpath. – Areza Jul 27 '21 at 22:26
  • 1
    I'm not able to reproduce, it might be that you have some problems with permissions (maybe GetObject is missing?). To narrow it down, you could try `aws s3api head-object --bucket my-bucket --key mykey` with your creds and if it fails with 403 - it will likely mean that you don't have GetObject permissions for that bucket/prefix. – Ruslan Kuprieiev Jul 29 '21 at 10:16
  • that's it - "An error occurred (404) when calling the HeadObject operation: Not Found" thanks for helping me out. perhaps worth adding a line to the documentation ? (i hope it is not already there) – Areza Jul 30 '21 at 22:23

0 Answers0