1

Title might be a little confusing.

Say I have an APIView with a post method. Inside the post method, I introduced a class that has its own method. In this case, it's a class that deals with uploading to S3, which is something I want to skip when running unittest.

class SomeView(APIView):
    def post(self):
        # do something here
        input1 = some_process(payload_arg1)
        input2 = some_other_process(payload_arg2)
        uploader = S3Uploader()
        s3_response = uploader.upload_with_aux_fxn(input1, input2)
        if s3_response['status_code'] == 200:
            # do something else
            return Response('Good job I did it!', status_code=200)
        else:
            return Response('noooo you're horrible!', status_code=400)

Real code has different function calls and responses, obviously.

Now I need to mock that uploader and uploader.upload_with_aux_fxn so I don't actually call S3. How do I mock it?

I tried in my test script

from some_place import S3Uploader
class SomeViewTestCase(TestCase): 
    def setUp(self):        
        self.client = APIClient()
        uploader_mock = S3Uploader()
        uploader_mock.upload_support_doc = MagicMock(return_value={'status_code': 200, 'message': 'asdasdad'}
        response = self.client.post(url, payload, format='multipart')

But I still triggered S3 upload (as file shows up in S3). How do I correctly mock this?

EDIT1:

My attempt to patch

def setUp(self):
    self.factory = APIRequestFactory()
    self.view = ViewToTest.as_view()
    self.url = reverse('some_url')


@patch('some_place.S3Uploader', FakeUploader)
def test_uplaod(self):
    payload = {'some': 'data', 'other': 'stuff'}
    request = self.factory.post(self.url, payload, format='json')
    force_authenticate(request, user=self.user)
    response = self.view(request)

where the FakeUplaoder is

class FakeUplaoder(object):
    def __init__(self):
        pass
    def upload_something(self, data, arg1, arg2, arg3):
        return {'status_code': 200, 'message': 'unit test', 's3_path': 
                'unit/test/path.pdf'}

    def downlaod_something(self, s3_path):
        return {'status_code': 200, 'message': '', 'body': 'some base64 
                stuff'}

unfortunately this is not successful. I still hit the actual class

EDIT 2:

I'm using Django 1.11 and Python 2.7, in case people need this info

JChao
  • 2,178
  • 5
  • 35
  • 65

4 Answers4

2

I guess the correct approach to it would be save the file within a model with FileField, and then connect Boto to handle upload in production scenario.
Take a good look at:
https://docs.djangoproject.com/en/2.2/ref/models/fields/#filefield
and
https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#model
this approach would preserve Django default behavior, making things more testable with Django's default test client.

  • doesn't FileField save locally? What if I have multiple servers to process production data, what's going to happen? Can django ensure all files are accessible? (I've run into issues like this before I believe. Sometimes data saved on the server cannot be accessed because, well, it's on a different server) – JChao Jul 10 '19 at 15:36
  • No. You can set a variable to conditionally load production environment configurations. To upload your media files to S3 set: DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage' . More details at: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html – João Victor Monte Jul 10 '19 at 16:32
  • but how can I do it without going through this? All I want is to mock that one class dealing with S3Upload and fake the response. It's a separate class with its own functions. It resides inside a django `APIView`. I just need to mock it. That's all – JChao Jul 10 '19 at 19:15
  • I actually can't think about a maintainable way of doing that without the Django default behavior. – João Victor Monte Jul 10 '19 at 20:30
2

Take a look at vcrpy. It records request to external API once and then replays answer every time you run your tests. No need to manually mock anything.

r_black
  • 614
  • 7
  • 10
  • is there a way to do it with just unittest? i am not allowed to add new library whenever I want. – JChao Jul 10 '19 at 19:25
  • You should stick with mocking then. IMHO, Recreating this behavior would require too much patching of boto internals, it's not worth it. – r_black Jul 15 '19 at 13:23
2

Here's an example of how I would mock that S3Uploader in an APITestCase.

from rest_framework import status
from unittest import mock
from unittest.mock import MagicMock

class SomeViewTestCase(APITestCase): 

   @mock.patch("path.to.view_file.S3Uploader")
   def test_upload(self, s3_uploader_mock):
       """Test with mocked S3Uploader"""
       concrete_uploader_mock = MagicMock(**{
           "upload_with_aux_fxn__return_value": {"status_code": 200}
       })
       s3_uploader_mock.return_value = concrete_uploader_mock
       response = self.client.post(url, payload, format='multipart')
       self.assertEqual(response.status_code, status.HTTP_200_OK)
       s3_uploader_mock.assert_called_once()
       concrete_uploader_mock.upload_with_aux_fx.assert_called_once()
A. J. Parr
  • 7,731
  • 2
  • 31
  • 46
  • do I need to define `s3_uploader_mock` before using it in `def test_upload`? – JChao Jul 12 '19 at 15:45
  • I tried it, with change in mock.patch with actual path. Also changed the return_value to the value I want. It still hits S3 unfortunately – JChao Jul 12 '19 at 15:56
  • Nah you don't need to define it prior, after you add the `mock.patch` decorator it should include an argument in your test function which is the mocked class/object. I'm a little surprised to hear it's still hitting S3 with this mock in place, that indicates that the `@mock.patch` is not targeting the correct object. Would need to see more view code to diagnose why that would be happening. – A. J. Parr Jul 15 '19 at 00:04
  • You also may want to try targeting the module where you import `S3Uploader` from, e.g. `@mock.patch("path.to.s3_module.S3Uploader")`, instead of the views module. – A. J. Parr Jul 15 '19 at 06:26
0

Try using MagicMock like below

from unittest import mock
from storages.backends.s3boto3 import S3Boto3Storage


class SomeTestCase(TestCase):
  def setUp(self):
    self.factory = APIRequestFactory()
    self.view = ViewToTest.as_view()
    self.url = reverse('some_url')


  @mock.patch.object(S3Boto3Storage, '_save', MagicMock(return_value='/tmp/somefile.png'))
  def test_uplaod(self):
    payload = {'some': 'data', 'other': 'stuff'}
    request = self.factory.post(self.url, payload, format='json')
    force_authenticate(request, user=self.user)
    response = self.view(request)
anjaneyulubatta505
  • 10,713
  • 1
  • 52
  • 62
  • this doesn't work. it still hits S3 and returns `Unexpected Error: Parameter validation failed: Invalid type for parameter Body` – JChao Jul 12 '19 at 15:43
  • This error is probably due to the fact that I'm using `json` instead of `multipart`, still, the point is not to hit S3, so I shouldn't get this error at all – JChao Jul 12 '19 at 15:57