0

I have implemented a test function in pytest which loads data from files, casts it into Python objects and provides a new object for each test.

Each one of these objects contains a request I need to make to the server and the expected responses, the function looks like this:

@pytest.mark.asyncio
@pytest.mark.parametrize('test', TestLoader.load(JSONTest, 'json_tests'))
async def test_json(test: JSONTest, groups: Set[TestGroup], client: httpx.AsyncClient):
    skip_if_not_in_groups(test, groups)

    request = Request(url=test.url, body=test.body.dict())
    response = await client.post(request.url, json=request.body)

    # Assertions down here...

Many times I send many requests that contain the same http endpoint with the same body so the response is the same, but I'm testing for different things in the response.

Because of that I thought of implementing an in-memory cache so that for each test run the same requests won't be implemented twice.

What I've tried to do is create a request object, with its own __hash__ implementation and use the @asyncstdlib.lru_cache on the function, it didn't seem to work.

# Does not work...

@asyncstdlib.lru_cache
async def send_request(request: Request, client: httpx.AsyncClient):
    return await client.post(request.url, json=request.body)


@pytest.mark.asyncio
@pytest.mark.parametrize('test', TestLoader.load(JSONTest, 'json_tests'))
async def test_json(test: JSONTest, groups: Set[TestGroup], client: httpx.AsyncClient):
    skip_if_not_in_groups(test, groups)

    request = Request(url=test.url, body=test.body.dict())
    response = await send_request(request)

The client I'm using: httpx.AsyncClient also implements __hash__, it's coming from a pytest.fixture in conftest.py and it has a scope of 'session':

# conftest.py

@pytest.fixture(scope='session')
def event_loop(request):
    loop = asyncio.get_event_loop_policy().new_event_loop()
    yield loop
    loop.close()

@pytest.fixture(scope='session')
async def client() -> httpx.AsyncClient:
    async with httpx.AsyncClient() as client:
        yield client
orie
  • 541
  • 6
  • 20
  • What do you mean by it doesn't work? It throws errors or you don't get what you want? From what you've posted I see that you don't return anything from send_request but just do the post request await client.post(request.url, json=request.body) – Simon Hawe Jan 12 '22 at 11:33
  • Im sorry its a typo from uploading the code to here, by not working I mean I'm sending 7 requests that are the same and the server gets 7 requests instead of 1, if it was cached the `response` object would be available to use without sending another request to the server – orie Jan 12 '22 at 12:00

1 Answers1

2

Just let go of the opaque 3rd party cache, and cache yourself. Since you don't require cleaning-up the cache during a single execution, a plain dictionary will work:

_cache = {}


async def send_request(request: Request, client: httpx.AsyncClient):
    if request.url not in _cache:
        _cache[request.url] = await client.post(request.url, json=request.body)
    return _cache[request.url]
jsbueno
  • 99,910
  • 10
  • 151
  • 209
  • works great, thank you! – orie Jan 13 '22 at 07:24
  • But is it not possible that multiple execution flows end up at "await client.post(...)" and thus the benefits of the caching process is somewhat reduced – Akshya11235 May 04 '23 at 21:11
  • this solutoin use the url as cache key. Ultimately it depends on the API being called if any caching makes sense at all - in general, `post`requests should not be cached, actually, as each post, when talking REST principles, will create a new, distinct, resource on the server. Other times it might make sense to use not only the URL, but some of the fields in the request.body as well - one should just pay attention in what they are doing, and not just copy this blindly. – jsbueno May 05 '23 at 01:39