0

The problem is When running the code it doesn't finish

it just print the index and stops at the end

What I am doing wrong?

limits = httpx.Limits(max_keepalive_connections=5, max_connections=10)

finalList=[]
async def getVouchDetails(link, client):
        ##getting params from link
        response = await client.get('https://egramswaraj.gov.in/paymentVoucherDetail.do',
                                params=params, headers=headers,)
        soup = BeautifulSoup(response.content, 'lxml')
        return  soup

async def poolVouch(link, client):
        print(links.index(link))
        soup = await getVouchDetails(link,client)
        ### extract data from soup
        finalList.append([##add required data])

async def main(links):
    async with httpx.AsyncClient(limits=limits) as client:
        tasks = []
        for link in links:
            tasks.append(asyncio.ensure_future(poolVouch(link, client)))
        await asyncio.gather(*tasks)
##Links contains 100k links
asyncio.run(main(links))

Does it still running I have to wait or something else

Coding Dog
  • 18
  • 4

1 Answers1

-1

1):

tasks.append(asyncio.ensure_future(poolVouch(link, client)))

->

tasks.append(asincio.create_task(poolVouch(link, client)))
await asyncio.sleep(0)
  • 1
    can you explain it? – Coding Dog Jun 18 '23 at 14:25
  • If you're asking about ensure_future -> create_task, it mostly depends on the version of Python you're using...(I haven't used ensure_future for a while) Regarding await asyncio.sleep(0), the point is that if there is no await instruction between tasks.append(asyncio.create_task(poolVouch(link, client))) and await asyncio.gather(*tasks), the execution context won't switch to any of the created tasks until await asyncio.gather(*tasks) is reached. I also agree with the comment that aiohttp is a great tool. – LtGenFlower Jun 18 '23 at 15:58