Every example and use case uses pyppeteer where browser is opened and close immediately. e.g. import asyncio from pyppeteer import launch
async def main():
browser = await launch()
page = await browser.newPage()
await page.goto('http://someurl')
content = await page.content()
cookieslist=await page.cookies()
cookiejar=createCookieJar(cookieslist)
await browser.close()
asyncio.get_event_loop().run_until_complete(main())
what happen if you want to keep browser open, and continuously scrape the data? thats easily done with selenium, but with pyppeteer, it doesnt work without asyncio it seems. the other way to make it work is to save the session and re-open the browser by schedule and scrape, but that feels like a very inefficient way. Anyone tried?