I am trying to build a web-crawler on python using gRPC. I have included the functions for crawling in the server file and I use the client to request a list of URLs from the user and send it to the server for the scraping part. Each url takes about 25-30 seconds to get scraped. So, I want to use multiprocessing to speed up the process i.e. extracting information from N URLs using N cores in parallel. How do I proceed? Say, I have 4 cores: is it possible to implement 4 calls of the client to the server on 4 different cores? Or should I create a server-client pair separately on each core? Or can I create 4 server instances with different channel ports and execute them on 4 cores?
I am new to all this. So, I could use any kind of help on this.