I have an idea and want to see whether it is possible to implement. I want to parse a website (copart.com) that shows, daily, a different and large list of cars with the corresponding description for each car. Daily, I am tasked with going over each list (each containing hundreds of cars) and select each car that meets certain requirements (brand, year, etc). I want to know whether it is possible to create a tool that would parse these lists automatically and, in so doing, selects the cars that meet my criteria. I was thinking something like website scrapers such as ParseHub, but I am not trying to extract data. I simply want a tool that goes over a website and automatically clicks the "select" button on each car that meets my criteria. This would save me enormous amounts of time daily. Thanks.
Asked
Active
Viewed 1,202 times
0
-
In this case , selenium is for your need,which simulates the browser behaviour – K.Andy Wang Feb 02 '18 at 05:20
-
Well, if you want to select cars from website based on the criteria, you have to first extract the relevant data from page so you are able to compare them with your criteria... How you want to proceed next is another question. If you want to get a list of URL for selected cars, Scrapy is the right tool to use. – Tomáš Linhart Feb 02 '18 at 16:28
2 Answers
0
I think you can use selenium for this task. It automatically opens the web browser and you can locate the element with xPath and click on the select button. I've done that before for some home utility website.

Omka Bambara
- 1
- 1
0
Scrapy is a good tool designed for this. Depending on how the webpages are rendered, you may or may not need an additional tool like Selenium. Submit or "select" buttons are often just links that can be followed using HTML requests, without an additional browser emulation tool. If you could post some of the sample HTML we could give you more specifics.

NFB
- 642
- 8
- 26