It's probably not the best idea to have one robot do everything - i.e. setting filters, parsing results, clicking on each result, parsing data, then going back, and so on. Instead of that approach, divide and conquer. Create multiple sequences/workflows, each one with one specific task in mind.
Here's how I would tackle it:
This comes with the benefit that you can use multiple robots at the same time extracting data, potentially increasing scraping significantly. The Queues and Transactions feature will make sure each result is visited only once, and that multiple robots don't process the same item multiple times.
Edit: you might want to start with ReFramework, which I'd recommend.