I think that every request that I send is being sent without cookies..
When I listen to the onResourceRequested event like this:
this.page.onResourceRequested = function(request) {
utils.dump(request);
};
And every request has the same form of headers:
"headers": [
{
"name": "User-Agent",
"value": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36"
},
{
"name": "Accept",
"value": "text/css,*/*;q=0.1"
},
{
"name": "Referer",
"value": "https://some_site.com/page"
}
],
I never get a header with the cookies that are supposed to be there..
When I try to look at the cookies - I run this:
utils.dump(this.page.cookies);
I get a list of many cookies entries.
I think this giving me some errors on my scraping script.
your thoughts? thanks.
EDIT
I try to make a POST request to download a file. I can log into the site, browse to a few pages, get to the download page - but then, when I send the request I get a message "Error: To register with the The site you have to enable your browser to accept cookies."
This is why it is confusing - I can log in and browse the site ( so I must have some cookies passed around) but I can't download ( so I might not have a cookie here..)