I need to automate scraping of some data from our partners site(German's research institute "Gesellschaft für Konsumforschung" - https://startrack.gfkrt.com/ ).
They don't want to create clean api for us, because "site is written about ten years ago, it's very-very complicated, original programmer was fired also about ten years ago, and new programmer is very scared to break whole thing because of changing/adding anything"(C). They also don't want to provide sources because "it's their intellectual property".
Site is whiten using ASP.NET Web Forms with help of Telerik controls.
So, using Fiddler i watched what headers and forms data browser sends to server. Then using standart HTTPWebRequest class, i written application that sends same headers and forms data. It's worked perfectly.. But while debugging i'm used Fiddler to watch data that been send/retrieved by my app over SSL-tunnel.
But when i'm stopped using Fiddler my app become totally broken - when running without it site always returns page that says that "critical error happen"(not standard IIS response when 5xx error's happen, but just standard page of site with that text).
When Fiddler running my app is ok - site returns no errors, but when not - there is "site critical error". I'm tried same thing with Charles - same result - when it's run in background all ok, when not there is errors. In browser all is fine too, even without Fiddler/Charles running.
So what can be different different between HTTPWebRequest's without anything and HTTPWebRequest's under Fiddler/Charles? What can i change in my code, that will make them act same?