I have an ASP.NET website mixed with classic asp (we are working on a conversion to .NET) and I recently upgraded from .NET 1.1 to .NET 4.0 and switched to integrated pipeline in IIS 7.
Since these changes ELMAH is reporting errors from classic asp pages with practicaly no detail (and status code 404):
System.Web.HttpException (0x80004005)
at System.Web.CachedPathData.ValidatePath(String physicalPath)
at System.Web.HttpApplication.PipelineStepManager.ValidateHelper(HttpContext context)
But when I request the page myself, no error occurs. All these errors showing up in ELMAH are caused by the Googlebot crawler (user agent string).
How come .NET picks up errors for classic asp pages? Has this got to do with the integrated pipeline?
Any ideas why the error only happens when Google crawls the page or how I can get more details to find the underlying fault?