0

I can't seem to find an answer to this question.

I had an old subdomain, lets say asdasd.example.com

This subdomain site no longer exists. However, I keep getting error emails from Django about invalid httphost:

SuspiciousOperation: Invalid HTTP_HOST header (you may need to set ALLOWED_HOSTS): asdasd.example.com

Since the subsite no longer exists, I cannot use robots.txt.

So how can I stop the crawler from trying to index this page that no longer exists? It doesn't just try to index asdasd.example.com but also asdasd.example.com/frontpage and other urls, which used to be valid.

Timmy O'Mahony
  • 53,000
  • 18
  • 155
  • 177
rulzart
  • 103
  • 1
  • 9
  • Seems your dns is still configured for this domain, otherwise Google would not reach your site. Does the old domain still resolve? – het.oosten Sep 15 '14 at 13:31
  • Or do you catch all subdomains? In that case this page is worth reading: http://www.wellfireinteractive.com/blog/multi-tennancy-in-django-using-subdomains/ – het.oosten Sep 15 '14 at 13:52
  • Thanks, het.oosten - will take a look at that. And yes, we catch all subdomains (it's a kind of CMS, so subdomains are created constantly). – rulzart Sep 16 '14 at 05:48
  • I am sure you could do this also in your server config (Apache or similar). I would advise you to ask a question on Server Fault in that case. – het.oosten Sep 16 '14 at 13:53

0 Answers0