I can't seem to find an answer to this question.
I had an old subdomain, lets say asdasd.example.com
This subdomain site no longer exists. However, I keep getting error emails from Django about invalid httphost:
SuspiciousOperation: Invalid HTTP_HOST header (you may need to set ALLOWED_HOSTS): asdasd.example.com
Since the subsite no longer exists, I cannot use robots.txt
.
So how can I stop the crawler from trying to index this page that no longer exists? It doesn't just try to index asdasd.example.com but also asdasd.example.com/frontpage and other urls, which used to be valid.