On my site I need to record some data about visitors but it's important I record only human visitors and not any bots, especially bad bots and automatically (capcha and such are not an option).
The way it's done now is running a JS that adds dynamically a new JS that runs the data recording (it actually runs a php server side script that does some work and returns a JS code). The problem with this solution is that all images are loaded after the JS and it slows the already slow site by 0.5s to 1.5s (depending on site and server load).
Would it be good practice using a pixel in a background image (set in a CSS file)?
This way accomplish:
- Faster page load
- Bots ignored (assuming bots don't load CSS and if they do then don't load bg images, and for legit bots like google block access with robots.txt)
Other solutions are very welcome.
It would look something like this:
<div id="pixel"></div>
CSS:
#pixel {background: url(/nobot/somephp.php); width: 1px; height: 1px;}
robots.txt
User-agent: *
Disallow: /nobot/
Another question,
can I rely on the assumption that bad bots will not load background CSS images?
I know there are many threads about bots, but couldn't find any that is up to date and talks about this.