Attack
To me this looks like defense. If the site asks you to not to scrape and you do it anyway, you are the attacker and deserve the garbage.
Cyber Security news and links to cyber security stories that could make you go hmmm. The content is exactly as it is consumed through RSS feeds and wont be edited (except for the occasional encoding errors).
This community is automagically fed by an instance of Dittybopper.
Attack
To me this looks like defense. If the site asks you to not to scrape and you do it anyway, you are the attacker and deserve the garbage.
I can save you a lot of trouble, actually. You don't need all of this!
Just make a custom 404 page that returns 13 MBs of junk along with status code 200 and has a few dead links (404, so it just goes to itself)
There are no bots on the domain I do this on anymore. From swarming to zero in under a week.
You don't need tar pits or heuristics or anything else fancy. Just make your website so expensive to crawl that it's not worth it so they filter themselves.
Just make a custom 404 page that returns 13 MBs of junk along with status code 200
How would you go about doing this part? Asking for a friend who’s an idiot, totally not for me.
I use Apache2 and PHP, here's what I did:
in .htaccess you can set ErrorDocument 404 /error-hole.php
https://httpd.apache.org/docs/2.4/custom-error.html
in error-hole.php,
<p>*paste a string that is 13 megabytes long*</p>
For the string, I used dd
to generate 13 MBs of noise from /dev/urandom
and then I converted that to base64 so it would paste into error-hole.php
You should probably hide some invisible dead links around your website as honeypots for the bots that normal users can't see.
How does this affect a genuine user who experiences a 404 on your site?
I don't know a lot about this, but I would guess a normal user would like a message, that says something along the lines of "404, couldn't find what you were looking for." The status code and the links back to itself as well as the 13 MBs of noise should probably not irritate them. Hidden links should also not irritate normal users.
I also "don't know a lot about this", but I do know that your browser receiving a 200 means that everything worked properly. From what I can tell, this technique is replaces any and every 404 response with 200, thus tricking the browser (and therefore the user) into thinking the site is working as expected every time they run into a missing webpage on this site.
They will see a long string of base64 that takes a quarter of a second longer to load then a regular page. If it's important to you, you can make the base64 string invisible and add some HTML to make it appear as a normal 404 page.
This will be as effective against LLM trainers as Nightshade has been against generative image AI trainers.