Private project, not really security related: Crawling robots.txts to gather some statistics on which bots people are most often excluding - weirdly I couldn't find any recent/regularly updated stats on this.
cybersecurity
An umbrella community for all things cybersecurity / infosec. News, research, questions, are all welcome!
Community Rules
- Be kind
- Limit promotional activities
- Non-cybersecurity posts should be redirected to other communities within infosec.pub.
Enjoy!
That’s a neat project. Are you looking for trends, or something specific?
It started with a popular mastodon posts on how to block openai crawlers I think, and I'd like to know whether people are actually implementing it.
That’s neat. I’m curious about this now. With “normal” search engines that have generally gone to shit, AI chat bots are on trend to give better results. If the robots.txt file is blocked from OpenAI, can I assume it hits other chatbots? And would that extend to Google/bing?
Project Management crap. It’s the money season in the government, so I get to ask for lots of money to try to do cool things.
Not strictly teh ciberrz, but upgrading to OpenBSD 7.5. Might rebuild a mail server this weekend.