As someone that has servers on the net, this anti-bock check is a very legitimate need that is growing. A very high percentage of the traffic I am seeing are bots. The bots are coming in a distributed network, where various networks under their control are making round-robin requests over each network per the logs I have reviewed. The stats I have collected show unique networks numbering in the 1000s. It's very hard to distinguish bots from real people(*). The bot activity is growing, there could be multiple players involved. If this keeps growing, it will essentially become a 24/7 DDoS for small sites -- at least to being available for legitimate, real people use.
The culprit? It's likely from companies, world-wide, trying to seed their LLM. It wasn't like this more than a year ago. These are certainly not like the classic search engine bots. They are similar to what search engines do, but these are clearly people that are evading the normal rules of activity, and common filtering schemes, and engaging in scraping-like activity.
I have not deployed an anti-bot checker but I am seriously considering it. The alternative is to make the sites require an account login to do anything at all. It's not fair that I am spending money so that LLM companies can seed their systems with useless content, in the most stupid way possible, just scraping everything and never asking permission.
As far as the internet dying, yes, I think it is starting to die. But it isn't our fault (the little people that have not much choice). We don't have much choice left other than to throw up expensive client javascript, or other strange things that confuse the heck out of most applications, or go full exclusive.
I agree with the sentiment here actually. If the small web hosts don't take action to make themselves free of bots, they'll just give up, and I think the old way of the internet will go away. The internet will mostly be chatbots left of the largest tech companies conveying any information that exists, and a few bloated applications that are a pretend sorry example of what things were before the internet, and that is dystopian.
(*) There is a recent article saying there is malware out there probably existing in the millions, that seems to be for allowing entities to do this scrapping of the web and avoid the filters.