There are mods in the forum software that can require guests to log in after certain conditions are met. But they have the potential to break the site if poorly implemented and you have to be very careful about which ones you use.
In the last 48 hours I’ve turned on a mod that caps guests at viewing 15 pages a day. After a guest clicks on a 16th page they are shown a screen prompting them to log in.
The immediate effect did not seem to result in any reduction in traffic. More “guests” are sitting at the index page instead of scraping data. The fact that they’re not scraping is good, but they’re still here taking up space. Perhaps over the long term this will frustrate some bots, and they will decide to use their computing power on more “productive” sites. Perhaps the bots are dumb and will just err out forever. Only time will tell.
Can you restrict the number of guests allowed 'in' at any point in time? Say limit to 50 guests. If guest 51 try to access they get a 'sorry guest limit reached' message..
There is also a mod available that “caps” guests access to a certain number of guests at any one time. I do not think it would be a good idea to turn it on based on the results of the “15 page cap” experiment.
If we have 1,000 bots attempting to scrape the site, and we set a “cap” at 50 guests, what will actually happen is 50 bots will scrape the site, and 950 bots will attempt to read a page and get an error message. We’d still be dealing with 1,000 bots of traffic, and we probably wouldn’t see any reduction in traffic. But this would frustrate *actual* human guests trying to read the site in a much more obnoxious way than the “15 page cap” does. Downside with no upside.
The solution is probably “stop the bots from accessing the site altogether”, which would require CloudFlare protection or something similar. I’ll bring it up.