I mean, it’s a while since i worked in backend. But one of the base tools was to limit requests per second per IP, so they can’t DDOS you. If a bot crawls your webpage you host with the intention to share, what’s the harm? And if one particukar bot/crawler misbehaves, block it. And if you don’t intend to share, put it in a VPN.
I mean, it’s a while since i worked in backend. But one of the base tools was to limit requests per second per IP, so they can’t DDOS you. If a bot crawls your webpage you host with the intention to share, what’s the harm? And if one particukar bot/crawler misbehaves, block it. And if you don’t intend to share, put it in a VPN.
Is that out of date?