

Right? We have standards for this and the reasonable assumption is that if it doesn’t respect robots.txt and otherwise looks like a user then it’s a user. It can’t be the responsibility of every single server admin to perfectly recognize what’s a user and what’s a bot run by a billion-dollar company doing a decent job pretending to be a user.
You could probably do something by getting into the weeds of browser updates, at least for web traffic. Like, if they’re showing themselves as an older version of chrome send a badly formatted cookie to crash it? Redirect to /%%30%30?