> My understanding of recaptcha v3 is, that it just gives you a score for how well google can track you and then leaves it up to the site operator to block users who aren't transparent enough.
Yeah, that's the one where you're supposed to put it on every page of your website so that Google can collect more information on your users. If they can't, they'll return a low score that you'll use to mark users as "bots".
> What I really hate about recaptcha v2 are those artificial delays before loading the next image (which it can happen to several times on a single card). And then in the end you frequently fail, despite answering everything correctly.
I think this is just what Google does when it thinks you're a "bot": i.e. they don't know who you are.
I do wonder why though. For a long time, I assumed it was a rate limiter, but then another HN commenter pointed out to me that time is more valuable to humans than bots. Bots can work on multiple captcha's in parallel.
I've become accustomed to just closing any page that presents me with a v2 reCAPTCHA.
I'm waiting for the day when it pops up "find the humans" and there's someone clearly wearing CV-camouflage. For anyone that doesn't get it: you let that human hide, because it could be you in twenty years.
Edit: Unless, it occurs to me now, Google is monitoring how the user responds to the fade in. When in the fade process do they click the square, for instance.
Still seems like it wouldn't be all that helpful, though.
Maybe this is to prevent adversarial learning? If the images reload immediately, then the bot can learn (via a neural network) whether its solution was good or not. If there's a delay, it's the same, but the learning is slowed down by the same factor.
No idea if this is true, it just popped out of my head.
There's more saleable data to be collected tracking people's interactions in a website, under the guise of predicting who's a bot.