I do all my mobile browsing on FF yet when I try to use some websites I always get this Recaptcha failed error(1) while it works flawlessly on chrome though I never use it often. Try it, maybe it will happen for you too.
Same happens on most sites which show you that "checking your browser" page via cloudflare too.
The web is very unusable unless you're using chrome because of such antics.
If only it was Google services alone. CloudFlare loves serving up a ReCAPTCHA for Tor users before they can even passively read site contents. That hugely expands the damage done.
The plugin requires "privacy passes". Those passes can be obtained by solving captchas, but when trying to do so, one is greeted with this message about being blocked: https://i.imgur.com/qXJfl6J.png
If it was developed in conjunction with Tor, how come it doesn't come bundled with the Tor browser or Tails?
Until software developers care -- nothing will happen.
Monopolies need to be broken up because they threaten the free market and consequently our way of life - not because employees revolt.
Juniper patented saying "No" to a client.
Yes I can and do. It's bad enough that some websites won't let you do certain things over Tor, but preventing access to the website entirely is unacceptable. I made this account and comment entirely over Tor.
I don't see how it's okay to block Tor. That generic claim is made, but how are your spam measures doing if you couldn't handle Tor spam?
>You might not be using it for abuse but a large volume of abuse originates from it.
There is infinitely more ''abuse'' coming from Google, and yet it seems most every page I visit contains Google malware.
On principle, I hold the idea that Tor should be a first-class citizen and not disadvantaged in any way. Notice that Google's ''HTTP/3'' is over UDP, which Tor doesn't work with; I don't find that a coincidence.
> like all IP addresses that connect to our network, we check the requests that they make and assign a threat score to the IP. Unfortunately, since such a high percentage of requests that are coming from the Tor network are malicious, the IPs of the Tor exit nodes often have a very high threat score.
Initially it was slow, yes. But totally fine the last few years for normal browsing and reasonable downloads. Speedtest.net, speedtest.googlefiber & fast.com just now gave me 5, 6 & 10Mbps for whatever server in Ghana i got. Only the high ping causes loading times to still be a bit annoying.
But right now the biggest reason not to use Tor for anything "legit" is the many services blocking you, since indeed most current Tor users are not what those services want and the race to the bottom of Tor will continue, if we haven't reached it already.
My own connection doesn't go over 1.6MB download speed, and only if the weather is clear and I have the wind in the back.
You can now achieve a 500KB or more speed in most Tor connection, which is enough to have a confortable browsing experience, imo.
The real downside is the google captcha, which happens sometimes to even denie you to solve a captcha in the first place for web pages where there is no user input.
Given that Tor is a tiny percentage of Internet traffic, most of the abusive volume out there has little to do with Tor.
edit: also didn't try it over tor
The walled garden approach worked for a while for Microsoft, and it's working for now for Google, but eventually, it stops working. Once people leave, walled gardens keep them away.
You can't just opt out of using half the Internet because you value privacy, and nor should you have to. This requires legislation to stop.
This of course doesn’t help explain why Firefox is so heavily targeted by what’s supposed to be a neutral utility like Google Analytics...
The idea of tracking your history across multiple reCAPTCHA loads across multiple domains to build a user profile is what sounds like a giant privacy red flag, even though it's entirely possible given the current implementation.
Additionally asking hosts to include JS directly onto their domain which sets 3rd party cookies/data across every page in addition to tracking referring domains is equally a bad idea. reCAPTCHA 2/3 does require loading 3rd party JS directly on page, which I'd imagine is necessary to create callbacks in the frontend upon verification (as iframe content messaging is very awkward):
Ideally the JS simply loads an iframe of the captcha HTML and handles the callbacks from events in the iframe. That's it. It shouldn't be touching anything else on your website. I'd be curious to see a reverse engineering to see how much the JS really does...
Yeah, no. It certainly can read non-google cookies on the page (not httpOnly cookies, though).
That said, I've no evidence one way or the other!
My understanding is that it comes down to information they can read about your browser (does this look like a bot environment?), and heuristically how the user has behaved since the JS has been loaded (mouse movements, time between actions, etc).
One trick that seems to help fool that awful piece of tech: click slowly on the images, as if you were thinking a second or two before each click. Maybe click a wrong image and deselect it again. In other words, behave like a slow human, and it seems to work better than if I solve it as quickly as possible.
Again, being slower and more error prone seems to be rewarded.
If this reduces the world Google allows me to access, it doesn't diminish mine because of it.
> "If you have a Google account it’s more likely you are human"
So, in the future if we don't keep signed into our google account(and let google know every article we read and every website we browse), we'll be cut off from the half of the internet or even more.
The amount of control a handful of companies have over the internet is suffocating to know!