So many companies make stupid, gratuitous changes that break perfectly acceptable browsers. For what? They're not making browsing any faster - no, instead, they're adding things that make the newest, fastest computers we can buy take a long, jerky pause when we click on something, or try to scroll too soon...
I think it's time to call out shitty web developers who are probably contributing to this because they want job security. The web doesn't need generation after generation of significantly different browsers every few years, nor do we need web sites that very literally take more RAM and more CPU resources than entire OSes.
It's all bullshit, and there's never a good reason for any web site to take hundreds of megabytes of memory and take seconds to do things on a machine that can do tens of billions of instructions per second.
I feel just as disgusted by stuff like this, but more at the industry incentives than developers as people.
There is a type of developer that adds more technical garbage to the pile, but their ranks grow when their peers see they are rewarded for that behavior.
It is the game that is messed up, many are just playing by the rules to 'win'.
I feel like this kind of thing happens as SWEs became commodified. Just imagine if we were still a tight knit community of people in basements, we could simply work our way through the social network and yell at whoever made such bad choices.
Apparently there's an extension called 'Hide Private Mode' which gets around this. Browsing in containers also reportedly works. But these workarounds shouldn't be necessary. This is creepy behavior.
If you were trying to get something done you probably just didn't pay enough attention to what you were doing, and you forgot about it. It could have been years ago. I don't think it's all that strange: can you remember everything you ate in April?
Their site has been annoying to use for a while now, and its slowly getting worse. I think it was last year when I found myself unable to pay bills with Firefox since it blocks certain cookies by default (and, really, I make those settings even more restrictive). So I have to manually allow everything, pay my bill, then restrict it all again.
More and more sites are breaking when certain privacy measures are used. A few months ago I started having similar issues with local water utility. Always could pay my bill just fine, then it became more restrictive.
Of all the things to make difficult, you'd think payment would be the one aspect they'd want to work smoothly.
Coles supermarket in Australia has a site that just fails to load when connected via a vpn. And it’s not even a “access denied” it’s the JavaScript cbn rejecting requests so the site loads broken.
Reading the thread; I have to assume whatever rep was in charge of that twitter account had ~no idea what they were talking about. It's just too oblivious to the base concepts involved.
Fuck T-mobile. I have stopped using plenty of sites that break under certain conditions, such as private mode or when blocking third party cookies. I guess it's a little more difficult to not use the T-mobile website if you're one of their customers though.
I'm a t-mobile customer and their website sucks, even when it works. On the plus side, I only rarely need to use it. Only when I need to adjust the auto pay, or if I need to fiddle with the plan if I go over the data allowance.
You are right, I should have mentioned they only block login for more context. I found it interesting that they would block Firefox, but allow Chrome's incognito. Well looks like I can't edit the title now.
Instead of restricting access to APIs, "private modes" should just return junk data. This should be on the browser to fix. I would love it if my browser would return random numbers for things that web sites have no business querying, like my screen size or the number of cores in my cpu.
The rabbit hole goes down a long way. You end up with things like timing api calls and if it’s too fast you know it must be junk from the private mode fake api.
Sure, but browsers are complex enough that it becomes an endless cat and mouse game where adversaries are able to find an endless list of things to differentiate private browsing with. The browser makers have been tightening things up but the people making these scripts have a long list of tricks in their backlog so they just roll the next one out immediately.
I have no idea what you're trying to say here. T-mobile wanting to know how many cores my cpu has is equivalent to my bank telling me how much money I have?
No, I sincerely think that I should have a setting to to tell my bank to lie to ATMs, especially useful for those that print out receipts automatically.
OK, so the next step is to spoof those APIs. That's to say the browser should report itself as something 'respectable' and not restrict said APIs but to send back garbage.
This is getting beyond a joke. Governments should demand common carriers be just that—license to operate dictates no interference with the data-stream and no monitoring of users' habits.
Why the hell is it so difficult for governments to legislate these requirements?
I think the developers at T-Mobile are equally pissed off about this. But with Firefox market share sitting at 3.5% in the US, it no longer makes business sense to spend the extra money required to make their site work in what is now a niche browser.
I get that this is a short move but what about using Firefox containers + cookie auto-delete? You can put T-Mobile into a container all by itself, and automatically nuke cookies. Is there anything else that Private Mode gives you?
I have Firefox Containers, cookie auto-delete, ublock origin, privacy badger.
The site is unavailable to me even in a regular Firefox window. I am guessing it's the tracker blocking or maybe I have some config settings that is doing this.
that's a very good question.
the one thing that private mode adds is not contaminating your own history.
there is a nice extension for temporary containers that can create new containers on demand and will automatically delete them if they are not used for a while.
i don't know if firefox is working on per-container history, but i think, in combination with that extension that would be a useful alternative to private mode
I am guessing Local Storage or something like that. That mode just breaks bunch of APIs to work, it is not as simple as "just delete all cookies on start and after tab is closed"
depending on which browser you are using there are slight changes to the api's accessable to web sites depending if you are in private mode or not. For example, in firefox private mode service workers are not avalable so a simple check to see if navigator.serviceWorker is undefined is all you need. Other browsers have other tells such as what types of storage are avilable, the size of storage (if avalable).
Firefox gradually adds protections against fingerprinting methods that aren't just cookies. Of course, sometimes this means stepping on borderline cases of legitimate API usage. Also these protections are added for the private mode first.
The T-Mobile account management website is absolute trash. I was just blowing my stack about this affront to all things decent the other day. Why does showing me my bill and letting me update my account details require so much garbage JavaScript all over the place? I'm not running a command center that needs up-to-the-second "dynamic" billing info, I'm just a simple plebe who wanted to update his billing info. So just render the stupid page server side and let me get to it. But no, instead I have to sit around watching 4 different spinners "load" the "frontend" components of this bloated, over-engineered pile of steaming crap.
The site, last time I checked, doesn't work when you try to login, even outside of private browsing. It just constantly tries to load. My bill is on auto pay so I haven't had to bother in a year but remember having to open Chrome when I was changing credit cards.
I had to resort to logging in via my phone browser the other day. Got endless spinning on desktop, on both Firefox and Chrome. Absolute trash of a website.
I think it's time to call out shitty web developers who are probably contributing to this because they want job security. The web doesn't need generation after generation of significantly different browsers every few years, nor do we need web sites that very literally take more RAM and more CPU resources than entire OSes.
It's all bullshit, and there's never a good reason for any web site to take hundreds of megabytes of memory and take seconds to do things on a machine that can do tens of billions of instructions per second.