Hacker News new | past | comments | ask | show | jobs | submit login
Former Mozilla exec: Google has sabotaged Firefox for years (zdnet.com)
187 points by y2kenny 34 days ago | hide | past | web | favorite | 35 comments



"oh our precious search deal"

Where was the free and open source distributed search engine?

"oh we cant support any protocol or any other technology! We don't want to be promoting anything! Unless we get paid of course!"

Where was p2p video? Why no BitTorrent? Not to long ago hosting video on a server was crazy expensive but at that time people still visited small websites.

It should have been as simple as:

<video src="magnet:?xt=urn:btih:0b239d7ccff9c46e816ba4f4b3ff813bea3691f5"/>

But in stead we got youtube which is short for: adware, censorship, spying, completely unreliable, a crazy resource hog to the point of not working on hardware older than 5 years.

And then there is privatized law making & privatized law enforcement.

If I put stuff out there from my own computer from the comfort of my house I'm subject to plenty of laws and regulations created by my democratically elected governments.

There are no multibilion robots shooting from the hip, there is no false-positive punishment without dialog. If I break the law the situation is carefully examined, I get fined or even imprisoned. Its not a perfect system but its better than persecution by robot.

The Google constitution (TOS) is something like: We may terminate you at any time without explanation.

That's not someone you want to enter into a relationship with.

Something modeled after YACY should be the default search engine. Its wonderful, you have it crawl the pages you frequent and it finds everything, other users crawl other things, nothing google could ever do.

In stead, if I type "1+1" in the search box you send me to google to solve the problem.


There are two problems with decentralized / federated p2p approach.

(1) Everyone pays for the infrastructure. That is, you pay for storing and serving the videos in a p2p video service. You pay for crawling and forwarding the search results to users of a p2p search engine. It may be unnoticeable if you run a beefy desktop / server on a 100 Mbit connection from a major metropolis. It becomes worse on your laptop or mobile on a metered connection, or on a rural DSL.

(2) The speed and reliability of a distributed system depends on all its nodes. SREs running a bunch of servers at many nines is one thing. You and me running goodwill p2p nodes is another. To achieve comparable reliability, many more nodes would be needed. Also, with enough nodes, p75 latency may be okay, but p99 is going to be poor. Compare to torrents, where last month's content may arrive in seconds, and obscure content from 5 years ago may take days.

The sad thing is that most people are not ready to shell out the amount which marketing companies pay for the ads they see. Even less are they ready to make their experience slower and requiring more technical chops.

For the few that care, there's PeerTube, ipfs, distributed search, etc.

If Mozilla only tries to pursue goals that have a chance of global success, I'm fine with that. Let them keep Firefox and Rust in a good shape; more than enough for me.


You're talking like we're still back in 2010 or something.

(1) Digital Ocean, Vultr, and Linode Boxes have a 300 Mbps port size, under polite use. That goes for every instance type. Search engines like Manticore or anything besides memory eating java things like elastic search can run under constrained environments in a cluster and when used with SSDs are really damn fast. Then there is Cloudflare who's services, as long as you don't abuse them, can help with massive distribution. Not to mention all of the storage backends that exist with really, really cheap storage. Backblaze, Wasabi, hell when some ingenuity you could probably use Dropbox apis for long term unlimited storage. This is your backbone.

(2) For extra redundancy and better distribution you can add in p2p which if used correctly can propagate new data to users within reasonable time frames and allow for data storage to be more resilient. Easily less than a minute in streaming environments.

Also, while IPFS is nice it doesn't have the resiliency necessary for the future. Mozilla won't do this but neither will any other browser making company.


Sure, the infrastructure is there for the tech-savvy.

But people who has at least a vague idea how to configure their own server are a small minority.


FWIW, NI URIs[0] were a stab at this. So you'd have something protocol-independent, like <video src="ni:///sha-256;UyaQV-Ev4rdLoHyJJWCi11OHfrYv9E1aGQAlMO2X_-Q" />.

I hoped they'd be part of the subresource integrity[1] spec, but a simpler system was chosen. That said, you could still piggyback on <img src="whocares" integrity="sha384-oqVuAfXRKap7fdgcCY5uykM6+R9GqQ8K/uxy9rx7HNQlGYl1kPzQho1wx4JwY8wC" /> to deliver protocol-agnostic content.

[0] https://tools.ietf.org/html/rfc6920

[1] https://www.w3.org/TR/SRI/


Hopefully one day we will get there.


Using Firefox on android I would only get the 'old' style Google search results. I installed a popular user agent switcher extension (featured by mozilla) that made me appear to be Chrome when on Google sites and I got the same rich results page as a Chrome user and it worked perfectly.

I had to uninstall it as I started seeing captchas everywhere. The recaptcha v3 test page gave me a score of 0.1. After uninstalling this jumped to 0.9.


> I had to uninstall it as I started seeing captchas everywhere. The recaptcha v3 test page gave me a score of 0.1. After uninstalling this jumped to 0.9.

Wow, I had no idea this was the cause. I thought it was just Google straight up ranking Firefox lower (whether intentionally or due to bugs) because they don't care about fixing the experience for Firefox users.


Yeah, I noticed that too. Instead of changing my user-agent string I switched to DuckDuckGo.


Anyone at Google want to make an anonymous account and confirm/deny? :)

Stuff like this really disgusts me, if it's true. The saddest part is that non-technical consumers will never care about walled gardens or the gradual centralization of the internet - because the truth is, the more centralized we get the more convenient everything becomes.


Stuff like this is never an outright order to break competitors. Most of the "features" requested for google sites would be solely chrome "features" that would break on the rest of the browsers.

Google also did the same thing with Internet Explorer/Edge where they put a transparent div on top of youtube videos that was "designed to give focus to videos" but broke youtube videos for all IE users.


>> Google also did the same thing with Internet Explorer/Edge where they put a transparent div on top of youtube videos that was "designed to give focus to videos" but broke youtube videos for all IE users.

In real life they did it accidentally, and then kept it when realised it works against Microsoft. Also kept the plausible deniability? :)


The sad part is that people are naive enough to think that companies that need/want money... are doing things for the better evolution of mankind, or for a better web, or for a better future. Companies are here to make money. Competitive companies will try destroy one another, always.

Google does not give a f... about users/web... They care like any other company to make money, because without money they cannot survive, just simple as that.

If the roles were inverted... Mozilla would have done exactly the same thing...


Mozilla is a non-profit.



I'm getting kinda tired of Google's bullshit. It's a small step, but as of today I'm ditching Chrome and moving back to Firefox.


Every step matters!


Was it also Google who decided that Mozilla had to spend a majority of their money on things other than Firefox development?


Yeah, Mozilla is wasting a ton of money, but they wont admit it unless Google stops funding them and they run out of ad cash. Even then i have a fear their marketdroids will start selling out their users (see Mr.Robot debacle) as a grasping to avoid losing their parasitic jobs, before giving up (and it wont be they who will lose).

All they need is programmers to write the code for the browser and someone to manage the servers where the code, bugtracker, wiki and website are hosted, anything else is unnecessary and frivolous waste of resources. Get rid of the parasites.


Thank you. From the Pocket stuff over Cliqz debacle to the Mr. Robot thing that went really wrong, i'd like to see Mozilla focus on their core product more, and less on any sort of auxiliary thing. I would love to have somebody correct me though if that's an unfair statement.


I second this I don't think it's unfair in the least.


Er... Mozilla does spend the majority of its money on Firefox development. What makes you say it does not?

(Disclaimer: I work on Firefox; I've seen the public financial statements.)


It does not matter, what they are spending those money on. In fact, spending them on development only made things worse.

As soon as Mozilla started relying on Google's money, they were doomed. Now they are staffed by lots of well-paid US developers, and have to continue taking Google's deals to pay the wages.


'Chrome as the only option' is a hugely significant strategic issue for everyone.


Google was also blocking Microsoft Edge at every turn. I still suspect Microsoft countered this by intentionally blocking Adsense/Doubleclick ads like 50% of the time.


I've been on Firefox since 2014. Never looked back.


I've been using Firefox since around 2003 or 2004 - whenever I bought my first MacBook Pro. Firefox was way better than Safari. If I ever do need something that would "require" Chrome, such as a Chrome app, I use Chromium rather than Chrome, since I run only free/libre software on my Libreboot laptop.


Stuff like this seemed to happen a lot back when webOS was a semi-serious contender in the mobile OS race. I remember that many times the downgraded functionality and bugs could be “patched” by changing the user agent. At the time I didn’t think much about it, but it may have very well been a strategy to make android more “competitive”


Gmail's memory usage seems to unreasonably balloon with time on Firefox for me. I wonder if it's intentional...


Fwiw this happens to me on Chrome too, for pretty much every Google app (Drive is probably the worst offender, especially because it's functionally just a static page when I'm not using it!)


The chief way by which google and the rest of the adweb companies sabotage Mozilla's stated mission is through executive incentive alignment, everything else is mostly irrelevant details (he said momentarily forgetting who the audience on this site was).


The same could be argued about Mozilla. They've gone out of their way to oppose Google at the W3C, at the expense of developers and users of the web.

List of good Web tech killed in W3C by Mozilla:

1. NaCl/PNacl

2. WebSql (in favor of the far worse and cumbersome indexed db)

3. HTML5 storage / filesystem api

...

The grudge seems equally bad on Mozilla's side


> 1. NaCl/PNacl

In favor of WebAssembly, which is a much better tech in terms of actually being able to achieve interop, no?


No, not really.

I have contributed towards Chrome's implementation of PNacl and wasm a few years ago. Wasm is still not anywhere close to the capabilities of PNacl and has all the signs of a 'design by committee' specification. I will be (pleasantly) surprised if wasm gains more traction.


A major problem with PNaCl, as far as I can tell, was the Pepper dependency. And, again as far as I can tell, Google never came up with a credible plan for solving that problem (e.g. by creating an actual implementable spec for Pepper).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: