I'd love to use ClearURLs, though last I checked it had a major flaw: it allows arbitrary code execution by the provider of the filter list. Among other things, it can redirect script URLs to arbitrary sources, and the filter list is periodically updated from a GitLab page, which enables the filter list provider to perform a targeted attack by serving a malicious filter list to a specific device.
The only filter list provider is the extension maintainer, so this information should be safe to share. I have not had the time to set up a PoC, but I'm confident that the filter rules are way too powerful.
At the very minimum, the current filter list should be included in the extension package rather than periodically updated from a remote URL. That way the filter list can be audited and must pass a review, without having a negative impact on the effectiveness of the extension, since the filter list does not appear to frequently change.
I agree with you there. For my stealth browser I decided to go with a different JSON based format [1] that can rewrite the URL parameters via wildcards (for both * at the start and end of both key and val).
It has the idea that you can audit a website and only list the allowed parameters there, so that a website search or sorting order or filters can still work.
I built my browser on an allowlist based concept because it seemed too impossible to maintain all bad urls, domains, parameters on the web. Most websites have more tracking than content in them, so I decided on maintaining lists to select the content rather than the ads and trackers.
Check out Neat URL - it's more basic, uses a comma-separated list of rules, and comes with some hard-coded presets you can override. I maintain my list in a text file and just update that and copy/paste in when I want to create one.
Of the defaults, I only override "cid, mbid" as blocking those on every site has ended up breaking some.
The last time I used it, it also disabled ETags by default. I lost many hours trying to figure out why those 10MB Kibana JS bundles are re-downloaded on every page load and only in my Firefox, checking about:config, etc... I know etags can be used for tracking and that Expires should be used instead but i did not expect CleanURLs to do anything more than just cleaning URLs...
I can no longer edit my comment, if someone has the time, please verify this vulnerability and follow up with the maintainer and Firefox reviewers, remote code execution is against add-on guidelines. My impression is that the maintainer is not malicious, though someone could exploit them or the filter list service, and hack the entire userbase of the extension.
I mean, you say paranoia, but I think back to the time I had to spend hours and hours unliking instagram posts made by a bot that had harvested our cookies by buying Nano Adblocker.
In this case, we know that extensions are sometimes sold and updated maliciously. Having external arbitrary code is a legitimately concerning vector because it bypasses Google verification of the extension.
Not that Google are great at their jobs in that case, but it's something.
So it's not paranoia in this case, it's "we can't have nice things" because of real bad actors.
I remember back when adware, spyware, and viruses ran amuck on PC’s thanks to lax Windows XP security design and an open internet without any effort to protect users. It was bad.
We do need to decentralize the decision making but the progress toward making the web safer for average folks is good.
Freedom isn't free. An open internet where users take reaponaibility for taking risks is preferable to a safe but locked down and centralized internet.
What is even scarier is one controlled by Google because it's the only browser in town. A company that wants to know everything about you and sell it to the highest bidder in order to maximize profits for their stock holders. Everything else is noise to them except for an occasional public outrage which the fix with a slight course correction.
If we put everybody in jail we don't have to worry about crime anymore! That's a lot easier than trying to have an informed public who can exercise caution and learn to assess risk in their lives. Besides, only a very small market segment of hardcore freedom enthusiasts really care about freedom. There's not enough money in that market segment to be worth the investment. Most everybody will happy watching television in their cells. Anybody who doesn't like it is welcome to go to the jail run by our one competitor.
Thanks for mentioning this. While I did install it upon seeing the news on removal, I'll go without it for now and hope for a similar project from the EFF.
I'm seeing downvotes for this and I am here to learn- where am I misguided? Is there a convincing argument to install this program? Let me know, I just want to understand what I may be unaware of, to receive the new information, and then if it makes sense I will correct my decision.
If you don't like the risk this poses, don't use the extension. Your ability to make informed decisions about risk vs reward keeps getting chipped away when Google pulls this kind of stuff off. Google should warn you about the security risks (edit: or just remove it from the public facing store and only keep the hard to guess URL active) but don't tell me what extensions I'm allowed to use or not. Even adding local extensions I make myself are treated like a security threat with a popup every time I open Chrome.
Stop the helicopter computing. People keep saying they want the old Internet back, this is why.
I disagree with this stance. Pulling extensions that have a large potential for abuse is absolutely in Google's prerogative, in my opinion.
Suppose our single maintainer decided to finally sell the extension, and the person who bought it made it so that all those links hijacked information or exposed you to malware. This would happen in one day without warning. How many people would be saying that was Google's fault for allowing this to happen?
You say people should determine for themselves based on risk, but most users of Chrome extensions are naive when it comes to understanding risk.
I agree, maybe removing the extension from the public facing web store is a better solution. But at the very least, allow the extension to be installed if you have the "hard to guess" URL. I do this with my app that requires a desktop app to be installed since it requires native app messaging.
If you really want to use the extension, you can clone the extension's repo, enable Developer Mode on the extensions page in Chrome, and then load the extension.
You can probably do this in under 30 seconds, but it's enough of a barrier to keep naive users from doing it.
With google it's especially true since their entire existence is centered around gathering intelligence on people and then selling ads with it (and also hoarding it for future use)
You're getting downvoted but I agree. It's one thing if the maintainer abuses his power as an extension provider. Quite another if they have a history of putting out a perfectly good extension and google acting like they're guilty before proven innocent.
I don't think you understand the issue. There is an accidental backdoor in the extension. The maintainer can manipulate and access the pages you visit at will, without needing to release a malicious update. All these features can be implemented without the maintainer being able to hack you without a trace, there is no loss of functionality if the security issue is patched.
So you're saying Apple should pull Chrome's permission to run on macOS anytime there's an accidental zero day or vulnerability?
If Google wants to act like a platform, it should have some form of escalation with the developer to fix issues instead of complete removal without warning.
It looks like the developer may be in the EU. If they offer the add-on as a business, it may be worth looking at if any of the internet rights legal groups will help take the case up under the 2019 EU platform rules.
These make various requirements around how Google act, and include requirements around removing products from platforms.
As an aside, it seems crazy that we allow platforms to take action when in positions of such clear conflict of interest, but that seems to be the way of the tech sector.
Except remote code execution wasn’t the reason google claims to take the add on down... come on, it’s a clear conflict of interest, and they could have easily asked to remove remote code execution possibility instead.
I really struggle to believe that "Google" thinks this itty bitty extension that a rounding error % of their users use would have an impact on their buisness model, which was grounds to kick it off the store.
This "just" sounds like the typical story we hear so often of an overzealous "app" reviewer waking up on the wrong side of the bed and just decided to delete someone's product and/or business (which is a huge problem itself!)
One extension does not have an impact, but in aggregate across many extensions these things can make a huge difference.
The primary reason Google has Chrome and Android is to maintain ecosystem control and to continue tracking users to support it's Ad business (or reduce the threat that other browsers will diminish it's business in these areas).
This reminded me of a library I released in January 2005. Part of the embedded docs from the 2011 release:
;;; The @b{urlskip} Racket library provides a function that translates some of
;;; the Web URLs that might be used to track a user across sites, by removing
;;; intermediate HTTP redirectors or information that might identify the user.
;;; Such a function might be used as part of a privacy-enhancing Web browser,
;;; or to canonicalize or un-obfuscate URLs for Web analysis projects.
;;;
;;; Note that @b{urlskip} is not intended to remove information used by
;;; ``affiliate'' referral programs to identify site operators that have sent
;;; users to a site. However, in some cases this affiliate ID information
;;; might be lost in the process of removing a intermediary URL that is used by
;;; a third party to track and profile users.
It had special-case handlers for various URL server authorities and the paths under them. So, for
A lot of the cases it handled were redirectors, which usually meant only the target URL, which was usually in a query parameter, but might be in the path, and might or might not be URL-escaped. So, for example,
I was going to link here to the code of `urlskip`, but it's no longer in the package repository where it used to be. (I added a lot of libraries to that repository, and don't recall whether there was some reason to remove this particular library.) It was a pretty niche library, and in a fringe language, so its impact might've only been as an example, pointing out that this could be done, and some rules for it.
Sounds like a great idea! I had a similar service in mind to resolve shortlinks like t.co/bit.ly, etc, this could also be a nice feature for web archiving like archive.org
I was under the impression that the new Firefox for Android has not implemented all plugin APIs yet, and thus can't run most of the plugins. Of course there will be plugins that do work and are not whitelisted right now, and they need to get on that, but it's not just whitelisting for the sake of whitelisting
1. Create a collection of addons you'd like to use on AMO (e.g. via desktop mode)
2. Use Firefox Nightly
3. Click several times on the firefox icon in settings > about until developer mode is enabled
4. Click "change addon collection" and enter your AMO userid as well as the name of the collection you created in step 1
5. Firefox is restarted, and you can install all addons from your collection (if this doesn't work, clear the cache via the android app settings for firefox)
6. Have fun :)
I hope you see why I still don't consider Mozilla behavior acceptable here.
Many addons work already, but they're locked behind mozilla's vetting. Not only you need to jump through additional hoops, but you actually require an account on Mozilla's website as well.
This is like saying that Chrome is fine too, just use Chromium, get the addon from a github release and there you go!
Definitely. But I can understand why not everything is added yet: Addon configuration pages are mostly broken, downloads are broken, and some other APIs.
Once those are implemented, which is expected soon (only in october did they implement the majority of the APIs so far) all addons should work, and it'll be open again.
The big issue with just opening it now is that it'd lead to dozens of complaints due to broken addons, so I can easily understand limiting it to people with enough understanding to go through these steps.
Besides, the chromium workaround used to be the official installation instructions for AdNauseam for Chrome, until Google markef it as malware. And Chromium doesn't even support addons at all on mobile devices
That's currently a technical limitation because they haven't built all the infrastructure into their "new" browser to handle all the available plugins and don't want to receive thousands of bug reports they can't do anything about.
Pretty far from user friendly though - not a lot of users will do that, reducing the chance there will be actual demand to fixup the APIs and addons to actually work on mobile.
I tried last year, and honestly it isn't bad. But I use the Chrome "install this site as an App" functionality a lot and Firefox's "app tabs" didn't work nearly as well. Plus, I hear they've removed or are removing said single site browser functionality.
This might be a somewhat niche case but it makes it really hard to switch as much as FF does have some nice features (picture in picture for all video content is very nice).
I think GP means on the desktop. I also use the same function so I can run the web version of Slack instead of the Electron one, on a separate window with the Slack icon. Firefox killed their efforts to support this (single site browser). https://news.ycombinator.com/item?id=25589177
I was trying to help a family member set up a new tablet for their kids, and we literally could not find any Add to Home Screen feature in Firefox anywhere. It's there on my phone, but like 80% of the menu items I have were just not there on this tablet.
There's a billion dollar niche waiting for the right company:
- make a search engine that works
- show text ads clearly distinguishable from results
- play nice, and maybe even use use a cool slogan like "we're not evil" or something (it used to be someone else's but it seems they don't use it anymore ;-)
- Our search engine works and has been doing so for >15 years. Our search quality needs improving but does so gradually. And it's independent; our own crawler, index and infrastructure
It's possible that at this point I (and others) have trained ourselves to know/do the kind of queries that work on Google, without even realizing it, which would be another thing making it hard to switch. Although in this case... I'm actually a bit surprised mojeek doesn't manage it. Just `github hanami` doesn't get it either. Is it just not matching on sitename at all?
Hello, mojeek dev here. Cheers for the feedback, appreciated.
In this case we simply don't have the page in our index, though we do have others mentioning hanami. Our bot is permitted to crawl github.com and has a good number of pages in from that host, we'll evaluate whether we can increase crawling for github and similar large sites and hopefully before long that page will enter the index.
I tried some development-related queries that I have recently done in Google.
Queries related to Go seem to mainly work only if using Go and not if using Golang (unless "go" + term is popular outside of Go as well). Usually people use Golang in search (to avoid confusion with the verb & game), but pages generally refer to Go. "go package XXXX" seems to work better in many cases.
With a bit of lesser known technologies, it was hard to find a query that would get me to the actual site. e.g. Python SDK for OCI. Lots of links to examples with various queries (python oci, python oci sdk, python oci api), but not really any direct link to GitHub or the official documentation.
Hello, mojeek dev here, thanks for the feedback it is always appreciated.
I think there's two ideas that come up from your feedback, one is index size. Our index is small but growing. A larger index increases the chance of having pages that satisfy your query.
The other aspect is boolean search versus something more akin to the vector space model. We've found a lot of people that are dissatisfied with Bing/Google searches tend to be unhappy that the search they actually enter is somehow modified to include what the search engine believes are relevant synonyms. In some cases, those synonyms may help in producing a better result when use of language is split between two terms used interchangeably, like go and golang. It's something we're looking into. We do value searches being based on what is actually searched for but also accept there are cases where assuming synonyms may be advantageous to the end results.
If you do this, it'd be great to have a way to select the 'mode' of search (exact query vs 'smart' terms). I'm not sure what the user interface ought to be, but "literally" "anything" "would" "be" "better" than the contortions you have to go through to force search engines to search for the terms you've actually provided.
Great job so far, by the way, keep up the good work!
But IMO it is a far cry from 2007 Google and only holds a candle because Google has nerfed itself.
And sadly, much of it isn't because they don't have resources for AI or even larger index but because of the same QA issues that Google has struggled with:
- including results that doesn't contain my search terms / too much fuzzing
- ignoring double quotes
Both probably in order to please the mythical ordinary/average user I guess.
Guess what: techies got me to change to startup Google and I guess we will get people to change to another search engine as soon as one is ready.
In the meantime I use DDG. The difference is mostly negligible now and when it isn't it is 20 times faster (I don't think this is hyperbole) to mash in a !g at the end of my DDG search than the other way around.
FTR: same goes for browsers. For me Firefox has always been best, but they have nerfed themselves and keep ignoring us techies to such a degree that I will - if necessary - pay monthly to get a safe, supported version of the same with my old extensions working, but not to Mozilla Foundation, only to the Corporation (the ones who create the browser) or someone else.
Do you realize how expensive and hard it is to build a search engine from scratch? It's not a coincidence that only state-size corporations have been able to keep a high quality web index going. Cliqz tried and struggled hard, until they decided to shut down. It's not a "market ripe for disruption". It's maybe the market with the highest barrier of entry in tech.
Independence on its own, would not be a competitive advantage. In reality, for the most part, people do not care enough about minor privacy violations. The outrage you see online rarely bleeds into the average user's day-to-day decisions.
I wonder how the next disruptive innovation is going to look in the search engine market?
>I wonder how the next disruptive innovation is going to look in the search engine market?
I think the search engine market will always be very closely linked to content distribution platforms. If the decentralised web continues to degenerate into an oligopoly of walled gardens then there will be no search engine market in the current sense. We will just use the search function provided by each platform.
I believe the question we have to ask is what the next disruptive content distribution platform is going to look like and whether that disruption can be anything more than yet another oligopolist stealing some share from the encumbents before getting bought by one of them (or not).
For me, it only broke websites when I had the extra options enabled in settings (enabled by default). I’d recommend turning them all off and trying to use it again
If you, like me, were wondering what that meant (or if it is an automatic distinction like "Amazon Choice"), this is what they say gets a "Recommend" badge.
>Recommended extensions are editorially curated extensions that meet the highest standards of security, functionality, and user experience. Firefox staff, along with community participation, selects each extension and manually reviews them for security and policy compliance before they receive Recommended status. These extensions may also qualify for promotions on the AMO homepage and other prominent locations. Developers cannot pay to have their extensions included in this program.
Just to add my own data point: I have some extensions in the Chrome Web Store and, from time to time, Google send me a notice that they violate privacy policies (but they don't collect any data) or some permissions are not used (but yes, they are).
So, after explaining and linking to the source code, they usually reply with another canned response:
Thank you for reaching out to us. We took a closer look at your item again and found it to be compliant with our policies. Your item has been reinstated and will be available in the store shortly. We apologise for the inconvenience caused to you in this matter. We value your contributions to the Chrome Web Store and look forward to working with you.
So maybe there is some Hanlon's razor at play here, too.
Sufficient incompetence is indistinguishable from malice, and should be treated similarly (at least when dealing with companies). In my opinion, a company that regularly flags things that are complaint with the rules is Bad regardless of motives.
> The reasons for this are ridiculous and probably only pretended because ClearURLs damages Google's business model. […]
> Among other things, it was claimed that the description of the addon is too detailed and thus violates the Chrome Web Store rules. The mention of all the people who helped to develop and translate ClearURLs is against Google's rules because it could "confuse" the user. Ridiculous.
> Also, Google has criticized that the description of the addon did not mention that there is a badged, an export/import function for the settings, a logging function for debugging, and a donation button. This would be "misleading".
> Last but not least, it was criticized that the "clipboardWrite" permission would not be necessary. But that's not true, and I've had a description for each permission in the Chrome Web Store Developer Dashboard for well over a year now. So the "clipboardWrite" permission is needed for writing clean links via the context menu into the clipboard.
> it was claimed that the description of the addon is too detailed and thus violates the Chrome Web Store rules.
This one does make me laugh more than the rest, coming from Google that names their apps in the Play store as follows:
"Android Auto - Google Maps, Media & Messaging"
"Files by Google: Clean up space on your phone"
"Google Chrome: Fast & Secure"
"Google Duo - High quality video calls"
"Phone by Google - Caller ID and spam protection"
If this is a policy, perhaps they'll delist their own apps from the store?
Google reject plugin saying description is too long and wordy, and doesn't cover points that it should.
So either Google must be trying to block a plugin for evil business reasons or Google is trying to improve the Chrome Web Store, making it less confusing for users and easier to search.
Improving the description will probably make more people install it, not less.
It needed different details. In the reviewer's opinion the description included irrelevant details while omitting information which would be relevant to someone trying to decide whether or not to install the addon. Which is a perfectly reasonable assessment IMHO. A small amount of "flavor text" is fine, but the main purpose of the description is to ensure that prospective users can make an informed decision. Anything else can go in the app's "about" page or on a separate website.
Now if we could just get app stores to mandate useful changelogs… No, Google, "Bug fixes and performance improvements" doesn't cut it. Describe the bugs that were fixed and where and by how much the performance was improved. Justify spending the effort and risk of updating the software to the new version. There is no point in a changelog message that could be applied equally well to every release of every software product ever made.
More fuel for the Google antitrust / breakup fire.
Google should not be allowed to develop Chrome or have any say in web standards. Every play they make favors themselves - unsemantic HTML5, AMP, crippled and removed extensions, progressive removal of the URL bar, https everywhere (no more self-hosted blogs unless you understand cert signing and automated renewal - why did the web stop being easy?), cookie standards that favor their moat, "acceptable ads" policies, ReCAPCHA, etc. etc.
I don't disagree with what you're saying, but I feel HTTPS everywhere does not belong in that list.
Secure by default doesn't sound evil to me, and Let's Encrypt made it easy enough to get free HTTPS certificates (and for non technical people, almost all hosting services I've seen offer it out of the box)
A static blog that takes no user input/data will still leak the pages on that blog you visit, and the times you visited them. Knowing that you went to a particular page on a particular blog is a lot more information than knowing if you went to a domain. If I know you read about Conan the Barbarian on three different blogs, I know to send you ads about Conan the Barbarian (as a trivial example.)
I think authors want to ensure their writings are not edited, censored or otherwise tampered with while in transit to the reader. HTTPS isn't perfect, but this is one of the benefits it provides. It isn't only about encryption or privacy.
The NSA (and who knows who else) has the ability to tamper with TLS encrypted traffic, so that’s a moot point. Also easily defeated in the client by adding a rogue proxy and CA.
I don't think there's anywhere near a consensus that your statement here is true. In fact, I think it would be a surprise to the majority of the tech community if that were the case.
In a perfect world, sure, static sites don't need HTTPs. However, ISPs and other malevolent middle-parties have demonstrated why HTTPS is a must.
We've seen everything from injecting tracking javascript, to injecting their own ads, to outright replacing content with unrelated content that the ISP wants to push.
in a perfect world, we would not secure the transport but the content itself. and everyone should be able to build their own web-of-trust. why i as a (web-) publisher and my readers have to rely on the grace of just a few root CAs? i know technically it is possible to import my home-made CA cert in browsers, but it's not made easy: my server cert can not be signed by more than 1 parties; android requisites an unlock code in order to have custom CA certs - first when i saw this i was like "why the hell?", i mean i can imagine this is safety feature for simple users but come on!
That link does not load for me. The redirect to the the captcha is broken. Sincere question - is that the point? In other words Google blocks the captcha loading since the site isn't using HTTPS?
Let's Encrypt has de facto monopoly. I think we could have added HTTPS if we had dozens of projects like Let's Encrypt otherwise this is just handing over too much control to one organisation.
LetsEncrypt doesn't owe anyone free certificates, either. The point is that AWS isn't an alternative unless you're spending money with AWS. Nobody is wondering whether you can get a certificate by paying someone.
> Let's Encrypt made it easy enough to get free HTTPS certificates
Just checked.
My hosting provider asks 2x more money for SSL addon (which includes unique IP, unlimited subdomains, and free certificate). They wrote on the support forum I need that addon regardless on which certificate I gonna use, the included free one, or any other like lets encrypt.
Not gonna switch hosting nor pay 2x more for it just to please Google.
My web site has no comments or other user-generated content, runs no CMS, uses no cookies, collects no data except standard web server logs, hosts no executables, and has no secret nor security sensitive content.
At Starbucks I can inject arbitrary content into the browser of anyone who visits your site over HTTP and take control of their browser.
Furthermore, congrats on your site but you’re 0.01% of sites like that. Should we keep an insecure web because your hosting provider is ripping you off? TLS is easy and free in 2021.
> Furthermore, congrats on your site but you’re 0.01% of sites like that.
Thanks to the rise of the almighty platforms we've lost the will and know-how to do it ourselves.
> TLS is easy and free in 2021.
Only if you're relying on complicated cloud infra or (non-free) managed providers that do everything for you. It's a lot of work to set this up on your own.
It's impossible to be simple at this point. It's like the automotive industry which collectively decided to use computers for everything. You can't repair things yourself now. It's ironic, too, because now the industry finds itself with a chip shortage. I can imagine lots of scenarios where our complicated infrastructure requirements bite us.
There should always be the option of not using TLS. It should be first-class and not require expertise to access or use.
It's actually very easy to set up a TLS server using certificates from Let's Encrypt or any other ACME-compliant certificate provider. If you're using Apache, mod_md[0] will manage all the details for you. After enabling mod_md and mod_ssl, a simple TLS server only requires a few lines of extra configuration compared to a basic non-TLS site:
If you're using Nginx rather than Apache I believe it still requires an external script to handle certificate renewal, but the process remains fairly simple. The same scripts will also work with Apache if you don't want to use mod_md.
Users can decide: find a browser which doesn’t put importance on cert usage. You’ll find this hard to find because every browser manufacturer realizes that 99.9% of users cannot make sound security decisions, so they shouldn’t have to. Things should default to secure.
There’s a trade off between protecting users and having a 100% free and open internet. An insecure internet is untrustworthy and therefore not useful, IMO.
This is far more common than you think. ISPs, hotels, cafes, mobile providers do this en masse far more than you think. Have you forgotten the NSA “SSL added and removed here”? That was a highly targeted attack against infrastructure. What we’re discussing here is 10x easier to achieve.
> And even if it was the risk is just crap injected into someone’s blog.
That “crap injected” has full control over the DOM, any authentication, and everything displayed. How many of your users would happily put their creds into a fake login modal that popped up claiming to be SSO for a popular identity provider?
Without encryption active attacker could redirect users to different website, which would collect more data than your website does normally. They could also inject ads and javascript into users' sessions through your website.
Redirecting an unencrypted webpage could be the first step a hacker uses to take over a user's computer. It's best to minimize attack vectors as much as possible
It doesn’t matter much what your web site has today. If it’s available over HTTP an attacker can inject whatever it wants into the page without too much trouble at all.
I think you're mixing different problems here. You can't blame Google nor HTTPS if your hosting provider is trying to rip you off. You don't need unique IPs or unlimited subdomains to get an SSL certificate, these are just forced requirements from your provider.
Your hosting provider sounds incompetent. There is no need for unique IP to host TLS encrypted website, SNI support nowadays is ubiquitous. Let's Encrypt issues wildcard certificates for free as well.
There is no technical reason for asking 2x more money for encryption in 2021.
I agree with you except for the HTTPS part. In some nations it's not unheard of for ISPs to inject ads and tracking into webpages. This also opens the door for malware.
Let's Encrypt's with its certbot made it easy enough to get a cert and every major webserver supports HTTPS out of the box with good documentation.
The certbot client is pretty awful (it does not cooperate well with automization) but otherwise I agree 100%. HTTPS everywhere and Letsencrypt have been huge boons to security.
Some of its commands modify the config files so if you ever need to fix something manually you can very easily end up changing the config and breaking something.
Certbot seemed pretty easily automated to me as a guy who just set it up for the first time a few days ago on a new Arch install. The wiki is a pretty great resource if you're still having problems.
I second that. The whole problem is that Google is supplying a trojan horse that they use to prevent people from developing solutions from protecting their privacy and in fact they use it to gather even more data about people -- something that ensures unfair advantage over competition.
The layoffs last summer tell us all we need to know about the direction Mozilla's headed. I think they're toast. Brave and -- dare I say it? -- even Edge have brighter prospects.
They're all based on chromium though. That gives Google a lot of leverage because every time these projects decide to do something different from upstream it adds to the maintenance burden.
If Google decides to make some fundamental changes to their core engine that would make, say, ad blocking a lot more difficult, would the other chromium-based browsers deep-fork the entire codebase to keep ad blockers working while at the same time integrating the new features as fast as possible in order to remain competitive with Chrome?
Microsoft has the resources to do it, but they may not care. Brave and Vivaldi would almost certainly care, but they may not have the resources to do it.
I could spend all day criticizing Mozilla but I'll use Firefox to the bitter end because of this. In the end there are only three engines in widespread use these days: Chrom(e|ium), WebKit/Safari and Gecko. As far as I know Safari is irrelevant outside of Mac world, so losing Gecko would be terrible for the open web.
Gecko is already effectively lost -- the groups doing cutting-edge work were cut back in August. We're just watching the implosion.
I switched from Firefox to Brave in October and suggest others do the same. Personally, I believe Brave would indeed "deep-fork" the Chromium codebase if necessary, and I'd guess there's a significant chance that other Chromium-based browsers would use it in preference to a submarined upstream.
It seems as if google or is eh moles are monitoring this site because it is not the first time a reasonable anti google comments gets treated like this with no replies explaining why
Bear in mind, that many subconsciously are unhappy with their workplace's policies and effects on the world, but at the same time have linked their personal esteem and identity with work.
By threatening their self esteem, their very identity, by validating what the know deep inside them, that they work to do evil, without wanting to, they must strongly reject/defy such claims to reassert and restore their veiled view of themselves.
Even if socia Kali am h happens then the employees pass to own the company thus they would have more of an incentive to act like this blaming this on capitalism seems like a baseless claim in this specific situation
I did found that the codebase does not seem to be using any `browser.clipboard` API, so `clipboardWrite` permission seems to be unnecessary. According to [MDN](https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Web...), `browser.clipboard` API mainly exists to enable extensions to write image contents to clipboards, and all the ClearURL needs is writing texts. Also, [another MDN page on clipboard interactions](https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Web...) states that extensions do not need `clipboardWrite` permissions to write to clipboards, e.g. by using `document.execCommand('copy')`.
Hopefully, removing this permission could help getting the extension restored.
Confusingly, according to the MDN document I linked, that permission is not needed.
Clipboard APIs are in general pretty permissive in writing, any websites can write anything to the clipboard without requesting any permission, if it's done within like 1000ms or so from user interaction. So you don't even need an extension to write to clipboard.
It's a shame that Firefox Android still hasn't got its shit together wrt add-ons. The current selection is abysmal. It would be nice to have the same addon setup on desktop and mobile and have settings sync across devices.
Chrome has many convenience features, especially for users of other Google services, that effectively lock you into it. On top of that, some Google websites don't work all that well in other browsers (except where they are forced to play along - e.g. Safari on iOS), either because they use features that Chrome implements first, or because they specifically check which user agent you're using.
(It should also be noted that this latter bit is not something new to Google, either. I remember how Opera complained about them doing exactly that with GMail back in... 2009, I think?)
Claiming that something has "better" security is meaningless without first defining the treat model. When the threat model includes protecting the user's privacy, Chrome is not merely insecure, it is an actively hostile threat.
Pretty bad and ofcourse private company etc etc, but unfortunately since I mostly browse on mobile and:
Safari no addons, Chrome no addons, FirefoxAndroid killed almost all addons. I had found here on HN that by using Kiwi browser I could import Chrome Desktop addons but now even this is adding more friction for my own good, apparently because obscure, legitimate looking, dark pattern coloured, reasons.
All those browser engine speed ups are intetesting but I prefer the mobile web experience customizable as it was a few years back.
Btw if anyone here forks FF and enables customization also plz add bookmarklets! Would pay for that...
You can install any add-ons (including ClearURLs) from addons.mozilla.org onto Firefox Nightly for Android or Fennec F-Droid (a fork of the stable version of Firefox for Android) by following these instructions:
I use Firefox Focus on mobile which lets you reset your entire browser and cookie history with one click. It also has some built-in tracking protection and content blocking.
While it doesn’t solve the issue of add-ons, it has enough of a privacy focus to make me feel better about reducing my footprint on the mobile.
What I don't get is why Google is allowed to provide a browser and be a major content provider at the same time. If this is allowed to continue then it seems inevitable than an unhealthy monopoly will form where Chrome and its derivatives are the only browsers that can consume the modern web, and where opting out of tracking is not a possibility for anyone but the most technically inclined (although that's almost already the situation now).
Come on EU, step up. Google doesn't need protection here, they'll survive even if they cannot retain control over the development of Chrome, and in particular the authority over the extension store.
What I don't get is why Google is allowed to provide a browser and be a major content provider at the same time.
For the same reason you are allowed to make this comment or I am allowed to reply. There is no explicit law prohibiting them from providing a free browser while also being dominant in the content space.
There is a fair argument to be made that Google's vertical integration of internet advertising could be running afoul of antitrust laws, if those were still being enforced in the US.
Well whenever the governments of the world get their heads out of their collective asses and learn what a computer even is, then we can have competent legislation.
The general solution is to do a little of both. The job of the regulators isn't to just kill off specific companies that got too powerful - it's to create conditions where disruption is possible. An important way to do this is to prevent big players from abusing their power to make themselves nearly-impossible to disrupt.
But that's the whole point of the SV business model! Strangle any competition by initially offering your products for free (and funded by huge amounts of VC), and then raise prices at your leisure once you're the last one standing.
I suppose this is tolerated because there's usually more than one party trying it at the same time, so there's technically a competition going on - but it doesn't take into account the collateral damage done to the entire market. VC subsidies run out, eventually, and if the upstarts haven't found a path to profitability by that time, society is left with a damaged market segment.
> The job of the regulators isn't to just kill off specific companies that got too powerful - it's to create conditions where disruption is possible.
Broadly speaking, the big regulations tend to create conditions where disruption is harder. Regulatory compliance is often a substantial investment, and a rather big one, thus a barrier to entry; it's a lot easier for big, profitable Google to hire a team of regulatory compliance officers and GDPR architects and the like than it is for a small startup.
That depends. Often enough, regulatory burden tends to scale with size as well! But I agree that quite a lot of compliance requirements apply to everyone, and that they're much harder for small companies to meet.
However, the reason for those requirements isn't, ostensibly, to make disruption difficult - it's to protect the society from problems that can be caused by companies of all sizes equally easily. For example, safety regulations about medical equipment. Or data protection regulations. To the extent these laws make disruption difficult, it's often unfortunate cost of safety.
(I understand though that the real world isn't perfect and big players tend to influence these laws to their benefit. But perfect being the enemy of good, and all that.)
That's exactly why we don't need "regulatory compliance" but straight up break up monopolies into multiple independent companies like what should be done in this case here.
We need a regulation that will automatically deal with such monopolies by ordering companies to split after reaching certain thresholds. This way we would never get companies too big to fail.
We'll need a second bit of regulation then (and I'd say, possibly we need it already) to counter the obvious workaround - creation of a thousand of small companies doing roughly the same thing, controlled by the same group of investors.
Then what is the incentive to start a company, grow it within the legal and regulatory framework and when it gets to an arbitrary, non-defined size have the legal and regulatory bodies systematically dismantle it by forcing you to sell it off?
There should be no incentives to grow a company past a certain limit. Ideally, that limit should be established well in advance.
And in US, it was back in the day - but then, thanks to Reagan and Robert Bork, we effectively threw away all the anti-trust legislation that was functioning quite effectively for 80 years. The large businesses, of course, heavily lobbied in favor of that.
So, they have created the present environment - and now that we have seen (yet again) why anti-trust enforcement has to be proactive, they're complaining that it would be unfair to revert to old rules?
It's not selling off but splitting. The goal should not be to create 100 billion super company, but to provide value to consumers. Companies that would be at a stage where they would be required to split have different priority - to extract value from a consumer and maximise profits for shareholders. That will be a target, not your small or medium business.
The same incentive to start a family or a restaurant.
I don't know why people open restaurants - do they know the profit margins? They must, and they do it anyway.
So it must be for the love of it.
This idea that starting a business just to make money is a good idea, is tenuous at best. It's a sign of the times that people even ask the question of 'well, if I don't get rich, why would I do it at all?' It's a sign of how dysfunctional work has universally become, that people view it as suffering, to reach a promised land, rather than a calling to do your part as part of a larger community.
If it's suffering, maybe something is wrong. Taking away the idea of heaven enables you to say no to resistance you're experiencing now. That's a good thing - don't believe anyone promising you a heaven later on - how would they know and if they're in it, is mentoring you to suffer really a part of their heavenly experience?
Supose we take this icorrect argument - you starg a company but at arbitrary size you have to sell off chunks. Okay, you made a profit, thats the point of capitalism.
Are you suggest we should instead cater to megalomaniacs that wish to run maximally large companies and rule industries?
because the company is still intact but not it is split in two. It's obvious that there needs to be limits on capitalism when companies get too large that they can dictate what the internet/any market does on a whim because they're the only game in town. Amazon and Google are quickly approaching that if not already there. They can crush competitors on a whim.
Money is actually a power (via buying other people's time and effort to spend on one's problems). The money is also behaves like the mass in Universe: it sticks together in large clots. So the power does too.
At some point some companies have become too big to disrupt. I think the best way is to partition companies that grew too big, that could be called disruption if you will. I'd like to see Alphabet holding to be dismantled and all products moved into separate independent companies.
But that's the whole point of the SV business model! Strangle any competition by initially offering your products for free (and funded by huge amounts of VC), and then raise prices at your leisure once you're the last one standing.
How is that different than a supermarket offering its own brand? Costco offer Kirkland, Whole Foods offer's 365 brand.
Apple offers various Apple apps even though they provide the platform, both included "Pages" and separate "Final Cut Pro", etc...
Vertical integration is fine, it's the part where one or more pieces of your vertical gives you too much control over the markets around other parts of your vertical.
Well you don't have to use their browser to consume content on the internet and I never have used it to consume content from their sites.
Now if they start to force it where you have to use their browser to access their content it may be an issue depending on how much competition is in that space.
Or worse, they use their influence on standards committees to effectively lock out other browsers. That would be worth taking them to town over.
Google is no more the internet than any other company. You never have to use google or their products for anything but many do because its just there and it just works
What I don’t get is how many howl and cry foul at Apple for its Store “monopoly” while anyone can opt-out of it by switching to any other of the myriad competitors.
While in this case, Google is very likely abusing its dominance to coerce the whole ecosystem beyond their own platform in their favor: if the web becomes “works with Chrome” everyone is (probably negatively) affected.
Also apple isn't even close to "monopoly" in the phone business. It's pretty easy to buy a phone from a different company last I checked. Chrome is quickly becoming the only game in town --for internet access--
I don't own an Apple product and probably never will, so I'm not that familiar with the situation there. In general though, I'm not that fond of private entities being regulators of markets. This is what democratic institutions are for, but unfortunately we don't have strong institutions in this space.
For what it's worth this argument - that users are actually able to switch to Android if they don't like the iOS app store and that consitutes competition - will be tested by the courts in the Epic vs Apple case.
Epic says this is not a real factor for consumers, despite it still harming them.
Now that you mention it, this is a pretty evil strategy. It may seem that developers and businesses have all freedom to create and run their sides and apps as they wish but if they don't comply to Google's view how the web and apps should be they...
... rank very low on search results and are therefore displaced by competitors that play along the rules of Google
... don't get into the Play Store or get kicked out
... can't express an opinion on any Google platform that goes against their (vaguely formulated) terms of use, otherwise their account gets blocked in no time.
While Apple does not have a monopoly over the smartphone market by any meaningful definition, what it does have is a vastly outsized mindshare.
Not only did Apple invent the modern smartphone, they created the first modern App Store. This, combined with their high cultural cachet, allows them to define by their actions what an App Store "should be" in many people's minds.
Personally, as a lifelong Apple user, I'm both concerned about Apple's singular control over the iPhone app market, and concerned what would happen if that singular control were broken.
I don't get it. Most non-tech people I know are entrenched into the Google echo system and belong to one of these groups:
a) don't know the extent of Google's digital invasiveness
b) know but don't care
c) care but either not enough to leave it or don't have a choice really
I don't know anyone from these groups that are aware of ClearURLs or what is solves - they probably would have never installed it.
On the other hand, tech folks who know and care about what Google is doing, are perfectly capable of installing Firefox/Opera etc and get around it. Removing this extension doesn't do anything to affect these people.
What does such dick moves actually achieve? Its just makes most people either be indifferent or hate them more.
It doesn't remove all trackers from the URLs yet, but at least you can copy what is displayed at the status bar when you hover over the link. It's useful since a lot of tracking sites (like FB) substitute that status bar link for the tracking one before clicking.
I think I'll add some cleaning process in the future.
You're making the massive assumption that Google cares and wants to change the status quo they've established. There is no evidence of this being the case.
I use Chrome on a slower PC. Although Chrome consumes more RAM than Firefox does, it actually feels noticeably faster on any PC. On old slow PCs (and mobiles) this makes a hell of difference. Firefox became usable again when it abandoned XUL (for some years before that it was pure torment to use so I had no choice but to switch to Chrome) but still isn't as fast as Chrome is. I only use Firefox happily on faster PCs.
My daughter on one of our computers had a problem with firefox being able to launch roblox (which frankly roblox's launching mechanisms seem buggy, I don't know why - I assume it is just an uri scheme doing the launching), as a consequence of which my daughter now tends to use Chrome. Which has the benefit of when I get on that computer she uses chrome I use firefox my browsing experience doesn't get littered with an 11 year old's behaviors.
and 11 year olds grab the computer when you're not there and use it, actually from the age of 8 I think.
But yes, browser profiles and computer multi account support are tools that would solve this problem if they worked for the particular problem description - which fair to say they weren't really designed for that scenario so no reason to expect they would work.
You can have your child use the default profile, and have a shortcut whose target is `firefox -P <adultprofilename> --no-remote` somewhere out of her way. This way, she can just open the browser and use it (on the default profile), and you can click this shortcut to open your profile easily.
Just an option to keep in mind in case you decide to get rid of Chrome in the future.
yeah but the scenario is really, dad is on computer, dad goes in kitchen to make food, child goes gets computer, opens new tab in running browser.
It's really a parenting issue, I could stop her doing it but I don't care enough to do so, and if I don't stop her via parenting there will always be ways she slips through the profile guards.
> No difference with real sites. A developer may notice some difference in latest features support (not always in favor of Chrome).
I use Firefox (desktop and mobile/Android). You sound like a Firefox evangelist. There are real site where Firefox is buggy or slow. Youtube is one of them. I know it's not technically their fault, but the average user can't fix the site. They can (and do) change their browser. So they use Chrome.
I found it unbearably slow on my Galaxy Note 3 with uBlock. The only reason I tried it was uBlock. So I rarely do. I've switched to Brave.
> You sound like a Firefox evangelist
Kind of, but an honest (if not say cruel) one - I always say Firefox is slow. I also use Chrome on slower machines.
> There are real site where Firefox is buggy or slow. Youtube is one of them
I don't know a single site where Firefox would be buggy but it indeed is slow - on all of them. On YouTube in particular yet still pretty much usable. On a modern PC the difference is easy to ignore. I watch a lot of YouTube in Firefox Every day and I enjoy the "picture in picture" mode. I have hardware decoding enabled and high-FPS and high-res (above 720p) modes disabled as I don't appreciate them.
Firefox's default scrolling behaviour didn't feel as responsive as Chrome. I changed a few settings below in about:config and now Firefox is a pleasure to use again. With the uBlock Origin, Cookie Autodelete and ClearURLs add-ons, it's better for privacy too.
I really wish someone at Mozilla would take the time to document the settings related to scrolling or make a GUI to simulate the behavior you're setting up. Getting it to work "close enough" to how well it works on chrome took a bit of time and was basically looking through forums where people mentioned what had worked for them.
It's not the most central thing a browser has to do, but every user does it so often and it's a constant annoyance when it doesn't work the way you like/expect it to.
As a developer, Chrome's tools are pretty good. As a user, the sync options for history and bookmarks are what keep me in. It's definitely not perfect but it's fine for now. I like Firefox but not enough to switch
I was first a Firefox user, switched to Chrome, back to FF and finally to Chrome once I got a smart phone and they added sync
For web development, it's kinda mandatory as nobody wants a site that doesn't work in the most popular browser. That doesn't mean you have to use it for browsing as well though- I found that having Chrome for testing / debugging and Firefox for browsing was quite a productive setup last time I did web dev.
Chrome's dev tools just work better for me, too. For instance, when there's an error in the console, clicking on the filename/line does nothing most of the time in Firefox now. I once fixed it by resetting firefox (per some instructions on the web) but it broken again within days.
Chrome's just works, every time.
Oddly, it sorta worked for me in FF the other day, but it took me to the minified source, which wasn't terribly helpful in an app that I'm writing.
I still use Firefox 99% of the time, but it really pains me that that one feature is so broken.
Firefox's tools are still competitive with Chrome and there are even a few things I miss about them when debugging in Brave. It gives a little joy to know that Mozilla is still competitive in something, but not holding my breath for it to stay that way.
I left Chrome for Firefox about 2 years ago because Chrome was becoming slow (I was more likely the reason) but Firefox was always “just okay” in many regards. No automatic omnibox search that I enjoyed since probably 2010 in Chrome, for example. That feature alone makes navigation a breeze.
Then last summer probably I switched to Safari. Nice browser, but very barebones if not for the beautiful Reader view, reading list and iOS Keychain integration. Generally smooth but extremely sluggish on the new Facebook site.
I don’t use Chrome because my dislike for Google grows by the day, but I recognize that Chrome is still the best at what it does, and that’s why many still use it.
If Edge supported the keychain I’d probably consider it.
I enjoy Edge and Firefox a lot and never use Chrome. I certainly think that Chrome being Google's browser is grounds enough to look critically at it aside from everything else.
I recently moved back to a Chromium based browser (Edge). Even on a fairly powerful system (16 logi cores/4GHz) it's just noticeably faster. SPAs are actually usable again. Video decoding is hardware accelerated and I can now play 4k videos without every interaction with the browser lagging like crazy.
Firefox is great, theoretically. But even then, Mozilla keep making inane decisions - lately their plans to remove compact mode [0] were in the news. Why?!
I don't like how mouse scrolling works in Firefox. I can't determine exact details what's wrong (jitter? inertia? delay?), but it feels subtly broken, and after a while it makes me nervous.
I use in on some older machines, where video is choppy on Firefox. I also use it for videoconferencing websites on my work laptop, because with Firefox the fans (sometimes?) are out of control.
I missed that too, but it turns out I need ot veery rarely. For when I do, I have a userscript that injects the GTranslate widget into the page. It's basically the same, albeit with uglier button. I don't think it's available as an extension since it by definition injects external scripts into a page, but it's available as a userscript for *-Monkey.
I actually much prefer Firefox devtools. The one thing I occasionally go to Chrome for is when stepping through minified JS code - Firefox doesn't fully resolve the variable names in the map files.
Glad I am using Firefox for this, issues like this makes my blood boil.
One thing I hate about most websites is the backward compatibility issues most websites tend to have from time to time that break pre existing functionalities. Almost all websites have them. EG copy pasting photos through clipboard doesn't work anymore on gmail using firefox which forces me to use chrome.
In order to have Google Play Services on an Android device, the vendor has to sign a contract with Google. Part of that contract states that the vendor must pre-install Chrome and a bunch of other Google apps as system apps, making them non-removable. Best you can do is disable it, which should be where the uninstall button usually is.
1. sudo apt-get-install android-tools-adb
2. Connect phone using usb, choose file transfer
3. In terminal type: adb shell
4. On phone allow dialog that appeared
5. In terminal in adb shell type: pm list package
6. Find which app is it and uninstall it, eg: pm uninstall -k --user 0 com.facebook.services
Do you get the option to disable it? When you look it up in the apps list?
I don't know about how it is in the latest phones.
But at least for older phones apps which are bundled as part of the OS have a version of the program in the read-only system partition which is used to do factory resets.
So disabling deletes the latest version of the software, and stops it from running on the phone.
You can't realistically remove it w/o going into devtools/root but you can use firefox and set as default browser. For years I have not touched chrome.
the system webview is chrome based. It cannot be uninstalled as far as i can see. The chrome app may be removable tho? Tho i can only disable it on my phone.
On Android he has a free choice to use and support Firefox (with Addons), which on iOS he can't do by the decree of the platform holder.
(The platform holder which told us, developers, that writing the complaint that's linked in this very article would cause us to be removed from the AppStore.)
I tried, but I find each and every Google Search alternative lacking. I frequently have to hunt down some bug descriptions or other stuff related to coding and Google Search is really hard to beat.
- if I can't find it, attach !g to my query and be instantly on Google
- either confirm Google can't find it either
- or found it
this way DDG get important data to improve their technology, I avoid being tracked most of the time and I get the same results for the cost of a few seconds here and there.
Yes, DDG isn't exactly amazing for anyone who used Google before 2009, but neither is the "Google" lookalike we see today ;-)
If alternatives to Google were better than Google, we wouldn’t have to tell people to use them. The whole point is to use alternatives even though they are not as good, because to aid Google is not acceptable.
I think you're observing personalized search. Next time it happens, try asking DDG a better question, e.g. include "programming" in the query.
Even ignoring privacy and data collection, I find personalized search problematic. Unreliable across computers / VMs, broken across browsers / OS reinstalls, and most importantly not portable across users, you can't tell other people how exactly you found stuff.
Have you noticed that the quality of Google Search results deteriorated dramatically lately?
Sometimes when I google something I don't even get any results at all. As if they no longer indexed any more obscure sites.
On the other hand I know some sites completely disabled robots and are not being scanned, to degoogle themselves.
I don't know about the parent post, but for me Google search results wrt to bug reports / general admin stuff have degraded dramatically in the past 5 years. As in you search for a stack trace and you get beginner tutorials on how to set up something only vaguely related.
It’s not like it’s difficult. This isn’t the world of IE6, where a bloated Netscape 4.7 which failed to work on many sites was the only alternative. (Kant remember when KHTML - later konqurer - became more usable than Phoenix/Firebird/Firefox)
Oh I loved Konq in the olden days. A single tabbed browser for both file system and web. Loved that thing. khtmlview was also an awesome browser component which I made use of in a few QT projects.
I'd respectfully disagree, Netscape (Gecko) had better, more up to date standards. The issue was always folks not testing on other than Internet Exploder.
It didn't when Netscape 4 (which was pre-Gecko) was a thing. The IE vs NN browser war saw both adopting various incompatible non-standard features - e.g. layers in NN, innerHTML in IE - but the set that IE ended up with was closer to what was eventually standardized as HTML5/CSS/DOM.
I actually found that the best way in that (horrible) era was to develop Opera-first. That way the amount of fixes for IE and Mozilla was much less than if choosing any of these two as firstm
There are lots of other keyboards to choose from. Unless you want Swype like functionality, in which case only the GBoard is anything like acceptable performance.
Prevent short link services from tracking you. Unshort.link i s an open source webservice removing tracking parameters from short links and displaying where they point at.
Raymond Hill has stopped developing uMatrix and also stopped accepting PRs because it's currently too much work for him [1]. So be aware that bugs will probably not be fixed.
That being said, I'm a very happy uMatrix user on Firefox.
I think this suggests it is high time we set up a third party chrome extension library not run by Google for use in non-google-shipped chrome-based browsers (Vivaldi, Brave, Chromium, etc)
If you search in Google, the links in the results don't directly link to the result page but through a Google redirect. Really annoying if you want to copy/paste a result without visiting. CleanURL fixes this. So that might be what referred here.
Currently, the Clean version has 13,750 users while the original version has 9,864 users on addons.mozilla.org, so the fork might turn out to be more popular.
I'm very worried that this extension can limit the ability of small site owners to earn money from affiliate marketing.
Many of this sites rely on links to Amazon and other big stores.
They are losing on ads with the ad filters, and now this?
How can they be compensated for their work? I'm all in for privacy, but shouldn't there be a more balanced way?
That train has unfortunately left the station long ago.
I'd like to see more small sites have a donation (not subscription, unless a decent micropayment solution shows up and they give up on the "it's just the price of a Starbucks coffee" crap) instead.
I'm don't have the number but unless there is a specific cause I think that content sites are not seeing donations as a good revenue stream.
Maybe Substack and such tools will be able to be a more sustainable solution.
Donations as a revenue stream is unpredictable, and they're right to not like it.
But from the reader side, how many subscriptions can you have? It may feel good now because there are only a few options and any one person only likes a small percentage of those few options enough to give them money. But what happens when every site you click on on HN asks for a subscription? Will you pay for all of them?
If the author is here, please expand the rules database. The rules file seems too humble. Many search engines are not there. Not even Yandex or Yahoo (which are fairly well-known) let alone less-popular search engines.
I think it’s important to be specific. Probably it was not who we think of when we say “googlers”, but rather it was upper management within Google.
Frankly, I see two googles: the people who make a lot of decent or great stuff, and the upper decision makers who act against the interests of the other group and the consumers.
"Googlers" are happy to take a stand on politically-relevant topics that can earn them Virtue Points, while at the same time staying perfectly quiet on stuff like this that actually harms the people using their product.
It's not accurate to blame this on upper management. Someone, somewhere, pressed a button - either a human reviewer (yes, takedowns are often issued by a human) or someone who set up whatever botched ML algorithm decided to nuke this extension. It sounds to me like this was definitely humans.
Humans also handle disputes, and Google's staff that handle them are borderline incompetent and don't respect developers or end-users.
You can say "upper management sets the policies" but I think that's giving people with some degree of control a pass. A highly-compensated SWE definitely has the option to not just go with the flow.
No problem as long as it's available elsewhere, is it? As to popularity - this mere news will probably bring more new users than Google store did in quite a period of time.
I do not understand what you mean with "No problem as long as it's available elsewhere".
Are you implying that we should just treat Google's decisions as inscrutable facts of nature that we must accept with resignation, and just be glad that the extension is still available somewhere?
Perhaps. Because Google is not Apple and does not prevent us from side-loading anything, still gives us a great browser and a great OS for free.
In fact we, the advanced privacy-concerned users, are not interested in such extensions being on the store front and becoming too popular. Let Google use simple methods to track people who don't care about being tracked (many people who I tell about tracking really don't care, they say I'm a paranoid weirdo complicating things instead of just using them the way everybody does) enough to find a workaround.
When AdBlock was only used by a bunch of superusers nobody cared to counter it, everybody was happy.
Google store contains a lot of bullshit and spyware extensions. The facts an extension is there doesn't add a dime to trust.
Discoverability also isn't a thing such extensions should seek. As soon as too many people discover it Google and others will invent a smarter way to spy.
Flow 1: user wants extension, clicks the browser addons button, searches for it, clicks install. Updates are automatic.
Flow 2: user wants extension, searches in Google, finds a third-party website, clicks download, downloads file, runs file, gets warning that the exension is untrusted, clicks install anyway. Repeat for every update.
Still don't think the average person wouldn't truat Flow 2 less? Or, to look at it from the other side: wouldn't the fact it was removed by Google make people suspicious of the addon? What was it doing that was so bad that even a garbage pile like the Chrome Store dropped it.
The only filter list provider is the extension maintainer, so this information should be safe to share. I have not had the time to set up a PoC, but I'm confident that the filter rules are way too powerful.
At the very minimum, the current filter list should be included in the extension package rather than periodically updated from a remote URL. That way the filter list can be audited and must pass a review, without having a negative impact on the effectiveness of the extension, since the filter list does not appear to frequently change.
https://github.com/ClearURLs/Addon/wiki/Rules
https://gitlab.com/anti-tracking/ClearURLs/rules
https://kevinroebert.gitlab.io/ClearUrls/data/data.minify.js...