Hacker News new | past | comments | ask | show | jobs | submit | ianlevesque's comments login

I think it's relevant that Transmit is a local native app. There's no hosted app exposed to the internet to hack here. Google made one lengthy process that doesn't fit this use case.

Panic runs a cloud-hosted sync service that syncs your credentials and connection info between different instances of Transmit you may have.

No idea if that's what google is targeting here, but that is a cloud service, that presumably gets a copy of people's Google Drive OAuth keys if they use Google Drive with Transmit and the sync service.


If they are connecting to Google Drive, is that not connected to the internet?

There’s no way for someone on the internet to reach into your Transmit app and make it do something.

How can you be so sure? Even after reading all the source code, there still can be bugs, attacks, demanding letters from different agencies, misconfigurations, vulnerabilities in code and in libraries, etc. etc. etc.

If your threat model is the NSA leaning on a developer to ship a compromised build, KPMG is not going to catch that. If it’s that you’re going to use Transmit to connect to a server which is compromised and exploits your client to exfiltrate your Drive files, guess what else they’re not going to prevent?

It’d be one thing if Project Zero was running serious audits but this policy is designed to let them check audit checkboxes so when you lose data, it’s hard to sue Google.


exposed to the internet and connected to the internet are different. Exposed implies that traffic originating from the internet reaches the app. You still do have to worry about things like parsing malicious files, but the class of relevant attacks is much smaller and generally easier to defend against.

Everything's connected to the internet, what the OP was talking about was attack vectors and since Transmit is a local app it really isn't one unless your whole machine is compromised, which in that case you're screwed.

There are lots of ways a local app can be compromised. It can read a local config value unsafely which can be influenced by some other app that does talk to the Internet, for example.

There's a reason why airgapping is the only way to secure important systems (and of course that can also have a number of vulnerabilities).

And besides, how do you know it's a local only app if you haven't audited it?

"Just trust me bro" -- some dev


This is absolutely a thing in 2024. Or less drastically the many examples of a development team being cut, further development being sporadic and bug prone, and a detached and patronizing ticket system for support instituted.


> now it's common to see people inviting government control of the internet for adults. I don't get it.

Sockpuppets and useful idiots in equal measure.


Yet another desktop environment https://github.com/pop-os/cosmic


> everyone else had access to an SMB share where dropbox actually ran

What the...? It honestly does sound like you were holding it wrong.


Why shouldn't that work? Dropbox and SMB both just need to read/write/watch files like any normal process. I do the same thing at home with Syncthing/SMB and it works fine.


From a theoretical perspective, sure, but the product Dropbox sells is either a website or an app each end user gets that adds local sync and some other useful sharing features. They certainly don’t intend it as a centralized system that people expose over the network in a bespoke way, and I can fully sympathize with why they’d tell this user that’s not something they want to support.


Layoff. They were costing the company money for irrelevant problems.


Congratulations: you've just created a culture where people are afraid to report potential issues for fear of losing their jobs. Maybe there are some cases where you find that specific individuals end up acting irrationally more often than not, but on the whole, it is better to treat these acts as if they were good faith until proven otherwise.


It sounds like the person you're replying to has a future career in QA at Boeing.


I’m saddened the sarcasm flew over the heads here. A disappointing reflection of the number of companies that really do act like that now, which was my point.


Don't be sad. I found out several times myself that irony without emotion hints, misses its goal. Otoh When you add the /s hint, it's more like explaining a joke to a listener.


Agreed. Also often the gap between what people will pay for a hobby project and what money is being made at a tech company by the people who have the hobby is vast. Sometimes there are contractual restrictions on taking money from other jobs simultaneously that complicate it.


I like my iPhone, and want to be able to use Kagi as my search engine. Why can't I?


That seems like something they'd be willing to fix. They allow users to select Ecosia, an extremely niche search engine. Kagi should be on that list too.


It's not. And you can't add any more.


The funny thing about legislation is that you're responsible for the unintended consequences of your laws too.


In this case, it is just showing that most companies are collecting more data than they need.

You don’t need a banner for the data that is necessary for the service to work at minimum level. There is no role for the consent since the site won’t work otherwise.


It also shows that most people don't care and just want to get on with their day. We know that companies are collecting more data. Now what?


How does it show that? Most people I know are annoyed by this and click on "reject" (if they can find it), but for a lot of non-technical people these banners are just a given because they don't even understand the problem. Doesn't mean they don't care


The close to million users now on https://www.stilldontcareaboutcookies.com/ suggests that there's a pretty sizable amount of people that care less about the philosophy of European data laws and more about just getting on with their day.


>pretty sizable amount of people that care less about the philosophy

How does it show that?

It shows that they prefer to get on with their day over clicking cookie banners. It says nothing about whether they agree with the philosophy of the GDPR.


How many "normies" do you know that stopped visiting websites that track them? I don't know anybody who isn't in my tech bubble who cares, and very few normies who would rather pay money than to give access to their data.


None. That doesn't mean they don't care. As I said, most people I know are annoyed by this but take these banners and tracking as a given because they don't understand enough about technology and see them everywhere. And let's be honest here, if you were to stop visiting sites that track you, you could just stop using more or less the whole internet. It's not about stopping to use these sites, it's about stopping those sites from tracking you, which almost everyone I talk to is ok with. The only people I see that defend the amount of tracking happening on the web are commenters online (here, on reddit, etc.). That leads me to believe it's mostly corporate accounts.

To the point: Not using a site is not the point of it. Insert "yet you participate in society" meme


Apple do not track alert resulted in many people saying they don't want it. And of course, had impact on Meta's business. So if websites presented cookie banners in a neutral way without dark patterns to make Reject difficult, "normies" would reject these, I'm sure.


> if they can find it

If it even exists!


This is something a lot of people seem to misunderstand about GDPR. At its core it says you should only process people’s personal data within a lawful basis. There are 6, and consent is only one.

(a) Consent: the individual has given clear consent for you to process their personal data for a specific purpose.

(b) Contract: the processing is necessary for a contract you have with the individual, or because they have asked you to take specific steps before entering into a contract.

(c) Legal obligation: the processing is necessary for you to comply with the law (not including contractual obligations).

(d) Vital interests: the processing is necessary to protect someone’s life.

(e) Public task: the processing is necessary for you to perform a task in the public interest or for your official functions, and the task or function has a clear basis in law.

(f) Legitimate interests: the processing is necessary for your legitimate interests or the legitimate interests of a third party, unless there is a good reason to protect the individual’s personal data which overrides those legitimate interests.


The thing is, if you have any of (b)-(f), why shouldn't you also get (a)?

The maximum fine is 20 million euros or 4% of revenue, whichever is higher. Sure, it probably won't be imposed on a first time violation, but why take the chance?

Could you imagine any lawyer advising a company against requiring consent, even if they have some cover because of a legal obligation? Isn't it much safer to deny service to those that refuse to consent?

Sure, it'll annoy the customer, but right now the customer is used to minor annoyances.


This is true, but the comment you replied to was about the cookie law, not about GDPR. They are separate issues, even if they are obviously related. Cookie law is about not using other peoples storage for usage that is not needed, GDPR is about personal information. You can use cookies for saving information that is not personal but that still would need banners.


> it is just showing that most companies are collecting more data than they need.

Thankfully we have EU institutions to protect us from these evil companies. But somehow the EU institution websites all have cookie banners too.


> You don’t need a banner for the data that is necessary for the service to work at minimum level.

We were advised by our lawyers (a top SV tech law firm) that we should include a cookie banner in the EU even if we're only using cookies for functions like login. After eventually switching legal counsel (for unrelated reasons), we were told the same thing by our new counsel.

Either EU law covers cookie banners that use cookies for routine functionality, or it's so (deliberately) vague that even top tech law firms would rather everyone add a cookie banner than risk running afoul of the law. Either case validates PG's argument here.


It is indeed quite complex. I would argue that just the login does not need.

1. There are users who will come to your website with specific purpose or expectation of your service.

2. Then there are users who came to website by accident and might just try out things without understanding what is happening.

The banner recommendation from the lawyers is likely for the 2nd case. The users haven't subscribed to the service with certain expectation or knowledge what is expected from them to service to provide what they want. Or they have zero expectations about the service to provide something for their needs.

For example, the login case, the group 1. probably wants to stay logged in if they came to service with expectation of personal service, which cannot be linked to the person without an account.

Or the lawyers just did not understand your service well enough and just said that put the banner be done with it. For group 2. it is unlikely that someone did not expect or want to stay logged in all the time, but that is for minority and arguable case whether is fair to assume that.


If the lawyers don't recommend you add the banner, and you somehow run into trouble because of it, the lawyers will be blamed. However, if they do recommend that you add a banner and you follow their advice, then they can get some more billable hours by recommending some verbiage for the banner, checking your website to make sure the banner is displayed in a compliant way, etc. And even if you don't follow their advice - people rarely fire their lawyer for recommending caution.

So, how did you ever expect the lawyers not to recommend adding the banner? That's like going to a plumber and ask them if you should DIY or not some installation. Of course they're going to recommend you get a professional...


I would put it another way. Any legislation against doing something is almost always motivated by someone's desire to do that very thing. Legislation is usually a battle of interests where the legislator, ideally, wants to protect the overall interests of the public when they conflict with narrower private interests. When the narrower interests belong to powerful groups, you often expect to see some struggle, and if the private interests have a way of making the regulation seem more intrusive and annoying than the harm it's intended to cause, they would take advantage of that to sway the public in their favour.

So legislators do expect such a struggle, and the shape it takes may be partly their fault, but it's clearly not all their fault. The more power the private interests have, the more likely they are to find some way to fight the regulation. They will certainly do everything they can to convince the public that the legislators are bad at regulation.

In this particular case, however, websites showing banners are also harming themselves as their competitors now have an interest in not showing banners and offering a better experience -- i.e. the regulation makes it worthwhile not to display banners in competitive situations. So we'll see how this all turns out.


No, people cannot escape responsibility by saying "the law made me be belligerent toward my users". It is intentional choice to use cookies and to make it unpleasant for people.


> No, people cannot escape responsibility by saying "the law made me be belligerent toward my users".

Correction: people should not be able to escape responsibility by saying this.

The problem is that right now people do escape responsibility for saying this because the EU is not properly enforcing these new laws.

Introducing a law and then not enforcing it has consequences, and those consequences should have been foreseen. Either the law is unenforceable due to practical constraints, in which case it's a bad law, or the EU is failing to enforce it due to inability.

Hopefully the EU starts putting more focus on enforcing its existing laws rather than creating new ones.


And use of cookies themswlves don't demand these banners, nor that they be so obstructive. Just don't collect unnecessary cookies or PII, or put in a prominent banner that doesn't overlap the site purpose.


I guess PG's original tweet assumes that cookie banners are a) bad, b) the fault of the EU, and C) unanticipated and unintended by the EU, thereby demonstrating their incompetence.

I can't really comment on what the lawmakers foresaw or intended, but I'd argue that cookie banners are actually a) good, and b) the fault of companies who can't imagine a better way to treat their users.

The reason I think they're good is that they cause a psychological nuisance to users of software which doesn't go out of their way to do them well or avoid their necessity. Over time I hope this will tend to cause an association in users minds that sites with cookie banners are somehow seedy or unscrupulous, like pop-up ads.


It's impossible to foresee everything, including the amount of malicious compliance.

In the end we are better off with this legislation and its future iterations and additions than we are without it. The extent to which people's data is misused is simply ridiculous.


So perhaps that suggests caution be taken in meddling…


Meddling in what way?

'Meddling' causing citizens to lose visibility and corporations to gain more power over data?


Though this was 100% predictable


Yes and no? To some extent, sure. As an example: But if companies or people went out of their way to comply with a law that is clearly not complying with the spirit of the law, just the letter of it, are you really responsible for that? Or are they because they're doing everything to not comply?

Let's say you make a law to reduce working hours from 40 to 37 hours except in "emergency situations". Now a company will force employees to sign off on "emergency situations" every week or they'll be fired. They're clearly not complying to the spirit of the law? Is it really your fault when you make a law like that? I'd say only to some degree, the people trying to abuse every loop hole are much more responsible in this case.

Companies using dark patterns, hiding the "reject all" option behind an additional click (which even is illegal) and even trying to collect all data possible are much more responsible than the EU's law. Oftentimes they are collecting data just because, not even thinking about it, because they'll add GA to their WordPress site without even looking at it or whatever. That cookie banners have become the standard around the web is sad because it just shows how much everyone is trying to track you.


Bonus points if you can convince folks to call it the annoying Sign Off law.


You mean the EU should have foreseen that people in tech have no conscience and sensitivity for right or wrong?


It's not enough to write a law on principles alone. It must be clear and practical to comply and clear how it will be enforced. The EU should not have created a situation where the most practical solution for 1000's of companies is a cookie banner.


Eh, I think people have the wrong take-away from all of this.

Imagine if the banner said "This website is known to the state of California to cause cancer". Would you keep visiting the site?

Like if every time you went the bar, the bouncer asked "Hey, can I punch you in the face?". Would you keep going to that bar?

As annoying as the banners are, they actually aren't annoying enough to change mass-behavior.


To be fair, completely foreseeable.


Yeah, I’m not sure legislators should be on the hook for malicious compliance, though.


Of course not, I was just having a laff at tech's expense.


As a reminder, it's the same people in tech we're trusting to build things like chat bots, "AI", cars that try to drive themselves and rockets that try to land themselves... etc.

You have to admit that if these same people can't be trusted to follow a simple "do not track" directive, humanity is in big trouble.


Sadly, we are. I didn't ever think of myself as an optimist until I realised just how pessimistic it is possible to become, as I've learned in the last 5 or so years. Now I realise I was quite an optimistic person, at least by comparison with my present self.


I mean, humanity is and always has been in big trouble. That's why history is full of disasters, mass death etc.

But I don't think it's that developers "can't" follow a DNT cookie. It's that they won't because it doesn't benefit their employer's financial interests.

Making a rocket that lands, on the other hand, does directly correspond to SpaceX's financial interests.


The thing is, the consequences seem to be very much intended. The consequences of forcing companies to be transparent about tracking, and hopefully letting the users start voting with their wallets as they get annoyed by the omnipresent "We Value Your Privacy"-popups (which is very ironic considering all the dark patterns et al that are used to have users get tracked).

If nothing else, at least now people know just how much they've been tracked. One can only hope that this increased consciousness would help people to choose services that don't track people. For example Hacker News doesn't need tracking cookies nor a cookie popup, and it seems to be doing just fine, even in terms of the law ;)


I would expect that most companies would be ashamed to publicly state that they sell your data to hundreds(!) of data providers and they would fix this before they had to disclose it. But nope, apparently the money is too good. And blaming the government is more convenient.


On the one hand when murders go up because you make using a gun in a crime an automatic 5 year prison term you should have foreseen that possible situation, on the other hand the real bad guy is the one shooting the witnesses.


Folding@Home predates the ML approaches to protein folding. Just because it has been eclipsed now doesn’t mean it wasn’t worthwhile.


Well, it predates the deep-learning ML approaches to protein folding like Alphafold. But people have been applying ML methods to protein folding for decades -- that has long been the two camps in the protein-folding bioinformatics world -- whether the best way to get structures was to simulate what the protein was actually doing or if there was a computational shortcut.


The goal of protein folding simulations like Folding@Home is not to predict 3D structures - it's to understand how folding actually works, and why it sometimes doesn't work. When FAH came out it was already very obvious that there were good computational shortcuts to predicting the end state (the Rosetta approach), but those don't tell you very much about the physical process. Different questions call for different approaches.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: