Hacker News new | past | comments | ask | show | jobs | submit login

Same problem Microsoft faced when it added "UAC" in Vista. Admittedly the implementation might not have been the best from a usability perspective but I think any attempt at implementing proper privilege management in Windows would have had many users complaining and not seeing the point.

I guess the lesson here is not to give your users bad habits for the sake of convenience otherwise it'll backfire if you ever want to do things right later. MS had everybody run as root for decades before they finally decided that it might not be such a great idea after all, and then they had to face annoyed users and bad publicity.

That being said I can't really imagine how having a non-intrusive "do you want to start the call" dialog before initiating the call can be considered a deal breaker. I assume you could even reduce that annoyance further by adding a "don't ask me again for this website/user/whatever" checkbox. Do you really think that would hurt Zoom significantly? I've never used their product so I can't really form an educated opinion.

This is especially stupid because I have no doubt that now that it's been made public some people will abuse the vulnerability, if only for fun.

It wasn't bad habits, up to Windows XP which introduced user separation on consumer oriented Windows (NT and 2K were meant for businesses and businesses who had networked PCs were really meant to use those) all personal computers were fully controlled by their users without any notion of privilege separation - this is a behavior that traces its lineage back to the original Altair 8800. Computers weren't networked and those who were were either running a different OS (NT, Unix, whatever) and/or controlled entirely by a single entity (a company). Or just didn't care and used Windows 9x.

And honestly i do not think it is bad habit even today. UAC is intrusive, the main reason you do not see it as much as at the past is because applications nowadays work around it: see how Chrome or even VS Code saves the executable files for their updates to your %APPDATA% folder (where normally regular data are going) to avoid the UAC annoyance of going through Program Files (which makes the UAC protection pointless) or how app stores like Steam change the permissions to "everything allowed" to be able to modify the folder contents.

People are using computers to do specific tasks they want to do, anything else is an annoyance and something they'll want to avoid.

Today's security issues come from things a lot of developers and companies simply do not want to acknowledge: trying to put everything online, connect all computers together, trying to have everything controlled by whoever writes the applications users use (putting everything online is a way to do that), trying to come up with monetization schemes where users pay nothing out of their own pockets, trying to make users pay subscriptions instead of one-off fees (the excuse is often that they have to somehow keep their servers going, willfully ignoring that the developers/companies are those who decided to make something run on a server in the first place and that by doing that they are the ones in control).

A lot of security issues would be gone if computers weren't so connected to each other. Sadly i do not see that happening any time soon since no developer wants to give up that sort of control (some developers nowadays do not even know how it is to not have it) and no company wants to get rid of the biggest excuse they have to ask for continuous payments.

Personal computers back in the 80s and 90s were very insecure, but that didn't matter because they weren't so connected as they are today. It isn't surprising that pretty much all famous security issues of the time (like the ILOVEYOU worm) happened exactly as that connectivity started getting widespread.

I think the only hope there is is that the IoT craze will blow up everyone's collective faces and realize that it might not be such a good idea to connect everything after all. Sadly the more cynical side of me thinks that what will happen instead is the introduction of more draconian user hostile measures which end up with the users losing every more control to big companies that control their devices and OSes in the name of security and usability (more like dumbability) and any voice against that would be marginalized as "you are a power user, you do not matter" (ok princess, then what are power users supposed to use after you lock down everything? - i guess the answer is somewhere between "expensive licensed workstations" and "nothing, now piss off").

I’ve had viruses and anti viruses years before I had internet. Getting a virus was trivial in the 90’s when windows had no security and any program could do anything.

Any program can do anything in modern Windows too, only special places like C:\Windows\System[32] are protected. I'm not against such protections since they can be easily overridden if needed and in day-to-day use they do not harm anyone nor affect negatively the usability of the system.

I'm not saying that we should go back to 90s entirely, we have a lot of good improvements over the years. I'm just hoping we'll tone down the "connect all the things" a bit since that is the main source of a lot of security issues.

I agree that less connectivity is better for security, which is why I think rushing to IoT-everything is premature.

However unless a computer cannot be physically connected to the internet, it must implement all of the protections it can. Just not having wifi enabled or cable disconnected is a false sense of security.

The question is about the "all the protections it can" part - what does that imply? Because "all the protections" can include user hostile (not just in terms of usability) misfeatures that give control to OS vendors in the name of security even though the real purpose is controlling what the users can do with their own devices (for a variety of reasons, with stuff like market segregation and forced obsolescence being among the more benign ones).

All the protections that help the machine survive in non-compromised state in a hostile environment. I think of stuff like not giving random users permission to write over system files or give processes access to peripherals (camera, microphone) without explicit user consent.

Your comment is a bit ambiguous. Are you saying that even retail software could be considered a virus just because of what it can do on the system? Or was virus software making it onto the machine in other ways?

When I was a kid it was quite normal to pass around floppies and later CDs full of warez. These contained viruses more often than not especially since an infected machine would auto infect any writable media it got hold of.

In our computer lab we got viruses spread by disks.

I think you make good points but to sum it up: privilege separation wasn't needed pre-internet because vulnerabilities and computer viruses weren't that big of a problem back then.

>A lot of security issues would be gone if computers weren't so connected to each other.

I mean, sure, but having computer connected together is pretty damn amazing.

I'm actually drawing the opposite conclusion compared to yours: I think UAC doesn't go far enough. You need more finely grained permissions. That seems to be the trend too: Android, SELinux, OpenBSD's pledge... It's all about giving every process only the privileges it needs and nothing more.

Finely grained permissions mean bad UX and as Android has shown you gain nothing practical from that since the people will learn to ignore them pretty much like they learn to ignore the UAC warning while on the other hand you lose the flexibility, functionality and openness of the entire system (all significant pillars for ensuring user control).

Note that i'm not saying to disconnect computers entirely, i'm saying to rely less on connected computers. Simple stuff like use LibreOffice or MS Office instead of Google Docs, use a desktop calendar and other tools instead of relying on "web apps", instead of using a "cloud-based solution" for syncing data with your mobile phone, just connect it directly to your computer (via wifi, bluetooth, whatever - this is a UX issue mainly - but it doesn't have to roundtrip with someone else's server). Stuff that makes you and your computer less reliant on the network.

Not everything can work like that of course, but then instead of trying to isolate applications from each other using fine-grained separation, we can simply treat the network itself as hostile and try to defend from it (e.g. applications that can access the network cannot access outside of a designated folder - the OpenBSD pledge approach but forced on all applications that access the network). I think it is a much easier, flexible, user controllable and understandable approach than UAC on steroids or any other approach that relies on application segregation.

It does require a massive shift in developers' mindsets and profit incentives for companies though, which is why i do not see such a thing happening.

> we can simply treat the network itself as hostile and try to defend from it (e.g. applications that can access the network cannot access outside of a designated folder - the OpenBSD pledge approach but forced on all applications that access the network)

Won't work. Malicious actors (both malware developers and companies with user-hostile business models) will start working around it, by for instance giving you two applications, one connected to the Internet and one not. The first application will be the C&C server, the second one will be the executor, and they'll talk with each other over e.g. files in first application's folder.

Trying to block that would pretty much hose all utility in having a general-purpose computer. You'll be back to the crappy UX of a smartphone.

I honestly don't know how to solve this conundrum. You can't solve it technologically, as you quickly hit the Halting Problem. You can't solve it socially, because for any power user benefiting from the modicum of interoperability you leave in, you get 10 regular people who can be trivially social-engineered into selfpwning their device. It seems that in the end, you'll either have to lock down computers to near uselessness, or live with the risk of bad actors exploiting them.

My comment isn't an ideal solution, it is what i consider a better solution considering how things are treated nowadays.

Ideally users would be wary of what they do with their computers, but considering how the world devolved from "you should never use your real name and address online" to modern social media, this is yet another case where i do not see such an ideal happening.

Have you looked into the object capability model of permissions? https://en.wikipedia.org/wiki/Capability-based_security

This is exactly the type of problem it solves, usability with security.

I don't see how it solves the selfpwn problem - that is, for any capability I can explicitly grant if I know what I'm doing, someone else can grant it because a malicious actor nicely asked them to do it. If you take away the ability to grant the capability, you're reducing usability.

Yeah, that's really an unsolvable problem I guess. But you could at least make it clear to the user what some app is requesting. If it's requesting the root capability / ambient authority (basically access to everything) then that should be a big red flag.

This is what things like https://sandstorm.io and Google's Fuchsia OS are trying to solve. Of course it requires a huge shift in how you design applications, but it does not impose any burden on the user's side really. They just allow $APP access to some data or resource, and then it has access to only what it needs going forward, with no need to allow it every time (unless you revoke it). This can be done when the app gets installed, so there's no UX problem really.

My original comment was about having computers be less connected because a large reason for security issues and their implications today arise from their connectivity, so i do not see what sandstorm.io is solving there.

I'm not familiar with Google's Fuchsia OS to judge, though i do remember reading some months (year?) ago about a clash between their developers and Google's advertising team that ended up with the developers compromising Fuchsia's design. Which brings me back to "let's not rely too much on connected stuff and prefer stuff we have control over, shall we?"

> Do you really think that would hurt Zoom significantly?

Zoom is a publicly-traded company now, so I am sure that adoption through convenience trumps a lot of other concerns.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact