Hacker News new | past | comments | ask | show | jobs | submit login

> This is clearly more complicated than ideal, but it should work.

Exactly.

Imagine you're someone who just wants to play around with cool web technologies. Maybe you're fairly new to web dev; maybe you're fairly new to the world of programming in general and you're using the web to learn it, which has historically been one of the huge strengths of the web. You suddenly encounter a brick wall, where you figure out that programming isn't enough; you have to fork over money for a domain and learn how SSL works and how to set up let's encrypt and how to make root certs and how to install them on your phone, just because you wanted to play with something you found interesting.

The web looks like it's going away from being a good platform to learn and play with programming in the name of security. It will be annoying but workable for most professional programmers who can just do whatever hacks they need to get by, but we're erecting some monumental barriers to learn this stuff. You already can't even include a fucking javascript module file from an html file without learning how to set up and configure a web server because Chrome blocks modules when using file://.




I fundamentally agree, but I think the solution is to continue making HTTPS easier to use rather than giving up on security.

For anything on the public Internet, things are already incredibly better than they used to be: HTTPS is free, and there's a wide range of easy ways to set it up on your site, ranging from Caddy (a webserver with built-in Let's Encrypt support) to CloudFlare (who will proxy your site for free including SSL termination). There are still problems – e.g. for all that certbot (official Let's Encrypt client) tries to be easy to use, it's more fiddly than ideal. But the goal of "HTTPS just works" seems clearly within reach, and things can only continue to improve from here.

On the other hand, the situation with local network servers is a complete mess. This includes not just development environments, where "just don't bother with security" is a viable option, but also things like home routers and IoT devices which do want to be secure. Currently, routers tend to just use HTTP, which is insecure if you have anyone untrusted on your Wi-Fi network. IoT devices, of course, tend to route everything through centralized cloud services; there are a lot of reasons for that, and it's easy to blame invidious motives, but I suspect a significant part of the reason is just that it's really hard to make an easy-to-use-device without a centralized service. At a bare minimum, you need to be able to:

1. Securely pair with the device;

2. Connect the device to the network; and

3. Access the device's services over the network, using the existing pairing record to establish identity and prevent a MitM.

(Ideally you would also be able to expose the device's services to the wider Internet, but that's another story.)

You can do this already with a custom protocol, but not with a browser. The closest browsers have to a "pairing record" is asking you to trust a self-signed cert for some random domain, but that's nowhere near the semantics you actually want. After all, it doesn't really matter whether the device controls such-and-such domain; what matters is whether it's the same device you paired with. Meanwhile, trusting random self-signed certs is fundamentally insecure, and (intentionally) difficult to do.

What we need IMO is an entirely new protocol to address this use case, and I think such a protocol might also work for local dev servers.

In the meantime... well, there are always plenty of workarounds.


Fully agreed on the last point (and the hypothesis that this is one of the factors driving the ridiculous "talk to the cloud for everything" design pattern in IoT.)

Seems the current push is making certain legitimate use-cases not just hard but pretty much impossible. Such as providing a local web server that is independent of the public internet.

Devices used to provide embedded web pages as an easy way to show config options. This seems to have become completely impossible: Even if you'd jump through all the hoops of generating a unique subdomain and certificate for each device, you'd also somehow need to update that certificate on that device. So your (possibly fully local) device now needs internet access for the sole reason so the browser does not refuse to display the local configuration page.


This feels pessimistic to me: most people didn’t learn the web that way, instead using shared servers — and there were plenty of similar complaints that it was too hard to learn Unix/Windows admin stuff, too. Today, you can use glitch, github pages, jsbin & a million friends, zeit, etc. or the same cheap Dreamhost account people used $20 years ago and start practicing with HTTPS and many other amenities at minimal cost. JavaScript CDNs make it pretty easy to use a ton of stuff without even needing to learn how to install it, too, and increasingly you can do that as native modules.

I’d worry a lot more about how many people are being told they need a J2EE-scale tool chain to run hello world even though the native environment has never been richer.


I have to imagine the "just open a .html file and start playing" route is a huge vector for getting people into proper programming. I know it's what both I and my brother did. Maybe you don't agree, but I think it's a horrible shame that we're making that route less and less possible by disabling features for anything other than HTTPS.

Glitch honestly looks really good, but I'm a bit worried about telling people that the way to learn programming is to rely on a random for-profit corporation's computers rather than letting people realize the actual power they have over their own machine.


I have to imagine the "just open a .html file and start playing" route is a huge vector for getting people into proper programming.

It may be, but just open an HTML file and write an app using your camera and microphone is not something that is typically the result of doing just that (nor should it be).

Putting some effort into figuring out how the pieces fit together is not a bad thing. You can still set up HTTPS if need be without having to rely on a for-profit corporation. It's trivial to install a self-signed cert in iOS and OSX the last I checked (and I seem to remember it wasn't so hard in Windows either). It was excruciatingly bad in Android (well, mostly missing IIRC) around Gingerbread — but that too is a good example of why using products built by people with no comprehension of how to secure things is bad.

A learning curve is not inherently bad, but beyond that, especially with something that has such huge security implications, some understanding of WHY you should be encrypting camera fees is something I'd want any dev working on a camera/audio recording app to understand. There's a reason that while CB radio is easily accessible, there are barriers to entry for Hams. With power comes responsibility.


"just open a .html file and start playing" is a route to making web pages, not get people into programming.

If they do want to get into programming, Scratch and other learning DEs and/or node/js are much better paths than dealing with the layers of barnacles that have accrued over HTML to get to the OPA / Webasm / TS/JS etc "web programming" environments.

.NET and VSCode are free downloads, provide a managed environment, and C# is a good imperative language to start with. It also supports F# if you want to get into functional programming.


I find "it's ok if the platform becomes worse because there are so many third-party services you can use" not a compelling argument.

(Not to mention, you need to get to know the services in the first place, while a browser is directly accessible to you)


How did the platform become worse when this is a new capability which didn’t used to be part of the platform? Similarly, while it’s true that you need to know services exist, that’s never been easier just as the documentation, available guides, and especially the developer tools have never been better for someone learning.

I’m not entirely in love with the needs driving this decision but I think it’s reasonable to make security and privacy decisions which benefit a billion people at the expense of making certain tasks slightly harder for a much smaller group.


> Imagine you're someone who just wants to play around with cool web technologies. Maybe you're fairly new to web dev;

Then... you can do it on localhost. I can't really image the new web dev that is using separate computer on their local network as a dev server but can't figure out how to get a Let's Encrypt cert to use.


False. Usually it’s just as easy if not easier to configure your dev server to listen on 0.0.0.0 instead of 127.0.0.1, and these days most sites are mobile first, so it’s very natural to develop on your computer and test on a phone in the same LAN. Figuring out accessing your computer via 192.168.0.x is way easier than figuring out how to issue a LE cert (the easy part) and use it on your dev server (the hard part).

Before you mention mobile device simulation in desktop browsers, I’ll point out that mobile browsers often have quirks that are not present in desktop browser simulations. For instance, mobile Safari is subtly different from desktop Safari responsive design mode in many ways, and the only way that I know to actually simulate mobile Safari is with full-blown Simulator.app.


> Figuring out accessing your computer via 192.168.0.x is way easier than figuring out how to issue a LE cert (the easy part) and use it on your dev server (the hard part).

It really isn't that hard. I still can't imagine this hypothetical "new dev" who is doing cross-browser testing of this specific feature but can't install a simple SSL cert or get help doing so.

This isn't a barrier for entry to new developers, this is a specific use case that requires learning a minor, otherwise useful skill to get around. I think that is a totally reasonable trade-off.


The burden is a little higher but not insurmountable and it obligates new folks into things they need to know. I think it's an acceptable forcing function. Especially given the security concerns.


What security concerns though? It's not like accessing the camera on a random attacker controlled HTTP page is less secure than on a random attacker controlled HTTPS page. If the user lets a malicious web page access the camera, that's game over regardless.

I'm all for doing stuff to tell users that the HTTP page they're visiting is insecure, but telling people who are new to web dev to get a domain and learn how the world of SSL and domains work is actually a pretty fucking big hurdle. They'll have to get into that if they want to get serious about it, sure, but there's no reason to unnecessarily front-load the frustrating and complicated and unrelated parts. You may think it's acceptable, I value the web's accessibility to new developers.

Surely there's better ways to convince sites to use HTTPS than to say they can't use getUserMedia on HTTP.


> What security concerns though? It's not like accessing the camera on a random attacker controlled HTTP page is less secure than on a random attacker controlled HTTPS page. If the user lets a malicious web page access the camera, that's game over regardless.

No. But accessing the camera on a non-attacker-controlled HTTP page is less secure than doing so on a non-attacker-controlled HTTPS page, because an attacker could MitM the former but not the latter. (Even if the camera data itself is sent securely, the attacker could just change the host page's JavaScript to send it to a different server instead.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: