Hacker News new | comments | ask | show | jobs | submit login

This also happens on iOS with spotlight. As far as I can tell there's no way to turn it off.

Source: MITM your iOS traffic.

Sidenote -- a possibly unforeseen side effect of end to end encryption everywhere is that it makes it far more difficult to man in the middle your traffic and hold companies accountable for their privacy policies.




> This also happens on iOS with spotlight. As far as I can tell there's no way to turn it off.

You can turn it off from Settings > General > Spotlight Search. Uncheck "Spotlight Suggestions" and "<Default search engine> Web Results".

The "About Spotlight Suggesstions & Privacy" link at the bottom of that settings page gives you instructions on how to do it too.


a possibly unforeseen side effect of end to end encryption everywhere is that it makes it far more difficult to man in the middle your traffic and hold companies accountable for their privacy policies

I don't think it's an unforseen effect, but one that is highly downplayed by advocates pushing the security angle. When it was revealed that smart TVs phoned home with detailed viewing information, including filenames, I remember making a similar comment - if they had used TLS, that discovery might not have occurred.

The ability to MITM your own devices is very important, if only so you can figure out exactly what they're sending out.

Another thing is the widespread use of enforced code signing, accompanied by pinning to specific (e.g. MS) CAs; if this had happened a decade or more ago, it would've been pretty easy to pinpoint the parts of the OS responsible and just patch them out. Now the same thing is likely still possible (theoretically, as long as you can change any byte on the disk it is), but involves plenty of bypassing other protection mechanisms on the way and could get pretty hairy if hardware is involved (e.g. secure boot/TPM.) From this perspective, remote attestation and the other upcoming security technologies are immensely disturbing. The desktop PC ecosystem is gradually being locked-down in the same way that mobile is.

These security mechanisms certainly have benefits, but their goal is ensuring that your software is completely unchanged from what the author wants you to have; in situations like these, that is precisely what you don't want. Nevertheless, I hope the hackers/crackers out there find a solution so those that are forced to use Win10 can still retain some privacy.


The "feature" to worry about is the new SGX instructions. With those, the secure boot/TPM stuff is locked down at the hardware level, and we lose root access.

Unfortunately, given how many in this very thread are willing to apologize for MS's behavior and justify their power grabs, I don't expect there will be much resistance in this War On General Purpose Computing.


Yep. 100x this.

In general, as long as you have root access to a machine, you can decrypt any traffic coming out of it, either by locating the private key in the filesystem or memory, or by patching the encryption methods to skip the encryption step.

If you do not have root access to a machine, and software on it signs traffic with a certificate you do not have access to, then you simply cannot see the traffic. If you ask me, that's a huge problem, especially when coupled with the "locking down" of ecosystems that you describe.

The skeptic in me wonders if the same entities pushing the privacy agenda are the same ones with vested interest in encrypted traffic that phones home.


One solution could be to move encryption to OS-level libraries in such a way that you could tap into traffic before it's encrypted for debugging.


I think this is worse though. It fires even when you've disabled web search through most GUI interfaces. It might stop using Windows Firewall > block SearchUI.exe

It will even ignore a-0001.a-msedge.net in your hosts file.


I just changed both Search rules to block and it appears to have worked.


It's pretty simple if the OS you're using allows you to install a trusted root CA certificate.


Except apps like this should probably not be using the OS CA store, and instead just pin their own CA cert. Doesn't seem to be the case here but in general I think pinning is getting more adoption, isn't it?


Not really, heck, on Android and iOS around 40% of banking apps don't even check the certificate at all.

Most people, even most developers seem to be pretty clueless with this stuff.


The paper won't be available to everyone until Wednesday at [1], but:

> Altogether, of the 639,283 [Android] apps in our data-set, 45 implement pinning.

[1]: https://www.usenix.org/conference/usenixsecurity15/technical...


> on Android and iOS around 40% of banking apps don't even check the certificate at all.

Please name and shame, this sounds pretty surprising!


List of Android SSL MITM vulnerable apps: https://samsclass.info/128/proj/popular-ssl.htm

Highly recommend any material on the main site as well. One of the few legit infosec professors I have ever interacted with.


At least for Android: https://docs.google.com/spreadsheets/d/1t5GXwjw82SyunALVJb2w...

There are several banking-related apps listed here.


Heard that on an old security now episode, https://www.grc.com/sn/sn-443-notes.pdf is the best I have unfortunately, there's mention of it near the bottom there.


> just pin their own CA cert.

No. No application or OS should impose it's own CA on an end user without choice. I get the importance of encrypted traffic flowing over the internet, but I also have concerns about traffic leaving my own network. Neither at my home or my business do I want an encrypted stream of traffic flowing out of my network without my being able to inspect the contents and know who the recipient is.


And it would break at any corporation that does a MITM on their own employees to monitor SSL traffic.


What do you mean "apps like this" ? The OP is talking about the Windows 10 OS, which uses the OS CA store.


The app in question is the search UI/Cortana. It has no need to use the OS store and could easily pin to MS's CA.


Doesn't work with cert pinning.


Privacy is a double-edged sword. If you want perfect privacy, you also throw away a lot of potential optimizations.

For example, you may leak information about sites recently visited, updates applied, etc. if you have a local proxy cache and subsequently look at response timings.


If you want perfect privacy, you also throw away a lot of potential optimizations.

That depends on what you mean by privacy. We share information, sometimes sensitive information, with other parties all the time when we interact with them. I believe the essence of privacy is more about being able to choose when and how and for what purposes information is collected and shared and used.

I can't make a purchase using a credit card without the card company at least knowing who I'm paying and how much money I'm giving them. There's not much point going to see a doctor if you're not going to discuss your medical situation with them. If I go out to visit friends, someone passing me in the street is going to know where I am at that moment in time. That doesn't mean anyone else needs to know any of those things, or that they need to be used for other purposes or correlated with other data.

In any case, with a lot of information sharing that is going on with software and networked systems these days, it is far from clear that many of those "potential optimizations" are actually in users' interests at all. Obviously some facilities do need to analyse relevant data sets to make useful predictions -- personal assistants like Siri and Cortana, say, or recommending new material that is similar to what you've accessed before on Amazon or Netflix. But even there, the limitation is often that the technology isn't powerful enough to do the same things locally yet, not that the organisations running these services inherently need to know lots of data about you.


> the essence of privacy is more about being able to choose when and how and for what purposes information is collected and shared and used

This is the core of the EU data protection principles. It's a very concise way of expressing things. There are two limitations:

- figleafing: the "cookie law" problem where everyone is made to agree to a useless dialog box, supposedly signing away their privacy in order to look at any web page with ads on

- it conflicts with the very strong American free speech principles, in which you can say anything you like about anyone on any basis. Privacy enforcement necessarily means silencing people talking about other people. The bad end of this is UK libel law. It's still present in the US "product libel" laws, although fortunately "ag-gag" was recently struck down.


The last time I have seen a proxy was in the 90s. Who still use proxies outside of corporate proxies which are there more to watch the traffic than reduce it?


is it even possible to get perfect privacy with a smart phone?


The biggest problem with mobile phone privacy (assuming you seek it) is that no matter how trustworthy and privacy respecting the operating system and its software is, the baseband modem is usually at the behest of the network provider and also generally has full access to the main CPU and memory. There is one project I am aware of that is looking to address this problem - the Neo900 [1].

The platform should hopefully be 100 % trustworthy (from an "it's free software so I can inspect it" point of view), as long as you do not choose to use a non-free graphics driver.

[1] https://neo900.org/


A WiFi-only cellphone with Ubuntu, Cyanogenmod, FirefoxOS or Sailfish as an OS along with a mobile hotspot device that provides a WiFi access point might be able to get around this issue of the baseband modem having access to your CPU and memory [1].

1. https://blog.torproject.org/blog/mission-impossible-hardenin...


Correct - by using two separate devices you can cheaply get around the issue today. This gets you a setup that I could agree to be trustworthy and privacy respecting.

The convenience of one device is a big sell though, plus I think a device with a built in cellular modem is more fairly called a "phone".


Nobody cares about perfect privacy, but people do care about Total Information Awareness. This is going in TIA direction, in fact it's already there. It's sickening. So I am happy to be a Linux user, and it's the right time to switch. Say goodbye to your corporate overlords and come on down here. The privacy is fine.


The parent comment was in the context of smartphones though. Do you have a smartphone & how happy are you with the privacy it gives you?


There is Jolla's Sailfish OS that has been built on Redhat Linux. However, the phone isn't exactly great hardware-wise. A better phone with the same OS is about to be released later this year by an Indian company (Intex) but I don't know whether it will be available globally.


Let's not talk about "perfect privacy" when we could still be happy with "reasonable privacy". Just because not everything is private by default, doesn't mean we should be okay with the ever more invasive privacy policies of these companies.

It reminds me of the argument that "of course NSA spies, that's what it does" completely merging together the spying on dangerous targets for national security with the spying on every single person on Earth and for economic, blackmail and so on purposes. Reality is more nuanced than that.


lets aim for perfect privacy and see where that gets us, how about that....



iOS does tell you have to turn it off but the user has to notice a small notice on the Spotlight control panel. Same with OS X. I believe this kind is unethical as Apple knows that most people won't change the default settings.


Yes but also the privacy implications are over exaggerated. This kind of "phoning home", at least for Apple Siri/Spotlight, goes through a session ID which is random, regenerated every 30 minutes, and not bound to Apple ID or other user profile. Moreover, IP addresses are not used, not communicated to third parties (eg: Bing for Siri) and anonymized in logs. So yes, it's a "phoning home", but with different implications compared to eg. inputting a wifi password on a logged-in Android phone.


There is nothing unethical about it. It would be if there was no way to turn it off.


The default setting should be to respect the privacy of the user first. Since they know most users will blindly go with the default, it's an asshole play.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: