Hacker News new | past | comments | ask | show | jobs | submit login

At least with Linux I can do a code review. Or hire someone I trust to do it for me. With a closed system that is not an option.



I mean yea, but this is about a processor and related microcode. It sucks, but there literally is no real mainstream opensource audit-able option for the consumer market right now.


There's this:

https://www.sifive.com/products/risc-v-core-ip/

Not quite ready for prime-time, but it's a big step in the right direction. At least there's hope for future.


So you can do a code review on the entire Linux codebase and you are skilled enough to find any security vulnerability? Can the person you hire do it? Would they have found the Heartbleed issue that was in open source code for at least two years?


> So you can do a code review on the entire Linux codebase and you are skilled enough to find any security vulnerability?

My odds there are much better than they are trying to do a similar audit on MacOS.

> Can the person you hire do it?

There are people out there who are extraordinarily skilled at this sort of thing. Security flaws in, say, iOS are found regularly. That's how jailbreaks work. (And, BTW, the fact that Apple doesn't give me the option of opening up the jail on my own phone means that if I want this freedom I have no choice but to run some black-box binary from some unknown Chinese hackers. That is by far the biggest security risk I face when using Apple products and it is a direct result of their policies.)

> Would they have found the Heartbleed issue that was in open source code for at least two years?

They did find it, and I didn't even have to pay them. If I had paid someone to look, they might have found it sooner.

No system will ever be perfect. But I like my odds on Linux much better than on Apple.


My odds there are much better than they are trying to do a similar audit on MacOS.

Based on what metrics? Someone skilled in Assembly can and has found many vulnerabilities in closed source software. The chance of you as an individual being able to audit the entire Linux codebase and finding a new vulnerability are nil unless you have some special skills that you haven't mentioned that make you better than some of the top hackers and researchers.

There are people out there who are extraordinarily skilled at this sort of thing. Security flaws in, say, iOS are found regularly.

Despite the fact that is closed. How does that support your point that "open" software is easier to find vulnerabilities. You said that you could hire someone to audit the code, could you afford them?

That's how jailbreaks work

Without the code being open...

(And, BTW, the fact that Apple doesn't give me the option of opening up the jail on my own phone means that if I want this freedom I have no choice but to run some black-box binary from some unknown Chinese hackers. That is by far the biggest security risk I face when using Apple products and it is a direct result of their policies.)

Apple has the source code for all of the chips they use and they had a hand in designing most of them. As opposed to the random Android manufacturer who outsources their chips and the manufacturers give the OEMs binary blob drivers.

Even Google said they couldn't upgrade one of their devices because they couldn't get drivers for one of their chips.

Do you really run your Android phone with all open source software that you compiled and installed yourself? Including the various drivers? Have you audited all of it?

They did find it, and I didn't even have to pay them. If I had paid someone to look, they might have found it sooner.

And they've also found bugs in close Software and if you pay someone enough they could find stuff too.

No system will ever be perfect. But I like my odds on Linux much better than on Apple.

Based on what objective metrics?


> Based on what metrics? Someone skilled in Assembly can and has found many vulnerabilities in closed source software. The chance of you as an individual being able to audit the entire Linux codebase and finding a new vulnerability are nil unless you have some special skills that you haven't mentioned that make you better than some of the top hackers and researchers.

Auditing C source code is orders of magnitude simpler, than auditing binary. With the same resources, you can have much more done in the former case than in the latter.


Then why do you hear more about Android vulnerabilities than iOS vulnerabilities?


Why do you keep insisting it comes down to him to audit the entire Linux codebase? The advantage of it being open is that there are millions of potential eyeballs out there all with different areas and levels of expertise.


As time goes on, that argument is becoming increasingly debunked. Heartbleed, already mentioned in this thread, is a good example. OpenSSL is one of the most visible pieces of open source software, with millions of actual (not potential) eyeballs on it, and a complete security defeat sat in the code for years. I suspect that having "all bugs are shallow" programmed into our brains means that we don't take the time to review code that we use, because we assume that other people did. Group psychology, just like at accident scenes: nobody's going to help, because they're waiting to see what everybody else does. OpenSSL is popular and open source, so surely people would have found all the issues. Same thought.

OpenSSL is also extremely arcane. I tried to work on it once, and spent days simply understanding the data structures. It was, when I was working with it, entirely undocumented. Out of those millions of eyeballs, say a few dozen completely understand the library, and a percentage of them have the capability to review the exact algorithms being implemented. Simply publishing source code is not a silver bullet to gain competent reviewers and contributors, otherwise Linux would be bug-free and have bug-free drivers for every device in existence.

Linus's Law has compelling arguments against it. esr may have been wrong about the bazaar.


Because he said he could do it. Those millions of potential eyeballs didn't find the Heartbleed bug. And out of the "millions" of potential eyeballs how many are skilled enough to find anything?

There are people looking and finding security vulnerabilities in closed software. Why do you assume that it's any harder for a skilled assembly language coder to find vulnerabilities in closed source software. Heck back in the day I knew assembly language well enough to read and understand dissembled code and I wasn't that great at it.


"Those millions of potential eyeballs didn't find the Heartbleed bug."

Sorry, how would we be talking about it if they didn't find it?


After two years when it was in open source code?

How is that any better of a track record than vulnerabilities found in closed source code?


We don't really know when it was first found, iirc, only when it was publicized.


That's not helping the case for open source software...


Exactly.. Open source is awesome, don't get me wrong, but it's not safer by definition. Sure, people can look for problems and openly discuss it and fix it, but that's assuming they are whitehats. Blackhats are also looking, all day every day, for exploits in open-source code. And they can find them before whitehats do.


Yes. Not necessarily with the knowledge I currently possess or with the effort I'm currently willing to put in, but if I wanted to put in the necessary effort and study, I could. Having that option open to me has value in and of itself even if I choose not to exercise it.

And the option is also open to you to be really good and find security vulnerabilities in closed source software just like the researchers at Google do and all of the people who found jailbreaks.

But even then, that wouldn't have helped with the latest bug found in x86 microcode...


> he said he could do it

Yes. Not necessarily with the knowledge I currently possess or with the effort I'm currently willing to put in, but if I wanted to put in the necessary effort and study, I could. Having that option open to me has value in and of itself even if I choose not to exercise it.


This is a super important point.

A complete layperson can use open-source software knowing that it has been looked at by at least some part of the community, conferring all kinds of security benefits.


And that same lay person can assume some experts (including some at Google) have looked at closed source software.

But all of the people running Linux on x86 had no way of knowing that there was a bug in the x86 microcode.


These are all terrible arguments, sir. There is absolutely no way to make an argument that closed-source software is just as auditable or secure as open-source software. There will always be methods of trust required when you use things other people made, but when software is developed in a black box, with NDAs you didn't know existed, or 0-days known about for years, you simply CANNOT make an analogy to heartbleed. Heartbleed was one vulnerably in a poorly maintained OSS library. It is foolish to use it as your primary example of the failure of OSS security when there are tens of thousands of closed tickets on Github, Bitbucket and Gitlab to that would argue OSS is easier to audit and easier to openly discuss ihfosec issues.

By coming up with the same examples over and over again in lower threads you are coming close to trolling this thread and ought to stop and come up with a better argument for why a corporate system-on-a-chip system is inherently just as safe as hardware developed with open source firmware.


So where is the "secure software running on open source firmware" that you can point to as being super secure? In your x86 based PCs running Linux? In the Android devices using chips, shipped with binary blobs that even the phone manufacturers don't have access to?

How has all of the "openness" of the Android ecosystem helped customers get patches for security vulnerabilities compared to iOS users?

And no one has pointed to a real world metric showing open source software being less vulnerable and getting patches faster than closed software.

And that one poorly maintained library was the basis of a mass industry wide security problem. It wasn't some obscure backwater piece of code.


I really wish the community would deal with the reality that open source software really isn't outstandingly secure - and that in fact there are exactly zero reliable processes in place that can guarantee its security.

Suggesting that "the community will find bugs, because eyeballs" is wishful hand-wavey exceptionalism that has been disproven over and over.

It's an unprofessional and unrealistic belief system that has no chance at all of creating truly secure systems.


That's all perfectly fair criticism about a specific type of OSS advocate. But the discussion here is not actually about "it's more secure because it's open" and really about "it has a higher likely-hood of being secure because all of the code is fully audit-able"

Again, heartbleed is a terrible example because many many people have complained for a long time about the cruft building around OpenSSL. Do you think i don't read the code of an open source python module before I roll it into a project? Would I ever consider including a python module in my code that came to me as a binary blob (not that this possible ... yet)? Not on your life.

The reality is it's shitty that I have to trust corporations with closed source firmware and I wish it were otherwise.


it has a higher likely-hood of being secure because all of the code is fully audit-able

If that's true, there should be a way that you can compare the amount of security vulnerabilities in iOS vs Android and the time it takes to get it patched and get the pst h distributed.

It should also be possible to have the same metric in Windows vs Linux, the various open source databases vs the closed source databases etc.

On the other hand there is an existence proof that closed source software doesn't prevent smart people from finding security vulnerabilities in closed source software - the number of vulnerabilities that researchers have found in iOS.

But why is everyone acting as if reading assembly code is some sort of black magic that only a few anointed people can do?

And if open source, easily readable code was such a panacea, the WordPress vulnerability of the week wouldnt exist.


How did you audit your intel CPU the last 10 years?


I didn't because I can't (obviously) and this is a real problem, which I am trying to solve by building my own hardware to act as a root of trust:

https://sc4.us/hsm/


Website mentions fully open hw+sw, is this public?


Depends on what you mean. The code is all open source, and the hardware is all off-the-shelf. But the board design has not been released. If you really want it drop me a line and we can probably work something out.


You can reverse engineer the firmware files though. I do this very often when new versions of iOS are released.


Even with Linux on an open system, you're going to be hard pressed to get access to the source code for all of the controllers in your system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: