Hacker News new | past | comments | ask | show | jobs | submit login

Like any other network solutions vendor?



I heard https://nvd.nist.gov/vuln/detail/CVE-2019-1804 is Cisco's ninth backdoor so far this year. Not ninth security problem total, ninth backdoor. The ninth security problem Cisco shipped intentionally.

Meanwhile, the router that serves my office is from a company that's had fewer than nine security problems in the past ten years. Two, I think, but I confess I don't really keep count (ditto the nine above). The precise number doesn't matter, because

1. If you want to be cynic you can point out that 9>0 and 2>0 and really they all suck.

2. And if you don't want to be cynic, then Cisco's recent record is in a league of its own. Steals the show. Makes other people's CVE count look like rounding errors.


> I heard https://nvd.nist.gov/vuln/detail/CVE-2019-1804 is Cisco's ninth backdoor so far this year.

Also the 9th they have fixed.

> Not ninth security problem total, ninth backdoor. The ninth security problem Cisco shipped intentionally.

How can you be sure it was intentional?

> Meanwhile, the router that serves my office is from a company that's had fewer than nine security problems in the past ten years.

How can you be sure? Did you audit all the source code yourself? Did you compile the source code yourself and are you running only binaries you compiled? Are you sure you can trust the compiler you used?

Or, are you assuming that, because there isn't a CVE, there isn't a vulnerability or security problem?

Fewer CVEs doesn't necessarily mean more secure, it may just mean less validation/testing etc.

But sure, it could mean more secure, it's just not a guarantee.


Someone at Cisco intentionally created a keypair and intentionally put it in the image build process. They may or may not have intended to put it in production builds, but they clearly intended to set it up in some form, when they could have just ... not. If you take the easy but risky approach, you have certainly intentionally put yourself at risk.

I've worked for a company that built OS images for distribution to customers. Putting my SSH key in development image builds would have been convenient, but there was too much of a risk of exactly this problem; instead we just made it easy enough to download an SSH key on a development build (and start up an sshd) once you've booted it and have physical access to a terminal.

Also, a practical concern with disclosed vulnerabilities is that non-nation-state attackers (which are most of the attackers most people care about) are very unlikely to find and exploit a vulnerability that neither has a public CVE issued now nor will have one issued for years. So even if the alternative vendor has difficult-to-discover vulnerabilities, there is, in a very real sense, reduced exposure from those vulnerabilities compared to things that are disclosed and fixed. And especially if Cisco's disclosed-and-fixed vulnerabilities originate from outside vulnerability reports, there's a definite correlation between whether a vulnerability can be found by someone who would report it and whether a vulnerability can be found by someone who would exploit it.


Backdoors aren't bugs like most others. Buffer overflows happen because someone mistypes or forgets a length check, etc.

Backdoors are unusual: They happen because someone writes code of the form addAccount("s3kr3e", "s3kr3t"), and that's code that's written. You can typo and accidentally omit a bounds check, but you can't typo and accidentally end up with a valid SSH key pair and code that installs it.

It's possible to ship that SSH key pair and the code to customers that as a bug, e.g. if someone writes that code on purpose, intending to add and deploy s3kr3t/s3kr3t but not intending to ever have that code on the branch that's deployed to regular customers, and then someone else mismerged. In that case serving it to customer X is due to a bug, it should only have gone to customer Y or test environment Z. What I'm saying is that shipping those backdoors at all must have been intentional.

(Personally I think shipping backdoors to test environments is fine. Including test environments at customers. Risky.)


> Backdoors aren't bugs like most others. Buffer overflows happen because someone mistypes or forgets a length check, etc.

> Backdoors are unusual: They happen because someone writes code of the form addAccount("s3kr3e", "s3kr3t"), and that's code that's written. You can typo and accidentally omit a bounds check, but you can't typo and accidentally end up with a valid SSH key pair and code that installs it.

Not sure if you're intentionally trolling but a backdoor is simply some code which bypasses security that a particular person knows about. It does not have to be obvious. The ones that have plausible deniability are the better ones as that is considered a feature. That way the company can say "oops, we made a mistake".

To say that a backdoor must be obvious is absolute nonsense particularly for closed sourced binaries where disassembly and using simple tools like https://en.wikipedia.org/wiki/Strings_(Unix) would reveal the presence of such backdoor.

> (Personally I think shipping backdoors to test environments is fine. Including test environments at customers. Risky.)

It depends. Unless that "test build" specifically has an option that disables all "test related backdoors" then the answer is no. You cannot risk having something slip through to a production build.

As the previous poster said:

> I've worked for a company that built OS images for distribution to customers. Putting my SSH key in development image builds would have been convenient, but there was too much of a risk of exactly this problem; instead we just made it easy enough to download an SSH key on a development build (and start up an sshd) once you've booted it and have physical access to a terminal.

Another very common solution would be a template where at build time the keys are generated/imported by whatever build system is being used.

That way if something unintended happens or is "forgotten about", the build simply won't have a key at all and therefore will not work, rather than having a set of keys that are the same across all production builds.


Can you tell us your vendor?

We are moving offices, and it's time to change equipment, Been reading about but still haven't gotten a good list.

The only thing i found is great micro tick for wifi routing/AP's


Mikrotik is okay for small routers, and so is Ubiquiti. If you get Mikrotiks, look for ones with angly metal boxes, not curvy plastic ones. And avoid the GUI stuff, use the CLI. If you're looking for larger routers I'd look at Juniper first.

All of those will give you hardware that does the job and stays up, and provide uncomplicated upgrades for many years.


Palo Alto seems to have a good reputation but that could be security through obscurity as no one can actually afford to use it.


Palo Alto (like every vendor) has had similar vulnerabilities in the past with their web management. Typically management of a switch/firewall isn't exposed to directly to the Internet.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: