
IOHIDeous OS X Local Kernel Vulnerability - tptacek
https://siguza.github.io/IOHIDeous/
======
stevemk14ebr
Oh my I feel so sorry for apple security engineer's right now. I'm curious the
motivations to release such a serious, and complex, 0day on new years eve!?
Makes me wonder who brushed this guy the wrong way, or if he just wants to
watch the world burn.

That said, seriously impressive work and I give him props.

~~~
Godel_unicode
> I'm curious the motivations to release such a serious, and complex, 0day on
> new years eve!?

CVE 2018-0001, it's all about namespaces.

~~~
floatingatoll
Bonus points if they issue it as CVE-2018-65536 (pen-test the world, so to
speak)

EDIT: "No one would ever store the CVE incrementing fragment as a 16-bit
unsigned int!"

~~~
jacquesm
Or in the pattern CVE-dddd-dddd ...

For additional fun, find a buffer overrun based on the CVE ID.

------
st3fan
There are two pieces here that I find really impressive:

First, the skills and persistence to get all these moving parts going. This
must have been weeks of tiring work and exploration.

Second, the fact that the author wrote an incredibly detailed posting with a
lot of detail and background information.

Wonderful work.

------
lopatin
To all the kernel programmers out there, can we get a HN-level ELI5 for this?

It looks like a total system compromise is possible. Under what conditions?
Any ways to ensure we don't get pwned?

~~~
Siguza
Needs to be running on the host already (nothing remote), achieves full system
compromise by itself, but logs you out in the process. Can wait for logout
though and is fast enough to run on shutdown/reboot until 10.13.1. On 10.13.2
it takes a fair bit longer (maybe half a minute) after logging out, so if your
OS logs you out unexpectedly... maybe pull the plug? And maybe don't download
& run untrusted software until the bug is patched (or, you know, ever)? Also,
any decent antivirus shouldn't take long to add this to their malware
definitions.

Not sure if this is HN-level, but... I hope it's understandable.

~~~
geekamongus
> Also, any decent antivirus shouldn't take long to add this to their malware
> definitions.

Have Mac users finally started running antivirus?

~~~
j0hnml
Apple actually creates some signatures in house with “XProtect”, but I’m not
sure they do the same for raw privesc exploits. I’m also not sure how thorough
they are with their signature creation...

------
userbinator
I'm actually not too surprised. I briefly delved into Mac kernel programming
in the early days of Hackintoshing (in an attempt to write some missing device
drivers), and the amount of complexity there seemed excessive --- to someone
who had done previous kernel work in Windows and Linux. The overall impression
I had was "far too many moving pieces". And as everyone should know,
complexity hides bugs.

------
sprite
Amazing writeup.

------
ngcc_hk
The article is really eye opening for non-os guy. Love mac and worry. But
sometimes you do appreciate in a sense Darth xxxxx.

------
mappu
Responsible disclosure would have been to product-security@apple.com. Do apple
have a bug-bounty program?

~~~
tptacek
"Responsible Disclosure" is an Orwellian term concocted by vendors to control
the actions of independent vulnerability researchers who work without real
compensation, using information freely available to consumers, in competition
with malicious attackers.

The term you're looking for is "Coordinated Disclosure". Yes, Coordinated
Disclosure would involve sending the bug to Apple and waiting for them to
publish it.

If you'd like to complain that this disclosure is irresponsible, fine. But try
not to do it using the vendor's marketing term, because it's not up to them to
decide what is and isn't "responsible". Other reasonable people --- myself
included --- will probably disagree with you, and say that getting information
out to people as comprehensively as possible is usually the most responsible
thing you can do with a security bug.

~~~
tensor
I don't like the term you made up. But fine, I think this is "Irresponsible
Disclosure." Is that better? Did anything change?

Vendors and non-vendors alike are all responsible for good security, and that
includes working together to make this happen. If you are working against
vendors because of some preconceived notion that they are "evil," that's not a
good thing.

If it turns out that the author did submit to the vendor and worked together
to minimize damage then I'll retract my statement. Until then I think it's
irresponsible, not just "uncoordinated".

~~~
dpwm
The problem is that allowing the vendor to define what is responsible, which
seems these days to be expanding into giving them unlimited time to fix it, is
to allow them to take unlimited time to fix it.

Cooperation or even coordination takes willingness from both parties. Let's
look at the actual page apple has on reporting security issues [0]

"When we receive your email, we send an automatic email as acknowledgment. If
you do not get this email, please check the email address and send again. We
will respond with additional emails if we need further information to
investigate a security issue."

Something seems a bit off here. I would have expected a human to get back
within a few working days for a serious security problem. That might be in the
auto response email, but I wouldn't be surprised if it wasn't.

"For the protection of our customers, Apple generally does not disclose,
discuss, or confirm security issues until a full investigation is complete and
any necessary patches or releases are available."

Does this extend to the security researcher that reports the vulnerability? If
so, that's probably why there was no coordination.

[0] [https://support.apple.com/en-us/HT201220](https://support.apple.com/en-
us/HT201220)

Edit: removed spelling mistake mistake.

~~~
an_account
As a Mac user, I feel it’s irresponsible. I don’t want zero days published
before Apple has a chance to fix.

I also think that the vendor has a responsibility to fix the exploit quickly,
and if not the researcher should publish and shame the vendor.

~~~
Digital-Citizen
Is 3 years considered quickly enough? How about 3 years for a remotely-
exploitable problem? According to <a
href="[http://www.telegraph.co.uk/technology/apple/8912714/Apple-
iT...](http://www.telegraph.co.uk/technology/apple/8912714/Apple-iTunes-flaw-
allowed-government-spying-for-3-years.html">The) Telegraph</a>, "Apple was
informed about the relevant flaw in iTunes in 2008, according to Brian Krebs,
a security writer, but did not patch the software until earlier this month
[Nov 2011], a delay of more than three years.".

It seems to me that nobody but Apple has a responsibility to its users. The
public at large certainly doesn't owe Apple (or any other software proprietor)
specific performance regardless of whether they report what they've found
publicly or when.

Apple is also not being nice to its users by denying them software freedom:
most of MacOS is proprietary and the aforementioned bug concerned iTunes, a
proprietary media player. So no matter how technically savvy and willing the
user is, they're not allowed to diagnose and fix the problem, prepare a fixed
copy of the changed files, and help their community by sharing copies of the
improved code.

"Responsible disclosure" is indeed propaganda that benefits the proprietor in
a clumsy attempt to divert blame for a product people paid for with their
software freedom as well as their money.

