
How to unc0ver a 0-day in 4 hours or less - GuardLlama
https://googleprojectzero.blogspot.com/2020/07/how-to-unc0ver-0-day-in-4-hours-or-less.html
======
akersten
> So, to summarize: the LightSpeed bug was fixed in iOS 12 with a patch that
> didn't address the root cause and instead just turned the race condition
> double-free into a memory leak. Then, in iOS 13, this memory leak was
> identified as a bug and "fixed" by reintroducing the original bug, again
> without addressing the root cause of the issue. And this security regression
> could have been found trivially by running the original POC from the blog
> post.

Yikes. Especially looking at the diff of the original problematic fix, it
seems like they slapped a quick patch on there and called it a day, instead of
investigating to find the underlying architectural issue. Doesn't really
inspire a lot of confidence that the resolution for unc0ver is any more
thought-through. I wonder if they've identified the root-cause? That'd be the
real interesting piece to me.

~~~
Jaxkr
What’s wrong with Apple? Why is modern iOS so buggy?

~~~
xvector
A friend at Apple told me that the testing story for iOS is complete shit, and
they actually rely on hundreds of humans to test their software to make up for
poor automated testing.

Apple takes the approach of throwing humans instead of automation at a problem
quite frequently [1]:

> The press release mentions RMSI, an India-based, geospatial data firm that
> creates vegetation and 3D building datasets. And the office’s large
> headcount (now near 5,000) [used to create Apple Maps]

The lack of automated testing is something Apple is working on fixing, but
they're a ways away from having anything substantial. The terrible iOS 13
release quite significantly bumped up the internal priority of stability and
testing. iOS 14 is likely to be far less buggy than iOS 13 because of this
culture change.

[1]: [https://www.justinobeirne.com/new-apple-
maps](https://www.justinobeirne.com/new-apple-maps)

------
devenblake
> By 1 AM, I had sent Apple a POC and my analysis.

> Still, I'm very happy that Apple patched this issue in a timely manner once
> the exploit became public.

Sh- should we be happy Apple fixed this so quickly? unc0ver allows consumers
to get more out of their Apple devices, and Apple's fix isn't really optional
(unless you disable auto-updates and tap "Later" on every update
notification). Is this exploit even an issue? Apple's probably not going to
let an app exploiting this zeroday into its App Store and sideloading is
difficult; it's very unlikely someone malicious is going to trick people into
installing malware that uses this exploit. It sounds to me like Apple is
purposefully limiting consumer freedom by actively trying to prevent
jailbreaking.

~~~
olliej
Look at the attacks on various human rights activists- those are using the
same exploits that jailbreaks use.

Fixing bugs used to attack people means fixing bugs used for jailbreaks. There
isn’t some magical mechanism by which a jailbreak exploit isn’t exploitable
but anyone else.

~~~
saagarjha
Of course, removing the need to jailbreak for such control would mean that
this dichotomy would not have to exist…

~~~
djrogers
If you remove the need for a jailbreak in order to allow arbitrary code to run
on any device, you're allowing arbitrary malware to run on any device.

~~~
saagarjha
The problem is not running arbitrary code, but running arbitrary code without
informed consent. Malware runs without consent. Apple's solution for iOS is
removing the ability to run anything completely, bypassing the need to figure
out how to obtain consent.

~~~
jsjohnst
> bypassing the need to figure out how to obtain consent

How do you propose getting “informed” consent from an audience who doesn’t
care and willingly expose everything about themselves and everyone they know
to find out which Star Wars character or 80s pop song they are most like?
Genuine question, as this doesn’t seem the least bit a solved problem
anywhere.

~~~
swiley
I don’t understand why people think that because some people are clumsy the
rest of us have to live in a straight jacket.

~~~
jsjohnst
> why people think that because some people are clumsy

Nobody said anything about clumsy people. As one of many examples, look at all
the guides that tell folks to disable SIP and don’t explain the risk and
really don’t even need to disable SIP, the app should just be fixed properly.

There are exceptions of course and good reasons to disable it, so I’m glad
Apple has the option, but I’d venture to say 85% of the time it’s done by a
person who isn’t really making an “informed consent”.

------
PragmaticPulp
Fantastic write-up. It's great to see this level of information sharing,
complete with a walkthrough of the author's thought process and strategy for
confirming the exploit. It's also interesting that this was a regression of a
previously-fixed bug rather than a new exploit.

As a side note, it's disappointing to see so much unfounded criticism here in
the comments. Apple was going to find and fix this bug quickly, regardless of
the author's efforts. In this case we get a peek into the inner workings of
the exploit discovery process that would otherwise remain secret. The author
and Apple both clearly noted that unc0ver was the source of the exploit, and
the author made no attempts to hide that fact. Calling the author of this blog
post "lazy" or an "informant" is out of touch and uncalled for.

------
curiousgal
TL;DR: reverse engineer a jailbreak exploit.

> By 7 PM, I had identified the vulnerability and informed Apple

I don't know why this rubbed me the wrong way. Like, it feels "lazy" (for lack
of a better way) to disassemble an exploit and run off to tell the vendor. If
anything, the exploit writer should get the credit. I don't know.

~~~
umvi
All this has taught me is that if I find an exploit to unlock <insert DRM'd
device> I need to obfuscate the heck out of it to make it as onerous as
possible for low-effort bug bounty do-gooders to scoop up a reward from it.

~~~
snazz
Project Zero researchers don’t take bounties, to my knowledge.

~~~
saagarjha
Nor have they been ever offered one, to my knowledge:
[https://twitter.com/i41nbeer/status/1027339893335154688](https://twitter.com/i41nbeer/status/1027339893335154688).
I'm actually not sure Apple has ever paid a bounty for anything that wasn't a
web issue…

~~~
jlgaddis
If memory serves, they've been offered but the bounties are always been given
to charity.

I'm guessing that's a policy/requirement of Project Zero as, presumably, the
P0 folks are making "enough" already.

------
MaxLeiter
Checkra1n, another iOS exploit (although it's more impressively a bootrom
exploit), is mentioned. You can see slides on it from 2019 here:
[https://iokit.racing/oneweirdtrick.pdf](https://iokit.racing/oneweirdtrick.pdf)
(The One Weird Trick SecureROM Hates)

~~~
doublerabbit
Interesting, from that slide I should always null my variables after I'm
finished with them.

~~~
sfink
If they're globals, then yes you should. Having dangling pointers anywhere,
even in supposedly unused areas, tends to come back and bite you.

For locals, why bother? The optimizer will probably discard the writes, and
worrying about stack addresses being reused is a waste of mental space and
clutters the code.

------
albntomat0
Since this always comes up, here's an overview I made several weeks ago about
where Project Zero focuses their efforts:

All counts are rough numbers. Project zero posts:

Google: 24

Apple: 28

Microsoft: 36

I was curious, so I poked around the project zero bug tracker to try to find
ground truth about their bug reporting: [https://bugs.chromium.org/p/project-
zero/issues/list](https://bugs.chromium.org/p/project-zero/issues/list) For
all issues, including closed:

product=Android returns 81 results

product=iOS returns 58

vendor=Apple returns 380

vendor=Google returns 145 (bugs in Samsung's Android kernel,etc. are tracked
separately)

vendor=Linux return 54

To be fair, a huge number of things make this not an even comparison,
including the underlying bug rate, different products and downstream Android
vendors being tracked separately. Also, # bugs found != which ones they choose
to write about.

~~~
londons_explore
Project Zero has uncovered 2033 issues... The majority of those could be used
alone to ruin your life. The rest might require 2 (Eg. one for the sandbox,
one for the kernel).

Thats a team of ~10 security researchers over many years...

Considering how many are being discovered each day/month/year, chances are
that there are at least hundreds undiscovered...

If it only takes _one_ to ruin your life, and a good security researcher can
find one in a few weeks, or months at most, the barrier to someone evil is
really really low...

~~~
Avamander
> good security researcher can find one in a few weeks

s/good/extremely good/

This doesn't change the fact that someone evil will still probably find one.

------
etaioinshrdlu
I have nothing to add but the author of this was my best friend in elementary
school. Interests included robots, crazy science experiments, dinosaurs,
general mischief, and Perl programming.

------
saagarjha
TL;DR background for this one: there existed a zero day bug in iOS 11 related
to how the kernel processed the lio_listio call. Apple fixed it then but
introduced a memory leak. In iOS 13 Apple fixed the memory leak but
reintroduced the vulnerability. The regression was found and packaged in a
obfuscated jailbreaking tool (unc0ver); this post explains how the tool was
deobfuscated. This resulted in an "emergency" iOS 13.5.1 update to fix the
issue. Interestingly this fix still does not fully fix the memory leak:
[https://www.synacktiv.com/posts/exploit/the-fix-for-
cve-2020...](https://www.synacktiv.com/posts/exploit/the-fix-for-
cve-2020-9859-and-the-lightspeed-vulnerability.html)

------
Jyaif
Why is he doing that work? Does Apple not fix every jailbreak exploits by
themselves?

~~~
jchw
In this case, it looks like there is a point to it:

> My goal in trying to identify the bug used by unc0ver was to demonstrate
> that obfuscation does not block attackers from quickly weaponizing the
> exploited vulnerability.

~~~
saurik
We all know obfuscation isn't some magic "no one knows how this works now"
trick: the goal is to buy time while people are forced to work though your
defense and to slow down the proliferation. Now, the "problem" with this is
that some people are just really good at pulling things apart, and so one
person can spend four hours attacking it and then tell the world how it
worked. But then it is more a matter of incentives, and it still isn't the
case that there is much universal incentive for it to both be reverse
engineered and then documented for others so quickly (even in the world of
piracy; the incentives there are fascinating, but still selfish).

And in fact, I will argue that this looks like it worked great: yes, someone--
and of course, likely many people working in shadowy areas of organized crime,
arms dealers, and government contractors--figured it out in hours, and they
could have been malicious and used it to attack others. But the real question
is then how _many_ such attackers you enable and what their goals are. If you
publish an exploit as open source code along with the tool (which some people
have done in the past :/), you allow almost any idiot "end" developer to
become an attacker: millions of people at low effort instead of thousands or
hopefully even only hundreds (when combined with incentives, not just
ability).

If you publish a closed source binary with obfuscation--one which is
restricted to a limited usage profile (like if nothing else it isn't in the
right UI form to "trick" someone into triggering it, or where what it
ostensibly "does" is too blatantly noticeable) you limit the number of people
who both have the time and incentives to work out the vulnerability and then
rebuild a stable exploit for it (which is _hard_ ) down to a small number of
people, almost none of whom (including the attackers) who are then
incentivized to publish a blog post (or certainly code) until at least months
after it gets fixed (as was the case here).

And so, as someone who had been sitting in the core of this community--where
everyone is wearing a grey hat, the vendors are the "bad guys", and
"responsible disclosure" is being complicit in a dystopia--and dealing with
these ethical challenges for a decade, my personal opinion is "please never
ever drop a zero day on the world without it being a closed source obfuscated
binary" unless you want to drop the barrier to entry so low that you have
creepy software engineers quickly using the exploit against their ex-spouse as
opposed to "merely" advanced attackers using the vulnerability for corporate
or government espionage.

~~~
jchw
> And so, as someone who had been sitting in the core of this community--where
> everyone is wearing a grey hat, the vendors are the "bad guys", and
> "responsible disclosure" is being complicit in a dystopia--and dealing with
> these ethical challenges for a decade, my personal opinion is "please never
> ever drop a zero day on the world without it being a closed source
> obfuscated binary" unless you want to drop the barrier to entry so low that
> you have creepy software engineers quickly using the exploit against their
> ex-spouse as opposed to "merely" advanced attackers using the vulnerability
> for corporate or government espionage.

Obviously you have a better understanding of the iOS jailbreak scene than I
ever will, but I still have to say I disagree with this ethical viewpoint.
Personally, I'd rather run an open source exploit chain than obfuscated
binaries from parties I do not know that are difficult to be sure are safe.
Thankfully in the case of unc0ver that is not an issue anymore, but in the
past it has been an issue for longer time periods. OTOH, if there is really a
moral dilemma in releasing 0days as open source specifically because of the
small time abusers and not nation state adversaries, I don't understand how
this moral quandary doesn't mean you can never ethically release an iBoot/more
generally any bootrom exploit, for example.

I'm genuinely curious how many abusive people are motivated enough to come up
with a creepy use for a tethered jailbreak. I know it's possible, but short of
rolling your own stalkerware, it really doesn't seem too straightforward?

------
Dolores12
There is nothing to brag about. I want to own my device. I want to install on
it whatever i like.

------
appybois
Working for the wrong side snitchy snitch

------
staycoolboy
FTA: "...the LightSpeed bug was fixed in iOS 12 with a patch that didn't
address the root cause and instead just turned the race condition double-free
into a memory leak. Then, in iOS 13, this memory leak was identified as a bug
and "fixed" by reintroducing the original bug, again without addressing the
root cause of the issue..."

Ooof. Talk about running in circles. Either this was someone who is swamped
with work and spaced out, or a new programmer who wasn't familiar with the
original. Oddly, I feel bad for both of them!

~~~
ehsankia
Reguardless of how bad the original fix was, this is why testing is important.
The original person should've added tests to make sure that specific issue
doesn't come up again, and it would've caught the regression.

> Thus, this is another case of a reintroduced bug that could have been
> identified by simple regression tests.

------
thierryzoller
So proud to have reverse engineered an 0day. Ok, move on. Nothing to see.

