I still remember geohot's miracle of his limera1n exploit, which was an unpatchable iBoot exploit on iPhone 4. And now we have its successor - axi0mX's checkm8, still an iBoot exploit, still unpatchable. It seems to be another golden age for iOS jailbreaking has came!
Also, just like limera1n, it requires total physical control over the device to run the exploit. A complete, untethered jailbreak still requires additional kernel/userspace exploits, so I don't see it as a major security problem, but it does make the job of an evil maid a bit easier.
Just for nostalgia, here's the original release text of limera1n.
First of all, congrats to Axi0mX! It must have been quite the exploiting effort.
> It seems to be another golden age for iOS jailbreaking has came!
There is much, much less reason to jailbreak these days than in the iPhone 1 - 5 days. Unlocked iPhones are easy to get. Apple has copied a tremendous amount of features, and with iOS 13 having both dark mode and a fixed volume HUD even more reasons (Noctis / Eclipse and SmartVolumeControl2) are gone. And those, along with CallBarXS (call bar instead of fullscreen calls) and Jellyfish (weather on your lockscreen) are by far the most popular tweaks. I suspect at least the fullscreen calling will be redone in iOS 14.
Theming and game emulators will probably never come to the App Store, but those are even more niche. Terminal emulators and Python environments are already in the store. That leaves.. SSHing into your phone I guess, which is mostly a gimmick.
That's not to say the wild wild west wasn't fun back then. I remember both the Yellowsn0w and Redsn0w periods of jailbreaking vividly. Icy (a Cydia alternative) dying, being revived, and dying again. Down the nostalgia rabbit hole we go..
There a plethora of reasons I still Jailbreak my phone, but some off the top of my head:
• Put an extra column of icons per page on my homescreen.
• Rename apps on the homescreen, and change their icons.
• Enable the iPad's "grid" app switcher on my phone.
• Remove the stupid app bar in Messages.
• Hide various UI elements around the OS and in apps, to make everything cleaner. (Cool note: you can use PermaFlex to hide nearly any arbitrary element.)
• Downgrade apps that release updates I dislike.
• Stop Apple News from creating "personalized" recommendations that would keep me in a filter bubble.
• Play audio from Youtube videos (in Safari) while the screen is locked.
• Install a Userscript that gets rid of AMP in Google search results.
I realize a lot of these things are nitpicks—some maybe even personal eccentricities—but taken together they just make my phone a lot more pleasant to use. I would not buy an iPhone I could not Jailbreak, full stop.
If Google tried to charge for background playback on desktop, the extensions would arrive within days. iOS allows Google to get away with it, but that's precisely the problem with iOS's model.
Besides, I don't want to install a separate app. It's a website, and I should be able to use my web browser.
Edit: Also, this patch doesn't just work on Youtube, it's for any video in Safari.
> If Google tried to charge for background playback on desktop
To be fair premium includes lots of things, including unlimited music streaming ala Spotify, ad free YouTube etc.
You’re not paying only for the background playback. Literally no one would be willing to pay for that. It’s just a minor perk, that’s all.
A perk you said you’d be willing to risk JBing your (expensive) phone over. Most people would probably be more willing to just pay the fee for what is a fairly decent music streaming service.
You COULD play YouTube videos in the background by default in Safari up until about summer 2016. Then, they sneakily added a hook that kills it if it loses focus. But sometimes there's a race condition or something and it doesn't take, so videos will still play audio in the background sometimes, and you could fool it for awhile by requesting desktop version of the YouTube page.
Point is, this was there by default, and they intentionally sabotaged it to force you to use their paid service. This is shitty malware-like behavior and I do not want to encourage them. I would not give them a single cent and I happily spent time to defeat it, because fuck them.
In your outrage, you are completely missing the point.
I don’t care about anything you’re saying here about how things are or were. Really. Not one bit.
I was merely saying that for that -one- particular reason for jailbreaking, a much simpler alternative exist (from a layman perspective).
That’s really just factual information, contributing to the discussion if you like, and downvoting me because you have grudge with Google and Apple is utterly misguided.
or write your own small app to do that :) there are a bunch of ways and premade libraries to get Youtube to deliver videos straight from its CDN. Then just play them using av foundation and add some features you want.
We hear this “Apple has done everything you could only get with a jailbreak so jailbreaking is now obsolete” every year, yet we still see neat new system-level functionality ideas show up all the time. Plenty of new ideas came about from jailbreakers in the “post-limera1n” era (2014 - present) including ones you’ve mentioned. I’d be certain this will open the floodgates to a plethora of new ideas, simply because it’ll soon be far less risky to develop jailbreak things (no need to be concerned about soft-bricking your phone, forcing you to restore to the latest, probably non-jailbreakable iOS version). The workarounds employed by the latest jailbreaks to avoid triggering security checks also made it just that little bit more annoying to work with. Developers being able to run in a “fully jailbroken, anything goes” state like the jailbreaks of yesteryear, as well as being able to develop on top of iOS beta releases, would encourage development once more.
And no terminal app in the App Store compares to actually running your code on bare metal with no clever workarounds required to make Apple happy. I don’t want to add network delays (mosh is great but still just isn’t the same) and worry about transferring files around, or force myself to run inside an emulated, fenced-off world (iSH). It’s just easier to run programs like, well, actual programs. This is why I develop NewTerm, a local terminal app for iOS, and am a huge fan of Termux on Android which neatly packages an entire self-contained ecosystem that still runs on bare metal.
> Theming and game emulators will probably never come to the App Store, but those are even more niche.
Apple thankfully enabled sideloading for "free" developer accounts, and someone released AltStore (a distribution method for the Delta game emulator) yesterday [1].
But you usually have a three app limit and need to reinstall every seven days. Altstore is using some very clever tricks to get around that, and while I applaud the developer’s creativity I don’t expect it to last.
I’m considering jail breaking to be able to force my phone to stay on 4G.
It’s a stupid omission by Apple to not have a “4G only” toggle. 3G connectivity really sucks and it’s annoying to randomly be downgraded when you have perfectly fine 4G coverage.
What do you propose a "stay on 4G" option would do when the device loses 4G coverage?
Drop signal altogether?
iirc falling back to 3G from 4, or 2G from 3 is to cover temporary coverage blackspots and allow data / voice communication to continue. Albeit less optimally.
I have had similar problems. The algorithm they use is not perfect.
My Nexus One used to fall back to 2G whenever my 3G got below like two bars, it was so annoying. And also there would be a temporary outage while it switched. I eventually learned how to hard disable 2G, and service improved greatly, because even one bar of 3G was better than 2G.
Firstly, losing 4G coverage isn’t an issue where I use my phone. I don’t live in a third world country like the US. Despite this my iPhone will randomly decide to downgrade to 3G.
As to what the phone should do when going outside an 4G coverage area when being in “4G only” mode is nothing. It should just aggressively try to reconnect. A discrete icon about 3G network availability should be shown. A press on the 3G icon would enable temporary 3G roaming.
The underlying problem is that Apple handles roaming between 2G/3G/4G networks poorly. I could wish Apple would do it better, but that’s like wishing for an unicorn. Hence my strong leanings toward jailbreaing.
I'm very confused. On one hand, the tweet claims to have a bootrom exploit. On the other hand, the fifth tweet in the chain talks about an iBoot vulnerability that got patched in ios12 beta[1].
Maybe the vulnerable codepath has some code sharing between iBoot and SecureROM?
If I've understood this correctly, it was an iBoot vulnerability enabling the exploitation of the BootROM vulnerability untethered (without connecting to a computer again). Since the iBoot vulnerability is patched, the phone has to be connected to a computer every time to boot if there has been any tinkering (custom FW or any change in boot sequence).
So prepatch you could exploit the BootROM vulnerability untethered with the iBoot vulnerability, but postpatch have to connect to a computer to boot every time if you have done any tinkering which is why it is currently only adviced for security researchers.
Tinkering with the BootROM also leads to invalidations of APTickets (so a future restore may be impossible without special gear).
That's pretty darn great compared to the current situation. Perhaps not quite a return to the redsn0w days, but a major boon all the same.
A (large?) majority of the iPhones currently in circulation will soon be Jailbreakable—not just for one brief moment in time (as with iOS 12.4), but on every future version of iOS. I didn't think that was ever going to happen again.
As an A12 owner, I'm really happy my device is now reasonably more secure if taken into a government back room during customs or via an evil maid attack. Different perspective I suppose.
how is this like a safe? you possess the keys to a safe you own. you do not possess the keys to iphones with a12 chips in the sense that this exploit delivers.
Everybody seems so happy about this, here on HN and on Twitter. But wouldn't this allow any law enforcement or bad actor to circumvent any device protections? Get the phone, do whatever you want with it without anything blocking you.
Am I missing something or are all these device secure enclaves and fingerprint protection or key protection now moot?
As far as I understand it, user data is still encrypted and the key is protected by the Secure Enclave, which is not affected.
This exploit allows flashing unsigned firmware, so by stealing the phone the attackers won’t be able to decrypt your data, but an evil maid attack is now (or will be) feasible.
Also, stolen iPhones are now more valuable, as you will be able to bypass iCloud Lock.
So if I understand, just losing your phone is safe, but if you find it again after losing it, you basically shouldn't keep using the phone before completely reflashing the device?
Well it won’t come back on in that case (the modified firmware will fail signature check.). But as you say you are still safe, just don’t unlock the device before reboot.
I think that depends on how it's set up, right? I rememember on my old iPod Touch with a tethered bootrom exploit, you could reboot without a computer but it would start up in non-Jailbreak mode. If you wanted to boot Jaillbroken, you had to find a computer. (This was the origin of the term "semi-tethered Jailbreak").
I don’t have deep knowledge of the security architecture here but, in general the key doesn’t need to be compromised to retrieve data in secure systems.
What prevents unauthorized firmware from requesting that the Secure Enclave decrypt all data? Similar to having control over an HSM - you can’t extract the key but you can perform cryptographic operations.
The SE will still require a password or other authentication data (e.g. iris/finger print, biometric measurements) until it will use its key to decrypt the data.
The only ways around this are:
* physical extraction of the embedded memory in the SE (I'm not sure if this is actually feasible, it's certainly a destructive attack)
* "updating" the SE firmware - this is what the FBI wanted Apple to do in that terrorism case, that Apple develop a SE firmware that leaks the secret key
* exploiting bugs in the SE firmware - this is what the FBI ended up doing by hiring either Cellebrite or some anonymous hackers (depending on which source one believes).
I see. Hence evil maid attack. If someone has temporary physical access they could install malware that captures data when the device is unlocked. Chargers as an attack vector seem more likely, if more mundane.
In the case of malicious chargers I believe Apple already authenticates peripherals to make data capture more difficult. If you’re unauthenticated then you will not be able to do much unless explicitly authorised.
Of course, none of that matters if you can reflash the device or exploit the boot ROM.
If Apple gave authorised owners a way to sign and boot their own code there wouldn't be as much of a necessity for people to go hunting for these exploits.
Change Apple to Google, then wait 10 years _after_ Google do it.
Android devices could allow you to do this out of the box if Google allowed you to upload your own keys and sign your own boot image by providing the tools to developers / power users. They haven't and they won't.
Sadly that means Apple is unlikely to do it either given how much more strict they are on these things.
The way Google does it seems fine to me. You have to explicitly unlock the bootloader, which wipes all the data. As an extra, if you want to use the phone after wiping, you need to have the auth of the last user (to avoid reselling of stolen phones). Once you unlock the bootloader, you can do whatever, but the phone explicitly shows that it's unlocked on boot so you know that you know your bootloader is no longer safe.
And how many people would know what “an unlocked boot loader” meant if someone else unlocked it — like an over possessive spouse wanted to install malware or a government actor?
We have decades of research that people ignore warnings — see Vista UAC. People always click continue. There is so much scare ware out there people have become immune.
People became immune to UAC because they saw it everywhere, and were trained that it was safe to ignore. Users don't ignore novel warnings, at least not when they're sufficiently noticeable.
If you try to visit a site in Google Chrome that Google thinks is hosting malware, they pop up a huge red message saying the site will harm your computer. There is a continue link, but... well, I don't have access to any analytics on this, but I would guess not many people visit those sites.
Exactly, if a site that you know is perfectly safe showed that red warning, then you'd probably start ignoring it. UAC would pop up on almost anything you tried to run, making it worthless. The android bootscreen should almost never happen, unless you buy a second hand phone or someone actually hacks your device.
If I receive a warning, I need to find out what it means.
If my primary device is the one displaying the warning, the only way I can find out what it means is to dismiss the warning and then 1) google it or 2) ask someone.
1) doesn't tend to happen outside of the tech bubble. 2) happens way way later, if it happens at all, as the odds of someone you can ask being around when it happens is slim. And more importantly, you need to make a phone call / check your email / do something with social media which is more important than the warning, as the warning can be dismissed and life can go on.
Odds are you forget about it entirely, and remember weeks later to ask a friend about somethingsomething boot warning insecure and then hand over your phone to them to have a look, at which point your friend loses their mind over what's happened, while the phone owner remains unconvinced it's really an issue since everything is still working correctly, and refuses to let their friend rebuild their phone for them as it'll take too long.
Source: happened to my friend's android phone. they still wont let me fix it.
And if they do see the warning? Where will they go to get more information? Will they use their computer? It’s becoming more common for people’s only “computer” being their mobile phone. What do they do when they do get the warning? Do they go to their local non existence “Android store”? Do they ask the clueless CSR from their carrier? Or do they just keep clicking continue like most people do on their computer?
How many people in other context like cars ignore warning lights?
So it's literally an education problem? The same issue exists with authentication on the web, if it weren't for normies being unable to wrap their head around it, we could be signing into websites with public key crypto instead of faffing with passwords.
Must we continue to drag things out and design for the lowest common denominator?
It’s an “education problem” that has been going on at least since the dawn of the web where people had 10 toolbars installed on their browser and adware installed on their computer.
Any kind of code signing that forces the end user to trust a third party and restricts how a person may use their own property needs to go.
The security benefits are real, but the implementations are poor. On Android devices, a locked bootloader guarantees your device will be e-waste within 2 years.
Even then that's only for the boot chain after the application boot loader - you for example can't run custom UEFI apps in the Qualcomm eXtensible Boot Loader (let alone custom TrustZone applets or Hexagon DSP firmware) as those are still locked down to Google's RKH, and similarly Google also (unlike a number of Chinese vendors) does not offer their-signed firehose emergency recovery blobs in case of a hard user brick (say, erasing the GPT on the flash).
We should all be happy for the public release of this exploit more than happy it exists. The alternative is that law enforcement and other bad actors might have knowledge of this exploit that the public doesn’t.
I don't see a claim that this circumvents secure enclave or device encryption, so I don't think it's possible to "do anything" to an already locked and encrypted device?
> What I am releasing today is not a full jailbreak with Cydia, just an exploit. Researchers and developers can use it to dump SecureROM, decrypt keybags with AES engine, and demote the device to enable JTAG. You still need additional hardware and software to use JTAG.
> During iOS 12 betas in summer 2018, Apple patched a critical use-after-free vulnerability in iBoot USB code. This vulnerability can only be triggered over USB and requires physical access. It cannot be exploited remotely. I am sure many researchers have seen that patch.
This looks like a bootrom exploit - the stage before iBoot - that verifies signatures of the firmware. The code for it is hard burned - read only from software. This can only be fixed with new hardware. https://www.theiphonewiki.com/wiki/Bootrom
A large part of the jailbreak user community is pretty young agewise. Lots of drama/immaturity/people quitting out of the scene due to toxicity. Some of the people crafting these released exploits into a functioning jailbreak are in college or below!
There's a pretty big piracy problem as well (not just cracked iOS apps, but also cracked paid tweaks released by devs for jailbreak devices) probably due to the younger ages without access to $.
I haven't seen news like this in years since Geohot (founder of comma.ai) found the limera1n BootROM exploit for iPhone 4 and below. With this recent addition, we can have more freedom and control of our iPhones/iPads. This is indeed a glorious time and a good time to be in the Jailbreak community.
This affects devices ranging from the iPhone 4S to the iPhone X. That is a large scope of vulnerable devices.
This is equivalent to the Nintendo Switch BootROM exploit and allows all sorts of OSes such as Linux, Android to be installed on the iDevice.
I've been poking at this already, and I'd venture a guess that the T2 will be breakable. The reason I think the XS and later are hard is pointer authentication codes, but that's more conjecture as I don't have a SecureROM dump from an XS. Plan to examine other parts of the boot-loader like iLLB, assuming they are not encrypted, for the ARM branch with authentication instructions...
I'm confused. The github only refers to the iPhone 3GS. Further, might this be a vector for anybody with physical access to gain access to your encrypted data?
Do I understand this correctly: does this mean that every iOS device from iPhone 4 to iPhone 8 + iPhone X could be unlocked for data access if you had physical access to it?
Also one of the following reply to the twitter link posted somewhere in the comment here [0] has the following:
`During iOS 12 betas in summer 2018, Apple patched a critical use-after-free vulnerability in iBoot USB code. This vulnerability can only be triggered over USB and requires physical access. It cannot be exploited remotely. I am sure many researchers have seen that patch.`
This could explain the recent price reduction of bounty for iOS (lower than Android) [1]
The data on the device is encrypted. This allows you to install a different OS on the device, but it won’t be able to decrypt the users data without pin code.
You could potentially flash a new firmware which contains a keylogger and sends the pin to someone. Or that waits for the user to enter it (decrypting the disk,) and then siphons off the data.
But on your own with a stolen phone and this you won’t be able to read the data.
Afaik your second statement is sadly impossible. The system needs a blob "shsh" signed with an 1024 bits rsa key by Apple's server to install an upgrade or a downgrade.
If this key is in the bootrom, you can't override it ?
but the iOS you install doesnt even need to be signed anymore. Im not an expert on this, but I would suspect you can delete the entire authorization step, wholesale.
This is true, so long as you accept that you need to use a PC to re-run the bootrom exploit every time you boot up the phone. The signature checks remain intact and the system will refuse to boot (goes to recovery mode with a “restore with iTunes” graphic) without the exploit patching out the checks.
It just needs something to boot it, not necessarily a computer. I could see a market for something that looks like this. https://www.amazon.com/dp/B07GWLF4GR that fits in nearly flush.
I have a question... I have an Iphone 7 Plus that I've been holding onto for awhile now that has an Icloud Lock (I bought it used and didn't set up the phone while I was there.. I know, stupid mistake) So will this exploit PERMANENTLY bypass the Icloud Lock once I install a new firmware etc. Or every time the phone shuts off / restarts I have to run the code once it gets developed in order to do this? That would really suck if that's the case.
I would be interested to know what the impact on Secure Enclave will be from this exploit. From quick googling it sounds like bootrom isn’t involved in booting SEP. One large change from the exploit though is unlocking isn’t required to jailbreak.
Aside from that though - are there any extra abilities gained that weren’t already accessible as root (i.e. jailbreak) in iOS?
Replacing the kernel is the big one. On current hardware, once the kernel is loaded, the DRAM controller has its own memory protection unit seperate from the CPUs' MMUs that's set once and then can't be modified until the next reboot. This is used to enforce that the code segment of the kernel can't be written to even if you have access to physical memory or the page tables.
It might make the secure enclave easier to hack, just by having a nicer, democratized access to application kernel space. But AFAIK none of this directly affects the secure enclave as it has its own bootrom that's way smaller and mainly just cryptographically verfies and executes a blob loaded by the main kernel.
I don’t think they can avoid a usb device stack in bootrom. I haven’t read about this much yet, but without bootrom USB device mode support, the BL can’t be flashed on a bricked device. So, devices that are functional still, but somehow corrupted couldn’t be fixed in an Apple Store (or at home?). They’d likely be trash.
I’m sure they have multiple redundant BLs, so I don’t know how often this actually happens.
That USB code was surely the riskiest thing in the bootrom by far though. They will be re-evaluating if it is necessary in new chips.
They also probably provision the devices with USB in the factory. But if that’s all the usb was for, I would suspect it would be disabled as the last step.
Apple actually used to do this on MacBooks - one of the USB ports has mux circuitry that can be switched to an SMC programming mode that merely uses USB as the physical connection for their SMC programmer tool, bypassing the USB host controller and not using the USB protocol. It was removed in the 2016 onwards MacBooks as the T1/T2 supersedes the SMC, so it follows the same design patterns as iPhone hardware.
Speed is a concern in the factory, and that would slow them down even if the alternate format was as fast as USB, which it probably wouldn't be. USB pads could be special and require additional work to mux with whatever the other format is. There are things they could do at the software level that could make sanitization easier too.
But if they are really upset about this, they will consider it.
It’s largely down to security economics. Boot ROMs can eventually be bypassed although difficult. But the real goal is actually to make it difficult enough such that you require physical access, specialised equipment and knowledge in order to pull it off - essentially making sure attacks are difficult to scale.
This should in theory increase the number of iOS users because who doesn't want a jail broken iPhone. I know I am going to switch from Android once a proper method is available. They should make it work for the new iPhone only instead in the future to increase sales.
I doubt enough to make an adoption rate dent. But, I do have some 32 bit apps I wouldn't mind playing around with for a bit. Some synths and a game or two.
Also, just like limera1n, it requires total physical control over the device to run the exploit. A complete, untethered jailbreak still requires additional kernel/userspace exploits, so I don't see it as a major security problem, but it does make the job of an evil maid a bit easier.
Just for nostalgia, here's the original release text of limera1n.
> limera1n, 6 months in the making
> iPhone 3GS, iPod Touch (3rd generation), iPad, iPhone 4, iPod Touch (4th generation)
> 4.0-4.1 and beyond+++
> limera1n is unpatchable
> untethered thanks to jailbreakme star comex
> brought to you by geohot
> hacktivates
> Mac coming in 7 years
> donations keep support alive
> zero pictures of my face