To confused HN commenters: Correlium works by offering VMs of iOS and all models of Apple device, and allowing Terminal access (i.e. pre-jailbroken) access the operating system underneath.
For example, as a security researcher, I could order a copy of iOS 11.1 running on an iPhone 6 32GB. It would be spun up and accessible in about 3 or 4 minutes, and I could run direct commands on the Darwin kernel underneath.
Why is this illegal? Correlium DOES NOT have a physical iPhone 6 that it is screen recording. They actually have made copies of various iOS releases, and are running them on virtualization software, while making big bucks from the researchers for this technology.
Will Apple win? Well, if you look at the Apple vs Psystar case of 2007 (in which Apple won pretty much every case and appeal and every injunction they filed for), the odds of victory from Correlium is about as likely as Bill DeBlasio winning the 2020 Election.
There is a fair use case here. My understanding is that you're allowed to do this sort of thing for purpose of security research.
It most definitely is not unethical. Illegal perhaps, but unethical? Please.
If they are trying to make money off selling to people doing something other than security research (say, playing games and using apps), than yeah, that'd be unethical.
To make copies of iOS (against ToS), run it on non-Apple-branded devices (against ToS), and make profit off it?
Like, I could see fair use for an individual security researcher, but a business making profit circumventing Apple's ToS and security in multiple ways, and encouraging others to do the same? I find it unlikely to pass.
They're not doomed to make their own tools from scratch. If researchers set up open source tools on how to spin up their own VM on their own system that's an entirely different situation.
Lots of folks do it, no one is tries to make money off of it (or at least stops trying after the initial letters from IBM legal arrives).
Hercules is an interesting case because it's widely rumored to be used inside IBM to run modern System Z releases.
In this case however the company in question isn't just making and selling an emulator you can run iOS on, its selling hosted iOS as a service, which is a clear license violation - just making the emulator and instructions for use available for sale would be quite legal.
The Psystar case (or at least, my read of Apple's opinion of that situation), implies that Apple (circa early 2000s) considers making such an emulator available would not legal.
Instead of just Apple though, think of a bigger picture here:
Let's say someone ran a service to help you hack AirBNB. Legal?
Let's say someone ran a service which spun up Windows Virtual Machines, of any model, which were not genuine licenses, for the sake of "security research." Legal?
Let's say someone ran a service which helped you find bugs in anti-cheat software and run those bugs on your PC. Legal?
Let's say someone ran a service which spun up Windows Virtual Machines, of any model, which were not genuine licenses, for the sake of "security research." Legal?
These virtual machines expire after 90 days. We recommend setting a snapshot when you first install the virtual machine which you can roll back to later.
Edit: Furthermore:
The Microsoft Software License Terms for the Microsoft Edge and IE VMs are included in the release notes[1] and supersede any conflicting Windows license terms included in the VMs. By downloading and using this software, you agree to these license terms.
NO ACTIVATION. To prevent its unlicensed use, the software contains activation enforcement technology.
Because the software is licensed for testing use only, you are not licensed to activate the software for any
purpose even if it prompts you to do so.
Not only are you encouraged to use the software without paying for a license for testing purposes, you're outright forbidden from doing so.
Microsoft are allowed to do so in this case because they are the owners of the software - they can license the software they own under any terms they want. In this case though, I'm sure there will be terms that prevent you from offering the VMs as a service to others for a fee.
Microsoft are definitely not 'someone' in the sense OP intended.
Let's say someone ran a service which spun up Windows Virtual Machines, of any model, which were not genuine licenses, for the sake of "security research." Legal?
Here, I see "someone" as a security researcher, not someone who is reselling the VMs. The OP may have meant someone spinning them up to resell them to security researchers, but, given the ambiguity, I do think it's important to point out that you can spin up free Windows VMs for research purposes--you are explicitly licensed to do so.
If the analogy is taken literally, then the answer is: yes, it's legal as long as you're not reselling them.
But MS does not encourage you to sell access to those unlicensed copies. Corellium's business model isn't security research, it's taking illegal copies and "renting" them out to people for money. And people go to them because it's convenient, not because there is no other legal way.
Of course there are key differences; however, the analogy provided by the OP doesn't really fit here.
Let's say someone ran a service which spun up Windows Virtual Machines, of any model, which were not genuine licenses, for the sake of "security research." Legal?
You can, in fact, legally spin up arbitrary Windows VMs for security research without a paid license.
If any comparison is to be made here, it should be between Apple's licensing and Microsoft's licensing:
* Microsoft will allow you to use Windows without paying for research purposes; Apple will not.
* Microsoft will allow you to run Windows on hardware purchased from a third party, or even hardware you've made yourself; Apple will not.
* Microsoft goes out of their way to facilitate research with the help of virtualization technology; Apple goes out of their way to impede it.
As a customer of both companies, Apple's approach here really rubs me the wrong way and definitely contributes to my unwillingness to completely embrace the Apple ecosystem.
To clarify, these are images that you have to download and host yourself. This isn’t a service offered by Microsoft; they’re providing the free license and tools, but you still have to have your own hardware to run the VM.
@zenexer Those expire after 90 days. You are doing it within Microsoft's boundaries. Now imagine circumventing Microsoft's boundaries and the 90 day limitation for convenience.
It explicitly encourages you to circumvent it so you can run it for more than 90 days.
Edit: To be clear, this part:
These virtual machines expire after 90 days. We recommend setting a snapshot when you first install the virtual machine which you can roll back to later.
If you set a snapshot and roll back to it later, you get another 90 days. Rinse and repeat.
Using snapshots to roll back and lose 89 days of work/system changes is quite an inconvenience though. Circumventing the need for snapshots/rollbacks is still not ok
Well, the need to take a snapshot--or rather, the activation issue--is described at one point in the license as a "bug". It's a little confusing, but from talking to the maintainer of the VMs on GitHub, it sounds as though they just don't want to dedicate time to "fixing" the activation system. The workaround is to take snapshots. There are actually instructions on the repo for extending the trial without reverting to a snapshot, if I recall correctly, but it's been a while since I've looked into it.
You are completely missing the point with your example and it's in no way similar to what's happening here. Free to use and free to sell are completely different concepts.
Just because someone gave you the right to use for free something that belongs to them doesn't mean you are allowed to sell that right or that "something", or even pass it on for free.
I'm countering the analogy, not the case of Corellium. You're right: the analogy used here misses the point; it's not the same as what Corellium was doing.
The analogy is harmful because it implies that it's unreasonable to expect that like Microsoft and Apple would allow their software to be used for free for research purposes; it doesn't mention resale of the software. I don't see that as an unreasonable expectation; Microsoft does it without issue.
Edit: Just to be clear, Microsoft is not offering these VMs as a service. You are required to download the image and run it yourself. It’s basically just a license to use Windows for free as long as it’s for a specific kind of research.
Fair use is relevant here but one of the main tests used by the courts is whether or not the use is commercial - this kind of for-profit redistribution is unlikely to pass muster.
> It most definitely is not unethical. Illegal perhaps, but unethical? Please.
I would argue it IS unethical. When speaking of ethics, the intention matters.
If they were just offering these tools to security researchers at cost or for free, then I could agree, but they aren't. They're intentionally doing something that wasn't allowed to make money.
Unfortunately the service does cost money to run, and it is hard to pay operating costs without some sort of income, so it's hard to fault the service for charging money for a valuable service, especially on HN.
This lawsuit (if it goes to court - many cases do not) would decide whether it is, or is not, "allowed". Still, contracts can contain all sorts of clauses, not all of them will be permitted/enforceable by law. Famously, non-compete agreements aren't enforceable in California but that doesn't stop them from appearing in California employment contracts.
Fair use doesn't apply when you're acting for profit and making complete copies. That's like arguing that setting up a book production shop and selling fraudulent copies of the book (for borrowing only) is "fair use".
That just means they're not running afoul of copyright laws. By setting up the VMs, they're running afoul of the EULA, and Apple has the right to seek legal remedy for their actions.
What if they bought a phone for every VM they offer? So let's say they offer 5000 VMs, they buy 5000 equivalent phones. Do you think they will still have a chance?
Probably not. That would be much better legal footing, but still has problems.
The iOS copy isn't running on the actual iPhone, which is a massive violation of the iOS ToS which says that iOS can only run on an Apple-branded device.
There is also legal precedent upholding this in Apple vs Psystar (2007), in which a company claimed "fair use" when selling their own "hackintoshes" and a UEFI product which made building "hackintoshes" easier.
How did Psystar do? They lost. Terribly. And they had a surprisingly robust legal team, making it all the way to the point to appealing to the Supreme Court (the SCOTUS denied the request).
What WOULD WORK is buy 5000 iPhones and doing passthrough. In this setup, you have 5000 iPhones, all running iOS, and the screen is recorded with touch-inputs replicated on the real iPhone. Even if they were jailbroken, the odds of Apple winning would be way, way lower than this case.
I don't know if the Aereo decision would come into play here -- they were declared in default, even though each user would "rent" their own unique antenna.
There's also a bunch of details to that case that may make that decision irrelevant here.
Does anything in copyright law entitle Apple to impose conditions on
the use of software it gives away for free online (e.g., that it may be run
only on Apple hardware)? It would seem analogous to me writing a book
and offering it free online but licensed only to be read in a comfy chair
with a nice cup of tea. Maybe I'm in the furniture business and my
livelihood depends on everyone's compliance. I bring this up because I
wonder if fanboyism is clouding our collective judgment where we might
otherwise greet these so called terms of service with the contempt they
so richly deserve.
> Does anything in copyright law entitle Apple to impose conditions on the use of software it gives away for free online (e.g., that it may be run only on Apple hardware)?
No. Nothing in copyright law allows this. However, everything in contract law allows this.
Copyright law forms the underlying background situation only. Under 17 U.S.C. § 106, the default is that only the copyright owner may make copies (including, e.g., the copy made when installing the software or the copy made into memory when running it).
However, those exclusive rights may be licensed to others. (Under 17 U.S.C. § 117, a software licensee, or the lawful owner of a copy of the software, may always copy the software to install/run or to make an archival copy.)
Licenses are governed by contract law. Contract law typically consists of an offer, acceptance, and some thing of value traded by each side. Restatement (Second) of Contracts, § 17(1).
The thing exchanged can be a promise, a forbearance (i.e., a license), a conditional promise, or any number of things. Restatement (Second) of Contracts, §§ 71-81.
In this case, the license to copy the software to your internal storage and from there into RAM is offered conditionally. In return, you promise not to run it on non-Apple systems. If you break your promise, the conditions of Apple's license to you are triggered and your license terminates. All of that is governed by contract law.
The backstop to that, though - the legal stick - is that now you're using an unlicensed copy and continuously copying it into RAM to use it. That is what opens you up to copyright violation liability.
I thought there needed to be 2 parties in agreement to enter a contract. I don't know much about how Corellium does business, but if they can get their hands on iOS images without specifically agreeing to a contract - is there a contract?
Perhaps the whole EULA thing is old news but I'm still not sure if I'm bound to every condition stated in a EULA when my cat clicks "Agree"... I thought at one point, the courts ruled that EULAs are pretty toothless. These are honest questions - I have no clue.
> Perhaps the whole EULA thing is old news but I'm still not sure if I'm bound to every condition stated in a EULA when my cat clicks "Agree"... I thought at one point, the courts ruled that EULAs are pretty toothless. These are honest questions - I have no clue.
IANAL but I would be really surprised if courts are as lenient with businesses as they are with consumers. I think it's unreasonable to expect a normal person to be able to go through the EULA of every product they use. I don't think it's unreasonable to expect a business to understand whether the way they are making money is legal.
The issue with this argument is that there can only ever be once instance of my car in existence at a given time. If you take my car, I don't have it. If you download a copy of my software that I give away for free and use it in a way I don't want you to, that doesn't stop me (or anyone else) from using it. That's not to say that there isn't some argument that using software against the terms of a license should be illegal; I just don't think the analogy to physical goods is a very good one.
What's the difference between you owning a car and you having the right to use a communal car?
The core of the legal concept of property is the right to exclude others. This is an academic philosophical lens to view it through, but it's fundamental to understanding how the law treats these things. "Property" isn't a physical thing itself, it's your right to exclude others. (This, by the way, is also a useful lens through which to view Fourth Amendment jurisprudence).
The property you're "stealing" in a copyright infringement case isn't the bits themselves. The property is the right to exclude others from copying, publicly performing, etc. By doing so without a license, you're denying the software licensor the ability to exclude you.
> Contract law typically consists of an offer, acceptance, and some thing of value traded by each side. Restatement (Second) of Contracts, § 17(1).
"some thing of value" - usually termed "consideration".
What is the "consideration" that Apple receives?
If Apple grants no license, the other party cannot do anything with the software, including those uses that would remain forbidden under the license. In granting a license, Apple allows some uses, but retains some limitations. Thus, Apple has only granted rights to the other party, and has received nothing in return that Apple did not have before.
They receive something of value - the promise not to violate the terms of the license.
"But wait!" you say. "That seems circular!" Indeed. The issue comes from a slight ambiguity in the term "license." The word is used both to mean the contract between Apple and the licensee, and the permission granted in that license. Strictly speaking, the former is a "license agreement," but referring to the agreement just as the "license" is commonplace.
Indeed, the first line of the iOS Software License Agreement reads: "PLEASE READ THIS SOFTWARE LICENSE AGREEMENT (“LICENSE”) CAREFULLY BEFORE USING YOUR iOS DEVICE"
Apple gives you permission to use (and, to some extent, copy) iOS. In return, you give Apple a promise to use the software in accordance with the terms of the license. You also give Apple other consideration, such as a waiver of liability in the event that you view indecent or offensive material on your iOS device.
Your point is an astute one. Section 73 of the Restatement (Second) of Contracts reads: "Performance of a legal duty owed to a promisor which is neither doubtful nor the subject of honest dispute is not consideration; but a similar performance is consideration if it differs from what was required by the duty in a way which reflects more than a pretense of bargain."
However, there are other things you give up, as noted above. Additionally, in all practical reality, courts are generally loathe to invalidate a license agreement or any other contract for insufficient consideration.
I co-founded a company called App.io which ran from 2012 to 2015. We let people run iOS apps in the browser and we did it by streaming the simulator from virtualised macOS instances. We were running EXSi on Mac Minis colocated in data centres around the world and the system ultimately worked really well (we definitely had scalability issues with such an unconventional setup).
We were quite tight with Apple. We had meetings on campus with senior executives that led to a pilot program with iAd where people could actually play games as an interstitial ad unit. We had employees at Apple who were dedicated to working with us to run this pilot program. Apple ultimately decided to shut down iAd which doomed our collaboration and possible acquisition opportunities.
So this move is really fascinating to me personally. Apple knew how we were doing it and embraced it, probably because we weren't competing against them or undermining the security of their OS.
Apple is depending on privacy and security to be a key differentiator with other phones, tablets and computers. Especially as the markets for all three are slowing as new features are harder to invent.
This company undermines this by allowing anyone to find bugs whilst encouraging them to profit off it instead of working with Apple.
Its not about security and privacy, its about obscurity and PR.
Corellium is heavily used by security researchers, killing it will reduce the number of bugs that are found in iOS and make the platform less secure overall.
Apple wants to project the appearance they have minimal security flaws and they accomplish this largely by making security research more difficult to perform.
I'm also confused why you seem to think security researchers don't deserve to get paid for their work. Bug bounties have been around for a while now for a reason.
Not GP poster but the contradiction is Apple (wants to be) serious about security. Thus, Apple will pay for (certain kinds of) bugs. Corellium is used to find bugs. Apple is trying to shut down Corellium.
This move by Apple seems to contradict the notion that Apple is serious about security - how are researchers supposed to find bugs without this?(a successful lawsuit would also set precedent that no US entity could run a very similar service)
Sure, there's workarounds (eg buy a pile of iPhones), but why is Apple making it harder to secure their product?
>“Although Corellium paints itself as providing a research tool for those trying to discover security vulnerabilities and other flaws in Apple’s software, Corellium’s true goal is profiting off its blatant infringement,” Apple said in the complaint. “Far from assisting in fixing vulnerabilities, Corellium encourages its users to sell any discovered information on the open market to the highest bidder.”
Why doesn't Apple simply outbid whomever is outbidding them? Why is Apple entitled to security research at anything less than the current market rate?
I think it is entirely naive and optimistic to think that Apple's lawyers aren't perfectly capable of framing things in the best light, even when their goals (or even a subset of them) may be less favorable.
I highly doubt that is there concern, I think it's to paint them as evil. If the case was on should Corellium emulate Apple's hardware that will be simple, but to tack on that Corellium also indirectly enables the bad guys in this age of constant hacks and privacy invasion is an additional burden that Corellium has to bear.
For the same reason a host is entitled to prior disclosure of its unfixed vulnerabilities.
Selling vulnerabilities on an open market should be outlawed. Either disclose them publicly for free, or participate in a bounty program by the software owner. People selling undisclosed vulnerabilities should be considered accomplice of people who then use it to break into systems.
We went through this multiple times. The CSS algorithm for DVD encryption was famously litigated and found to be free speech when someone printed it on a t-shirt.
I think there's this weird schism at times where people perform all sorts of convoluted hoop-jumping to decide whether hacking is bad or good depending on their perspective, the target, and a host of other variables that really do not much more than inject subjectivity into debates.
Witness the Apple fans who will have a certain glee at another vendors vulnerabilities and then bemoan attempts to find vulnerabilities in the Apple ecosystem. And, to be quite clear, "Apple" can be replaced with many major ecosystems.
Just the other week people were bemoaning Google's Project Zero for calling out vulnerabilities in iOS. "Not fair, I bet they don't do that for Android, Chrome, they're doing it for market advantage!" - except that Project Zero absolutely _does_ feature Android and Chrome vulnerabilities.
I don't think it is convoluted: either you disclose the vulnerability to the constructor so that they can fix it or you publish it publicly. Selling it without disclosure to someone whose goal is just to exploit it is criminal.
The convolution is in somehow shoehorning the notion of selling a secret into the notion of free speech.
> Google's Project Zero
Do you understand that the projects that aim at improving security are fundamentally different than the ones aiming at exploiting flaws?
The problem is the legal equation isn't that simple, and is subjective by merit of the extent to which it is subjected to persistent arbitration. This just leads to muddied waters and attempts to favorably contextualize the discussion in favor of one actor over the other. Black and white reasoning isn't a luxury afforded those who find themselves entangled in these situations. Some examples:
Does it stand to reason that attempting with purpose to discover exploitable flaws in and of itself makes you a bad actor? ( we've sentenced minors, academics and "white hats" using this argument )
What if someone wrote software that had a legitimate use, but made use of an undisclosed flaw that is then sold to many consumers and reverse engineered, revealing the flaw to larger constituents? What if bad actors merely used a tool out of its original context to exploit a side effect? Does this constitute intent? ( this was tried and the individual in question was jailed )
If an open source project collects money from a bad actor unknowingly and then discloses through a PR or official release the existence of a flaw previously unknown, should they be culpable? ( waiting to see this one play out, hasn't yet, but I have no doubt it will. Was kind of expecting it as a result event-stream.js )
This all just speaks to the concept of subjectivity vs objectivity in the litigation of this concept. The point where it is subjective, rather than objective is the point where it becomes an ethical discussion, and is therefor subject to the principle of fallibility and the human uncertainty principle. tl;dr, if you can't strip motive, investment and bias from the argument, it can't be objective by definition.
I think all your examples would work fairly well with my initial proposal: if you are going to disclose a flaw, either do it publicly or reserve it to the actor who may be able to correct it. Only these two courses of actions would provide a legal shield. Revealing a flaw or an exploit to another actor would engage your responsibility if this actor behaved criminally. I don't know the English legal term, facilitator? Accomplice? That's how we charge people who, for instance, provide otherwise legal help to people they know are terrorists or criminals.
"Does it stand to reason that attempting with purpose to discover exploitable flaws in and of itself makes you a bad actor?"
No. I think a lot of past litigation of such case were really misguided.
"What if someone wrote software that had a legitimate use, but made use of an undisclosed flaw that is then sold to many consumers and reverse engineered, revealing the flaw to larger constituents?"
Illegitimate unless the flaw was previously disclosed in a responsible way to the constructor (which basically means give them time to solve the issue).
"What if bad actors merely used a tool out of its original context to exploit a side effect?"
If the tool had an exploit built-in, the author's responsibility is engaged, not otherwise.
"If an open source project collects money from a bad actor unknowingly and then discloses through a PR or official release the existence of a flaw previously unknown, should they be culpable?"
Of course not, but we live in a stupid enough universe for such a thing to be liable.
No, I don't realize that at all. It may make sense in a country where money is free speech and where companies are people, but outside the US, it makes little sense to say that selling a secret is protected by free speech.
Disclosing one, yes. Selling it secretly do that it can result in exploits, certainly not. Responsible disclosure is a thing. I doubt that people selling credit card numbers and fake identities are protected by free speech.
Journalists and newspaper editors also do not work for free. More importantly, speech in a newspaper is still protected, even if a copy of the newspaper costs money.
What is the functional difference between selling a copy of a newspaper costing a lot of money per issue detailing the exploit and selling the exploit some other way?
Really your arguments make no sense. First security flaws are not secrets. If I put a number in a box in my house and you break in and take it, that is 'stealing' a secret (aka credit card number). But if I make a million boxes with the same number in it and sell them there is no expectation that you will not look inside the box.
Moreso selling "information that you would rather not be public but is not considered a secret legally" is completely legal in the US. There are lots of books published and sold containing information that some company or person would rather keep secret.
>Responsible disclosure is a thing.
Not a legal requirement. More of a gentleman's agreement after companies sent the law after security researchers so researchers sold or released anonymous zero days.
The 1A is about freedom to express any idea or opinion, not about literally publishing or saying whatever you want. Purposeful lying, for instance, is not covered under 1A. Finding bugs for the purpose of selling them to people with malicious intent is malicious itself, and is very likely not covered under 1A. All rights imply a commensurate duty. You don't have freedom of speech for the purposes of harming others.
You can legally sell security bugs, as the NSA buys them all the time.
However, running a platform for the explicit goal of breaking security? Are you copying copyrighted code to achieve this goal? You're on shaky ground there.
> Why is Apple entitled to security research at anything less than the current market rate?
This is not only about Apple. This is also about their customers. You are essentially advocating that people should sell exploits in the black market, legal disclosure be damned.
There's a strong difference between being in favor of the _right_ of freedom of speech but against specific instances of speech. They are completely different. I'm in favor of their right to go to the black market, but I'm completely against them actually going to the black market.
This reminds me of the “DeCSS” case, where claims were made that a specific number was not allowed to be spoken or transmitted or otherwise shared. ( https://en.m.wikipedia.org/wiki/DeCSS )
Obviously the DeCSS people lost that argument.
Banning someone from saying “by doing X you can bypass security feature Y” is going to be a difficult one to get past the Supreme Court, at least in the USA.
So if they have a right to go to the black market, then Apple has no case. Or rather Corellium might have a case against Apple for tortious interference.
Total freedom of speech would imply that you're allowed to make movies about cartoon mice and publish information about military designs. No country has total freedom.
Corellium is an amazing product and I wish they win. They fill a massive gap that Apple is not addressing. Apple is not very good at doing good dev tools (cough xcode monolith cough), so when somebody enters the game and gets dev excited again about their platform, I think they should embrace it and buy them.
Can anyone explain the exact nature of the infringement here? Presumably illegally copying the software from a device to a vm? Is there a logical strategy to counter this claim by Corellium?
Interesting case. Corellium should be allowed to sell to bonafide security researchers, Apple even admits it themselves - "“Corellium is not selectively limiting its customers to only those with some socially beneficial purpose.”".
That said, who gets to be a bonafide sec researcher? Love to see how apple can define that.
The fact that they encourage vulns to be sold to an open market is likely a problem. They might have to shut that down and move to a wink wink mode.
It may, however, be that Apple thinks that pointing out they aren't selling to only security researchers or other "socially beneficial" people is easier than getting into a fight about fair use, which would also bring bad PR.
This comes hot on the heels of an announcement from Apple that they are starting up a new program to allow select security researchers access to iPhones with a majority of security features disabled. I wonder if it is somehow related.
How does this compare to Sony vs Bleem? Bleem was a commercial Playstation 1 emulator that Sony sued. Bleem won, but the company shut down due to the cost of the legal fees.
Bleem didn't violate any of Sony's patents or rules. It allowed unauthorized code to run through the use of a bug, but it didn't, you know, come with illegal copies of the games, or copy the PlayStation ROMs. It didn't copy any Sony code. It let users use the console in a way Sony didn't like, but didn't ACTUALLY "harm" Sony in any way.
This is different. In the US, bugs are actually "legal" to buy and sell and protected by the 1st. However, how you USE those bugs is a different matter.
What is happening here? Corellium has copied iOS code, is running it on non-Apple hardware by virtualization, and justifies what would typically be a majorly illegal process (i.e. what if HTC made a phone running iOS?) by claiming "security researchers."
> There is no basis for Corellium to be selling a product that allows the creation of avowedly perfect replicas of Apple’s devices to anyone willing to pay.
It's hard to be sympathetic when Apple's business model is built around preventing users from using the software they pay for in ways Apple does not approve of—sometimes you can frame this around profit, but the problems hardly stop there (e.g. they exercise political control of their platform, too). If this isn't a legitimate market, I don't see any good that comes from making this market illegal.
That said Corellium doesn't seem to be aimed at anything good, either, so this should be fun to watch.
For example, as a security researcher, I could order a copy of iOS 11.1 running on an iPhone 6 32GB. It would be spun up and accessible in about 3 or 4 minutes, and I could run direct commands on the Darwin kernel underneath.
Why is this illegal? Correlium DOES NOT have a physical iPhone 6 that it is screen recording. They actually have made copies of various iOS releases, and are running them on virtualization software, while making big bucks from the researchers for this technology.
Will Apple win? Well, if you look at the Apple vs Psystar case of 2007 (in which Apple won pretty much every case and appeal and every injunction they filed for), the odds of victory from Correlium is about as likely as Bill DeBlasio winning the 2020 Election.