As per IMG CEO, Apple has never not been an IMG customer. ( Referring to period between 2015 and 2019.) Unfortunately that quote, along with that article has simply vanished. It was said during an Interview on a British newspaper / web site if I remember correctly.
"On 2 January 2020, Imagination Technologies announced a new multi-year license agreement with Apple including access to a wider range of Imagination's IP in exchange for license fees. This deal replaced the prior deal signed on 6 February 2014." 
The Apple official Metal Feature set document , All Apple A Series SoC including the latest A14 supports PVRTC, which stands for PowerVR Texture Compression.
It could be a Custom GPU, but it still has plenty of PowerVR tech in it. Just like Apple Custom CPU, it is still an ARM.
Note: I am still bitter at what Apple has done to IMG / PowerVR.
I'm unfamiliar with this; are you bitter about a lack of attribution that they produced lots of the IP their GPUs are built on?
"Apple expects that they will no longer be using Imagination’s IP for new products in 15 to 24 months."
But at no point in time did Apple announced they stop supporting PVRTC. Nor did they make any depreciation notice.
That announcement caused IMG's stock price fall by nearly 80%. IMG was later sold to a Chinese VC. ( Which somehow had some tiny connection linking to Apple, but that is in the conspiracy field so I wont go into it )
And if you look back now, Apple were not telling truth or lying by omission.
Which is what got me to scrutinise every details when they had a battle with Qualcomm. And it was a very different view to what mainstream media tells you.
From reading this, I'd be mad at the bogus/flimsy stock market (and the restrictions on behavior it imposes on companies to satisfy ever more greedy stockholders) rather than Apple here.
Tangentially, it's far easier to be an ethical underdog than it is to be an ethical winner. But Apple isn't even trying.
Of course, when companies are large enough, shareholder's interests spin off into the regulator's domain, allowing interesting turn of events in the form of bailouts
But there's a significant economic cost to society when large companies fail catastrophically like this. Which is why size matters. Large companies mean putting all of our economic eggs in one basket, which is exactly what systems like capitalism are supposed to avoid.
Reviving a corrupted entity to allow it to continue must have a cost that needs to be weight against the perceived loss of its benefits.
Apple took Qualcomm to court due to 4G/5G patents dispute. And made number of claims which were false, possibly inaccurate or lying by omission.
I initially sided with Apple, quoting Tim Cook
“Despite being just one of over a dozen companies who contributed to basic cellular standards, Qualcomm insists on charging Apple at least five times more in payments than all the other cellular patent licensors we have agreements with combined,”
Fives Times More than others Combined. That is ridiculous. We knew Qualcomm were expensive, but we never had an actual number or up to date figure. This allegation was repeated in all mainstream newspaper and website. There were at least Six other major companies ( Nokia, Samsung, Huawei, ZTE, Ericsson Intel/LG ) with broad wireless patent portfolio. That is Qualcomm = 5x where other companies = x/6. Or 30x difference if x were divided equally between all companies.
This got people furious. And I did too. Despite knowing Qualcomm have the best, this kind of multiple were unheard of.
That is until you sit down and start to run some numbers. It doesn't make any sense. You can add up the low end, high end estimate of all there other six companies patent licensing cost in their investor notes and rounding to the nearest number and it would still be an order of magnitude off the figure Apple were trying to paint.
Apple argued Qualcomm made double dipping because they were earning profits from Qualcomm Modem itself, and they are still charging for patents. As if the design of the Modem itself doesn't cost anything?
And numerous other claims.
At the end instead of going to PR battle with Apple, Qualcomm has shown immense restrain. I guess they knew they would win. And since then I have tremendous respect for their CEO, Steve Mollenkopf's professionalism.
The only person who bother to report all the fact from both side was Shara Tibken from Cnet. Who actually sit in every Apple Qualcomm trial ( There were multiple ) and tweet as well as post update. You can read about it here  to make up your own mind.
I know it's business but they seem a pretty bad customer by most standards.
People don't have to interact with that side plus the public image is super polished.
TL;DR: What you're saying seems perfectly in character.
If you wanted to use one QCOM patent, you'd have to pay for a suite of patents that includes IP you have no interest in using.
If you want to use a particular technology, the pricing would depend on the total price of your handset -- even if the difference in handset price has nothing to do with the Qualcomm technology.
Their business is tough because they basically spend years inventing technologies and then more years licensing them and defending their IP in court. Still, if you have to draw a moral/ethical conclusion I'd suggest that (at least in part) Qualcomm brought its 2016 Apple woes upon itself.
> Apple's has already been recruiting Imagination workers for months, its most prominent get being COO John Metcalfe. Until now, hires were being sent to Apple's main offices in London, or overseas to California.
> The company now has over a dozen listings for graphics-related jobs in South Hertfordshire, the region in which St. Albans and Kings Langley are based.
Edit to add quotes.
I remember the IMG stuff being full of bugs
I'm a little confused about why Apple can't support PVRTC after they stopped using PowerVR GPUs in their devices?
PVRTC is a texture format. Its latest incarnation (PVRTC2) was introduced in 2013. Being able to load and render the texture format (whether efficiently or not) doesn't appear to require a licence. I used to use PVRTC assets when developing for the original iPod touch. I can imagine Apple retaining support for the format on new hardware for backwards compatibility reasons, such as being able to load and run apps with PVRTC resources
It could very well be the case that PVRTC resources are no longer rendered using the same tech on modern Apple GPUs, and general performance improvements since 2013 make it a non-issue
I don't think support for the PVRTC format implies that a PowerVR GPU design is in use or that "plenty of PowerVR tech" is present in modern Apple GPUs
We could be at the point where the PVRTC patents are still valid, and Apple is paying those royalties, but there's no IMG RTL left in the GPU.
We could also be at the place where Apple somehow got a perpetual license to IMG's RTL, and the bones of the Apple GPUs are still very very clearly IMG GPUs, but they feel like they can essientially write the company off.
What I find interesting is that PVRTC is the very last proprietary IMG format that Apple GPUs support.
IMG introduced an improved PVRTC2 texture compression format in 2012. Yet no Apple device has ever exposed support for it.
It looks like Apple was already thinking ahead back then and wanting to avoid supporting features that would require extra royalties.
Dodgy business practices.
In the end you can't just get new customers on demand - you have to earn them.
Qualcomm bought adreno from AMD (interestingly an anagram for radeon).
Samsung struck a deal to buy from AMD.
Intel moved from PowerVR to their own designs.
And just like that we're out of major chip makers other than Apple or the occasional specialized SoC that probably doesn't sell in huge volume or rake in the cash.
Making things worse, their really cool patents related to early geometry culling expire soon.
Yeah, I've heard rumors that ARM+Mali is cheaper than just ARM for the licensing fees.
To me, it sounds like it might mean 32-bit ALUs can be used as two 16-bit ones; that's how I would approach it, unless I'm missing something? The vectorization can also happen at the superscalar level, if borrowing the instruction queue concept from out-of-order designs: buffer operations for a while until you've filled a vector unit's worth, align input data in the pipeline, execute. A smart compiler could rearrange opcodes to avoid dependency issues, and insert "flushes" or filling operations at the right time.
Desktop Nvidia simply converts them to 32-bit floats up front so the performance is kept but memory is wasted, but raspberry does the 16 -> 32 bit calculations every time resulting in horrible performance.
I still have to test the engine on Jetson Nano with half floats, but I'm pretty sure I will be dissapointed again, and since raspberry doesn't support it I need to backtrack the code anyway!
After some further research I heard snapdragon has 16-bit support in the hardware and hopefully this is where we are heading! 32-bit is completely overkill for model data, it wastes memory and cycles! 16-bit for model data: vertex, normal, texture and index and 8-bit for bone-index and weights! Back to the 80-90s!
This is the last memory & performance increase we can grab now at "5"nm without adding too much complexity!
You can try the engine without half float here: http://talk.binarytask.com/task?id=5959519327505901449
Pascal has native FP16 operations and can execute 2 FP16's at once ( https://docs.nvidia.com/cuda/pascal-tuning-guide/index.html#... )
BUT, and this is where things get fucked up, Nvidia then neutered that in the GeForce lineup because market segmentation. In fact, it's slower than FP32 operations: "GTX 1080’s FP16 instruction rate is 1/128th its FP32 instruction rate" https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-...
You can workaround this by doing most of your math in local coordinate spaces though, so it's fine for games.
The target architecture was a 33MHz 486 PC running in "real" (ie sixteen-bit) mode. While hardware floating point was sometimes available (DX systems) it was quite slow.
Perf? Which hardware does have support for math on fixed-integers? (e.g. sqrt, inverse sqrt, ray triangle intersection accelerations, etc.)
Maybe the vc4-dri.so driver hasen't implemented it properly yet one year later? But does it still save the RAM?
So the texture uvs could be accelerated? But most of the data is in vec3... (vertex and normal)
For vertex arrays, while in theory the driver could load vec2/vec4 F16 attributes into the vertex shader as-is, AFAICT the Mesa driver does not attempt to do this; they get converted to F32 as they are loaded from main memory. There would be downsides to getting rid of this conversion... in particular the hardware does not support F32->F16 conversions when loading attributes, so the driver would have to recompile the shader if you switched the vertex arrays to F32 (or perhaps more likely recompile the shader when it realises you are using F16 vertex arrays rather than F32).
In any case, the hardware does support zero overhead F16->F32 conversions when loading attributes, so you should not see a performance drop when switching vertex arrays to F16! The performance should go up slightly as less data needs to be loaded from main memory. If you're seeing an 80% performance drop something has gone terribly wrong!
I believe this is a difference between the A14 GPU and the M1 GPU; the former's 32 bit throughput is half its 16 bit throughput, whereas on the latter they are equal.
Also, I noticed that IMGTEC are going to write an open source mesa driver, I wonder how much code the two drivers will be able to share.
Even with a custom design, those designers are likely to think along the same lines (and Imagination Tech IP is pretty good in it's own right).
In any case, I doubt Apple would pay that much money if they didn't need rights to the IP.
The only real question is why they didn't acquire the company outright.
Apple GPUs support the proprietary IMGTEC compressed texture format that nobody else uses and they use the same deferred rendering method.
Apple's deception was good that nobody realised they were shipping custom GPUs. Even with hindsight, nobody can really tell when apple shipped their first custom GPU. Apple may have even shipped GPUs which were a mixture of IMG and Apple RTL.
But at this point the internals are far enough appart that I doubt much code can be shared.
Alyssa's Panfrost work is leaps and bounds ahead of anything geohot has ever personally done.
Pretty much everyone of our profile gets started at around that age (I was porting Linux to a weirdo ISP router at 14 and working on Wii hardware and homebrew, the first "well known" thing I did, starting at 16-17), but somehow geohot has marketed himself to be some kind of genius when he's one of the more mediocre hackers I've known. He's not bad these days, but he has always represented his skills as being way better than they actually are, and taken solo credit for work that involved other people, his entire career.
I've seen that guy work in person and sometimes it felt like he was just streaming keystrokes into the keyboard buffer without regard for any visual feedback. That time at CCC, he was working on exploiting the Wii U. A bit later he called me over, and I was staring at a blank assembly file. He had just completed a full stack Wii U exploit, from WebKit to the kernel to the security CPU to its kernel, and expected me to dictate some assembly to him to dump the eFuses where the keys are held. I stumbled something like "Uuuuh.... I guess we need to deal with banks.... well how much memory can you dump? Uh.... yeah let's just treat it at linear.... errrr.... make a loop? I guess load a register with 0 and loop until, uhh... 0x1000? Yeah and you need to set the top bit and store that into, er.... 0xd8000...dc? I think? Let me check... uhh yeah and then read the data word from e0. I think. And put that somewhere you can dump."
Made me feel like a complete idiot :)
Something along the line of "there's no bad publicity", because out of the millions of tv watchers and his yt rap video watchers, these brings in a lot of people (at least teens) interested in the scene with a higher percentage of those who remain and go deeper into the technical parts of hacks, then learning about fail0verflow & co.
Many only then realized what the big picture actually is (insert podium meme)
So, for what it's worth, never underestimate how many people REALLY know and appreciate a lot more fail0verflow's work. Thanks again!
p.s. hopefully PS4 will catch up to the good old days of PS3. Pepperidge Farm remembers. Also that last year there was a pseudo-consensus that held-back PS4 hacks will be released only after the PS5 appears. (pretty please)
Someone mentioned the person that stole his work as someone to look up to. It seems like a pretty justifiable personal grudge.
A rare talent that from what I’ve seen. Who also gets respect from people way smarter than myself. That’s always a signal I look for when I’m in out of my scope.
I hope this project works out for you. Always so much BS on the people side in OSS, something I learned the hard-way long ago. But talent is talent, work speaks for itself, etc.
You’ve certainly chosen a general enough topic/niche to draw lots of generic opinions and the whole peanut gallery stuff. So I predict this road will require quite of bit of blinders to operate without the constant distractions from the non contributing opinionated mediocrity. The only way to stay productive in such an environment IMO.
Guess that's why he decided to start his own company :)
First time he appeared in the PS3 jailbreak scene I was like "wait, is this a Romanian?! Some boy-genius expat?!" because George is a common first name here and (fun fact) Hotz means "thief". Actually it is written hoț but Romanian teens sometimes write tz instead of ț. He was meant to be a key thief, I guess.
I still wonder if they got to examine the video in the courtroom...
It would seem easier to get Linux booting (by just sending the same commands apples software does) before worrying about 3d acceleration and shaders...
From the iPhone 7 Linux port we know that Apple is nothing special in terms of this (they have framebuffer working there)
(I'm more of a software kind of guy, I might be entirely wrong here tho)
Which is fine until you need to change the mode, or start doing heavy BLT/etc operations. At which point your machine will feel like something from the early 1990s.
So yes, you can get a full linux DE running with mesa and CPU GLES emulation, but its not really particularly usable.
Intel has simply failed.
I used to dual boot windows for school and when I first switched to windows I had an old laptop's backup in bootcamp. Cross-platform software is much more ubiquitous than 5-10 years ago. For linux I always used another box or just run a VM. Nowadays my laptop can ssh or Remote Desktop into a more powerful machine. I have a custom built widows box, a custom built NAS running FreeNas (FreeBSD), 2 RPi's running Raspbian, and a not always linux box based on old hardware. There is a machine big or small to do things or play with. My VPN allows me to connect from anywhere.
What are you guys doing that you have to install linux instead of running a VM or remotely connecting to a linux box? If it's just for the sake of knowledge I can understand it.
Apple's touch pad experience in MacOS is the best in the market and it is always very different in Windows and Linux. The XPS, Lenovo and smaller vendors really make killer Linux/Windows laptops that have much more options than Macs.
One reason is to devote an older machine to Linux (M1 models are still new, but they wont be in 3-4 years. And in 6-7 years they might not be supported by new then MacOS releases at all (only security fixes). When someone gets the M4 or MX, they might want to retire the M1 models, or use them for server/supportive roles with Linux on them.
Others just prefer Mac hardware (Linus Torvalds himself was a big fan of the early Macbook Air, which we used with Linux, and has welcomed the M1 model as well) but Linux software for various reasons.
I, for one, ocassionally wish to boot to full native Linux for some stuff. I have several Linux computers I use too, but I usually use those headless, whereas a Mac laptop would be great with a good supported Linux.
>What are you guys doing that you have to install linux instead of running a VM or remotely connecting to a linux box? If it's just for the sake of knowledge I can understand it.
Besides repurposing an old Mac, even running a desktop environment as your primary or secondary driver is much faster and usable with bare metal (as opposed to a VM) and, of course, impossible with remote boxes (X forwarding aside, which is not workable as raw native desktop).
If I buy a 2022 Macbook Air, I'd like to repeat this process for it in 2032. Moore's law is dead, treasure your hardware.
I don't see how it's superior. For that money you can buy Dell XPS or Thinkpad of same quality or better.
> but it falls short on every other metric
So far I didn't see any evidence of that. That may be true in 2011-2012 where not many laptops had hires screens with decent matrices, but definitely not relevant today.
I'm not sure I would say that the XPS is equal or better quality than the MBP though.
The trackpad isn't better than the Mac's and neither is the case construction. I prefer the aluminium unibody to the carbon fibre (rubberised plastic) stuck to aluminium plates of the XPS.
I really like the XPS though and would buy another if it broke. i.e. I wouldn't buy a Mac to run Linux but when Apple finally stop supporting my MBP I see no reason not to install Linux on it and extend its life. Especially as I like Linux as a desktop OS.
It varies by year and place though.
As for Apple, you buy a 3 year warranty with AppleCare for less than the surface costs and that includes accidental damage for a small excess fee per event.
What? Maybe this is true for the M1-based MacBooks now, but I bought my Surface Book 3 with my Microsoft Business account (which was made for me for free when I bought my Surface Book 2 by a manager at the Dallas, TX Microsoft Store) and for $249 I got a three year warranty with three accidental damage incidents, also with no deductible. Yes, I had to pony up for a year of Office 365 for $69 to get the normally-$349 warranty for $100 off, but I already use Office 365 yearly sub anyway, so adding on an extra year of something I was going to buy anyway, for $100 off a 3-year, 3-incident, deductible-free warranty isn't exactly a "hard sell".
Although the only problem I've ever had with Microsoft's hardware products was my original Surface Book, and I'll admit, that was a miserable experience, but I had zero problems with my SB2 and none so far with my SB3.
Having said all that, I might actually switch over to Apple if they had a 15" 2-in-1 like my SB3. In order to functionally replace my Surface Book 3, I have to buy a $1499 12.9" iPad Pro, a $129 Apple Pencil 2nd gen, and a $2299 MacBook Pro M1. Adding Apple Care+ and tax brings the total up to $4670. But I also now have the problem of having to lug around two devices instead of just once as I do with the Surface Book 3. My totally pimped out Surface Book 3 + warranty + Office 365 1-year sub was $4350. That's the Quadro RTX 3000 MaxQ version also.
Right now, the Apple products have the performance advantage, I can't deny that. But they didn't have that when I bought my Surface Book 3, and Microsoft isn't going to sit on their ass. Their ARM-based Surface Pro X is following Microsoft trends perfectly. The first version of all their hardware ranges anywhere from a total joke to slightly underwhelming. But by the time 3rd and 4th iterations, Microsoft has their shit together. I fully expect that in 2-3 years, a Microsoft ARM-based Surface Book 5/6 will be a serious MacBook Pro contender, while also winning over users like me for the 2-in-1 aspect.
As for how many people run Linux on bare metal on the Mac. That's hard to guess. Likely not a ton. But then... not a ton of anyone runs Linux as their primary desktop/ laptop so that's not surprising.
The "fiasco keyboard" period aside (which is now 1.5+ years over, and which reliability aside, some do prefer its feel), Mac keyboards have been quite fine.
But one famous person that used to buy Macbooks to run Linux as a primary one (and used an Air as his daily driver for years) was a certain Linus Torvalds.
(Previously he also used an Apple G5 tower for his main desktop).
I have a 10 year old Bluetooth magic keyboard and a 10 year old USB magic keyboard and both work perfectly with windows.
Be warned, they are difficult to Bluetooth pair with windows for the first time, usually takes 10 tries for me.
You also pay a hefty premium for MacOS if you are just running Linux.
You can’t run Linux on the M1 yet, but will this still true when you can? How does price / performance of the M1 air compare to x86 laptops? (Or the Mac mini vs other small form factor PCs?)
There are features that are unique to Linux, like cgroups and KVM, that are nice to use on my main OS. The kernel itself is incredibly configurable and extensible, whereas on macOS the kernel is locked away and Apple has deprecated foreign kext support.
Another example is Linux's native FUSE support. I can use an updated version of sshfs on Linux that isn't available on macOS and solves problems I had with sshfs on macOS. I have to use closed source software to even use FUSE on macOS.
Using Linux on a VM or over SSH to do things I couldn't do on macOS was frustrating, and it's a breath of fresh air to run Linux natively.
Also, I prefer Plasma Desktop to the macOS desktop shell. KWin is scriptable, and KDE is really nice.
> Apple's touch pad experience in MacOS is the best in the market and it is always very different in Windows and Linux.
I disagree with this. Libinput is configurable and its adaptive acceleration profile feels more natural to me.
If Linux had software support for it, it would probably be the best platform for Linux (server) software development.
Same reason you run Linux on a Dell or Other laptop. Because you prefer Linux. In Apple's case, the M1 is particularly appealing right now.
For me personally, this project is mostly interesting as insurance for if/ when Apple ships a version of MacOS I don't care for or stops supporting the M1. That's likely... 10 years out, but you never know.
That said... it would be fantastic to have Linux as a backup.
For "why not Linux on not-M1," I vastly prefer reading and writing AArch64 assembly to amd64... And the battery-life+performance claims of M1 are certainly extremely exciting, coming from a Pinebook Pro (which has the battery life, but pretty wimpy performance by comparison).
I assume it won't be as long in Linux but it'll still be longer than a lightweight and high performance x86 laptop.
Because OSX, besides being a user prison, is certainly far from being as good as their hardware.
I've got an iMac 2020 here, I log into iTerm2 and macports has every things I want, installed cleanly and co-operatively with the Apple frameworks and kernel.
I'm running Firefox and Thunderbird and Office 365, all native. I also have suppler supported clients for iMessage, Whatsapp, Telegram, Skype, Spotify (whether Electron or not).
I have developer utilities like Dash, which is one of the best documentation browsers ever, I have native UI neovim with VimR, I have menu bar utilities for monitoring, cloud sync with Google and Microsoft (and iCloud if I wanted).
How exactly am I in a user prison? I'm running the Beta Big Sur and the only bug so far is that the Karabiner kext doesn't load.
Nope. fact check true. Unsigned software will not run on a Mac without having to jump through hoops to enable each app.
I predominantly use open source software, and a lot is not signed.