It's ewaste, or at least, an interesting artifact in computer design history.
What frameworks and GUI Apple built on top of Darwin are not open source.
Also, releasing their software as open source goes against their business model as they wouldn't be able to enforce its use only on their hardware.
People wants Apple software but that only comes with their hardware.
System on Chip are called System on Chip because they don’t expose I/O ports(like PORTA through PORTD on AVR) to the pins thus can’t interface RAM or ROM or other peripherals natively, but only over narrower buses through internal interface chips/logics.
In my rule of thumb, every surface mounted “CPU” between 80 to 200MHz across industries were called SoC from 2003 through 2010s.
Those CPUs/MPUs that do expose I/O directly to be interfaced to SDRAM, LCD or equivalents of North Bridge chip on PC were usually called either MPU or CPU, often MPU for they integrate some peripherals. Those were roughly 200MHz and above, including PowerPC for NAS or x86 for PC for which heat or footprint were not concerns.
Those under 80-120MHz were often called Microcontrollers(uC). They were also often self contained like SoC but didn’t count, also got a lot higher clocked recently but still don’t.
Intel and AMD refer to their systems as SoC since Apple started doing custom SoC, and that might give you an impression that it’s new and better. That’s not true.
SoC means CPU’s main I/O ports are not exposed to pins.
Intel and AMD refer to their systems as SoC since they integrated the north bridge (which includes the memory controller) into the chip, has nothing to do with Apple.
> CPU’s main I/O ports
That's a weird definition. Most architectures don't have "I/O ports" and use memory-mapped I/O exclusively. And even if we read that as "any kind of I/O pins" and go back to the
> can’t interface RAM or ROM or other peripherals natively
part: the vast majority of mobile SoCs in the 2010s use external RAM (and storage). The memory controller is onboard and the external connection is directly to DDR3/4/.. chips.
> System on Chip are called System on Chip because they don’t expose I/O ports(like PORTA through PORTD on AVR) to the pins thus can’t interface RAM or ROM or other peripherals natively, but only over narrower buses through internal interface chips/logics.
What does IO port multiplexing of the AVR have to do with the definition of SoC?
> In my rule of thumb, every surface mounted “CPU” between 80 to 200MHz across industries were called SoC from 2003 through 2010s.
No. I dont see how the physical mounting of the package changes anything. Most processors are surface mount, even some bga x86 chips.
> Those CPUs/MPUs that do expose I/O directly to be interfaced to SDRAM, LCD or equivalents of North Bridge chip on PC were usually called either MPU or CPU, often MPU for they integrate some peripherals.
No. MPU = MicroProcessor Unit.
> Those under 80-120MHz were often called Microcontrollers(uC). They were also often self contained like SoC but didn’t count, also got a lot higher clocked recently but still don’t.
So modern highly integrated microcontrollers, an effective system on a chip with 300+MHz core speed, sram, flash, eeprom, can, uarts, ethernet, pwm, counters, dac's, adc's, arent system on chips?
> SoC means CPU’s main I/O ports are not exposed to pins.
Not sure how to purge that psychology.
I think that's something most of us struggle with. That doesn't mean you're not a great person :-) I appreciate the humble reply.
And Apple didn't use ARM reference designs in early iPhones; they were using Samsung SoCs. Pretty much nobody uses the ARM reference designs except as an early prototyping target.
This isn't true. The majority of ARM based SoCs today use the Cortex or Neoverse reference designs from ARM. Today there are only a few handful of vendors who fully develop their own core, such as Cavium, Apple, and Ampere.
In the context of reference designs, you see stuff like the ARM Juno designs. I've seen people (very very rarely) just drop those whole designs into their boards. But, like I said, almost nobody uses them except as a prototyping target.
Samsung also fabbed their SOCs.
PowerVR designed the GPUs in Apple SOCs.
welcome to 2020.
If people want to sideload apps and such, which explicitly hacks around the system designed by Apple, why are they using Apple devices (which are notoriously locked down and restricted to "Apple's way") in the first place? Why not use Android or something else where that sort of thing is designed for?
Same kind of question for jailbreakers. Isn't the desire to jail break indicative that you want a more flexible platform? Why wouldn't you use something else besides Apple?
I ended up "jailbreaking" it because there were just so many things I wanted it to do that weren't possible with the stock OS and approved applications (adblocking, better keyboard layout, file system access, and mouse support among other things). Many of these have eventually been implemented to some extent or another, but at the time they were unavailable via the App Store but easily done via the Cydia store.
Since then, the iPad 2 aged out of usefulness and my need for a tablet has mostly dried up. Likewise, options for other higher-powered, well built, and lightweight laptops/tablets have expanded so I've not purchased any more Apple stuff for the reasons you point out. I basically "jailbroke" because it was the only way to get my device to do what I wanted it to do. Now that they exist, I just use other devices that don't require me to jump through those hoops.
Still, I wish there was a way to install some sort of lightweight OS on that iPad 2 (which is now sitting in a box somewhere).
It's 2020, more than a decade after the iPhone was released, and 99.9%+ of users can still only use Apple's App Store.
I don’t want to boot up Mac
At least in Android devices changing that parameter made them much snappier, although some apps would work, because they absolutely required more than one background process.
I’m quite unconvinced that the boot loader exploit demonstrates that the platform is so insecure that you shouldn’t use it. If you want to run a different operating system on a typical android device you would either not need to circumvent such security measures (which doesn’t mean that they exist or work) or you would use some exploit that didn’t get such wide press.
A (old) phone/tablet running Linux connected via USB/ADB/Network can be used as:
* (Additional) Webcam for your computer
* (Additional) microphone/Touch-pad for your computer
* (Additional) (status) screen for your computer/whatever_with_USB
* Security-cam/Digital-picture-frame for your whatever_with_USB
* (Additional) Network-available environmental sensors (light, temp, vibration).
* (Additional) Wifi/xG Network (scanning) device for your PC
If this doesn't make dozens of useful workflows available to you ... then I don't know, what you have been doing with computers before.
Which was a good thing.
The commenter unironically using Orwellian phrasing like "be sure of the integrity of the device" shows how far we've come.
You don't want to know that your phone hasn't had a rootkit installed on it by someone who borrowed it off your desk while you were in the office kitchen?
Not, say, disable flashing if the passcode protected lock screen is up?
I had a much more snarky answer before this one about how realistic this scenario is for most people, but I will keep it to myself.
But your claim is verifiably false: https://computerhistory.org/blog/microsoft-word-for-windows-...
I guess an os thats not really supposed to be running might be less stable, but I would hope most OSes are stable enough to essentially never require reboots. (aside from power loss + updates)
And I've used unlocked bootloader devices before. My threat model allows for that exploit to be performed on me should someone desire it. The utility I gain exceeds the risk = f(likelihood, loss).
It's all software.
Apple didn't learn from its own mistakes which almost made them bankrupt.
While now it is doing very good, iOS devices competing with Android devices is very similar to Apple competing with PC.
People like choice, competition and not being locked down.
Apple computers have been able to run Windows for a long time.
Still less than 20% of the market. And I feel like the main reason they're even usable as a desktop OS is because 99% of most people's computer activity takes place in a web browser and not because OSX has a particularly special native app collection. Hell it seems like the only time I fire up my mac is because I need to build an iDevice app and nearly every "native" mac app I run is really Electron based.
What does this have to do with the platform? You choosing to use bloated electron apps certainly isn't Apple's fault.
>1997: Microsoft rescues one-time and future nemesis Apple with a $150 million investment that breathes new life into a struggling Silicon Alley icon.