Hacker News new | past | comments | ask | show | jobs | submit | thoughtsimple's comments login

>If the Apple II was the 70s and the Macintosh was the 80s and NeXTSTEP was the 90s and the iPhone was the 2000s, name the thing of this caliber they did in the 2010s.

Arguably, the iPod was 2000s and the iPhone was 2010s. The iPhone really didn't get started until the iPhone 3GS in middle-2009 so it started slowly but really defined the 2010s for Apple.


That's just as true of the others. NeXTSTEP was the 90s but "Mac OS X" wasn't released until 2001. But if you asked someone in 2004 or 2014 if Apple had done anything interesting recently, you'd have been able to name what it was.


That was on a 700 MHz Raspberry Pi 1. On an 1800 MHz Raspberry Pi 400 NEON SIMD the difference was another order of magnitude.

[QUOTE] Comparison - The three 700 MHz Pi 1 main measurements (Loops, Linpack and Whetstone) were 55, 42 and 94 MFLOPS, with the four gains over Cray 1 being 8.8 times for MHz and 4.6, 1.6, 15.7 times for MFLOPS.

The 2020 1800 MHz Pi 400 provided 819, 1147 and 498 MFLOPS, with MHz speed gains of 23 times and 69, 42 and 83 times for MFLOPS. With more advanced SIMD options, the 64 bit compilation produced Cray 1 MFLOPS gains of 78.8, 49.5 and 95.5 times.[/QUOTE]


You can download LiViable from the article author's download page https://eclecticlight.co/virtualisation-on-apple-silicon/

You can run in full-screen mode by doing the normal macOS click on the green stoplight button at the top of the window. The current versions of Apple's virtualization libraries para-virtualize the GPU which runs nearly at full native speed.

It sounds like you want a type 1 VM. Unfortunately none exist for macOS that I'm aware of.


Yep.

With an application level VM (type 2 I think?) I haven't found a way to lock the cursor and keyboard shortcuts to the VM - so my three finger swipe will switch my MacOS workspace rather than my Gnome one, spotlight overrides the VM's shortcuts, etc.

Also moving my cursor to the top left/right of the screen expands the MacOS window controls, breaking the 4rth wall and getting in the way of the VM's DE controls (volume, hot corner, power, etc).

I haven't been able to enable GPU paravirtualization within a Linux VM. I think it's for MacOS VMs running on MacOS.


Won’t work anyway. You can’t virtualize an Intel macOS on an Apple silicon VM. Unless you were just looking to transfer your data files. If so you can share a directory that is the external TM disk. (Not sure if you can do that with Parallels but the article author’s Viable works fine.)


Not even QEMU can manage it? I imagine it's slow of course, but I'm a little surprised that there's simply no solution. Is this a driver issue or something?


Generally if you say "virtualization" people will assume you mean "use the hardware support to run a VM on the host CPU". For that you must have the same host and guest CPU architecture. Anything else is "emulation", which is perfectly doable regardless of host OS and architecture but often slower than you might like. (QEMU's emulation is not particularly fast, certainly.)


I don't understand why these are worth casually distinguishing, but good to know.


From a user perspective the difference is important, because it's often the difference between "can use this for real work without worrying about exactly what's going on under the hood" and "you probably shouldn't use this unless you know you need to use it and that you can live with the performance aspects". (e.g.: booting a Windows guest under virtualization is a very popular thing to do; booting a Windows guest under emulation is probably going to be a lot of work and not actually achieve the end-goal you were after when you do eventually get it running.) They are also vastly different from a technical perspective.


They are very important distinctions and they are completely different.

Virtualization abstracts the actual underlying hardware, creating a virtual instance. You are executing instructions directly on the host CPU with little or no overhead, so it’s fast.

Emulation mimics one system on a different system by converting the instructions of the mimicked system into instructions the host platform can understand. It is generally a very slow process.


It’s possible and I’ve done it, it’s just unusably slow. If you want to use it solely for terminal usage then it’s bearable.


This supposed experience reminds me of running OS X on Pear PC on Windows.

(1) https://pearpc.sourceforge.net/


QEMU can emulate it.


There has to be a less than pleasant way to hackintosh a vm of the Macbook VM.

A non vm alternative in the meantime is buying and keeping a used iMac to use target disc mode to boot a backup off the hard drive. But who wants to do that.


It also seems to be on by default on iOS 17.1. It doesn't seem to on by default in MacOS Sonoma (14.1).


Scroll up. Still there. We all need to get used to many more scroll pages.


Never mentions Siri at all. Weird. Newest watches it’s all on device. Maybe he doesn’t know that?


There was one brief mention about it being difficult to use in a loud kitchen. From the way they framed it, I think they might be coming from some sort of commercial kitchen background?

> A kitchen also tends to be a noisy place, with multiple conversations and background music. If Siri doesn’t understand what you’ve said, you’re going to end up with burnt onions.

Doesn't apply to my own home cooking situation, but if you're a professional of some sort...


> Doesn't apply to my own home cooking situation

Tell me you don't have annoying little kids, without telling me.

I can personally attest that in any room, Siri's accuracy, and even willingness to activate reliably at all, is utterly abysmal, especially on the Watch. Raise to speak? Just tried it 4 times. It activated and answered once. The other three, it just ignored me.

I'm grateful that I never have to use anything Siri-based for cooking, since I use the Google Nest Hub thing (the one with a screen), which though also imperfect in its speech parsing, at least most of the time is capable to execute "Set a 5 minute onion timer" accurately. Though he makes a good point that if you wander around a big house you might miss your timer alert. Thankfully my house isn't big enough for that.


My child experience always featured them respecting boundaries I set around places they could yell, but I'll admit that's probably very situational.

I was complaining about it elsewhere on this page, but WatchOS 10 seems to have done something awful to raise-to-speak's accuracy -- hopefully it's a bug that they'll promptly fix.

Most of my timer usage for cooking is actually with a HomePod mini that I have in my kitchen, which I've found to be very reliable for it. That said, in part it's because until this most recent round of OS updates it was the only Apple device that let you run multiple simultaneous timers, so I haven't had any chance to rebuild habits away from it yet. :D


Perhaps you didn't mean to be snide, but if your children are capable of just learning appropriate behavior, please recognize what a blessing this is. Different neurotypes exist, and are not so simple to teach or control.


That's why I said it's very situational.


It's not "all on device". Any Siri request that hits the network (which is many of them) send your entire address book to Apple.

Siri is a privacy nightmare and everyone sane turns it off immediately.


It was always a stupid rumor. Can you imagine if Apple required a different USB-C cable for your new iPhone rather than what already works with iPads and MacBooks? They would never do that.


> Can you imagine if Apple required a different [... snip ...] cable for your new iPhone rather than what already works with iPads and MacBooks?

My alteration of your comment brings it in line with the status quo prior to the USB-C which, keep in mind, Apple fought tooth and nail against. It is a stupid idea, but Apple did it anyway. The world runs on stupid ideas daily; either because they are more profitable, or are too entrenched, or both.


let's recall that lightning came out before usb-c, because vendors within the USB-IF didn't want to upgrade their devices to replace micro-b with something more expensive and were stalling the acceptance of the standard.

for everything you can say about apple, they have at least been consistent on connectors, some would say to a fault. when they deprecated the 30-pin they promised that iphone (not ipad, not laptop) would remain on lightning for 10 years.


I'm not buying it. It's just a cable, and regular wear and tear (user error in Apple parlance) pretty-much dictates a lifetime far less than 10 years. The reason is profit, no more, no less.


Them: “Apple promised to ship their iPhones with lighting for at least 10 years”

You: “I don’t buy it, cables wouldn’t last 10 years”

It’s a non-sequitur.

As for profit motives, if that were the case then they wouldn’t voluntarily switch their iPad line over to USB-C.


> It’s a non-sequitur.

This isn't a formal debate, I am allowed to have different reasoning from the rest of the group.


You’re allowed to say whatever you want and I’m allowed to point out that it makes no sense within the context of the discussion at hand.


Most of the software I’ve written has gone into hardware products. Currently working on an embedded Linux system and writing a UI in ReactJS. We are late but only by a couple of months and I have every expectation that the software will be completed. The product has pre-sold about $1M already.

If you want to be working on shipping products, work on embedded stuff.

I’ve done customer facing business software too. Still mostly successful but that is where I’ve seen a few failures. One project I did was writing automated functional/integration tests didn’t see release but I take no blame for that one. The tests worked. The product didn’t.


Yeah but a 1 GHz N200 yeesh. Everything else looks great.


It's a tablet, not a workstation PC. I have a Surface Go w/ 4GB of memory running Fedora 37 that's plenty fast for what I use it for.

Apparently this also supports hardware accelerated AV1 decoding and h265 encode/decode.


I haven't used one myself but the N200 looks pretty ok for a tablet that's supposed to run a long time on battery. Quad core Skylake-ish cores that turbo to 3.7GHz?


Notebook check says equivalent to a Core i5-8250U. That is not good in 2023.

https://www.notebookcheck.net/Intel-Processor-N200-CPU-Bench...


It depends on what you're doing. My 4th low power i5 in my laptop and my i3 7100 in my main desktop are just fine for web browsing and development.


Yes, and in this case 6W TDP and cheap enough for a $500 tablet seem like key drivers of the N200's design. Of course the Apple M1 is probably twice as fast at similar cost and power but compared to everything outside Apple the N200 looks pretty decent.


If they make a Ryzen 7040 APU version, I'll take it as my daily driver.


7040 in its lowest TDP configuration is 15W. Intel N200 is 6W. Even accounting for some differences in how both companies measure TDP only one of these can be passively cooled in that sort of form factor.

The lack of low-tdp products on AMD side was also one of the reasons given by pcengines to discontinue their embedded line. AMD's last 6W APU was the 2-core R1102G which is now a couple generations old.


Where did you get 1Ghz from? The N200 has a max clock of 3.7Ghz.


From the Specification section of the website:

> 1.00GHz quad-core Intel Alder Lake N200

> Turbo Boost up to 3.70GHz, with 6MB Smart Cache


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: