Hacker News new | past | comments | ask | show | jobs | submit login
Catalina VM (bit-101.com)
75 points by chmaynard on Sept 27, 2020 | hide | past | favorite | 25 comments



There's two articles [1] now on the front page of HN from this bit-101.com domain and they both seem pretty basic for HN. The other one is just about some guy installing a third party CPU cooler. No new tech, nothing interesting about the cooler, he just found it works better than the stock one. This article walks through using a shell script he found on GitHub to virtualize MacOS and his install and his very unscientific opinions on performance. Where's the beef?

1. https://news.ycombinator.com/item?id=24599832


> pretty basic for HN

HN is not a homogeneous mass anymore and not everyone has experience with everything already.

> Where's the beef?

The usefulness of this article probably depends on what you take away from it. That you can viably run OSX in a VM was certainly news to me.


Yeah, I'm the author of those articles. Sorry they are not beefy enough for you, but I'm not the one submitting them here. In fact, HN is not a site I usually visit at all. This is just my personal blog where I post stuff that I find interesting.


From the HN Guidelines: "Be kind. Don't be snarky. Have curious conversation; don't cross-examine. Please don't fulminate. Please don't sneer, including at the rest of the community."

Oh, and I think you meant "There are two articles".


Running macOS in a VM that is not running on a Mac is a violation of your contract with Apple. However, there are no restrictions on running a macOS VM on a mac. This can be very useful if you are a developer who needs to test on older macOS versions. I use VirtualBox running on a Mac, although I believe KVM/QEMU also works on a mac. (on Intel Macs -- this won't work in the future with Apple Silicon) for testing on older macOS versions. As usual with advice on the internet: caveat lector. I'm not a lawyer.


I have used this set up in the past. With GPU, NVMe and USB passthrough it feels pretty much native.

There are definitely some caveats. You can't just passthrough any GPU. Post High Sierra AMD has the best support. Select Nvidia GPUs like the GT 710 still work, but it doesn't do hardware video decoding. I could never get sleep to work either.

With Apple moving towards ARM in the future, I thought of giving Linux another shot. It's not as polished but so far has been adequate.


GitHub Actions provide another option for testing against macOS: See action-tmate [1] and fastmac [2] for reference, as well as previous discussion on HN [3].

[1]: https://github.com/mxschmitt/action-tmate

[2]: https://github.com/fastai/fastmac

[3]: https://news.ycombinator.com/item?id=24452384



In 2006 or 7 I played around with OS X Tiger in a VM from Windows and it was like a portal into a much more advanced operating system. It was the beginning of me moving entirely to OS X over time.

At the time, almost everyone was on XP and Tiger was very impressive. Looking back, most of the advantages of Tiger were audiovisual in nature. But the vastly better command line was also a big deal.


If you’re not using it to run Xcode what’s the point?


Also you can't run anything Metal-based by default (which is a lot of things on macOS these days). You can theoretically PCI passthrough a GPU, but we're not to the point where you can use GVT-g based Intel passthrough to get something functioning. Such magic has seemed "very close" for a number of years now [1]

1. https://gist.github.com/artizirk/28aa4c28b252bd679a4daf84d91...


It's a hack. The best hacks are intrinsically interesting. Their only purpose is to satisfy one's curiosity. They don't need to have a point. :)


I do this because my family is locked in to iMessage and FaceTime, and to test programs that I write to ensure they work on Macs.


Is there really no way to reverse engineer the iMessage protocol? Or is it just difficult and no one is motivated.

I'm in a similar situation where my family and some of my friends only want to talk over iMessage. Last time my iphone wasn't available for a while I had to spend a couple hours pirating OSX and fiddling with a bunch of hackintosh tools just to talk to the girl I was dating at the time.


> just to talk to the girl I was dating at the time

You mean she wouldn't talk to you if it wasn't on iMessage? That's not love (or friendship for that matter).

I would use smoke signals, learn sign language, use Morse or whatever is necessary to communicate with my love.


It's not that she would have refused, we just got so used to using imessage that we didn't have a working backup (we had sent emails but that's not always reliable.)

We lived hours apart and I needed the address of the place we planned on meeting the next day (she had already sent it, but over imessage.)


Audio Units... the most annoying vendor lock-in ever.


I just installed macos in a VM a few hours ago to test cross platform builds


Audio software, Video software, Graphics software, plus pretty decent UX.


> I’ve been considering building a “Hackintosh” system. But with Apple’s plans of going to ARM possibly by the end of this year, I don’t want to invest a lot in hardware that’s going to be obsolete soon.

This argument doesn't make much sense to me. While not all hardware works with the goal of building a hackintosh. Plenty of top of the line hardware does and will run Windows just fine if ARM becomes an issue.


You generally build a hackintosh machine with parts that are most compatible with MacOS and not necessarily the best bang for buck. So, I agree with the argument from the article here. If ARM makes the build obsolete, then all you now have is a Windows machine that you did not spend your money wisely on.


Have you built a Hackintosh? I have, the only two parts that require any consideration are the Motherboard and the GPU, and even then enough Motherboards are fine as to say really the only consideration is the GPU. I admit you pretty much have to use a stock AMD GPU, if you’re ok with essentially this one requirement then you’re set, if not then yea you didn’t spend your money wisely.


Well, I did build a hackintosh and the parts you claim need to be fixed are exactly the parts I claim are not spent wisely on. You're paying for compatibility not performance. I mean the AMD offerings are no where near Nvidia, particularly with the release of the 3000 series. But you have to settle with AMD for compatibility.


A lot of AIB cards work too, for example the Sapphire 5700XT Nitro+ is popular for hackintoshing. There are a handful that don't (usually because the GPU vendor decided to make some funky design change), both those surface pretty quickly after the release of a new model and are easy to avoid.


It's argument that has not been given much careful thought. Hardware bought today will likely become obsolete itself before Apple obsoletes it.

Apple will likely be supporting Intel hardware with updates for at least 5 years and probably much longer. Application devs will likely provide even longer support, say 10 years plus.

There is a massive installed base of Intel machines and ARM is a downgrade for anyone who values compatibility (such as Bootcamp, VMs, etc) so there will not be a rush to the new platform like previous transitions.

Most of the attraction of ARM hardware will be in laptops because of improved power efficiency. People with desktops will have much less incentive to upgrade early, or at all.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: