It's the reason I went back to a MacBook. What drew me in is the desktop class i7. At the time the Mini outperformed all other Macs in benchmarks, and it is noticeable and fast during compile times etc.
Once you connect a 4k Screen (or even more screens), an eGPU becomes a must have. The integrated one just can't do smooth UI rendering. For example typing in IntelliJ always felt laggy, because it takes so long to redraw I guess. Or things like Google Maps on satellite view in a big browser window will stutter a lot. All the subtle macOS animations aren't smooth and so on.
Bothered me so much I had to sell it. Was thinking about keeping it as media center for the TV, but again, no GPU makes it less then ideal, and its way too powerful for the job otherwise.
My kid spilled a glass of water on my 2018 MacBook last year (sob). At that time, word was out about the 2019 MacBooks with the new keyboards, so I decided to hold out, took out an old 2012 Mac Mini and have been doing my (Java/Swift) development on it for the past few months. It works so well, I've been in no hurry to pick up a new MacBook. Definitely don't feel animations or anything is laggy.
A good GPU doesn't help. I've just upgraded to a new 16" MacBook Pro with the maxed-out GPU hoping it could handle CLion on a 4K screen, but it's still worse than anything that I've ever seen on Linux. (Not to mention that this laptop generally suffers from input lag.)
One of the many bug tracker threads about this issue: https://youtrack.jetbrains.com/issue/JBR-526?p=JRE-526
There's a linked ticket about rewriting the renderer to use Metal on macOS, hopefully that will solve the issue once and for all.
Anyways I had actually used Eclipse and did not find it particularly responsive. Been a while so maybe it is faster now.
IntelliJ is quite fast for me both on Windows and on Linux, no typing lag.
But is it a fact that the macOS Java implementation is bad? Do you have benchmarks or something else to support that?
It's the only non-Pro designated Mac that has four Thunderbolt 3 ports. I don't think you'd find this level of I/O in any other machine in this price range.
That incredibly powerful I/O makes for tons of expansion possibilities. Storage, GPU, etc.
(Edited to say "non-Pro designated")
The point being that a "Pro" system might have 4 individual Thunderbolt controllers, each getting their own 4x PCIe 3.0 lanes.
I think that's why the specs say "Up to 40GB/sec" on the page. It sadly doesn't say which controller it has... But I guess my whole point is really moot in practicality. Good luck saturating that much bandwidth.
The Mac Pro starts with two controllers but can be configured up to six.
The I/O is actually pretty good and provides for a ton of headroom for expansion down the line (I'm thinking primarily storage and GPU for my use-case).
I can see a pretty high powered eGPU saturating the lanes.
* Mac Pro
* Mac Mini
* iMac Pro
* MacBook Pro 16"
* MacBook Pro 13"
I must have mixed it up in my head.
If you "want" a mid-level "box with slots", that's going to be bigger anyway.
Minis are compact bricks, with much less surface.
So if you need real graphic power mobile gpu doesnt help much and just for output for most people integrated is enough.
In some ways that egpu approach (if it works) is nice.
Really hoping that the next major redesign of this looks less like an Intel NUC and more like a slightly smaller Mac Pro.
An actual mini tower! What a concept.
Is there a Mini Pc that would be similar to this now?
But, the Mac desktop gaming ecosystem is fairly anemic so I doubt most people get much from a better GPU. IMO, a Mac Mini + KVM + 500+$ gaming PC is probably the best all around option.
I tend to agree, but then I also see 4x thunderbolt 3 ports and 1x HDMI 2.0 ports and think how nice it would be to drive all four of my monitors from that one tiny system ...
However the specifications dictate that only 2x 4k monitors can be driven, simultaneously (at 60hz ?) ... ?
Which brings up the same old, tired question:
Just who is it that works as an engineer, at Apple, and has such boring, inexact, low-power-user use-cases that they, the actual creators of this kit, are happy with it ?
How do you do your jobs ? Why don't you need these things ?
Remember, there were years where multi-monitors were completely, totally broken - in OSX, for all models - which suggests that nobody inside Apple uses multiple monitors.
That's weird ... who are these people ?
I use my MBP more than my Razer but I'm disappointed Apple no longer makes the power house laptops they used to.
If you're using a non-evenly scaled resolution (i.e. not exactly 1x or 2x rendering) it's not gonna be fantastic. Running at 'default' 2x my 2018 runs 2x 24" 4Ks without issue - the problem most people have I think is that they buy much larger displays and then want to use a higher rendered resolution, which is where the iGPU will struggle.
I'm not going to buy a 4k screen with the intention of running it at 1080p. There are plenty of large 1080p screens out there.
If I buy a 4k screen it's because I want to run it at 4k.
He is not talking about not running 4k. He is talking about running more than 4k framebuffer scaled down to 4k (the "more space" option in Display control panel).
This doesn't mean I'm running it at a lower resolution - behind the scenes, the mac is actually rendering the display at 6016x3384 and then smoothly scaling it down to the native 4k of 3840x2160.
Because it's having to draw everything at a higher resolution than the actual screen, performance does take a hit, but I find everything still works perfectly smoothly and responsive (just don't try playing games at this resolution on the mac mini!)
I have a second monitor which is not high-DPI but 2560x1440, connected with a USB-C => displayport cable. Surprisingly, the mix of high-DPI and normal DPI monitors works really well, with no problems, even when moving windows from one display to the other.
It isn't running at 1080p. It's doing "@2x" pixel rendering (commonly called "Retina") - so it uses 4 physical pixels to render one "screen" pixel, giving you much crisper.. everything.
Though I get why you might think it's pixelated from stephenr's description.
Edit: I have a 5k screen connected
What is a 'real' GPU? Why is this one not 'real'? If it can be used to run a rendering API and it outputs graphics fast enough to use the computer with a realistic number of monitors then it's a real GPU.
But there no options for CUDA applications, and no options for similar applications on the AMD side. Depending on your use case, the iGPU might not be powerful enough for your needs even without ML programs if you have multiple monitors (and one is 4k).
Would love iMac Pro firepower without being tied to a set display.
I’ve been alive 30 years. A good portion of that was growing up as a PC user, overclocker, teenage hacker, etc. My professional career however has been on Unix and MacOS, and as much as I try to get back to desktop Linux and open source OS’s I just find them to be so brittle and hacked together.
I tried to daily drive Linux and just couldn't do it, and I am not a newb to Linux either. I got my feet wet on SuSE (Kernel 2.4!), and then saw the light of Debian and Ubuntu and have used that primarily since. Remember when you could order free install CD's for Ubuntu? There were copies of Warty Warthog all over my high school. I have installed arch by hand, btw. I’ve used Fedora and RHEL and CentOS and Slackware! Gnome and KDE and XFCE. Remember Compiz? It isn’t the desktop environments. It isn’t the distributions. It’s just the unavoidable consequence of an entire ecosystem of desktop Linux being designed by committee.
So I’ve accepted I’ll be on macOS for a long time to come (not unlike DHH - https://m.signalvnoise.com/back-to-windows-after-twenty-year...) and have started to seriously consider building a hackintosh. But when your time is billed hourly and you have a half dozen clients, time is money and the novelty of a Ryzen-powered hate tank hackintosh for my professional work is starting to look less realistic.
So how about a Diet® Pro for those of us who want something that isn’t a laptop in a big AppleTV chassis and isn’t a desktop in a monitor chassis.
Keeping a hackintosh updated (for even stuff like graphics card drivers) turns out to be a lot of time doing research, scrolling through endless forum posts, and bricking your install quarterly because you "did the wrong thing" (like absent-mindedly accepting a security update notification).
Yes, you can get it running. But macOS is buggy enough as it is, and is decidedly not-pleasant if you're relying on it as a daily driver and you just can't fix the Bluetooth mouse disconnecting after sleep unless you reboot, or the USB bridge not enabling after suspend, or the diaplayports deciding to not send signal if there are multiple monitors (fixed in the new GPU driver, but that driver requires the OS upgrade you can't apply due to other show-stipping bugs). You shouldn't be relieved every time you step up to your desk that your computer can turn on.
(FWIW: this was ~> year ago, on a machine built with the "best compatability with hackintosh" parts list. The landscape with T2 is most likely less comfortable).
That was my experience with a hackintosh. It works great, but your always worried about it not working. The "T2" security chip is a good point. Will likely make Hackintoshes a thing of the pas.
There are some good machines you can buy with Linux installed. I have a system76, it updates all the time without issue. I feel the extra cost over an "install it yourself" is worth it, especially for a laptop. Dell even sells machines with ubuntu.
I really do not understand this sentiment at all. Mind you, I am decidedly on Mojave. I am not discounting the experience of others, but macOS has been rock stable for me since about Leopard. I had kernel panics back on Tiger, but that was their first OS for x86 so I can forgive that.
(I only run windows to do cross-platform testing)
Yes, the /mnt/c situation is bad.
People have been waiting eons for this, and it seems like you'll just have to keep waiting.
From what I can remember, Apple haven't done anything like this since the G4 cube, and they've never given indications they'll do it again. Everyone was praying the new Mac Pro would be the magic xMax, but it just isn't.
Personally I don't think anything wrong with being tied to the iMac display, given it's one of the best you can get.
Because the current Mini serves a lot of Pro needs with the crazy powerful I/O it has.
With eGPUs in a fairly mature state, the Mac Mini can serve pretty powerful video/graphics workloads.
I want more cores, faster cores, and more thermal headroom. A Mac Mini would be a slight performance bump from where I am right now, but the ROI isn't really there.
My workload varies, but I am a freelancer with 3-4 contracts at any given time which vary from wordpress sites or rails apps, vue apps, python apps, up to 'big data' processing pipelines. I usually have a handful of services running for any client, redis, es, postgres, mysql, etc... and the Docker/mac story needs work.
I essentially don't want to ever wait on IO ever again. That is another reason I would like a full desktop chassis: I want PCIe slots so that I can outfit them with something like an Optane SSD.
Attemping to spec out something that would perform better than my current MBP:
3.2GHz 6‑core 8th‑generation Intel Core i7 (Turbo Boost up to 4.6GHz)
64GB 2666MHz DDR4
Intel UHD Graphics 630
1TB SSD storage
10 Gigabit Ethernet (Nbase-T Ethernet with support for 1Gb, 2.5Gb, 5Gb, and 10Gb Ethernet using RJ‑45 connector)
The whole process takes about 20mins and there's professional YouTube videos that take you every step of the way.
This is exactly what I did and it was a breeze (I didn't have much prior experience messing with computer internals). It might look complicated, but it's really not.
If you do go this route, I'd recommend OWC RAM since they're guaranteed to work with Mac and they also tell a little $7 toolkit that has the TORX screwdrivers you need to do the RAM upgrade -> https://eshop.macsales.com/item/OWC/TOOLKITMM18/
Here's a video of the process: https://www.youtube.com/watch?v=qKyv0QP4XPQ
My current test bench is an i3770k in a pretty popular Z77 Asus board so if I find myself an AMD card I could probably prototype it here first.
They wouldn't need the Mini, and the mid-tier would be too far from the Mac Pro.
Apple does this quite frequently.
If you are a gamer, use Windows. If you are working in a .net shop, use Windows. If you are using special software that demands Windows, use it. For everyone else, macOS is the right way to go.
I still use Windows frequently as a consultant - either in testing or in onsite infrastructure deployments on Windows Server hosts. It sucks to use. You will rarely run into a smooth scenario, and due to the proprietary nature of everything you are stuck with whatever is available in the docs or whatever the support team says, assuming you purchased the right support license.
Their greatest contribution to the world is RDP, which is the only saving grace on Windows as far as I am concerned.
Thanks, but I would skip this Mac Mini and go for the latest maxed out NUC instead.
Having just moved to a Windows 10 shop running on cheapest-possible Dell hotdesking workstations, I’m starting to miss macOS and all its command-spacebar glory!
In macOS you'll find Disk Utility by typing "disk utility" in CMD+SPACE, even when you're on a different language system that has it's own name. I couldn't find anything in Windows even when that was the literal name of the panel. And hunting for formerly easy to find settings is one of the things that already drives me crazy in Windows 10.
Also when I type "alert sound" I still find the Sound and Notifications Control Panels even if they're not full matches, but they do deal with alerts.
There's a lot about Windows I like though. It's just not the UX, apart from the Explorer once I exactly set it up as I like it (well it looks the same as Windows 95 mostly).
I admit that this search sometimes works strange, but it adapts to me, so when I'm launching some program often, it remembers that and will always put that program on the top.
Realistically, they both get the job done.
Search "alert" brings up the audio settings.
I'm not sure what you're doing differently.
One less button press than on a mac.
Of course Windows 10 is also in bad shape.
I have 3 hades canyon nucs (homelab) and also a fully maxed mini. The great thing about the mini is it's versatility for expansion via TB3, as well as allowing me to triple-boot macOS, windows, and linux with minimal effort (no hackintosh workarounds needed)
Neat. Of course of the actual base of buyers for this, the fact that it runs macOS is a critical requirement. Running an outdated, hacked version is not a viable consideration for most (I'm fairly certain that the "I'll run a Hackintosh" comments outnumber people actually running a Hackintosh by at least 1000:1).
I don't think there has ever been an Apple release without a barrage of "Lame. No wireless. Less space than a Nomad" kind of responses. As if we are all unaware that it's a competitive space with lots of options.
Yes, there are alternatives. Lots of people buy those alternatives. But it is absolutely not an orange-orange situation.
As one fun aside -- I had a Mac Mini a few years ago and loved it. The power utilization of thing was absolutely bonkers tiny. It just worked 100% of the time. Bought it for like $799. Sold it after having it for I think five years for $500. The kind of bizarre ability of Apple devices to hold value is really a financial factor that many don't consider.
I have used and installed Linux and BSDs for a long time, and yet I had a look at the suggested procedures and I went "nope". There is basically a 50/50 chance that it will work, some features will likely not work at all (particularly cloud-related services), and every little update might doom your machine. Fun project for the hardcore Apple hacker, but imho it would be foolish to base your day-to-day production needs on such a build.
I don't know how well the modern macOS versions as Hackintoshes, but I suspect the community is still strong. The current challenge is probably around all the specialized hardware security and encryption chips on modern Macs, as well as the newer restrictions on signed/unsigned kexts and all the recent kernel/system API lock downs.
The caveat here is: I'm still on 10.12.6. Updating, especially in regards to audio apps, is generally done very slowly. So, I'm in no hurry to run the latest and greatest. But, upgrading to this point was pretty painless.
Biggest troubles are WIFI and Bluetooth (and therefore continuity, needs one of a few selected cards), graphics (no nVidia) and keyboard and trackpad (for desktops, use used Apple peripherals) as well as sound in some configurations (might use a Bluetooth speaker if BT works).
It always can happen that you mess up (or need access to a real Mac for downloading the OS in the first place), so have a Backup and wait a few days after system updates appear, read forums about possible showstoppers and necessary precautions. That said, Catalina is still not perfect and troubles may not be caused by the hackintosh but the OS.
Agreed. But this is where "I'll pay an extra $200 to not have to worry about that" comes into the equation
No matter the distro, under Linux it feels like I’m always fighting weirdness with X11/Wayland/GPU drivers and there’s more rough edges than I can count. It’s death by a thousand cuts. It has massive potential to be great, but it’s just not.
By contrast once your hackintosh is working properly, that’s pretty much it… the fight is over and you can just use your machine. If you set it up right, minor updates are uneventful and the only real maintenance that’s required comes with major OS releases every year or so, but that can be delayed for a long time if you’re not the type to keep pace with OS updates.
That said it is a bit of a technical endeavor, so be prepared for that.
I've been considering getting a Mac mini, but haven't committed yet. Can you tell me which NUCs run macOS so I comparison shop?
If you want to run Linux or Windows, buy a NUC. If you want to run macOS, buy a Mac. Hardware-wise, the Mac Mini is comparatively expensive, but you also fund macOS development and will have a much better experience than any other hardware.
Source: I use NUCs (currently NUC8i5BEH for Linux, plus two other around the house), a MacBook Pro, and have had Mac Minis in the past.
It used to be that Apple charged a lot for RAM and Storage, but you could save a bundle by bringing your own. For example, my 2012 i7 Mac mini has maxed memory and a 2TB SSD. When Apple began soldering stuff in, they failed to reduce their punitive prices on expansion, which really hurts reusability and resale value. In other words, they are far less green than they could be since they must now be disposed of earlier.
At the same time, the only reason those slots were there in the first place was so that a manufacturer could build one main board and then offer different configurations by just swapping out some elements. That benefit no longer holds since you can do the same on a single PCB and still offer the same options. The side-effect of users being able to change it after the fact is lost, but when it wasn't a selling point for the manufacturer anyway it doesn't seem to matter to them.
While not ideal for every user, the mass market doesn't really care. They will probably never upgrade a machine in their lifetime and simply 'upgrade' to a different machine instead.
People often claim a ton of things but then actually don't have any background in the industry (either the engineering side or the high level planning side). Apply Hanlon's Razor instead of making everything that doesn't suit your context automatically be some evil plan.
Of course a company that stands to make a ton of money with a choice between to options that only marginally affect users will choose the ton of money option. It's just that in this case there is no ton of money.
Ask any user if they would prefer an upgradeable device to one that cannot be, they will opt for the upgradeable one.
That is why they were forced to bring back the RAM slots after the failure of Mac Mini 2014 (which had soldered RAM).
Apple does this purely to make more money - they have to spend less on parts if everything is soldered, they can charge an obnoxious amount for more memory and it helps with planned obsolescence.
The current mac mini would sell more if they had user upgradeable SSD. But it is obviously more profitable for Apple to offer soldered one's even if it sells less.
Also if you try to match the boxes spec by spec; the Mini are usually not that much more.
Though I will admit that I typically Brave, Brew, VSCode / Visual Studio and iTerm. It might be me just and not getting any of the real benefit, but I tend to use Windows and Linux just a shell.
I've been tempted to buy a Mac for that alone. If there's any similar options for Linux, I would be very interested.
The Intel NUCs are excelent little machines. Some of them support Intel AMT, so you can have them completely headless in some cabinet.
Then they soldered the SSD onto the motherboard.
I've used any wireless mouse/keyboard bought from Best Buy since. Prefer they are separate as I dont type. Just click through my bookmarks to read and watch content.
It’s such a shame that’s the best Intel can do in that thermal envelope.
AMD is about to ship Ryzen 4000 mobile chips with lower TDP, great integrated graphics, higher clock speeds and more cores. It would have worked fine in the Mac mini. Imagine how good the desktop APUs will be when those come out.
I'm highly doubtful most people here have workloads that it wouldn't be able to handle.
I'm starting to think there won't be any more desktop APUs. The next logical step would be an 8 core CPU with better GPU than the 3400G. That would be good enough desktop for a huge swath of the market and might kill sales of higher end chips.
I use a 2400G for software development. It's the fastest cpu I've ever owned. It runs a 4k desktop, 4k video, or 4k gaming (old stuff I'm not a "gamer") without even running the fan loudly.
The main thing I want is AV1 decode on the GPU.
It will more likely have only 8 CUs like the laptop chips but with a higher TDP (65W?) and clock. OTOH probably only $200ish. Still a killer desktop value.
Maybe the i9-9900T would have worked?
Business as usual for Apple...
In recent years they became an Intel but without an AMD in sight, they do it because they can.
Looks like Apple didn't learn its lesson with the Mac Mini 2014 that had soldered everything and was a FLOP. The later versions reverted to upgradeable RAM again, but soldered SSD.
And that is why these Mac Mini aren't selling as expected - because users want upgradeable computers. And aren't as much of a fool as Apple believes to pay the insane high prices Apple charges for its soldered SSD in these devices.
I think if you're using it for a specialized application with eGPU support, game development, or gaming in Windows via bootcamp it's a passable solution (albeit with lower performance than a native card).
On the for something like web development, it's finicky - Chrome won't pick up on the external GPU unless you have an output directly from the Mac Mini to the graphics card along with separate output from your Mac Mini to your external monitor (which means a lot of plugging and unplugging cables whenever I boot up the machine).
If I could afford a Mac Pro or had space for an iMac, or if Apple actually made the xMac I'd buy those in a heartbeat.
The mini itself is compact, fast, and generally great. Handles VM based dev workloads well. It’s small and quiet (not always silent though).
I used the machine with just the built in integrated GPU for a few months. For gaming it was bad. For lots of 4K screens it was... doable, in the sense that you could actually plug them all in, but UI effects would lag and it felt weirdly disjointed. Some tasks would be very fast then hit a GPU related thing (scrolling, OS animations, etc) and lag, stutter, making the system feel slower than it was.
Got the eGPU and it was 100% plug and play. Works beautifully, completely fixes all the UI issues, games generally run great (as good or better than my previous hackintosh with the same GPU in a direct PCI slot)
I’ve run into a few very small issues. USB-C connector tolerances are not always tight enough to reliably move equipment while it is all on, so sometimes if you grab the mini and drag it around (maybe to position it to plug in another cable) the GPU can disconnect. This generally falls back “gracefully” to the internal GPU but can be a bit annoying. You also have to think a bit more about heat management in your space. The GPU and mini all get hot when under load, you can’t stuff them in an unventilated hole. Overheat behavior is generally that the GPU shuts off and needs to be power cycled.
Compared to an iMac... The iMac is probably a better standalone deal and experience. However, in my case I have many machines and many displays. Mini+GPU lets me share generic displays across multiple machines in a way that an iMac simply can’t. Especially if you have existing windows gaming hardware where you can “recycle” the previous gen GPU in the Mac it can be way more cost effective. The GPU I am using is more powerful than you can get in any mac other than the Mac Pro.
Vs the Mac Pro... well the Mac Pro is better, but costs 4-a zillion times as much. kind of different ballparks. If you want a headless desktop and don’t need the Mac Pro crazy, a mini+GPU is a good option.
As a budget option? Mini+enclosure+GPU isn’t particularly cheap. My setup was ~2k without displays and I have a fairly simple spec setup (512GB, 16GB ram, slowest 6 core).
Where it shines is in modularity with other equipment. I can share the eGPU with laptops, share the GPUs with other windows machines, and share displays with other machines, in that environment it is cheaper and more flexible.
This is probably because the integrated GPU is dipping into your system RAM to drive those displays. I think people who've upgraded to 32GB of system RAM have far less issues running the stock GPU and 4k/5k screens.
I myself am using a stock GPU with a 2018 Mac Mini and an LG 5k Ultrafine + another 2k screen with no issues at all. Amazing machine.
Thanks for the insight, that's where I see it shining is sharing the GPU between a work desktop/laptop, and then a personal desktop/laptop for gaming. All whilst still being able to take advantage of the benefits a slim and light device has when moving around.
Been waiting to hear about an iPhone SE 2.
"Up to 7.8X faster than 16GB"
The UK site is showing an 8th gen CPU
(I jest, I know what you mean. I made one with pretty much just an iPhone although it's definitely not fantastic.)
For a CI or compiling machine, neither are good enough.
Well of course Apple recommends this. It's like asking a barber if you need a premium haircut, but the price is expensive with poor performance, but will do just fine for most people.
A less expensive CI for achieving better CPU performance would be a quick Hackintosh setup with a powerful Intel NUC for 1/2 of the price of a maxed out Mac mini. This isn't for everyone, but it saves me money and even more if it were my existing computer.
The expectation here for a in-house hackintosh CI is that if you can install and setup a Linux machine and configure X11/Wayland, etc it is pretty much the same with a Hackintosh these days.
For $100, you are in dual port cards territory.
While the RAM technically was replaceable on the previous model, it's a major pain that requires you to basically tear the entire thing down. I was hoping for something as easy as it used to be.
Here's a "professional" video of it being done, note the person even breaks the cable for the LED indicator in the process.
It's pretty slow on my recent MBA too.
Anyone have suggestions on what the best option is going forward? Looks like ~$700 to pay Apple to fix, less to DIY but that's a risk in itself, Apple trade-in would give me $310 with damage or $890 without damage (so basically the difference is price to repair).
The cheapest option is of course to not repair at all and just continue using the laptop with an external monitor.
Definitely want to stick with Apple. I only use this computer for DAW / home music studio.
Anyone been in a similar situation and what did you do?