Hacker News new | past | comments | ask | show | jobs | submit login
Mac mini (apple.com)
283 points by tosh 12 days ago | hide | past | web | favorite | 248 comments





Still no real GPU.

It's the reason I went back to a MacBook. What drew me in is the desktop class i7. At the time the Mini outperformed all other Macs in benchmarks, and it is noticeable and fast during compile times etc.

Once you connect a 4k Screen (or even more screens), an eGPU becomes a must have. The integrated one just can't do smooth UI rendering. For example typing in IntelliJ always felt laggy, because it takes so long to redraw I guess. Or things like Google Maps on satellite view in a big browser window will stutter a lot. All the subtle macOS animations aren't smooth and so on.

Bothered me so much I had to sell it. Was thinking about keeping it as media center for the TV, but again, no GPU makes it less then ideal, and its way too powerful for the job otherwise.


If IntelliJ can't handle realtime rendering of text on modern hardware, perhaps the problem is with the software?

My kid spilled a glass of water on my 2018 MacBook last year (sob). At that time, word was out about the 2019 MacBooks with the new keyboards, so I decided to hold out, took out an old 2012 Mac Mini and have been doing my (Java/Swift) development on it for the past few months. It works so well, I've been in no hurry to pick up a new MacBook. Definitely don't feel animations or anything is laggy.


Indeed it is a software problem. In my experience, JetBrains IDEs perform terribly on macOS as soon as you use a 5K/scaled 4K screen.

A good GPU doesn't help. I've just upgraded to a new 16" MacBook Pro with the maxed-out GPU hoping it could handle CLion on a 4K screen, but it's still worse than anything that I've ever seen on Linux. (Not to mention that this laptop generally suffers from input lag.)

One of the many bug tracker threads about this issue: https://youtrack.jetbrains.com/issue/JBR-526?p=JRE-526

There's a linked ticket about rewriting the renderer to use Metal on macOS, hopefully that will solve the issue once and for all.


IntelliJ tools always felt like molasses to me. Maybe has something to do with them being done in Java. Most of my dev tools are native and the feeling is totally different.

Well, my two primary tools are Xcode and Eclipse. Xcode is native but Eclipse is much, much faster. I doubt appliction performance has much to do with the language they're written in.

Not the language but what it compiles to and of course some sorcery that few people possess.

Anyways I had actually used Eclipse and did not find it particularly responsive. Been a while so maybe it is faster now.


yeah they have a bunch of open bugs about performance making it unusable, they are even rewriting their rendering to use Metal underneath

Yes, the problem must be in software, but not that software.

IntelliJ is quite fast for me both on Windows and on Linux, no typing lag.


IntelliJ is unusable on my Mac Mini hardware.

I'm afraid it's partly the problem of macOS and the Java implementation for it. That's what I meant speaking about "other" software.

But Eclipse and all other java software is amazingly fast. It's just IntelliJ that's really slow. I can't see how that can be blamed on "the java implementation"?

But is it a fact that the macOS Java implementation is bad? Do you have benchmarks or something else to support that?


Mac Mini doesn't need a real GPU in my opinion -- what they need is a mid-level desktop that accepts a real GPU. $1000 - $1500 with decent specs. But, we know they won't move into that space.

You can get a new Mac Mini and a Thunderbolt 3 eGPU for $1500 or less.

In fact, you can get the new entry-level Mac mini plus the Sonnet Breakaway Puck RX 560 [0] for US$1100, now that the eGPU has dropped in price to $300. Whether any given person would be satisfied with that configuration is, of course, open to debate.

[0] https://www.apple.com/shop/product/HMT22ZM/A/sonnet-egfx-bre...


Does anyone make a "case" that holds a Mac mini and eGPU in an attractive desktop-sized package?

While this is probably too niche to make economic sense as a product, this seems like a perfect use case for a vertically-mounted PCIe slot, creating a tall GPU pedestal for the Mini in a similar vein to the G4 Cube.


Thanks, I had no idea eGPUs were a thing!

That's the whole premise of the Mac Mini.

It's the only non-Pro designated Mac that has four Thunderbolt 3 ports. I don't think you'd find this level of I/O in any other machine in this price range.

That incredibly powerful I/O makes for tons of expansion possibilities. Storage, GPU, etc.

(Edited to say "non-Pro designated")


Not all Thunderbolt 3 ports are the same. Good chance that two ports are sharing a single PCIe 3.0 4x controller. Since PCIe 3.0 is good for ~7.8Gb/sec/lane, you're looking at two 40Gb/sec Thunderbolt ports sharing about 31Gb/sec of bandwidth on the PCIe bus. Okay, probably not that big of a deal really.

The point being that a "Pro" system might have 4 individual Thunderbolt controllers, each getting their own 4x PCIe 3.0 lanes.

I think that's why the specs say "Up to 40GB/sec" on the page. It sadly doesn't say which controller it has... But I guess my whole point is really moot in practicality. Good luck saturating that much bandwidth.


Mac mini, iMac Pro, and Macbook Pro all have two Thunderbolt 3 controllers.

The Mac Pro starts with two controllers but can be configured up to six.

The I/O is actually pretty good and provides for a ton of headroom for expansion down the line (I'm thinking primarily storage and GPU for my use-case).

I can see a pretty high powered eGPU saturating the lanes.


4K 60Hz is 8Gbps at 8bit/channel 4:2:0 color. If you go to 16bits and 4:4:4, teach of those double the data rate.

Currently shipping Macs with four Thunderbolt 3 ports:

* Mac Pro

* Mac Mini

* iMac Pro

* MacBook Pro 16"

* MacBook Pro 13"


You're totally right, I should have said non-Pro designated Mac.

I must have mixed it up in my head.


Head to https://egpu.io/ then

kinda negates the whole mini part.

I never understand this argument.

If you "want" a mid-level "box with slots", that's going to be bigger anyway.


I don't think that's the point: if you can ship 16" MacBook Pros with dedicated GPU you could do the same with MacMini; why don't you give me such option? ("you" stands for "Apple" here btw)

desktop grade cpus and gpus have very different thermal characteristics. it quite impossible to have silent + desktop cpu + desktop gpu + very small space.

Yes, which is why it would be good if they had a mac mini that had laptop grade CPU and GPU instead.

They do. It also includes a screen, and a keyboard and trackpad. It's called.. a laptop.

Putting the cpu/gpu of the 16" MB Pro would enhance the mini greatly. The thermals would be be much better than in a laptop resulting in better performance. Yet, you wouldn't have to pay for an unused screen/battery. Also, while you can use the MP Pro in clamshell mode, it is making the thermals even worse. So why should one get a laptop, if one wants a compact desktop?

Apple's hardware business model is driving margins by bundling unnecessary features in the "trim levels". They don't want you picking and choosing only the parts you need.

Well we are talking about the Mini vs. a laptop, so entirely different devices. There is no trim level of the Mini with a reasonable graphics chip.

MacBooks are not desktops, they are laptops

Still they have much more area per unit of volume, so, a better way to dissipate heat.

Minis are compact bricks, with much less surface.


So you're saying that the MacBook shape is more thermally efficient than the MacMini shape? Why don't they do MacMinis with the MacBook shape?

I am not sure what you don't understand. There is no way you can cram desktop grade cpu and gpu in such a small space. You could put laptop line of cpus + gpus there which are slower and probably shouldn't be run 24/7 (apples usecase is NAS). Or you put just desktop cpu which has ok integrated graphics already.

Can't you use a laptop GPU with a desktop CPU? Logically I'd say "yes", but I'm not an hardware expert, just asking

You probably can but maybe it still just doesnt make sense to add it because of heat/money cost. The integrated gpus nowdays are pretty good you can connect 3x4K screens in that mac mini.

So if you need real graphic power mobile gpu doesnt help much and just for output for most people integrated is enough.

In some ways that egpu approach (if it works) is nice.


It probably doesn't, and in some sense, that's the problem.

Really hoping that the next major redesign of this looks less like an Intel NUC and more like a slightly smaller Mac Pro.

An actual mini tower! What a concept.


A "Mini Pro" is what I've been wanting from Apple forever (in tech years). Just a small tower with room for a graphics card and two or three SSD/NVMe drives. I'd be fine with paying Apple's customary 100+% hardware premium for something like that.

That's exactly what I want to.

Is there a Mini Pc that would be similar to this now?


You can do all that now with the mini, it has 4 thunderbolt ports.

I thought it was pretty clear I was talking about something much different from hanging multiple external devices off the back of a Mini.

Thunderbolt isn't nowhere near close in performance to built-in PCI express slots.

I guess it depends on the width you need. Indeed you won't be able to run x16 or x8 but x4 is possible which is enough for full-speed nvme or dual 10GbE

iMac covers that for most people. eGPU’s also lets you extend the Mac Mini at the cost of more wires and boxes.

But, the Mac desktop gaming ecosystem is fairly anemic so I doubt most people get much from a better GPU. IMO, a Mac Mini + KVM + 500+$ gaming PC is probably the best all around option.


"Mac Mini doesn't need a real GPU in my opinion ..."

I tend to agree, but then I also see 4x thunderbolt 3 ports and 1x HDMI 2.0 ports and think how nice it would be to drive all four of my monitors from that one tiny system ...

However the specifications dictate that only 2x 4k monitors can be driven, simultaneously (at 60hz ?) ... ?

Which brings up the same old, tired question:

Just who is it that works as an engineer, at Apple, and has such boring, inexact, low-power-user use-cases that they, the actual creators of this kit, are happy with it ?

How do you do your jobs ? Why don't you need these things ?

Remember, there were years where multi-monitors were completely, totally broken - in OSX, for all models - which suggests that nobody inside Apple uses multiple monitors.

That's weird ... who are these people ?


Thunderbolt 3 ports are multipurpose: disks, external GPU's,etc. Just because you have 3 of them does not mean you should plug in a monitor to each one. And if you really need 4 screens that eGPU will probably handle it.

MacBooks don't have real GPUs either (I assume you mean Macbook Pros). Running the default HDRP Unity scene on a MacBook Pro runs around 30fps, Running on a 3yr old Razer 1060 runs at 180fps.

I use my MBP more than my Razer but I'm disappointed Apple no longer makes the power house laptops they used to.


the 15" macbook pro has a dedicated gpu, the mac mini only has integrated gpu.

The performance depends very greatly on the resolution you run the 4K screen at.

If you're using a non-evenly scaled resolution (i.e. not exactly 1x or 2x rendering) it's not gonna be fantastic. Running at 'default' 2x my 2018 runs 2x 24" 4Ks without issue - the problem most people have I think is that they buy much larger displays and then want to use a higher rendered resolution, which is where the iGPU will struggle.


> the problem most people have I think is that they buy much larger displays and then want to use a higher rendered resolution

I'm not going to buy a 4k screen with the intention of running it at 1080p. There are plenty of large 1080p screens out there.

If I buy a 4k screen it's because I want to run it at 4k.


> I'm not going to buy a 4k screen with the intention of running it at 1080p.

He is not talking about not running 4k. He is talking about running more than 4k framebuffer scaled down to 4k (the "more space" option in Display control panel).


I'm using a 4k display on a mac mini, but I've got it set halfway between the 'larger text' (1080p x2) and 'more space' (4k with no pixel doubling).

This doesn't mean I'm running it at a lower resolution - behind the scenes, the mac is actually rendering the display at 6016x3384 and then smoothly scaling it down to the native 4k of 3840x2160.

Because it's having to draw everything at a higher resolution than the actual screen, performance does take a hit, but I find everything still works perfectly smoothly and responsive (just don't try playing games at this resolution on the mac mini!)


Mac Mini user here also: What display are you using?

It's a BenQ EW3270U, 4k @ 60hz, connected via USB-C. It's been working just fine, I'd recommend it.

I have a second monitor which is not high-DPI but 2560x1440, connected with a USB-C => displayport cable. Surprisingly, the mix of high-DPI and normal DPI monitors works really well, with no problems, even when moving windows from one display to the other.


... I think you've missed the point.

It isn't running at 1080p. It's doing "@2x" pixel rendering (commonly called "Retina") - so it uses 4 physical pixels to render one "screen" pixel, giving you much crisper.. everything.


..crisper as in more visibly pixelated? What is the point in that?

It is not more pixelated. It's like scaling up all fonts and widgets by 2x, but still rendering them at 4k. So, the fonts and widgets are retina-sharp. Imagine using 4k as the resolution, but rendering all 14pt fonts as 28pt.

Though I get why you might think it's pixelated from stephenr's description.


Crisper as in everything has 4x antialiasing.

I run my 2 27" 4Ks (off a Hackintosh) at 1080p HiRes. Is for me the best balance of aesthetics and resolution.

Isn't "4K" the resolution??

Macs have the option to run a 4K screen at different "effective dpis" via scaling. The default ~200% scaling factor gives you a 1920x1080 work area that is twice as sharp. A lot of folks (myself included) choose to draw larger work surfaces (I use a 5120x2880 canvas with 2560x1440 work area on my 24 inch 4Ks). A 4K screen is high enough DPI that when scaled text still looks very nice (nicer than a native res 1440p display). Doing this (especially on multiple displays) requires a lot of GPU power.

As others have said it is the physical resolution, which is why the term used in macOS is "Looks like X by Y". The default of "looks like 1920x1080" is straight "@2x" - so the rendering is pixel doubled in both axis, giving a crisper image than a same size "real" 1920x1080 display.

Yes, but since you can get 4k on everything from 24" to 84" you'll want to choose between more space or bigger fonts depending on the setting. macOS can do 'fractional scaling' when the integer scaling isn't to your preference, but the implementation works by rendering to a much higher resolution internally and then scaling down which needs a beefy GPU to run without noticeably overhead.

Anything bigger than 24" or so and the blurriness gets bad. The pixel density just isn't quite there.

Depends how close you are to it, though, I would think

I am reading this on 32" 4K monitor (2 of them actually are hooked to a gaming laptop). I do not notice any blurriness.

Are you running a non-default resolution?

I am running those at native 4K resolution. Frankly I do not get a point of buying 4K monitor to run it at lower resolution

Yep this definitely deserves downvote. I guess 4K must be blurry by definition

Yes, but I guess just like I don't know anyone who runs their retina macbook at the native resolution, you don't need to run a 4K screen at 4K. For instance 1080p gives you perfect 2x2 scaling with no blurriness at all since every pixel perfectly translates into 4. I have no idea why you'd do that, but I can see why that's a valid consideration for some people.

Just chiming in here and endorsing the eGPU add-on to resolve 4k monitor performance for non-standard resolutions / pixel density. Without it things get pretty chunky. With it, smooth as butter.

Are you already running Catalina? I observed a huge UI performance increase with the current Mac Mini on Catalina. The lagginess and stuttering you described bothered me to no end, but it feels (almost) completely fine on Catalina.

Edit: I have a 5k screen connected


> Still no real GPU.

What is a 'real' GPU? Why is this one not 'real'? If it can be used to run a rendering API and it outputs graphics fast enough to use the computer with a realistic number of monitors then it's a real GPU.


Well there's an integrated GPU sure - it does 'technically have a real gpu'.

But there no options for CUDA applications, and no options for similar applications on the AMD side. Depending on your use case, the iGPU might not be powerful enough for your needs even without ML programs if you have multiple monitors (and one is 4k).


Intel GPUs generate artefacts in FCPX and Motion 5 under heavier load.

Still got my fingers crossed for a desktop machine that isn’t a Mac Pro and isn’t a Mac Mini.

Would love iMac Pro firepower without being tied to a set display.

I’ve been alive 30 years. A good portion of that was growing up as a PC user, overclocker, teenage hacker, etc. My professional career however has been on Unix and MacOS, and as much as I try to get back to desktop Linux and open source OS’s I just find them to be so brittle and hacked together.

I tried to daily drive Linux and just couldn't do it, and I am not a newb to Linux either. I got my feet wet on SuSE (Kernel 2.4!), and then saw the light of Debian and Ubuntu and have used that primarily since. Remember when you could order free install CD's for Ubuntu? There were copies of Warty Warthog all over my high school. I have installed arch by hand, btw. I’ve used Fedora and RHEL and CentOS and Slackware! Gnome and KDE and XFCE. Remember Compiz? It isn’t the desktop environments. It isn’t the distributions. It’s just the unavoidable consequence of an entire ecosystem of desktop Linux being designed by committee.

So I’ve accepted I’ll be on macOS for a long time to come (not unlike DHH - https://m.signalvnoise.com/back-to-windows-after-twenty-year...) and have started to seriously consider building a hackintosh. But when your time is billed hourly and you have a half dozen clients, time is money and the novelty of a Ryzen-powered hate tank hackintosh for my professional work is starting to look less realistic.

So how about a Diet® Pro for those of us who want something that isn’t a laptop in a big AppleTV chassis and isn’t a desktop in a monitor chassis.


Installing and applying security updates on a Linux desktop is fairly automatic and uneventful.

Keeping a hackintosh updated (for even stuff like graphics card drivers) turns out to be a lot of time doing research, scrolling through endless forum posts, and bricking your install quarterly because you "did the wrong thing" (like absent-mindedly accepting a security update notification).

Yes, you can get it running. But macOS is buggy enough as it is, and is decidedly not-pleasant if you're relying on it as a daily driver and you just can't fix the Bluetooth mouse disconnecting after sleep unless you reboot, or the USB bridge not enabling after suspend, or the diaplayports deciding to not send signal if there are multiple monitors (fixed in the new GPU driver, but that driver requires the OS upgrade you can't apply due to other show-stipping bugs). You shouldn't be relieved every time you step up to your desk that your computer can turn on.

(FWIW: this was ~> year ago, on a machine built with the "best compatability with hackintosh" parts list. The landscape with T2 is most likely less comfortable).


>Keeping a hackintosh updated (for even stuff like graphics card drivers) turns out to be a lot of time doing research, scrolling through endless forum posts, and bricking your install quarterly because you "did the wrong thing" (like absent-mindedly accepting a security update notification).

That was my experience with a hackintosh. It works great, but your always worried about it not working. The "T2" security chip is a good point. Will likely make Hackintoshes a thing of the pas.

There are some good machines you can buy with Linux installed. I have a system76, it updates all the time without issue. I feel the extra cost over an "install it yourself" is worth it, especially for a laptop. Dell even sells machines with ubuntu.


> But macOS is buggy enough as it is

I really do not understand this sentiment at all. Mind you, I am decidedly on Mojave. I am not discounting the experience of others, but macOS has been rock stable for me since about Leopard. I had kernel panics back on Tiger, but that was their first OS for x86 so I can forgive that.


You are fortunate. That being said I've never had any issues with Linux myself so why take the risk if all you get in return is a bubbly GUI and some bloatware

Is there a Linux distribution that fully supports the desktop version of Office 365 (Word, Excel, PowerPoint)? I would love to migrate to Linux, but need Office to work on papers and presentations with the rest of my lab. LibreOffice doesn't cut it. I tried Wine in the past and it was too buggy.

They work pretty well in wine from what I've heard. I dont' have any direct experience running them that way though.

I’m pretty shocked you haven’t moved to either office 365 or google docs/sheets

Popular reference managers, e.g., Mendeley, EndNote, don't support them.

Worth remembering: the current non-pro iMac doesn’t have a T2 chip. Non-T2 Macs will be supported by macOS for years to come.

I'm in a similar boat to you and am keeping an eye on WSL2 for Windows. It SEEMS like it could cover my entire dev workflow and have similar performance to MacOS/*nix, but then again I thought that the original WSL was going to do that as well ...

I started down that path. But when you install Windows 10 Pro and your login screen is covered in advertisements and news updates, it leaves a pretty shitty taste in your mouth.

Agreed, shitty, but those aren't hard to disable, and it's one of the first things I do on new installs.

(I only run windows to do cross-platform testing)


For most users that get that nasty aftertaste, it's not as much that you can't remove or disable those ads/unexpected feeds, but that the manufacturer or core team that built it had the mentality to put it there in the first place.

I do not have ads at all and did not have to disable them. This maybe because when I install windows I do it using local account. Not the online account with MS. And of course I answer NOOOOOOO to every question they ask during install.

A year ago I did some contract work building docker based software. Had to run many dockers. What I did is I installed VMWare on my Windows laptop and then created Linux VM with 16GB RAM and 4 Cores. Then all docker based development was done on that VM so I did not have to launch separate VM for each container. All fast and snappy.

What are you currently missing with the current WSL?

filesystem performance is absolutely horrible, which affects lots of workflows.

If you create the files on the linux side as compared to shared windows folder, the performance seems to be good for me. Before vscode wsl plugin, the only way to work was to open files from windows and use the linux cli but the plugin made it so that you can stream files from the linux, which removed the need for using any shared drive for me. Perf bonus of switching to linux only workflow was significant, you should give it a try.

That is only for cross OS FS work, the Linux FS speed is as good as native.

Yes, the /mnt/c situation is bad.


For backend development I am writing native C++ servers that run on Linux. But all the development and debugging is done on Windows using Visual Studio. After I am ready to publish I check out the source and rebuild it on Linux. Since I use multi platform libraries all works like a charm and I have yet to encounter any platform related problem. I can still edit/debug project on Linux ( Using CodeLite ) but the need is very rare. This way I get to either build my hardware any way I like or buy a laptop (the kind I can customize) with exact configuration I need.

> Still got my fingers crossed for a desktop machine that isn’t a Mac Pro and isn’t a Mac Mini

People have been waiting eons for this, and it seems like you'll just have to keep waiting.

From what I can remember, Apple haven't done anything like this since the G4 cube, and they've never given indications they'll do it again. Everyone was praying the new Mac Pro would be the magic xMax, but it just isn't.

Personally I don't think anything wrong with being tied to the iMac display, given it's one of the best you can get.


The iMac display is great, I am using it right now :). But I don't get why it is still locked at 27". Apple once used to make 30" cinema displays. Not only is the iMac overdue for a redesign, it better should have an option for a larger display. Also, it is absolutely offensive, that the iMac as a desktop is glued together, preventing even the simplest maintenance like fan cleaning.

Is it glued together? The glass is no longer removable with suction cups?

Yes, it is glued together, suction cups are no longer sufficient to remove the display. To service the current gen iMacs, you have to cut away the glue and afterwards re-glue the machine. My dealer charged me about $300 labor cost for a HD replacement due to this.

The iMac Pro price tag is hard to swallow when I already have a 4k display that I really like (HP Z27)

What are your needs?

Because the current Mini serves a lot of Pro needs with the crazy powerful I/O it has.

With eGPUs in a fairly mature state, the Mac Mini can serve pretty powerful video/graphics workloads.


I have a slightly upgraded 2018 Macbook Pro right now that does most of my work really well. It has 32gb of RAM, an upgraded GPU and the i7-8750H chip.

I want more cores, faster cores, and more thermal headroom. A Mac Mini would be a slight performance bump from where I am right now, but the ROI isn't really there.

My workload varies, but I am a freelancer with 3-4 contracts at any given time which vary from wordpress sites or rails apps, vue apps, python apps, up to 'big data' processing pipelines. I usually have a handful of services running for any client, redis, es, postgres, mysql, etc... and the Docker/mac story needs work.

I essentially don't want to ever wait on IO ever again. That is another reason I would like a full desktop chassis: I want PCIe slots so that I can outfit them with something like an Optane SSD.

Attemping to spec out something that would perform better than my current MBP:

    3.2GHz 6‑core 8th‑generation Intel Core i7 (Turbo Boost up to 4.6GHz)
    64GB 2666MHz DDR4
    Intel UHD Graphics 630
    1TB SSD storage
    10 Gigabit Ethernet (Nbase-T Ethernet with support for 1Gb, 2.5Gb, 5Gb, and 10Gb Ethernet using RJ‑45 connector)
Same generation chip that I have now. Same core count. Higher boost clock. $2600 is the final price and that is certainly not worth it.

One thing you can consider -> the RAM costs are very high if you upgrade through Apple. You can spec out a Mac Mini with the lowest possible RAM and then do the RAM upgrade yourself to save $600.

The whole process takes about 20mins and there's professional YouTube videos that take you every step of the way.

This is exactly what I did and it was a breeze (I didn't have much prior experience messing with computer internals). It might look complicated, but it's really not.

If you do go this route, I'd recommend OWC RAM since they're guaranteed to work with Mac and they also tell a little $7 toolkit that has the TORX screwdrivers you need to do the RAM upgrade -> https://eshop.macsales.com/item/OWC/TOOLKITMM18/

Here's a video of the process: https://www.youtube.com/watch?v=qKyv0QP4XPQ


Ryzen would probably not be worth it if your time is money, but if you build a basic Intel Hackintosh with known compatible parts, you could probably do it in a day (assuming first time; much faster if you have experience).

I built a Hackintosh back when Tiger was hot (on an AMD socket 754 chip!) so I have a little familiarity - but I am sure things have changed a lot.

My current test bench is an i3770k in a pretty popular Z77 Asus board so if I find myself an AMD card I could probably prototype it here first.


Such product would cannibalise their line, so they'll never make it.

They wouldn't need the Mini, and the mid-tier would be too far from the Mac Pro.


> Such product would cannibalise their line, so they'll never make it.

Apple does this quite frequently.


What about a Mac Mini Pro?

I would be happy if they have just reuse the Mac Pro design, with less expandable ports. May be only two PCI-E Slot, and a Consumer Grade CPU. Would buy one even if it was $2999.

They could call it a "Mac".

Why did you leave Windows?

Unix is a better environment for everything but gaming, and I am not a gamer. Everything is unequivocally better on a Mac, again, with the exception of gaming. Despite what you will hear around the HN echo chamber these days, macOS is a very legitimate Unix and a good one at that. People complain when they run into userland issues, but their blame on Apple is misplaced.

If you are a gamer, use Windows. If you are working in a .net shop, use Windows. If you are using special software that demands Windows, use it. For everyone else, macOS is the right way to go.

I still use Windows frequently as a consultant - either in testing or in onsite infrastructure deployments on Windows Server hosts. It sucks to use. You will rarely run into a smooth scenario, and due to the proprietary nature of everything you are stuck with whatever is available in the docs or whatever the support team says, assuming you purchased the right support license.

Their greatest contribution to the world is RDP, which is the only saving grace on Windows as far as I am concerned.


The Mac Minis are still expensive and underwhelming (even when maxed out) when compared with the top of the range NUCs with even AMD GPUs. The Skull and Ghost Canyon NUCs are great for Linux CIs or building large projects or I could even go far to either Hackintosh a few NUCs to surpass the need to buy a Mac Mini anyway for next 5 years.

Thanks, but I would skip this Mac Mini and go for the latest maxed out NUC instead.


True, but a more charitable take is that the high price goes to subsidize macOS development.

Having just moved to a Windows 10 shop running on cheapest-possible Dell hotdesking workstations, I’m starting to miss macOS and all its command-spacebar glory!


Perhaps you should buy something more in the price range of the mac-mini to make a real comparison rather than the cheapest dell.

I don't think they have much choice in what their employer deems sufficient for getting work done.

And I suspect their employer is right that they get their job done a cheap Dell. Excel works well on a cheap Dell!

Have you tried http://keypirinha.com/ ?

Try pressing the windows key (granted, it's not as pretty as Spotlight).

This is honestly the thing I miss most while using macOS. The windows key search always seemed to work better than Spotlight.

I don't know what you like about it? When I use Windows I try to use it to find Control Panels to set up stuff and it does a horrible job at that.

In macOS you'll find Disk Utility by typing "disk utility" in CMD+SPACE, even when you're on a different language system that has it's own name. I couldn't find anything in Windows even when that was the literal name of the panel. And hunting for formerly easy to find settings is one of the things that already drives me crazy in Windows 10.

Also when I type "alert sound" I still find the Sound and Notifications Control Panels even if they're not full matches, but they do deal with alerts.

There's a lot about Windows I like though. It's just not the UX, apart from the Explorer once I exactly set it up as I like it (well it looks the same as Windows 95 mostly).


I pressed "Win" and typed "Control". Control panel was the first and default choice. And my Windows is not even using English locale.

I admit that this search sometimes works strange, but it adapts to me, so when I'm launching some program often, it remembers that and will always put that program on the top.


I generally have the opposite experience you've described--I can find exactly what I need in settings/Control Panel on Windows, but rarely on macOS. Could be that I'm just used to the names of things in Windows?

Realistically, they both get the job done.


Spotlight is better at understanding multiple languages (accidentally typing the English name on a German System often yields the right result), synonyms and results inside windows. I think this is mainly due to Apples accessibility efforts. Windows is much more picky. But most of the time you just want to start an app anyway.

That is entirely possible. I'm in an English bubble.

Hit the windows key and type "disk man" and the disk manager pops up. Maybe you're just used to the OSX names for things?

Search "alert" brings up the audio settings.

I'm not sure what you're doing differently.


It's hard to give exact examples now I'm not behind a Windows machine. But I didn't have a great time finding stuff on a non-English machine even when I typed in the name exactly as it appeared in the top bar when I finally found it and opened it.

In windows just press the windows key and then type the name off the app you want (same as command spacebar in macOS)

One less button press than on a mac.


Only if it works. Which often it doesn't. And by doesn't I don't mean it doesn't do it 100% correctly but I mean it doesn't find a program or file even if you type the whole name completely. It just gives no result at all for some terms.

But people often speculative that macOS development is on the back burner these days. Release to release they have not seemed to add much of use in a long time. They keep introducing new limitations on apps and deprecating stuff. That has questionable utility if your goal is to get work done.

Of course Windows 10 is also in bad shape.


Have used mac for many years. Never miss spotlight on Linux or Windows. First, you do have that functionality if you need it. Second, the spotlight indexing takes a non-trivial amount of CPU time and is honestly annoying when I need the juice to finish a render or compile code. My guess is you're just suffering from "unfamiliarity" syndrome.

Which NUC has 6 cores and 4 thunderbolt ports with integrated 10GbE?

I have 3 hades canyon nucs (homelab) and also a fully maxed mini. The great thing about the mini is it's versatility for expansion via TB3, as well as allowing me to triple-boot macOS, windows, and linux with minimal effort (no hackintosh workarounds needed)


"Thanks, but I would skip this Mac Mini and go for the latest maxed out NUC instead."

Neat. Of course of the actual base of buyers for this, the fact that it runs macOS is a critical requirement. Running an outdated, hacked version is not a viable consideration for most (I'm fairly certain that the "I'll run a Hackintosh" comments outnumber people actually running a Hackintosh by at least 1000:1).

I don't think there has ever been an Apple release without a barrage of "Lame. No wireless. Less space than a Nomad" kind of responses. As if we are all unaware that it's a competitive space with lots of options.

Yes, there are alternatives. Lots of people buy those alternatives. But it is absolutely not an orange-orange situation.

As one fun aside -- I had a Mac Mini a few years ago and loved it. The power utilization of thing was absolutely bonkers tiny. It just worked 100% of the time. Bought it for like $799. Sold it after having it for I think five years for $500. The kind of bizarre ability of Apple devices to hold value is really a financial factor that many don't consider.


Are Hackintosh a practical reality? (genuine question)

Define "practical".

I have used and installed Linux and BSDs for a long time, and yet I had a look at the suggested procedures and I went "nope". There is basically a 50/50 chance that it will work, some features will likely not work at all (particularly cloud-related services), and every little update might doom your machine. Fun project for the hardcore Apple hacker, but imho it would be foolish to base your day-to-day production needs on such a build.


I ran a hackintosh for years. It used the retail OS plus some custom kexts. My last OS was Snow Leopard before I sold it. I got a real MacBook Pro with Lion and hated hated hated it. Hated it so much I started running Linux in a VM again and eventually switched to a Linux desktop again.

I don't know how well the modern macOS versions as Hackintoshes, but I suspect the community is still strong. The current challenge is probably around all the specialized hardware security and encryption chips on modern Macs, as well as the newer restrictions on signed/unsigned kexts and all the recent kernel/system API lock downs.


Depends on your goals, I think. I run mine for audio production, exclusively. And in that context, it's been great. I have an older firewire audio interface, multiple midi controllers, and a Nvidia GTX-1060 graphics card (I game on a different W10 drive). I run ethernet, so I can't comment on WIFI. But, the rest works flawlessly.

The caveat here is: I'm still on 10.12.6. Updating, especially in regards to audio apps, is generally done very slowly. So, I'm in no hurry to run the latest and greatest. But, upgrading to this point was pretty painless.


They are, but I would only use one as a second device besides a real Mac. Setup and updates are relatively painless if you choose the right hardware and store additional kexts not on the system OS partition but the EFI boot partition.

Biggest troubles are WIFI and Bluetooth (and therefore continuity, needs one of a few selected cards), graphics (no nVidia) and keyboard and trackpad (for desktops, use used Apple peripherals) as well as sound in some configurations (might use a Bluetooth speaker if BT works).

It always can happen that you mess up (or need access to a real Mac for downloading the OS in the first place), so have a Backup and wait a few days after system updates appear, read forums about possible showstoppers and necessary precautions. That said, Catalina is still not perfect and troubles may not be caused by the hackintosh but the OS.


>Setup and updates are relatively painless if you choose the right hardware

Agreed. But this is where "I'll pay an extra $200 to not have to worry about that" comes into the equation


I wish I could've bought my desktop tower from Apple for mere $200 extra.

YMMV, but for my tower, after getting over the initial hump of getting macOS installed, it’s less trouble than Linux is.

No matter the distro, under Linux it feels like I’m always fighting weirdness with X11/Wayland/GPU drivers and there’s more rough edges than I can count. It’s death by a thousand cuts. It has massive potential to be great, but it’s just not.

By contrast once your hackintosh is working properly, that’s pretty much it… the fight is over and you can just use your machine. If you set it up right, minor updates are uneventful and the only real maintenance that’s required comes with major OS releases every year or so, but that can be delayed for a long time if you’re not the type to keep pace with OS updates.

That said it is a bit of a technical endeavor, so be prepared for that.


I built a Hackintosh since my old Mini died before the refresh. Glad I did, since the $440 (tax in) components I bought trump every one of the $799+tax base model Mini specs

Depends on what you mean by practical. The software and hardware works (for now, until T2s become required). It allows you to build configurations that Apple does not (and probably will not) ever sell. They are not cost effective if you have a way to turn time into money though. I would not recommend it unless you have lax security & availability requirements and genuinely enjoy low level tinkering with the basic functions of your computer.

In the last couple years, yes. But Apple has been playing the long game and tightening the noose in order to pull the rug on all Hackintoshs one day.

People forget that the original hackintoshes had to work around the hardware security module. Where there's a will there's a way.

They're not worth the money they save.

If you don't mind painful updates, then yes. I've had mine for six years now.

The Mac Minis are still expensive and underwhelming (even when maxed out) when compared with the top of the range NUCs with even AMD GPUs

I've been considering getting a Mac mini, but haven't committed yet. Can you tell me which NUCs run macOS so I comparison shop?


Maintaining a hackintosh is a drag. You never now if an update is going to break things. You have to jump though a lot of hoops to get things like iMessage working. Moreover, you have to be prepared to use random kexts from the net. You also miss out on many security features of the Mac (T2 and their secure boot mechanism).

If you want to run Linux or Windows, buy a NUC. If you want to run macOS, buy a Mac. Hardware-wise, the Mac Mini is comparatively expensive, but you also fund macOS development and will have a much better experience than any other hardware.

Source: I use NUCs (currently NUC8i5BEH for Linux, plus two other around the house), a MacBook Pro, and have had Mac Minis in the past.


Don‘t. If you want macOS just bite the bullet and buy yourself a pretty carefree experience. If you don‘t care about the OS obviously get any of the countless affordable PC configurations.

True, if you ignore the difference in terms of UI/UX. For me, MacOS is the killer feature. Hard to estimate the price but still worth to buy a Mac Mini over a NUC. Hackintosh is not something I would like to waste time on.

I've been looking for Mac Mini alternatives recently, but both the Skull Canyon NUC and the HP Z2 Mini G4 had plenty of reviews complaining about fan noise, plus they require large external PSUs. The Ghost Canyon NUC isn't super cheap either. The Mac Mini really has a lot going for it.

I don't understand Apple's aversion to accessible DIMM slots and off-the-shelf M.2 SSDs.

It used to be that Apple charged a lot for RAM and Storage, but you could save a bundle by bringing your own. For example, my 2012 i7 Mac mini has maxed memory and a 2TB SSD. When Apple began soldering stuff in, they failed to reduce their punitive prices on expansion, which really hurts reusability and resale value. In other words, they are far less green than they could be since they must now be disposed of earlier.


Like any manufacturer, they are trying to bring down the amount of breakage, and removing electrical connectors and mechanical connections is one way to do that. And when you want things to be smaller you don't really have much of a choice anyway.

At the same time, the only reason those slots were there in the first place was so that a manufacturer could build one main board and then offer different configurations by just swapping out some elements. That benefit no longer holds since you can do the same on a single PCB and still offer the same options. The side-effect of users being able to change it after the fact is lost, but when it wasn't a selling point for the manufacturer anyway it doesn't seem to matter to them.

While not ideal for every user, the mass market doesn't really care. They will probably never upgrade a machine in their lifetime and simply 'upgrade' to a different machine instead.


If we're being really honest, it is for profit margins and planned obsolescence.

The electrical and structural design do account for marginal BOM costs but are hardly useful for some evil obsolescence plan. If only because nobody actually 'upgrades' their systems anymore. The mass market hasn't done that for 10+ years.

People often claim a ton of things but then actually don't have any background in the industry (either the engineering side or the high level planning side). Apply Hanlon's Razor instead of making everything that doesn't suit your context automatically be some evil plan.

Of course a company that stands to make a ton of money with a choice between to options that only marginally affect users will choose the ton of money option. It's just that in this case there is no ton of money.


I call bullshit on this.

Ask any user if they would prefer an upgradeable device to one that cannot be, they will opt for the upgradeable one.

That is why they were forced to bring back the RAM slots after the failure of Mac Mini 2014 (which had soldered RAM).

Apple does this purely to make more money - they have to spend less on parts if everything is soldered, they can charge an obnoxious amount for more memory and it helps with planned obsolescence.

The current mac mini would sell more if they had user upgradeable SSD. But it is obviously more profitable for Apple to offer soldered one's even if it sells less.


I'll take industry standard lifetimes on dimm and m2 slots if this is the trade off.

So much naysaying here. The Mac Minis are excellent for certain use cases (like a home server) and they last forever. Great that Apple is updating them regularly.

Intel NUCs are even better for home server use cases. Similar form factor, much cheaper, and you can put your own memory and storage in them. Just built my own last week and it idles at 5W, it's great.

There is no Mac OS X and Apple's service/warranty.

Also if you try to match the boxes spec by spec; the Mini are usually not that much more.


I really don't understand the fascination with Mac OS. I have two macs and tbh it feels like a polished Linux distro that doesn't run that well after a few OS versions (my 2015 Macbook Pro is so slow it is useless now and I've bought a Thinkpad).

Though I will admit that I typically Brave, Brew, VSCode / Visual Studio and iTerm. It might be me just and not getting any of the real benefit, but I tend to use Windows and Linux just a shell.


What sort of home server use cases do people need OS X for? The only thing I could think of is running an Xcode build server but that seems quite niche.

It appears that an OS X server can take any compatible printer and expose it to the network as a generic PostScript printer. Clients on the network only need to support PostScript, and the server handles everything device-specific.

I've been tempted to buy a Mac for that alone. If there's any similar options for Linux, I would be very interested.


Why would you want to run macOS on a server (no matter that it is just home server)?

The Intel NUCs are excelent little machines. Some of them support Intel AMT, so you can have them completely headless in some cabinet.


Out of curiosity, how do you measure its idle power?


My Mac Mini home server cost was zero. I retire my old Minis for these functions, I run an IRC server on my 2009 Core 2 Duo Mini and idle power is about the same there.

Well we don't all get our computers off the back of trucks so I'm not sure this is relevant.

They are not lasting forever anymore since sad is soldered one of two ours just died a little after a year.

They used to last forever, because you could replace the failing parts that wear out, like the hard drive.

Then they soldered the SSD onto the motherboard.


I wouldn't say "updating them regularly". The last three updates were in 2018, 2014, and 2012. That 4 year stretch without an update had many thinking they would just discontinue the Mac Mini.

They are the best entertainment center machines.... the full Internet experience from your couch using a wireless mouse and keyboard.


Im sure it isn't, but i used various mac minis as an entertainment center since 2008. It's compact and fits nicely into the TV cabinet or dresser.

I've used any wireless mouse/keyboard bought from Best Buy since. Prefer they are separate as I dont type. Just click through my bookmarks to read and watch content.


Only up to 6-core 8th generation CPUs, and it looks like the base 6-core SKU is the i5-8500, which doesn’t have Hyperthreading. Ouch.

It’s such a shame that’s the best Intel can do in that thermal envelope.

AMD is about to ship Ryzen 4000 mobile chips with lower TDP, great integrated graphics, higher clock speeds and more cores. It would have worked fine in the Mac mini. Imagine how good the desktop APUs will be when those come out.


That i5 is a 65W desktop part, not a mobile part. It is faster than the i7 9750H of the 16 inch macbook pro.

I'm highly doubtful most people here have workloads that it wouldn't be able to handle.


>> Imagine how good the desktop APUs will be when those come out.

I'm starting to think there won't be any more desktop APUs. The next logical step would be an 8 core CPU with better GPU than the 3400G. That would be good enough desktop for a huge swath of the market and might kill sales of higher end chips.

I use a 2400G for software development. It's the fastest cpu I've ever owned. It runs a 4k desktop, 4k video, or 4k gaming (old stuff I'm not a "gamer") without even running the fan loudly.

The main thing I want is AV1 decode on the GPU.


That’s a fair assessment and is probably why they’ve launched their high core count CPUs and $400 range GPUs in this order. I’m more than happy with my 2700X but I wish I didn’t jump on the 5700 XT. It’s too loud and hot.

I'd love to see a 4800G chip with 8 cores, 20-30 CUs, and 16/32GB onboard HBM2 unified memory in the $400/600 range (depending on RAM).

>> I'd love to see a 4800G chip with 8 cores, 20-30 CUs, and 16/32GB onboard HBM2 unified memory in the $400/600 range (depending on RAM).

It will more likely have only 8 CUs like the laptop chips but with a higher TDP (65W?) and clock. OTOH probably only $200ish. Still a killer desktop value.


Oh wow that’d be really cool. You could pretty much consider that to be L4 cache.

That's not the best Intel can do. There's 9th generation CPUs which Apple decided not to use.

Yes, they could have gone with the i7-9700 trading 6c/12t for 8c/8t. There's also the question of power, which despite both being marketed as 65W parts do not have the same thermals.

Maybe the i9-9900T would have worked?


$800 for an quad-core i3 with 8GB, 256GB and the lovely T2 chip.

Business as usual for Apple...

In recent years they became an Intel but without an AMD in sight, they do it because they can.


This was originally posted with an (updated) suffix in the title. The Mac Mini has not really been updated. All they did was adjust the storage tiering. Minimum is now 256G. So this amounts to a price drop.

> Minimum is now 256G. So this amounts to a price drop.

Looks like Apple didn't learn its lesson with the Mac Mini 2014 that had soldered everything and was a FLOP. The later versions reverted to upgradeable RAM again, but soldered SSD.

And that is why these Mac Mini aren't selling as expected - because users want upgradeable computers. And aren't as much of a fool as Apple believes to pay the insane high prices Apple charges for its soldered SSD in these devices.


8th gen intel? You gotta be kidding me with this deprecated hardware. How much is intel paying Apple to subvert AMD?

I'm not sure why this was posted on HN today, but the Mac mini hasn't been updated, it just has slightly different storage configurations. This is the exact same computer the debuted in late 2018.

Looks nice. I wish they would use numbers on these things, though. It's difficult to find support when searching "mac mini" could bring back results from multiple generations of different hardware and software.

They use the year and a model number, but not in the marketing. Everymac is a good resource: https://everymac.com/systems/apple/mac_mini/index-macmini.ht...

Anyone has experience with Mac Mini + eGPU combo and can share a bit about it (e.g. as alternative to Mac Pro or iMac?)

I've got a maxed out Mac Mini with the 8th Gen hexa core i5, 64 GB ram and 1TB of storage along with a Radeon Rx 580 8GB in a Razer Core X.

I think if you're using it for a specialized application with eGPU support, game development, or gaming in Windows via bootcamp it's a passable solution (albeit with lower performance than a native card).

On the for something like web development, it's finicky - Chrome won't pick up on the external GPU unless you have an output directly from the Mac Mini to the graphics card along with separate output from your Mac Mini to your external monitor (which means a lot of plugging and unplugging cables whenever I boot up the machine).

If I could afford a Mac Pro or had space for an iMac, or if Apple actually made the xMac I'd buy those in a heartbeat.


what is the xMac?

In theory it would be a headless, modular, user-upgradeable mid-range mac - essentially a low-end Mac Pro or prosumer machine targeted towards gamers, enthusiast developers, DIY artists, etc.

I have a 2015 15" MacBook Pro with an eGPU containing a Vega 64. I haven't really had any issues with it other than some OS hacks to get it working with TB2 instead of the fully supported TB3.

I use a 2018 Mac mini with an OWC Helios FX enclosure with a stock AMD Vega 56 GPU. My workload is some light gaming (Mac only no windows/boot camp) and 3x4K displays in scaled modes for web/software dev.

The mini itself is compact, fast, and generally great. Handles VM based dev workloads well. It’s small and quiet (not always silent though).

I used the machine with just the built in integrated GPU for a few months. For gaming it was bad. For lots of 4K screens it was... doable, in the sense that you could actually plug them all in, but UI effects would lag and it felt weirdly disjointed. Some tasks would be very fast then hit a GPU related thing (scrolling, OS animations, etc) and lag, stutter, making the system feel slower than it was.

Got the eGPU and it was 100% plug and play. Works beautifully, completely fixes all the UI issues, games generally run great (as good or better than my previous hackintosh with the same GPU in a direct PCI slot)

I’ve run into a few very small issues. USB-C connector tolerances are not always tight enough to reliably move equipment while it is all on, so sometimes if you grab the mini and drag it around (maybe to position it to plug in another cable) the GPU can disconnect. This generally falls back “gracefully” to the internal GPU but can be a bit annoying. You also have to think a bit more about heat management in your space. The GPU and mini all get hot when under load, you can’t stuff them in an unventilated hole. Overheat behavior is generally that the GPU shuts off and needs to be power cycled.

Compared to an iMac... The iMac is probably a better standalone deal and experience. However, in my case I have many machines and many displays. Mini+GPU lets me share generic displays across multiple machines in a way that an iMac simply can’t. Especially if you have existing windows gaming hardware where you can “recycle” the previous gen GPU in the Mac it can be way more cost effective. The GPU I am using is more powerful than you can get in any mac other than the Mac Pro.

Vs the Mac Pro... well the Mac Pro is better, but costs 4-a zillion times as much. kind of different ballparks. If you want a headless desktop and don’t need the Mac Pro crazy, a mini+GPU is a good option.

As a budget option? Mini+enclosure+GPU isn’t particularly cheap. My setup was ~2k without displays and I have a fairly simple spec setup (512GB, 16GB ram, slowest 6 core).

Where it shines is in modularity with other equipment. I can share the eGPU with laptops, share the GPUs with other windows machines, and share displays with other machines, in that environment it is cheaper and more flexible.


> For lots of 4K screens it was... doable, in the sense that you could actually plug them all in, but UI effects would lag and it felt weirdly disjointed.

This is probably because the integrated GPU is dipping into your system RAM to drive those displays. I think people who've upgraded to 32GB of system RAM have far less issues running the stock GPU and 4k/5k screens.

I myself am using a stock GPU with a 2018 Mac Mini and an LG 5k Ultrafine + another 2k screen with no issues at all. Amazing machine.


> Where it shines is in modularity with other equipment

Thanks for the insight, that's where I see it shining is sharing the GPU between a work desktop/laptop, and then a personal desktop/laptop for gaming. All whilst still being able to take advantage of the benefits a slim and light device has when moving around.


great overview, thanks!

Is Apple at present time announcing new products?

Been waiting to hear about an iPhone SE 2.


iPhones are always announced in September

The original SE was announced March 21st, 2016.

Looks like both RAM and SSD are upgradable? I thought they had gone to hardwired everything in the Mini.

RAM yes, SSD no

They're using custom T2 chip which is known to control flash storage, so I'm not sure about SSD.

What the heck does this mean?

"Up to 7.8X faster than 16GB"


It means that some typical processes are run entirely in RAM rather than having to swap to disk. Specifically, the tests they used were common image manipulations (rotations and so forth) in Photoshop.

LOL at the row of 10 of them, about 2/3 down the page. Those things would catch fire in about 5 seconds. You'd have to be blowing high pressure ice cold air at the intake side to even hope keeping them running at that density.

I'm using the 2018 Mac mini with 32GB RAM and the i7 + 4k monitor, it's great, however compiling Go's Kubernetes client still stresses its CPU. What's the main changes here?

The UK site is showing an 8th gen CPU


It's just a bump in the default storage at each "level".

Any music production geeks have a recommendation on whether a relatively maxed out new Mac Mini could be a worthy DAW? Keep in mind I need macOS to use Logic, and don't want the hassle of a Hackintosh.

more than easily. Fantastic albums have been made with way less CPU / Mem grunt than these new Mac minis.

Some of them without any CPU horsepower at all!

(I jest, I know what you mean. I made one with pretty much just an iPhone although it's definitely not fantastic.)


Steve Lacy recorded his entire debut album on his iPhone into Garageband, and it's definitely fantastic.

It should be enough to be suitable for your needs. If it were for being a gaming machine with macOS then a 16-inch MacBook Pro + AMD GPU would be also fine.

For a CI or compiling machine, neither are good enough.


It's funny that you this mention this; Apple recommends Mac Minis for Xcode build farms right on their page! What could be better? A mac pro? An iMac pro would clearly be a waste given the cost of the screen (although some CI for a Windows app I worked on in the past required a monitor to be turned plugged-in).

> It's funny that you this mention this; Apple recommends Mac Minis for Xcode build farms right on their page! What could be better?

Well of course Apple recommends this. It's like asking a barber if you need a premium haircut, but the price is expensive with poor performance, but will do just fine for most people.

A less expensive CI for achieving better CPU performance would be a quick Hackintosh setup with a powerful Intel NUC for 1/2 of the price of a maxed out Mac mini. This isn't for everyone, but it saves me money and even more if it were my existing computer.

The expectation here for a in-house hackintosh CI is that if you can install and setup a Linux machine and configure X11/Wayland, etc it is pretty much the same with a Hackintosh these days.


It's really convenient that I was able to use my scroll wheel to adjust the text opacity! Sometimes I prefer a 50% fade just to add a little extra challenge to my read.

Considering how Apple prices its products, $100 for 10 Gigabit Ethernet seems very cheap. Actually it is cheaper than most PCI-e cards.

Yes, I saw that too, it's using an AQC107 (because it's supporting 2.5 and 5) and the card is 100 bucks https://camelcamelcamel.com/Gigabit-Ethernet-Express-Network... pretty steadily and so Apple only asking for the same price is indeed surprising.

Not really - you can have single port Intel X520-based pcie-e card for $40. Or bundled with 3m DAC cable for $60.

For $100, you are in dual port cards territory.


ECC would be cool for using these for remote builds. Don't know if that's ever going to happen.

Absolutely an ECC would be great and cool. However, I'm not sure if Apple is going to provide ECC memory since they already do with Mac Pro's and they might want people to just buy a Mac Pro instead of Mac Mini.

Curious, why does ECC matter for remote builds? Any explanation is appreciated!

If there is a bit flipped in your build, you then propagate the bad bit to potentially millions of machines. Bit flips in non-ECC memory are common enough that this is a concern in practice.

Intel only allows ECC with Xeons, meaning a big price jump if nothing else.

This used to be a good value Mac model years ago, but nowadays if you configure it well the pricing is on the kooky side for what you get. IMO they should consider discontinuing it, and instead introducing something like Mac Pro mini, with a real GPU at this point.

Found this concept about a modular Mac Mini "Pro" the technology is not there yet but could be developed in the future

https://www.youtube.com/watch?v=xuZlMROiSXQ


I'm still hoping, against all the odds, of a Mac Mini Pro or a Mac Pro Mini. I don't need a Xeon, but I don't want a thermally constrained chip either.

I hope wikipedia has not been updated yet... The specs are the same as the 2018 model except for the storage! No reason to buy the new one

So is RAM upgradeable in this? Only offering soldered ram / SSD is reason to not buy these.....

I assume it still is. The only thing that changed here was a bump in the default storage.

> still is.

While the RAM technically was replaceable on the previous model, it's a major pain that requires you to basically tear the entire thing down. I was hoping for something as easy as it used to be.

Here's a "professional" video of it being done, note the person even breaks the cable for the LED indicator in the process.

https://www.youtube.com/watch?v=gQq4hLKv1Cc&feature=emb_logo


Having never taken a Mac apart before, it took me less than half an hour to upgrade mine, and I had zero issues. It was definitely more work than upgrading a normal computer, but I think it's definitely worth doing if you're a student or otherwise unable or unwilling to pay Apple's exorbitant RAM prices.

apple, they have a pro phone without 3.5mm headphone jack. n a pro ipad too. But quick to highlight a 3.5mm headphone jack on the new mac mini. the new mac mini looks like a wonderful machine though. glad they doing the long needed upgrades.

You need something more powerful than this Mac Mini just to be able to scroll that webpage.

This is not true - even a 10 year old MBA handles it fine. Try disabling your extensions – especially any ad blockers which aren't simple domain blocking.

well adding that last statement is ridiculous. expecting to toggle an adblocker for web page performance..

It's pretty slow on my recent MBA too.


Read the whole sentence: ad blockers which do simple host / path filtering are fast but there are many which inject huge amounts of code into the target pages with unknown impact. There were years where a big fraction of the people complaining about Firefox performance were ABP users.

That couldn't be more wrong.

I bet it could be

I broke my 2016 15" Macbook Pro display while attempting to replace the Macbook logo bezel (I know, stupid of me...).

Anyone have suggestions on what the best option is going forward? Looks like ~$700 to pay Apple to fix, less to DIY but that's a risk in itself, Apple trade-in would give me $310 with damage or $890 without damage (so basically the difference is price to repair).

The cheapest option is of course to not repair at all and just continue using the laptop with an external monitor.

Definitely want to stick with Apple. I only use this computer for DAW / home music studio.

Anyone been in a similar situation and what did you do?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: