Hacker News new | comments | show | ask | jobs | submit login
Completely Silent Computer (tp69.wordpress.com)
1087 points by signa11 61 days ago | hide | past | web | favorite | 489 comments

Am I the only one who thinks of smartphones and tablets as silent computers?

Granted, they can't do what this person's high-spec workstation can do, but they do most of the computing tasks most people use (used) noisy fanned computers with clacking disks for and in many cases do those tasks better.

And unless I'm just losing my hearing, my smartphone is completely silent as long as I don't accidentally press the Golem Invoker, er, Siri button.

There are plenty of silent/fanless Coffee Lake based industrial NUCs, too.


I was kind of excited about this build but then I saw your comment and remembered that was a much better option.

4-6 months ago I built a new workstation for work. I had used one of those Corsair closed loop water coolers with the prior build, so I set one up on this guy. A month or so later my workstation was running really sluggishly, and I realized it was drastically throttling the CPU because of heat. I installed some software to spin up the fans on the cooler to keep it down under 100C, but now it'll get kind of loud when I run much heavy CPU.

Now, this is a pretty heavy duty workstation, 64GB of RAM and 3 displays. But, if I were doing a new machine for home to be quiet, I think it'd be a NUC. Then I'll put the box that has the 6 drive ZFS array in a closet and call it good.

If you want something slightly more modifiable, Compulab makes passively-cooled screwdriver-openable boxes at basically all levels that it makes sense.

Their highest end is available with ECC ram and a discrete graphics card: https://fit-iot.com/web/product/airtop2-build-to-order/

And the lowest end fitlet2 can be configured with an atom in a reasonable configuration for around $300 (it's $130-$200 for case/cpu/motherboard depending on which CPU you configure).

I bet water circulation is broken. Check pump. You may have some air in pipes, turn your PC upside down and back (no joke).

I had looked at the NUCs a few months ago, and just remembered how hard it is in their product page to find something that will support 3 monitors. Looks like only the highest end models will do. I'd be tempted to build my own again, especially for my home office where I can put it in a closet next to my desk. The NUC product page lets you select requirements and then seems to give you a list of products that don't match.

Gigabyte also has their BRIX products, which are similar to the NUC.

I was at a hotel this weekend and at check-in they had a monitor with a "ThinkVantage slotted in the back of the monitor, that might be a nice setup.

Ha, the entire team at my previous job were using NUCs. Nifty little buggers. In the summer it would heat up quite a lot (not yet to the point of damage I suppose) and I couldn't even turn it off without burning myself if the PC froze.

Eventually I just bought a cheap USB desktop fan and ran it facing the NUC.

I'm using a NUC for a quiet home PC that I do mostly browsing but also some light web development on. Although it's mostly silent, I haven't been too impressed with the performance of it. It was one of the top of the line models in 2015, an i7-5557U. I was surprised to find it's quite a bit slower than my 2013 Macbook Pro 2.4 Ghz i5. On the Geekbench profile the MBP got almost 3x the single-core score of the NUC and 1.7x for multi-core. Real world performance of the older macbook is noticably faster.

It has made me think that instead of getting a NUC, for a quiet desktop system I should have just gotten a second used MBP and ran it permanently docked in clamshell mode with the monitors and keyboard attached. (with the added benefit of being able to go portable when I want to)

Neither the NUC nor my MBP is completely silent but for my purposes I find that I seldom tax either of them enough to where the fans become audible enough to be annoying. Still, I do find the difference in performance between them to be apparent just in things like iteration time on web development and IDE responsiveness.

Without ECC, that isn't a very industrial PC.

FWIW, most industrial PCs I've seen do not use ECC. In many case they use CPUs that do not support it anyways.

My iPhone 7 Plus has some crazy coil whine. In a quiet room you can hear it. I can hear incoming notifications being processed before the alert goes off!

Is this only when it’s plugged in? Could be a bad cable or charger. I had an iPhone 5 that had a super touch digitizer when plugged in, and I finally narrowed it down to cheap third party cables. Otherwise I think I would be doing a sit-in at the nearest Apple Store until they replaced it, that would drive me nuts.

The transceiver powering up is actually noisy. I had a Nokia 2100 series in the late 90s that generated enough emi to distort a CRT if it was sitting next to it.

You could hear the whine from across the room a few seconds before a call would ring through.

Heh. Back in the 90’s in college I would keep my phone on top of the monitor I was sitting at exactly so I would see the monitor juke just before the phone connected.

People always asked how I managed to answer the phone so fast. Electromagnetic Supplementary Perception, of course.

Heh - same sort of timeframe - probably a Nokia 8210 though - I could reliably have my Apple hockey puck mouse "crash" if my phone got an incoming call while it was sitting on my desk in a loop of the usb cable for the mouse. It'd just stop working, and need unplugging/replugging to get it working again.

It was clearly electromagnetically "noisy", but I do't recall ever having heard any on my phones make any unexpected audio noise... (My old-and-abused rock concert and motorcycle weary ears probably can't get up as high as inverter whine any more though...)

We had four computers on a LAN-party that didn't have the shielding on the computer. On phonecalls, all four computers got bluescreens.

Pretty much all cellphones would do that to CTRs. They would also go directly into the audio circuits of cheap amplifiers, to the point where you could "hear" a text or call incoming before the phone made any kind of notification.

You know, that reminds me on how we don't really hear any speakers making odd noises when there's an incoming call anymore. Probably because phones operate on different frequencies nowadays.

-I presume transmit power has been lowered significantly as coverage has improved, too; your cell always transmits at the lowest level it can get a reliable connection with to preserve battery life. This should reduce interference considerably.

Also, GSM phones used TDMA (keying the transmitter on and off to occupy one of -hm- eight, I believe - time slots on a given channel.)

This is practically asking for EMC issues.

LTE, on the other hand, transmits continously (I believe - I do not work in RF engineering anymore, but try to read up on new tech every now and then.:) - much less interference-causing than the constant on/off of TDMA.

I thought LTE worked on a timeslice schedule as well. I remember hearing that was one of the problems with carrier plans to start running LTE on unlicensed spectrum, because it doesn't play nice with listen-before-talk wifi.

I still hear speakers make noises when there is an incoming call, I assume it varies by country.

In college I bought one of those antennas that lights up whenever I got a call or a text. Didn't those lights work on the same principle?

I'd say a good half of the smartphones I've owned over the years have had audible something, even when not plugged in and supposedly silent.

Some were barely noticeable, while one in particular (a Droid Turbo) was so loud I could hear it getting ready to receive a call from another room. This was regardless of whether they were plugged in or not, although charger whine was its own separate issue.

Thankfully it does seem to be getting better over time- my current S8 is, as far as I can tell, genuinely silent.

This happened on TDMA systems like on T-Mobile and AT&T. (Verizon & Spring were CDMA).

There's a TDMA modulation frequency at 217Hz and this interferes with all sorts of nearby audio devices. CDMA and WCDMA phones have a much broader interference spectrum, which is why you don't hear it much anymore.

No, this is all the time. It is only quiet when the phone is idle.

I considered returning it, but I find it charming. I miss the days when you could tell exactly what your PC was doing by all the sounds it was making, and I find dead silent electronics to be elegant but a little sad.

It could also be the earpiece/speaker turning on and having static noise right before the notification sound is made.

The easiest way to experience this is to plug in headphones and hear the clicking before/after a sound is made.

The sound is loudest if I hold it up to my ear right behind where the SoC is. It generates a unique pattern of sound based on activity, such as running your finger over the touch screen. It's not just when sound events are about to play.

The MacBook is (at least in theory) noiseless as well. It has an SSD and no fans.

A few Chromebooks fit this description as well, at least they used to.

My Samsung Chromebook from 2012 has an ARM processor, solid state storage, and no fans. It is pretty slow by today's standards though.

I think several companies make cases for the Intel NUC boards that radiate the heat away and have no fans, too.

Many of them are still fanless: https://chromebookdb.com/search.php?fanless

My Samsung Chromebook 3 gets a touch warm but never uncomfortably so like my 2012 Retina MacBook, which lets you really feel it when your code is inefficient. (Granted, the Chromebook is a lot less powerful)

Funnily enough, my MacBook Air 2013 produces a buzzing sound on SSD access. It's barely audible, but it's there.

My MacBook Air 2013 is routinely the loudest machine in the room when it's compiling or similar.

Does it have fans? If not, what kind of noise is this ?

MBAs have fans, the 12" MacBook doesn't.

Possibly inductors vibrating.

"Coil whine".

No, the sound comes mostly from the fans, which when at their maximum start to get loud.

Oh...it’s not just me then.. Every couple of months I check my 2015 MacBook Pro system info because I’m utterly convinced that it has an hard drive because of the SSD noise. It’s quite frustrating actually..

Imagine being a kid in the 80s or 90s at school and hearing the distinctive 20k tone of a CRT Television humming and wondering if you were going to be watching TV in one of your classes that day. It was like a dog whistle for kids.

If you're in the US or anywhere else with NTSC, the horizontal scan rate (and thus the whine of the flyback transformer) is 15.75 kHz :)

525 lines / 2 for interlacing * 60 fields per second = 15750

times 1000/1001 ever since color was introduced, so about 15734 Hz

For me that was a thing until the late 2000's.

They were still using CRT TVs in 2012 when I finished high school. I wouldn't be surprised if there were still plenty of schools with CRT TVs and VCRs for educational material.

I still have a 19" CRT[0] as 2nd monitor at work because... why not?? It still works most of the time and supports a decent resolution (1400x1050 - 1600x1200 flickers). And it's a nice nostalgic conversation piece :)

[0] https://www.cnet.com/products/compaq-s910-crt-monitor-19-ser...

Back around 2001 I took my desktop build (Celerin 300A oveeclocked to 450MHz), installed a giant passive heat sink on the CPU and PSU, put in an 8MB IDE flash drive, and network booted off a server in my laundry room. I thought I was finally noiseless. But the end result was worse. I had coil whine out of my power supply any time CPU usage ramped up. If I was in the same room, I could tell whenever my email pinged the server or if a cron job ran. It was both instructional (why is my cpu ramping and I’m not even logged in?) and really annoying.

I had one of those, I think it may have overclocked to 550MHz, I can't remember for sure. The cooling fan was audible outside of my house!

I’m routinely annoyed by what I am reasonably sure is the sound of my MacBook Pro’s heat pipes and other warm components expanding and contracting, making creaking noises shortly after starting it up on a winter day. Then there’s the gentle gaseous hissing I would swear is the heat pipes condensing and evaporating (based on workload and laptop temperature concurrent with the noise) if I could think of a practical reason for them to be audible.

No, it's not just you. I can hear the SSDs in two of my laptops.

So does my Kingston SSD. And the switching of my switching power supply sometimes (rarely) is the loudest thing in my desktop pc.

Can confirm the 12" MacBook is totally noiseless. It's a neat little thing.

The Dell XPS 13 should also be noiseless when the fan is not on, but many models also suffer from coil whine.

I find coil whine a worse background sound than the lower frequency fan hum.

When I go to editfight.com on my rMBP the fans go crazy and it gets super hot. When I visit it on my iPhone it stays the same temperature and is silent. Kind of an extreme example but the principle is the same. Phones are better for sites like this.

The rMBP is very different from the MacBook.

Oh I didn't know that. I thought Apple recently shifted the MacBook so that it was more "pro" and the MacBook Pro so that it was less "pro", making them a lot closer to each other, almost identical. I thought I remembered a lot of criticism over that move too, here on HN.

I bought an Asus "zenbook" ux305 for this reason. It uses an Intel core M processor, which idle around 800mHz but turbo boost to around 2gHz, which I believe have been discontinued.

I was worried about performance, but it has been very acceptable. It depends what you need it for, but I can run 2 monitors, a Linux VM and atom all while streaming HD video. Or I can do light web browsing for 10 hours on battery[1]. I love it.

On the rare occasions I need more power, I spin up a spot instance.

[1] If you use Linux on a laptop, install "tlp". It optimizes battery life without a noticeable reduction of performance.

MacBook Air and Pro both have fans. I don’t know the entire Apple line of all time by heart, so there might have been another MacBook that had no fans.

Its the super thin 12 inch MacBook they currently sell, which is just called "MacBook".

My bad. Thanks for the info!

The Pro better not go fanless! Oh my CPU melted while compiling and running a few VMs.

The Macbook? My Macbook's fans are incredibly noisy. I've already replaced them once, hoping that would fix the issue, but it didn't. Macbook fans are just noisy. At least the 2011 unibody ones.

It sounds like you’re talking about a MacBook Air as a generic MacBook, whereas in fact the MacBook is it’s own distinct line of computer, debuted in 2015. It indeed does not contain fans.

Macbook debuted in 2015? Macbooks were first released in 2006. Mine is a Macbook Pro.

Obviously newer models are different from older models, and Air models probably don't have the fans that Pro models do, but the claim that Macbooks don't have fans is a bit too broad to be true.

You don't understand. They mean the laptop literally called just the Macbook. It's a 12 inch fanless laptop.


This is why these 'simple' naming schemes are confusing.

The Macbook didn't exist in 2011. You're thinking of the Macbook Air or Macbook Pro. The Macbook with no fans was released in mid-2015.

First line of the pertinent Wikipedia article: "The MacBook is a brand of notebook computers manufactured by Apple Inc. from May 2006 to February 2012, and relaunched in 2015."

In Jobs' 2x2 matrix, the portable half was initially populated by iBook and PowerBook, later by MacBook and MacBook Pro.

And yet, in a post referencing a fan-less "Macbook" it's almost certain that the "Macbook" in question is the only fan-less laptop Apple produces, which is coincidentally called simply "Macbook."

It's unfortunate that Apple has confusing brand names, but the fact remains that the Macbook indeed has no fans so the original comment who hears fan noise is obviously using a different model of laptop.

Of course there are Macbooks without fans, but the claim that "the macbook has no fans" is a bit too broad and generic to be true. Macbook Pros are also called Macbooks. Older Macbooks are still Macbooks (they seem to be pretty durable).

So if you say that recent Macbooks have no fans, that may well be true. But it's not true for all Macbooks.

How would you prefer we refer to this specific product [0] in the plural form if not “MacBooks”? Please note, once again, that this is not the same product as the MacBook Pro [1] or MacBook Air [2].

[0] https://www.apple.com/macbook/

[1] https://www.apple.com/macbook-pro/

[2] https://www.apple.com/macbook-air/

Am I the only one that wonders how anyone can thinks of comparing a phone with to a high-end desktop system and claim they can fore fill the same need?

I mean these days having a powerful computer on my desk is less of a benefit - that can be a machine in the closet or sitting on a rack. The machine on my desk just needs to drive a couple displays and run a browser.

I mean it depends on what you do - but for many people it's a realistic solution.

The iphone x is getting close to mbp performance and its just a matter of time before it surpasses it

Mere physics says it won’t. It’s always possible to pack more performance in a larger package, even if it’s just because you can more easily dissipate heat on a larger surface. iPhones are amazingly powerful and might just be sufficiently powerful for everyday computing soon or even right now, but they’ll never surpass anything that can accommodate a larger die.

Physics doesn't drive CPU development, it just sets the absolute limits.

Apple have put faster chips in their smartphones than in their laptops.


Geekbench scores are interesting, but not necessarily translate straight into real-world performance. Intel CPUs have a much richer instruction set for example. Peak performance vs sustained performance is another issue. GPU performance, disk size and speed, available RAM, battery time at a certain usage etc. are other performance factors. Apple could decide to put faster CPUs into its macbooks at the cost of less battery time.

This is all alluded to in the article and the macrumors post that the article is based upon, here are some quotes:

> "Sure, that doesn’t mean the A11 Bionic can do all the things a desktop CPU does."

> "Though the iPhone X and the iPhone 8 offer impressive Geekbench scores, how that translates to real world performance remains to be seen."

There's no question that the iPhone chips deliver amazing performance, but there's a reason people still lug Macbooks around.

It does seem like something is missing in the comparison. High end x86 CPUs draw something like 30W idle, which would drain an iPhone X's battery in minutes. Do Apple/Arm really have some magic technology that makes their CPUs orders of magnitude more power efficient?

Geekbench always seems like an odd benchmark - the variability between runs alone is kind of odd. If I could run a compiler on an iPhone, for example, would I really see similar performance to my MBP?

The CPUs in MacBooks are mobile CPUs that certainly don’t draw 30W idle. You’d drain the battery faster than you can charge it :)

One thing to note, x86 is known for being spectacularly inefficient for mobile workloads. It's not magic that makes an ARM CPU more efficient, it's just different design considerations. For an example of this, check out the Intel Atom line of processors[0] which were mostly x86 processors but designed to be mobile and power sipping. Whether they were successful at that, or in terms of performance, I'm not sure. But they get down to single digit TDP, which is how many watts of power you can expect one to use while under load.


It's not about physics but about economics. Improvements go where revenue goes. For consumers that outcome is known.

Phones have a real physical advantage in lower latency connection to RAM. You can have more processing power in larger form factors, but it's not a net win for all workloads.

I don't think you can trust those Geekbench results for real-world performance. For starters, they largely ignore the much richer set of high-troughput opcodes on desktop PCs.

It also doesn't pass the smell test. Even Atom CPUs are preferred over high-end ARM for netbooks. But a Xeon is way more powerful than any Atom.

I think the Atoms are used there more for compatibility than anything - honestly, in terms of performance I think a new iPad Pro will smoke most netbooks.

I'm sometimes nostalgic for clacking disks. It's sometimes useful to know when the machine is swapping because some process has gone awry and used up all the memory...

Same, some feedback that something is happening is great. This is what annoys me the most about macbooks, they give 0 feedback when they're off. It's always a bit of a spincter squeezing moment on monday mornings when I open the macbook, sometimes it went out of power and shut down completely, but there's no indication that it's on or off or working or whatever. Black screen, and you kinda have to push the button for an arbitrary amount of time and wait for an arbitrary amount of time until there's any feedback.

Whereas on a normal computer, a light goes on and the fans start spinning, not because it's useful but just to indicate that it's on now. It's a miracle.

That's funny. My mother sometimes laments over the fact that modern washing machines are too quiet and she doesn't know when the laundry is done. This is like the IT guy equivalent!

I could tell what phase of booting/loading Windows my 486 was at, from the sounds of the HDD.

Fun story: A long time ago I was on the phone with someone, and typed some commands. For some reason tab completion made ticking noises in the phone.

Then it dawned on me: I was currently logged in on the machine on his desk.

Sure, but most mobile OSes don't support a full desktop experience well, and you're limited to bluetooth peripherals only. Unless you have a rare phone that comes with a docking station.

I have an OTG USB hub with Ethernet. Mouse, keyboard, and external drive can plug in all at once, and if I have a place to plug the other end of the cable I get wired network too. It even has power input to drive the peripherals. Add something like GNURoot with the Debian chroot and it's nearly desktop Linux on a tiny screen. It's more of a pain to set up during a meeting than a dedicated docking station would be. It also doesn't currently charge the phone (although there are ways to modify it to do that). It's nice for novelty but a little unwieldy. Using an OTG cable with a keyboard with a built-in hub reduces some of the cable clutter.

Some of the Chromebooks and such have no moving parts. I'm probably taking my Pinebook to the next conference I attend.

>Unless you have a rare phone that comes with a docking station.

Can't you just use any USB-C Dockingstation?

Not all phones support all the alternate modes of USB-C, especially display port or HDMI, so display output is often a hurdle.

Good point. I guess few people use those since phone OSes don't scale well for desktop use.

The latest MacBook is essentially mobile hardware with desktop peripherals and software. Seems like the right way to go rather than bloating a smartphone with desktop software given that the amount of people who want their smartphone to be their only interface to everything is likely a very small minority.

> Sure, but most mobile OSes don't support a full desktop experience well

This is due to the form factor, not the capability of the devices. A high end smart phone is more than capable of producing a good desktop experience.

It's purely a software issue. Microsoft, Ubuntu, and others have come close to building a responsive desktop GUI.

A better title for the article would be "Completely Silent Gaming PC"

> ...this system is not meant for gaming...

It still kinda is. They didn’t skimp any more than they had to

> Even though this system is not meant to be a gaming rig, there’s no harm in putting in the best GPU you can without blowing the thermals.

These days with GPGPU (and crypto) becoming increasingly popular having a good GPU does not automatically imply gaming.

0% of people do (non crypto) GPGPU at home, and the post is lamenting crypto miners for driving prices up.

3D artists/motion graphics/video editors do. Check out Octane Render, Redshift, Realflow, After Effects, Premier etc for use cases.

Do those reach 1% of home users with GPUs?

I don't think there's a way to gather that statistic. Some people might suddenly get a project that involves software that leverages GPGPU, some might use it professionally, then there might be kids experimenting. Plus as the job market involving AIs grows, so does the GPGPU market.

My point was it's not "0%"... Users exist.

0% is different from none. 0% means that it's a very small amount, not even enough to be 1%. The implication is that while it could be true, it's such a rarity that it is not in fact a counterargument to "GPU implies gaming".

> my smartphone is completely silent as long as I don't accidentally press the Golem Invoker, er, Siri button

You can get rid of that sound as well: just flip the switch on the left side of the phone.

I use a Dell Latitude 7370 which is completely silent. It's not the fastest machine, but works fine for running several VMs, Visual Studio, etc.

correct. which makes the top comment pointless (or missing the point?)

Under-performant computers can and have been silent for a while. A phone falls into that category.

The trouble is making a good performance computer silent.

And even the case the article advertises, is pure garbage. I had the smaller ones (and the author should really have bought the black anodized one!). It works fine while underpowered. But as soon as you hit the 5h compile/rendering levels of workload, that thing cannot move heat away without airflow. period.

My desktop is a terminal. A Mac mini makes a great silent terminal.

The noisiest computer is the person constantly telling you how silent their machine is.

I can hear a high pitched whine emitted from most electronics. Granted for smartphones the display or transceiver has to be powered up and it has to be a quiet room with no white noise. It's a lot better than it used to be, I knew I was getting a call or text from my old Nokia 5160 before it did just from the whine.

Interesting.. I hear that from CRT televisions, but not from other electronics.

Yes you are right. Cellphones and tablets are totally silent. It is the desktops and laptops that make noise, ummm waste too much energy ( heat ).

MacBook 12" have no fans and are completely silent.

The 12" MacBook is a silent computer as well.

silent yes, computers, no.

No, but a smartphone is not a computer.

Smartphones are capable of computing and are sometimes very powerful, but the analogy is totally out of wack in my opinion.

If you can live without using an actual computer, it means you don't really need computers. You can check your email and browser the web on a Kindle, on your TV, or even your car.

Everything is a computer, then.

I think a computer is a productivity tool. Smartphones (and I'd definitely say tablets, too) are to consume content, not produce it. Some companies (most notably Apple I think) believe otherwise, but I think they'll have to come to realize that smartphones and tablets are horrible to produce most kinds of content.

A smartphone absolutely is a computer. It's maybe not a "productive" computer for a lot of workloads, but it still carries out operations and produces output based on the input it is fed.

You're just making up your own definition of "computer" and then claiming a smartphone isn't one because it doesn't match your made up definition.

If the definition is so broad, my oven and fridge are computers.

Your oven and fridge (probably) has a computer within them, but they're not computers themselves.

According to that definition, yes.

You have programs, you have a UI to control them, they have a CPU and memory.

And "that definition" is the official dictionary definition and it works. There is no need to redefine it like you tried to.

Why not? They have buttons and speakers (I/O), I can program a timer on them, and there’s a little processor in there somewhere. How is that different from an iPad?

The computer that has those things is part of the oven, but it is not the oven itself.

Certainly in the rough sense of 'Turing machine' a smartphone is clearly a computer. And, even in other senses, I can connect a keyboard to my S5, open up Termux, run ffmpeg, open up Emacs --- all on the smartphone itself, not through ssh or anything. But it's hardly as productive as a regular laptop or desktop, of course, but that's largely due to screen size, lack of keyboard (or lacklustre keyboard), and weakness of processing capacity compared to a desktop/laptop.

Totally agree. That's where the line is drawn I think.

Otherwise everything is a computer.

I can’t check my email on my Kindle or my car.

Maybe cut to the chase: what specific capability is an iPhone lacking that every “real computer” has?

Do you run the SDK on the device itself, or do you have to develop somewhere else and then transfer the build output to the device?

On Android, you can definitively develop and compile on the device itself. See for example: http://www.android-ide.com/

What a nightmare, though.

Have you tried programming on the original Eee laptops? I'd take a 10" tablet with a bluetooth keyboard over that 7" screen and tiny keys any day.

I've tried touch based UIs to do actual work, and I just can't handle it.

Android can be quite well driven by the keyboard. Even alt-tabbing between applications works.

Personally, even a touchpad compared to a mouse will slow me down significantly.

It's possible that I could be somewhat productive on a tablet in an emergency, but not as my main machine like Apple suggest people should do.

Why can you not check your email on your Kindle? There's a browser, you can absolutely check your email.

If a keyboard (and a decent-sized screen) can be added, definitely a mouse and a mouse-centric interface.

I have tried using an iPad Pro for productivity, and it's living hell.

> definitely a mouse and a mouse-centric interface.

Wouldn’t that mean most servers aren’t computers? Not to mention the DOS machines I grew up with, or everything made before the Xerox Alto?

> I have tried using an iPad Pro for productivity, and it's living hell.

Not going to disagree with you there. But that doesn’t make it not-a-computer.

Those are servers.

If we want to classify devices, we need to group them somehow. Otherwise we call them devices and call it a day.

His old DOS machines certainly weren't servers. Not having a GUI does not make a machine a server and having a GUI does not make a computer not-a-server (especially since we have graphics chips inside CPUs now).

Servers are still computers.

We already classify them: smartphones, tablets, laptops, desktops, servers are all groups of computers.

I don't know about iDevices, but this is very much supported on a lot of Android phones. Samsung's lineup even has it as a bullet point feature, and they sell docks which let you connect your phone to a screen, peripherals, and switch to a mouse-oriented UI without having to do _anything_ weird.

At that point, you're running a mouse-centric, multi-window OS with a wide array of software, that can run basically whatever you want.

So that's a computer, definitely, right? When does it stop being a computer? If you disconnect the display? Is it using a stylus instead of a mouse? Maybe the software keyboard instead of a hardware keyboard? (But then is the MS Surface not a computer when you detach the keyboard case?)

Even though you're technically wrong (the best kind of wrong), I can see where you're coming from. Even microcontroller ICs without enough memory to store this comment are real computers and are silent, but if you say you've built a silent computer and then unveil some Arduino contraption, expect eyes to roll.

That said, I still think smartphones qualify as computers even by your productivity definition. Newer smartphones would sit somewhere above older netbooks on a ranking of overall utility.

>I can’t check my email on my Kindle

I can on mine…

In the past when I tried to build a silent PC, I found that even after removing / stopping all the fans, there was still often an electronic humm or buzz left over. That is when I gave up.

Later I changed desks to one that had one of those built in computer cabinets made of thick particle board. That did as much to silence a pc as all the tens of hours of effort I had put into meticulously researching and specc'ing the build before.

This type of noise is arguably worse than that produced by a well-managed fan setup. If you have fans that spin at a constant RPM, you can fairly easily tune out the noise. Coil whine will vary depending on the load of the system (e.g., when you start/stop scrolling a web page), making it much more difficult to tune out.

Never heard coil whine for years and years decades infact of PC building. Until last year I got my shiny new fancy pants blast furnace, (aka GTX 1080 TI), which has near dead silent fans at idle/light loads. The minute that bad boy starts working an obnoxious screeching/whine starts.

Super annoying compared to the rest of the build being a beast of a machine and watercooled that's so quiet I'm more likely to hear the noise floor on speakers than the PC (which is on the desk, next to said speakers).

My 1080TI also has an obnoxious whine that precisely reflects activity on the card. On the upside, when I’m working in CUDA, I can listen to my algorithms and get a hint at how they are working :)

Yeah I can tell by the various fans whether an application has crashed or the whole system.

Coil whine with graphics cards is such a frustrating experience, because there is no informed consensus as to what the underlying cause truly is. Everyone has their own anecdotal reason:

a) Maybe coil whine is an intrinsic factor in the manufacture of graphics cards, similar to dead pixels on displays. "Luck of the draw" when obtaining one is the only way to win. Cycle through RMAs until you get one with little to no coil whine.

b) Or, it depends which company you buy from: each of Asus, EVGA, Gigabyte, MSI, Zotac et al are supposedly better or worse than the others.

c) Or, it's not a problem with the GPU at all; rather, it's an indication of a poor quality power supply (PSU).

I've never seen an informed analysis from an industry engineer who has a goddamn clue what they are talking about. NVIDIA could probably enlighten us all with an exact-science explanation, but that seems unlikely. My uneducated guess is that the situation is closest to option 'a' above, and that rejecting units for coil whine during quality control would drastically reduce production yield.

Yeah, I don’t see NVIDIA or AMD drawing any attention to product flaws, particularly if the end-user solution is “RMA until you get a good one”.

Instead, ping a few review sites and see if any of them are willing to take a crack at it.

My GTX 970 does the same thing. It is incredibly annoying.

Fortunately it happens when at high load, which is while playing games. This doesn't help for quiet scenes, though.

Is there a way to fix this sound in video cards? I'll have to investigate.

Coils emit a relatively high-frequency sound, so your best bet is a case with thick panels, that’s fairly well enclosed, and has some acoustic dampening.

Most cases made today don’t have any significant dampening material. It’s pretty trivial to add some to the panels without significantly affecting cooling capacity.

Alternatively, if you wanted to really dig into this, you could try to find every switch-mode power supply and replace them with (much less efficient) linear power supplies. You'd need a larger water-cooling system though.

Is this a joke?

That would require making your own GPU PCB, and maybe a year's worth of studying on electronics and power supply design.

Not really. More likely just desoldering a couple of key components and soldering in a few wires to a dsughterboard with a linear regulator.

A year’s worth of power supply design? Have you looked at the LM7805 datasheet? The circuit is one regulator and two capacitors...

Have you looked at the current requirements for CPUs and GPUs? We're talking 100A or more. Even if the input was 3.3V, which it isn't, a linear regulator down to 1.1V would have to dissipate 220W or more. You'll need a bigger heatsink for the linear regulator than for the CPU...

And designing a 250W-capable linear regulator is not as simple as just hooking up a LM7805.

So with a modern CPU and GPU, your talking ~400W of power to the actual components, and nearly 500W wasted in the linear regulators. This of course also means you have to get a 1000W PSU as a bare minimum.

Here's the problem: not only do you have to dissipate 200W+ for your pass element, you need to drive it ultra fast with an extremely fast analog circuit that can withstand the massive magnetic fields (which pretty much means you need a PCB).

Yeah and good luck driving a 200W linear element (if it even exists, lol) with a few op amps--the driver which should deliver a few amps into the gate/base of the pass element, which in and of itself is a pretty difficult challenge.

old thread but

LMFAO you can't be any more wrong. You need /much/ more careful design to get GPU-compliant performance. The dI/dt on modern ASICs are insane, and you need an insane regulator to deal with it.

I was going to say, I can hear coil whine if I stick my ear into the machine, but it doesn't make it out past the acoustic dampening panels in my case.

I've had a GTX 460 and a GTX 660 Ti before, and once I upgraded to a GTX 970 i was disappointed by the coil whine. I could even see the effect of the power drain on my secondary screen when I booted up GTA 5. Guess we'll have to live with the fact that the tech cannot keep up with the demand and quality, resulting in more 'unstable' electronics.

My fiancee has a mouse(Logitech MX Master) which has a very-high pitched coil whine, but only when you move it. It's the most infuriating thing in the universe, but apparently she can't hear it. So yes, random coil whine is about 10000x worse than fans.

So her mouse squeaks? That’s adorable!

If you google "Logitech mouse coil whine" you will see that this has affected multiple Logitech models for many, many years. They apparently have no interest in fixing this issue.

I find it funny that this post has become the second result on google when searching for this phrase now :P


Same with the Dell XPS 9560 but talk to their customer service department and they'll deny all knowledge...

I've had my top-of-the-line Dell XPS serviced(motherboard fully replaced) 3 times because of the coil whine issue, and it's not going away, and Dell said they will not replace it any more. It's one of their most expensive laptops and they can't get such a basic thing right.

Last year when I was deciding which laptop to buy for personal use I ruled out the XPS15 (partially) on coil whine (I'm 37 but thanks to never going to concerts and rarely listening to music beyond half volume I still have decent high frequency hearing).

In the end I went with a Thinkpad and having seen the issues people I know have had with the XPS15 I'm pretty glad I did.

That's not the mouse, but the CPU causing coil whine when the mouses causes the kernel to up it's CPU usage as you fly across various GUI elements.

Unless you've tested OP's fiancee's system yourself and found that to be the case, I wouldn't be so sure.

I had a Logitech G500 with awful coil whine and the opposite problem: it would stop whining when moving and start when idle. I suspect it had to do with the power saving mode that lots of mice have, where the laser power supply ramps down to dim the laser illumination after a period of no detected motion.

A lot of times I've noticed it's the USB controller. It's quite evident when you have headphone ports near USB ports.

I can be tabbed out and hear when I get into the plane in PUBG from the increase in whine. It's quite nice actually.

Not necessarily. Possible, but I had a cheap HP mouse that whined briefly when it first came out of sleep mode. Couldn't have been the computer, since it would make the same noise when the PC was turned off.

That is not the case - the mouse itself is whining, it does it even when the receiver is not plugged into anything.

Are you running the latest firmware?

Some noise, depending on the texture can be even pleasuring.

coil whine is highly unnerving while low fan sound is relaxing.

we like stimulus, the clicky keys of my old hp48 is neat, the insertion sequence of pioneer 32x slot-in cd drive was amazingly subtle; not long ago I revived an old HP tape drive, the tape rolling and the head gear was also beautiful.

Also, it was as cute as informative, it's a clear state change side channel. Often software notification about hardware are decoupled so much that you don't trust it; plus they're invasive, unlike a tiny led, a click, a tiny motor ramping up.

I still miss hard drive chatter as a proxy for machine being up to something. Sometimes the machine isn’t supposed to be doing anything and it tells me to ask questions.

Now it’s just when the fan on my laptop starts taxiing for takeoff, which can take a lot longer.

If you miss that then you are going to love jwz's latest xscreensaver work.


I can’t believe I used to sleep next to a computer that sounded exactly like that. It’s soothing.

> Some noise, depending on the texture can be even pleasuring.

I like the sound HDD make when grinding (except when I don't know the reason for the grinding... looking at you svchost.exe).

> coil whine is highly unnerving while low fan sound is relaxing.

Which is why I have been putting off getting a new laptop for years now. Most seem to suffer from coil whines and I can't stand it (to the point I ended up using an old eeepc 1000he rather than a brand new 16 inches VAIO some years ago).

> (except when I don't know the reason for the grinding... looking at you svchost.exe

This might be "SuperFetch".


yes, I'll even feel angry when a new faster drive has a worse sound. Some old hdd had a very nice gratting that wasn't threatening or annoying.

I've switched to Fractal Design cases with their heavy sound-proofing and been thoroughly impressed. I use Corsair RM750 which never powers its fan on, and I used to have a all-in-one water cooler for the CPU, but I moved to a Noctua design with a large 140mm fan. The water cooler had a 120mm fan that I replaced with a 120mm Noctua--but it was too close to the rear vent of the case, and thus noisy.

However my graphics card (RX 480) is quite loud, and one bearing is making noises.

But the Fractal Design case has really dampened the sound. For my home server, it's using a RM500 (which also never turns the fan on), and a low-profile Noctua CPU cooler. No other fans, but I do hear the 6 HDDs spinning and seeking when it's real quiet in the room.

I don't hear any coil whine, except when using headphones plugged into my desktop's speakers--probably the result of the speaker system's power supply. Klipsch ProMedia, if you're curious.

I have one of the Corsair budget quiet cases under my desk. I can attest that it has excellent bang for the buck, for being a basic black case with useful soundproofing/absorption.

As far as noise reduction for CPU cooling, I'd suggest buying more air cooling than you need for a modest TDP CPU. Between that and my fanless Seasonic power supply, and SSD, the only noiseI can ever hear from my machines is from the GPU.

How did temperatures compare between the 140mm Noctua and the AIO water cooler?

I'm looking at buying a new machine soon, and was actually looking at a Fractal Design case with either one of those 2 cooling systems.

Temps are higher, as is spiking, but it never surpasses 70C, where when water-cooled, would never surpass 45C or so, IIRC.

This is a Ivy Bridge (IIRC) Core i5 at 4.0GHz.

Wow, that's a big difference. I would like it to be as quiet as possible, but also plan on running it nearly 24/7 for 5+ years, so I'll probably go for the AIO water cooler.

I had a Corsair H50, it lasted about 5 years or so. The pump died on it. It was actually quite annoying to track down. At idle it would run for an hour or two before shutting off. But start a game it would only last a few minutes.

Seems obvious in hindsight, but I had no fan (pump) speed warning or anything.

The only time I achieved silence was when I moved the computer case outside the room and used a 2m VGA cable and USB chord extenders. That silence was weird though. No audio feedback at all from the computer, just the clicking of keyboard and the mouse.

I did exactly that too!

I was tired of the never-ending quest for silence, so I bought 3 50-ft dvi cables and a couple usb-3 cables of the same length and put the PC in the attic.

It worked great, except any hardware issues resulted in a trip to the attic.

Did you leave the machine always on ?

Pretty much. I've never really shut down my main machines when i'm not using them.

I work from home, so i'm on it several hours every workday, combined with the fact that I tend to have multiple things in-progress all the time means it would be a giant pain in the ass to shut it down fully.

Hehe, you get used to it after a while. Sometimes I wonder where the heat is coming from when my PC has some higher load and my hand moves over it. Those are the moments when I remember the days when I couldn't hear the vacuum when the PC was compiling ;-)

Hm, almost a dumb terminal!

Same setup, with sound over hdmi. love it!

A long time ago, about the time someone tried to coin the term invisible computing, I figured out how to hook a monitor arm to my couch, and I put the tower behind it. 3 inches of padding can absorb a lot of sound.

Still kinda miss that couch.

The case matters a lot. I've got a Fractal Design Define C, which dampens a lot of noise. Supposedly the R5 is even better.

This was my second silent PC. The first one still had moving disks, but I went for as few fans as possible, and had passive coolers on the internals. This time I did the opposite: lots of fans, but have them spin as slow as possible. This works very well.

But coil whine and electronic hums are easy to overlook when you're choosing parts. It's worth looking at not just the fans and the power use (more power needs more cooling), but also the quality of the electronics.

That's called coil whine, often a result of shoddy power components or construction (and often in either PSU or GPU).

Capacitors in PSUs and GPUs are often, but not always, the parts to blame.

Inductors, actually, are most often to blame (hence - coil whine, also why they're covered in goop in most devices).

I've a completely silent PC too [1] and I was lucky, as mine doesn't have any relevant electronic buzzes.

But when I walk to the backside of my desk I can hear some electronic buzz from one of my monitors. Whats funny about it: I have that monitor since a few years now and before I built that silent PC and turned my desk to another direction, I never noticed the buzz from the monitor :D

[1]: https://news.ycombinator.com/item?id=16224202

It is possible to get rid of the coil whine. You can use a long paper roll like from a kitchen roll to locate it. I once saw in a forum a guy flooding a whole PSU in epoxy, to get rid of it.

I used to do all this in the past but now as I get older I have hearing loss/tinnitus and suddenly no longer have to worry about fan noise anymore...

yeah, similar for me. deaf people have their own advantages!

Oh gosh, coil whine.

I had a Geforce 280 that would scream like hell whenever it was at full power and its framerate went below about 10 or above about 100 FPS. I was glad when it broke some other way and got replaced under warranty.

Same thing with my 280! Unfortunately it was a BFG, and the company folded about 2 days after I submitted my warranty claim.

Mine was an EVGA.

But doesn't locking up your PC in a cabinet hurt its cooling performance? I doubt there is adequate airflow in there.

I keep my CPU running at a confortable 50°C to 60°C.

There are 5 fans in my tower, two on the CPU cooler, only one of those two fans is running constantly, at only 200rpm. All the others aren't running most of the time.

I don't need my computer to run at a cool 30°C all the time. The hardware can run very hot without any issues. And when all the fans eventually kick in under load, it will always keep under 70°C anyway.

It does mean things run warmer, but generally moderately specc'ed pcs don't generate enough heat and have enough reliability margin that it is not a problem. If you were some game enthusiast or crypto miner running multiple flagship GPUs on an 850W power supply, then I probably would not recommend this approach.

I put mine in this: https://www.ikea.com/gb/en/products/desks/table-tops-legs/al...

And it's doing fine. Its cooling is slightly over-sized since I want to keep fans spinning at very low speed, but I haven't seen much difference compared to when it was outside.

Yes, depending on how well it seals. Usually there's a hole in the back for HID cables, and the front has rubber feet to prevent door slamming that also offsets the door. This allows heat to flow out.

My computer is in the closet and I have a two cables a USB type c and a mini display port that comes out under my desk through the wall. Absolutely quiet.

Coil whine can relatively easily be absorbed as it is easy to absorb high-frequency noise. Just seal the entire thing and let only the copper pipes with the heat sink stick out.

> (Astute readers will notice they are all AMD (Socket AM4) motherboards. The whole Meltdown/Spectre debacle rendered my previous Intel system insecure and unsecurable so that was the final straw for me — no more Intel CPUs.)

That's a silly and extremist position to take. "insecure" is relative, not absolute. It's a certainty that the software he's running has far more quantity of vulnerabilities and a much longer history of them. I don't know his exact use case, but arguably his use case isn't one where Spectre is particularly more severe than even a userland, non-priv-escalation vuln. (eg ransomware doesn't require root access to hold all your files hostage.)

> Eliminate the moving parts (e.g. fans, HDDs) and you eliminate the noise — it’s not that complicated.

Ha! And yet it deserved a detailed blog post. I'm surprised he would say this even after the amount of effort he spent.

After having purchased an AMD GPU, they would have to pay me for me to buy anything from AMD ever again.

Their CPUs are pretty good, their GPUs are pretty good, however their GPU drivers are terrible, especially under linux.

> however their GPU drivers are terrible, especially under linux.

Imo the AMD drivers are way better than Nvidia's drivers. They're included in the kernel and therefore open source. Compared to Nvidia's proprietary drivers that have horrible support with a lot of compositors and lack support for DDC/CI over DisplayPort. The Nouveau drivers are better (slower performance but better compatibility), but are unable to change the clock speed (and are set to the minimum).

The AMD drivers "just worked". Selling my RX480 for a GTX 1070 was the worst decision I made when it came to compatibility. Now I can't even get Vsync to work with this Nvidia crap.

Since Linux 4.15, AMD GPU support is in kernel, can't say the same for Nvidia. That said, Nvidia's proprietary drivers still do better than AMD's drivers.

Well it's a good thing nvidia is catching up on being horrible. The 367.xx driver's dkms module doesn't even compile for me on Ubuntu 16.04 anymore, and later drivers make some gstreamer-based apps stutter a bit.

And Windows is behaving weirdly as well since I installed the latest drivers. Black bar on top of full screen programs after waking from hibernation, The HD audio driver not letting pulseaudio start (I need it to get sound from WSL) and crashes when multiple 3D accelerated VMs are open. And restarting the GPU driver (with either the shortcut or through device manager) is what solves all the issues and they only occur with the latest driver.

And the crappiest part is that there's nobody that can help. Getting someone from nvidia to respond on their forums is basically luck, and I'm not a huge company that can get their reps to get someone to help me.

The open source drivers of slightly old GPUs are ok. Not that good, but not bad either. And they are open, so they will come well integrated with your distro.

What puts AMD GPUs in a weird situation where you can expect them not to work very well when they are new, but to improve until you can forget about them. (The inverse of the NVidia GPUs, that work ok when new, but slowly loses compatibility with time.)

People literally buy AMD GPUs specifically for the Linux driver

With a recent kernel, their hybrid driver is fantastic.

what? It's the exact opposite, their GPU drivers are great on Linux. I've literally never had a problem with AMDGPU.

Also, you can't even compare NVidia's drivers to them, since they don't even support Wayland properly!

you're probably getting downvoted because of your 'never'.

AMD linux support was downright abysmal pre ~2015.

One can make a passive build much more powerful.

NSG S0, once out, will most likely be the go-to case for such setups. Until then, an HDPLEX H5 is cool.

My desk has a H5 on it, housing an i7 8700 (non-K) and a GTX 1060. The TIM under the heatspreader is replaced with Thermal Grizzly Conductonaut and Thermal Grizzly Kryonaut is used as every other TIM that the case setup needs. The CPU is on stock clocks with a voltage offset of -30 mV. The GPU has the power target reduced to 90% and clocks increased by 130 MHz, so that it is effectively undervolted as well. The PSU is a Seasonic Ultra Prime Titanium 650. Prime95 with AVX throttles really, really fast, under a minute, perhaps, but is a very unrealistic load. Non-AVX stress tests and FurMark take a while to start throttling (20 minutes?), as the thermal capacity of the aluminum case is quite big. After hours of gaming, the GPU and CPU float around 80 C while providing full stock performance. I don't do 3D rendering (other than in-game) or video en/decoding, so have not had long, real-world, full loads to see how temperatures behave with those.

From the discussion I've had and forums I've read, I think that people are afraid of putting more power in passive cases and having their components at "high" temperatures, despite those being rated for them.

>Prime95 with AVX throttles really, really fast, under a minute, perhaps, but is a very unrealistic load

I suppose blender would thermal throttle the cpu as well. If you run any non-Xeon/non-Laptop Intel chip (greater than 2k series) and care about temperatures - delid the bugger. (Xeons are soldered, laptop chips don't have IHS). Intel uses something that's worse than toothpaste, plus tons of glue between the die and the IHS. If you see temperature deltas under full load more than 9-10C between the cores, the thermal paste between the die and the IHS might have missing spots or have dried out. In your case removal of the IHS altogether would provide decent results.

You might wish to check the VRMs, they are rated at 125C but if the case is hellishly hot inside, they might not be able to dissipate the heat.

Somewhat unrelated, but maybe you can shoot me down since you seem to have some experience?

Metal is an incredibly good conductor on its own, and the properties of thermal paste (typically) are just barely better than air. So long as your cpu and heatsink are fairly flat surfaces and mashed together physically, it seems like either forgoing or having the absolute minimum amount of paste is ideal. I've used a razor to leave an absolutely minimal layer of paste (e.g. filling in sub-millimeter surface structure) on my latest build, and cpu temperatures are well within a reasonable range. But I'm also not trying to OC the cpu or anything.


>...and the properties of thermal paste (typically) are just barely better than air.

I am not certain how you have managed to come to such a conclusion. Thermal conductivity of air is around 0.03W/(m·K)[0]. Good thermal, non-conductive paste is like 12.5W/(m·K)[1] (or 400 times better than air). Conductive ones are in the region of ~40-80 W/(m·K) and Aluminium is 237W/(m·K). Also air also expands pushing the cooler and CPU away.

Normally you if choose between "too much" and "too little" paste, you pick the former. The pressure pushes out the unneeded amounts.

[0]: https://www.engineeringtoolbox.com/thermal-conductivity-d_42... [1]: http://www.thermal-grizzly.com/en/products/16-kryonaut-en

Don't, you need to multiply the raw conductivity by the linear distance occupied by the thermal paste? I presume that distance will be at least two orders of magnitude larger than that occupied by air in a metal contact only setup.

I would be extremely surprised if increased pressure due to air at higher temperature played any role whatsoever unless the bolts connecting the heatsink and cpu were very loose. If anything, I'd expect the increased conductivity of air at higher temperatures to dominate.

I'd also expect there to be effects at the metal-paste and paste-metal interfaces which reduce the effective system conductivity (i.e. phonons are much more likely to reflect in this scenario than in a metal-metal interface).

It's very impractical/expensive for mass products to make the surfaces in question so flat that no thermal paste would be needed. Many tests and reviews have been done. Even if top-of-the-line coolers came with perfectly flat surfaces, Intel's heatspreader is not -- otherwise it would cost so much more. Also, heatsinks can be applied with a lot of force, which usually pushes out the "unneeded" part of the thermal paste. In a bind, even lipstick, toothpaste, chocolate and other silly compounds work better than nothing, so I'm not surprised that you're getting ok results even with a touch of thermal paste.

A fun thing to try is using a modern low-end CPU (latest i3s, Pentiums, Celerons) without its cooler. Not advised by Intel, of course, but you might get into your OS of choice even before it starts throttling. I'm somewhat comforted by the fact that a CPU automatically powers of once it reaches something above 100 C (103 maybe?) and throttles a few degrees before that. Those temperatures shouldn't leave the silicon damaged.

In practice, thermal paste is a must. If you don't like those (I personally don't, they get everywhere by accident and can be tough to remove), try getting an IC Graphite Thermal Pad which is reusable and rivals really good, if not the best thermals pastes, according to the limited number of reviews I've seen. I think that its practicality beats better results in non-highest-end applications.

Its really not that hard to create flat surfaces- anyone with a lathe, a hard cutting instrument, and a bit of fine grinding material can probably get a contact bond between two pieces of metal.


Smallest ammount of TIM spread all over. NO "PEA" METHOD! All over!

The Cool Laboratory Liquid Metal stuff is the best but hard to work with.

Blender might throttle. Does it come with benchmarks/tests that I could use to gauge thermal performance?

The CPU is delidded! I've got another i3 4300 delidded as well running under a NoFan CR-80EH. Delid + Conductonaut + Kryonaut made the difference between throttling vs hovering around 90 C in FurMark + Prime 95. When integrated graphics aren't used, the CPU runs cooler, of course, and didn't throttle with MX4 thermal paste and no delid.

I do fear that VRMs are running too hot. When selecting components, I picked those that come with some heatsinks on VRMs at least. The motherboard is an AsRock Fatal1ty Z370 Gaming-ITX/ac (non ITX motherboard wouldn't fit in the case anyway with an ATX power supply). The graphics card is Gigabyte's cheapest offering and has a small sink across the VRMs. I'm hoping that undervolting will help keep the VRMs in check.

> Blender might throttle. Does it come with benchmarks/tests that I could use to gauge thermal performance?

There are multiple scenes to render as benchmark (I guess BMW one is the shortest/most popular). https://www.blender.org/download/demo-files/

Clearly a work of passion and I appreciate OP sharing.

I enjoyed earlier days of "Silent PC" building, ten or fifteen years ago. For example, building a silent tower or desktop for a DAW or softsynth back then in a recording/studio environment required some ingenuity. SSD? Not on a hobbiest budget. I recall one build, not mine, fully immersed in a bucket of oil (mineral?) for passive heat dispersion.

Today, as a new reference point, any MacBook Pro within the last few years may qualify as truly silent for many people's everyday usage. It does for me. And when I do heat up the CPU/GPU with heavy tasks, the fans spin up but then they go away completely as soon as the hard work is done. Back to silent.

No more spinning platters or crappy fan bearings or poorly engineering airflow nowadays. :-)

There's no hacker pride in buying off-the-shelf, so the performance bar for DIY is higher. Progress!

I'm glad you enjoyed the post. My first silencing attempt (in the early 1990s) was to seal a hard disk in what amounted to little more than a couple of oven bags, custom-make a long IDE cable, and then — literally — dangle the drive in a bucket of water that was under the desk the computer sat on. That ended poorly, with thermal cycling inducing microfractures in the bags, which resulted in leaks, a dead drive and a blown controller on the motherboard. Ah well, we live and learn. Take it easy.

It used to be fun trying to silence a computer. Figuring out which HDDs were single-platter, undervolting processors, finding low-rpm fans, etc. used to be a challenge, and have great rewards for those of us who care. Now, check out silentpcreview.com, hasn’t been updated this calendar year. In some ways I miss the old days, but most modern laptops and even desktops are reasonably quiet, and full silence is easy to achieve.

I've had the same experience with my MBP 2017. So many people don't realise that the aluminium chassis isn't just attractive, it's actually a massive heatsink, and it's the reason the machine needs barely any internal airflow.

man, my mbp is on fire and fans spinning all the time just from xcode / ios dev.. backend dev on a remote server was the life! :D

This says that a completely silent computer would be 0dB. But it's a logarithmic scale, so I think it should say -Infinity dB.

This is admittedly pointless pedantry.

Silent means that it has no sound, but "sound" either means the sensation of perceiving vibrations, or the actual vibrations that cause the sensation. 0 dB SPL is the threshold of hearing, so it is "silent" in the first sense.

Strictly speaking.

He’s using the term incorrectly to begin with so corrections aren’t really possible.

Of course it can’t be completely silent. Heat generates air movement which is “sound”.

By 0dB he means 0dB SPL which is give or take correct.

If the air moves at a constant speed I don't think you'd get sound. Any fluctuations probably won't be in the audible spectrum, although whether that still counts as 'silent' is more of a philosophical question.

Yep. 0dB is the limit of human hearing, not absolute silence. There are anechoic chambers with noise levels below 0dB.

Yet if you sit in one you hear the blood pumping through your ears so it's reasonable to use human perception as a baseline when that's what they care about.

I've been in one. Back in HS, we took a trip to the GIANT anechoic chamber at Georgia Tech. It was incredible. If someone was facing away from you and yelling as loud as they could, you could barely hear it from behind them. You're standing on basically a giant net on top of a pit of foam blocks, and every wall is wildly shaped undulating foam.

It's really surreal and a number of people actually became nauseated at the sensation.

Pretty sure there's going to be some coil whine. Probably above 0dB.

I'm pretty sure that there's some measurable amount of coil whine or other noise coming from capacitors and other small components, making it more than -infinity dB, and probably more than 0dB. The sound might not even be within human detectable frequencies.

To be more pedantic, it's probably safe to assume this is not even 0dB. Most people would probably consider a computer under 25dB-30dB as "silent" since most rooms background noise are around there.

Linus did a silent PC build which even in a sound proofed case and at idle was about 14dB and broke 20dB under load: https://www.youtube.com/watch?v=RXZrWqCT7R0

Even the high end microphone used to record the sound level in this video produced it's own 7dB of noise.

It may register below 0dB SPL (in an anechoic chamber, anyway) depending on the manner in which it's measured, which is a complex discipline in itself. Sound pressure dissipates over distance, for example, so the distance at which the measurement is made has bearing, as does the angle at which it's measured.

(This is unabashed pedantry. But I'm on my lunch break, so...)

20 to 30dBA is probably the base noise level of most environments.

No point in making your PC less noisy than the noise floor.

Also, coil whine and convection wind have potential to be more obvious once you take away fans and introduce more efficient passive components.

I was just going to post this with a nitpicking warning. Anyway, it may be a joke.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact