Granted, they can't do what this person's high-spec workstation can do, but they do most of the computing tasks most people use (used) noisy fanned computers with clacking disks for and in many cases do those tasks better.
And unless I'm just losing my hearing, my smartphone is completely silent as long as I don't accidentally press the Golem Invoker, er, Siri button.
4-6 months ago I built a new workstation for work. I had used one of those Corsair closed loop water coolers with the prior build, so I set one up on this guy. A month or so later my workstation was running really sluggishly, and I realized it was drastically throttling the CPU because of heat. I installed some software to spin up the fans on the cooler to keep it down under 100C, but now it'll get kind of loud when I run much heavy CPU.
Now, this is a pretty heavy duty workstation, 64GB of RAM and 3 displays. But, if I were doing a new machine for home to be quiet, I think it'd be a NUC. Then I'll put the box that has the 6 drive ZFS array in a closet and call it good.
Their highest end is available with ECC ram and a discrete graphics card: https://fit-iot.com/web/product/airtop2-build-to-order/
And the lowest end fitlet2 can be configured with an atom in a reasonable configuration for around $300 (it's $130-$200 for case/cpu/motherboard depending on which CPU you configure).
Gigabyte also has their BRIX products, which are similar to the NUC.
I was at a hotel this weekend and at check-in they had a monitor with a "ThinkVantage slotted in the back of the monitor, that might be a nice setup.
Eventually I just bought a cheap USB desktop fan and ran it facing the NUC.
It has made me think that instead of getting a NUC, for a quiet desktop system I should have just gotten a second used MBP and ran it permanently docked in clamshell mode with the monitors and keyboard attached. (with the added benefit of being able to go portable when I want to)
Neither the NUC nor my MBP is completely silent but for my purposes I find that I seldom tax either of them enough to where the fans become audible enough to be annoying. Still, I do find the difference in performance between them to be apparent just in things like iteration time on web development and IDE responsiveness.
You could hear the whine from across the room a few seconds before a call would ring through.
People always asked how I managed to answer the phone so fast. Electromagnetic Supplementary Perception, of course.
It was clearly electromagnetically "noisy", but I do't recall ever having heard any on my phones make any unexpected audio noise... (My old-and-abused rock concert and motorcycle weary ears probably can't get up as high as inverter whine any more though...)
Also, GSM phones used TDMA (keying the transmitter on and off to occupy one of -hm- eight, I believe - time slots on a given channel.)
This is practically asking for EMC issues.
LTE, on the other hand, transmits continously (I believe - I do not work in RF engineering anymore, but try to read up on new tech every now and then.:) - much less interference-causing than the constant on/off of TDMA.
Some were barely noticeable, while one in particular (a Droid Turbo) was so loud I could hear it getting ready to receive a call from another room. This was regardless of whether they were plugged in or not, although charger whine was its own separate issue.
Thankfully it does seem to be getting better over time- my current S8 is, as far as I can tell, genuinely silent.
There's a TDMA modulation frequency at 217Hz and this interferes with all sorts of nearby audio devices. CDMA and WCDMA phones have a much broader interference spectrum, which is why you don't hear it much anymore.
I considered returning it, but I find it charming. I miss the days when you could tell exactly what your PC was doing by all the sounds it was making, and I find dead silent electronics to be elegant but a little sad.
The easiest way to experience this is to plug in headphones and hear the clicking before/after a sound is made.
My Samsung Chromebook from 2012 has an ARM processor, solid state storage, and no fans. It is pretty slow by today's standards though.
I think several companies make cases for the Intel NUC boards that radiate the heat away and have no fans, too.
My Samsung Chromebook 3 gets a touch warm but never uncomfortably so like my 2012 Retina MacBook, which lets you really feel it when your code is inefficient. (Granted, the Chromebook is a lot less powerful)
525 lines / 2 for interlacing * 60 fields per second = 15750
They were still using CRT TVs in 2012 when I finished high school. I wouldn't be surprised if there were still plenty of schools with CRT TVs and VCRs for educational material.
I find coil whine a worse background sound than the lower frequency fan hum.
I was worried about performance, but it has been very acceptable. It depends what you need it for, but I can run 2 monitors, a Linux VM and atom all while streaming HD video. Or I can do light web browsing for 10 hours on battery. I love it.
On the rare occasions I need more power, I spin up a spot instance.
 If you use Linux on a laptop, install "tlp". It optimizes battery life without a noticeable reduction of performance.
Obviously newer models are different from older models, and Air models probably don't have the fans that Pro models do, but the claim that Macbooks don't have fans is a bit too broad to be true.
This is why these 'simple' naming schemes are confusing.
In Jobs' 2x2 matrix, the portable half was initially populated by iBook and PowerBook, later by MacBook and MacBook Pro.
It's unfortunate that Apple has confusing brand names, but the fact remains that the Macbook indeed has no fans so the original comment who hears fan noise is obviously using a different model of laptop.
So if you say that recent Macbooks have no fans, that may well be true. But it's not true for all Macbooks.
I mean it depends on what you do - but for many people it's a realistic solution.
Apple have put faster chips in their smartphones than in their laptops.
This is all alluded to in the article and the macrumors post that the article is based upon, here are some quotes:
> "Sure, that doesn’t mean the A11 Bionic can do all the things a desktop CPU does."
> "Though the iPhone X and the iPhone 8 offer impressive Geekbench scores, how that translates to real world performance remains to be seen."
There's no question that the iPhone chips deliver amazing performance, but there's a reason people still lug Macbooks around.
Geekbench always seems like an odd benchmark - the variability between runs alone is kind of odd. If I could run a compiler on an iPhone, for example, would I really see similar performance to my MBP?
It also doesn't pass the smell test. Even Atom CPUs are preferred over high-end ARM for netbooks. But a Xeon is way more powerful than any Atom.
Whereas on a normal computer, a light goes on and the fans start spinning, not because it's useful but just to indicate that it's on now. It's a miracle.
Then it dawned on me: I was currently logged in on the machine on his desk.
Some of the Chromebooks and such have no moving parts. I'm probably taking my Pinebook to the next conference I attend.
Can't you just use any USB-C Dockingstation?
This is due to the form factor, not the capability of the devices. A high end smart phone is more than capable of producing a good desktop experience.
> Even though this system is not meant to be a gaming rig, there’s no harm in putting in the best GPU you can without blowing the thermals.
You can get rid of that sound as well: just flip the switch on the left side of the phone.
Under-performant computers can and have been silent for a while. A phone falls into that category.
The trouble is making a good performance computer silent.
And even the case the article advertises, is pure garbage. I had the smaller ones (and the author should really have bought the black anodized one!). It works fine while underpowered. But as soon as you hit the 5h compile/rendering levels of workload, that thing cannot move heat away without airflow. period.
Smartphones are capable of computing and are sometimes very powerful, but the analogy is totally out of wack in my opinion.
If you can live without using an actual computer, it means you don't really need computers. You can check your email and browser the web on a Kindle, on your TV, or even your car.
Everything is a computer, then.
I think a computer is a productivity tool. Smartphones (and I'd definitely say tablets, too) are to consume content, not produce it. Some companies (most notably Apple I think) believe otherwise, but I think they'll have to come to realize that smartphones and tablets are horrible to produce most kinds of content.
You're just making up your own definition of "computer" and then claiming a smartphone isn't one because it doesn't match your made up definition.
You have programs, you have a UI to control them, they have a CPU and memory.
Otherwise everything is a computer.
Maybe cut to the chase: what specific capability is an iPhone lacking that every “real computer” has?
It's possible that I could be somewhat productive on a tablet in an emergency, but not as my main machine like Apple suggest people should do.
I have tried using an iPad Pro for productivity, and it's living hell.
Wouldn’t that mean most servers aren’t computers? Not to mention the DOS machines I grew up with, or everything made before the Xerox Alto?
> I have tried using an iPad Pro for productivity, and it's living hell.
Not going to disagree with you there. But that doesn’t make it not-a-computer.
If we want to classify devices, we need to group them somehow. Otherwise we call them devices and call it a day.
We already classify them: smartphones, tablets, laptops, desktops, servers are all groups of computers.
At that point, you're running a mouse-centric, multi-window OS with a wide array of software, that can run basically whatever you want.
So that's a computer, definitely, right? When does it stop being a computer? If you disconnect the display? Is it using a stylus instead of a mouse? Maybe the software keyboard instead of a hardware keyboard? (But then is the MS Surface not a computer when you detach the keyboard case?)
That said, I still think smartphones qualify as computers even by your productivity definition. Newer smartphones would sit somewhere above older netbooks on a ranking of overall utility.
I can on mine…
Later I changed desks to one that had one of those built in computer cabinets made of thick particle board. That did as much to silence a pc as all the tens of hours of effort I had put into meticulously researching and specc'ing the build before.
Super annoying compared to the rest of the build being a beast of a machine and watercooled that's so quiet I'm more likely to hear the noise floor on speakers than the PC (which is on the desk, next to said speakers).
a) Maybe coil whine is an intrinsic factor in the manufacture of graphics cards, similar to dead pixels on displays. "Luck of the draw" when obtaining one is the only way to win. Cycle through RMAs until you get one with little to no coil whine.
b) Or, it depends which company you buy from: each of Asus, EVGA, Gigabyte, MSI, Zotac et al are supposedly better or worse than the others.
c) Or, it's not a problem with the GPU at all; rather, it's an indication of a poor quality power supply (PSU).
I've never seen an informed analysis from an industry engineer who has a goddamn clue what they are talking about. NVIDIA could probably enlighten us all with an exact-science explanation, but that seems unlikely. My uneducated guess is that the situation is closest to option 'a' above, and that rejecting units for coil whine during quality control would drastically reduce production yield.
Instead, ping a few review sites and see if any of them are willing to take a crack at it.
Fortunately it happens when at high load, which is while playing games. This doesn't help for quiet scenes, though.
Is there a way to fix this sound in video cards? I'll have to investigate.
Most cases made today don’t have any significant dampening material. It’s pretty trivial to add some to the panels without significantly affecting cooling capacity.
That would require making your own GPU PCB, and maybe a year's worth of studying on electronics and power supply design.
A year’s worth of power supply design? Have you looked at the LM7805 datasheet? The circuit is one regulator and two capacitors...
And designing a 250W-capable linear regulator is not as simple as just hooking up a LM7805.
So with a modern CPU and GPU, your talking ~400W of power to the actual components, and nearly 500W wasted in the linear regulators. This of course also means you have to get a 1000W PSU as a bare minimum.
Yeah and good luck driving a 200W linear element (if it even exists, lol) with a few op amps--the driver which should deliver a few amps into the gate/base of the pass element, which in and of itself is a pretty difficult challenge.
LMFAO you can't be any more wrong. You need /much/ more careful design to get GPU-compliant performance. The dI/dt on modern ASICs are insane, and you need an insane regulator to deal with it.
In the end I went with a Thinkpad and having seen the issues people I know have had with the XPS15 I'm pretty glad I did.
I had a Logitech G500 with awful coil whine and the opposite problem: it would stop whining when moving and start when idle. I suspect it had to do with the power saving mode that lots of mice have, where the laser power supply ramps down to dim the laser illumination after a period of no detected motion.
coil whine is highly unnerving while low fan sound is relaxing.
we like stimulus, the clicky keys of my old hp48 is neat, the insertion sequence of pioneer 32x slot-in cd drive was amazingly subtle; not long ago I revived an old HP tape drive, the tape rolling and the head gear was also beautiful.
Also, it was as cute as informative, it's a clear state change side channel. Often software notification about hardware are decoupled so much that you don't trust it; plus they're invasive, unlike a tiny led, a click, a tiny motor ramping up.
Now it’s just when the fan on my laptop starts taxiing for takeoff, which can take a lot longer.
I like the sound HDD make when grinding (except when I don't know the reason for the grinding... looking at you svchost.exe).
> coil whine is highly unnerving while low fan sound is relaxing.
Which is why I have been putting off getting a new laptop for years now. Most seem to suffer from coil whines and I can't stand it (to the point I ended up using an old eeepc 1000he rather than a brand new 16 inches VAIO some years ago).
This might be "SuperFetch".
However my graphics card (RX 480) is quite loud, and one bearing is making noises.
But the Fractal Design case has really dampened the sound. For my home server, it's using a RM500 (which also never turns the fan on), and a low-profile Noctua CPU cooler. No other fans, but I do hear the 6 HDDs spinning and seeking when it's real quiet in the room.
I don't hear any coil whine, except when using headphones plugged into my desktop's speakers--probably the result of the speaker system's power supply. Klipsch ProMedia, if you're curious.
As far as noise reduction for CPU cooling, I'd suggest buying more air cooling than you need for a modest TDP CPU. Between that and my fanless Seasonic power supply, and SSD, the only noiseI can ever hear from my machines is from the GPU.
I'm looking at buying a new machine soon, and was actually looking at a Fractal Design case with either one of those 2 cooling systems.
This is a Ivy Bridge (IIRC) Core i5 at 4.0GHz.
Seems obvious in hindsight, but I had no fan (pump) speed warning or anything.
I was tired of the never-ending quest for silence, so I bought 3 50-ft dvi cables and a couple usb-3 cables of the same length and put the PC in the attic.
It worked great, except any hardware issues resulted in a trip to the attic.
I work from home, so i'm on it several hours every workday, combined with the fact that I tend to have multiple things in-progress all the time means it would be a giant pain in the ass to shut it down fully.
Still kinda miss that couch.
This was my second silent PC. The first one still had moving disks, but I went for as few fans as possible, and had passive coolers on the internals. This time I did the opposite: lots of fans, but have them spin as slow as possible. This works very well.
But coil whine and electronic hums are easy to overlook when you're choosing parts. It's worth looking at not just the fans and the power use (more power needs more cooling), but also the quality of the electronics.
But when I walk to the backside of my desk I can hear some electronic buzz from one of my monitors. Whats funny about it: I have that monitor since a few years now and before I built that silent PC and turned my desk to another direction, I never noticed the buzz from the monitor :D
I had a Geforce 280 that would scream like hell whenever it was at full power and its framerate went below about 10 or above about 100 FPS. I was glad when it broke some other way and got replaced under warranty.
There are 5 fans in my tower, two on the CPU cooler, only one of those two fans is running constantly, at only 200rpm. All the others aren't running most of the time.
I don't need my computer to run at a cool 30°C all the time. The hardware can run very hot without any issues. And when all the fans eventually kick in under load, it will always keep under 70°C anyway.
And it's doing fine. Its cooling is slightly over-sized since I want to keep fans spinning at very low speed, but I haven't seen much difference compared to when it was outside.
That's a silly and extremist position to take. "insecure" is relative, not absolute. It's a certainty that the software he's running has far more quantity of vulnerabilities and a much longer history of them. I don't know his exact use case, but arguably his use case isn't one where Spectre is particularly more severe than even a userland, non-priv-escalation vuln. (eg ransomware doesn't require root access to hold all your files hostage.)
> Eliminate the moving parts (e.g. fans, HDDs) and you eliminate the noise — it’s not that complicated.
Ha! And yet it deserved a detailed blog post. I'm surprised he would say this even after the amount of effort he spent.
Imo the AMD drivers are way better than Nvidia's drivers. They're included in the kernel and therefore open source. Compared to Nvidia's proprietary drivers that have horrible support with a lot of compositors and lack support for DDC/CI over DisplayPort. The Nouveau drivers are better (slower performance but better compatibility), but are unable to change the clock speed (and are set to the minimum).
The AMD drivers "just worked". Selling my RX480 for a GTX 1070 was the worst decision I made when it came to compatibility. Now I can't even get Vsync to work with this Nvidia crap.
And Windows is behaving weirdly as well since I installed the latest drivers. Black bar on top of full screen programs after waking from hibernation, The HD audio driver not letting pulseaudio start (I need it to get sound from WSL) and crashes when multiple 3D accelerated VMs are open. And restarting the GPU driver (with either the shortcut or through device manager) is what solves all the issues and they only occur with the latest driver.
And the crappiest part is that there's nobody that can help. Getting someone from nvidia to respond on their forums is basically luck, and I'm not a huge company that can get their reps to get someone to help me.
What puts AMD GPUs in a weird situation where you can expect them not to work very well when they are new, but to improve until you can forget about them. (The inverse of the NVidia GPUs, that work ok when new, but slowly loses compatibility with time.)
Also, you can't even compare NVidia's drivers to them, since they don't even support Wayland properly!
AMD linux support was downright abysmal pre ~2015.
NSG S0, once out, will most likely be the go-to case for such setups. Until then, an HDPLEX H5 is cool.
My desk has a H5 on it, housing an i7 8700 (non-K) and a GTX 1060. The TIM under the heatspreader is replaced with Thermal Grizzly Conductonaut and Thermal Grizzly Kryonaut is used as every other TIM that the case setup needs. The CPU is on stock clocks with a voltage offset of -30 mV. The GPU has the power target reduced to 90% and clocks increased by 130 MHz, so that it is effectively undervolted as well. The PSU is a Seasonic Ultra Prime Titanium 650. Prime95 with AVX throttles really, really fast, under a minute, perhaps, but is a very unrealistic load. Non-AVX stress tests and FurMark take a while to start throttling (20 minutes?), as the thermal capacity of the aluminum case is quite big. After hours of gaming, the GPU and CPU float around 80 C while providing full stock performance. I don't do 3D rendering (other than in-game) or video en/decoding, so have not had long, real-world, full loads to see how temperatures behave with those.
From the discussion I've had and forums I've read, I think that people are afraid of putting more power in passive cases and having their components at "high" temperatures, despite those being rated for them.
I suppose blender would thermal throttle the cpu as well. If you run any non-Xeon/non-Laptop Intel chip (greater than 2k series) and care about temperatures - delid the bugger. (Xeons are soldered, laptop chips don't have IHS). Intel uses something that's worse than toothpaste, plus tons of glue between the die and the IHS. If you see temperature deltas under full load more than 9-10C between the cores, the thermal paste between the die and the IHS might have missing spots or have dried out.
In your case removal of the IHS altogether would provide decent results.
You might wish to check the VRMs, they are rated at 125C but if the case is hellishly hot inside, they might not be able to dissipate the heat.
Metal is an incredibly good conductor on its own, and the properties of thermal paste (typically) are just barely better than air. So long as your cpu and heatsink are fairly flat surfaces and mashed together physically, it seems like either forgoing or having the absolute minimum amount of paste is ideal. I've used a razor to leave an absolutely minimal layer of paste (e.g. filling in sub-millimeter surface structure) on my latest build, and cpu temperatures are well within a reasonable range. But I'm also not trying to OC the cpu or anything.
I am not certain how you have managed to come to such a conclusion. Thermal conductivity of air is around 0.03W/(m·K). Good thermal, non-conductive paste is like 12.5W/(m·K) (or 400 times better than air). Conductive ones are in the region of ~40-80 W/(m·K) and Aluminium is 237W/(m·K). Also air also expands pushing the cooler and CPU away.
Normally you if choose between "too much" and "too little" paste, you pick the former. The pressure pushes out the unneeded amounts.
I would be extremely surprised if increased pressure due to air at higher temperature played any role whatsoever unless the bolts connecting the heatsink and cpu were very loose. If anything, I'd expect the increased conductivity of air at higher temperatures to dominate.
I'd also expect there to be effects at the metal-paste and paste-metal interfaces which reduce the effective system conductivity (i.e. phonons are much more likely to reflect in this scenario than in a metal-metal interface).
A fun thing to try is using a modern low-end CPU (latest i3s, Pentiums, Celerons) without its cooler. Not advised by Intel, of course, but you might get into your OS of choice even before it starts throttling. I'm somewhat comforted by the fact that a CPU automatically powers of once it reaches something above 100 C (103 maybe?) and throttles a few degrees before that. Those temperatures shouldn't leave the silicon damaged.
In practice, thermal paste is a must. If you don't like those (I personally don't, they get everywhere by accident and can be tough to remove), try getting an IC Graphite Thermal Pad which is reusable and rivals really good, if not the best thermals pastes, according to the limited number of reviews I've seen. I think that its practicality beats better results in non-highest-end applications.
Smallest ammount of TIM spread all over. NO "PEA" METHOD! All over!
The Cool Laboratory Liquid Metal stuff is the best but hard to work with.
The CPU is delidded! I've got another i3 4300 delidded as well running under a NoFan CR-80EH. Delid + Conductonaut + Kryonaut made the difference between throttling vs hovering around 90 C in FurMark + Prime 95. When integrated graphics aren't used, the CPU runs cooler, of course, and didn't throttle with MX4 thermal paste and no delid.
I do fear that VRMs are running too hot. When selecting components, I picked those that come with some heatsinks on VRMs at least. The motherboard is an AsRock Fatal1ty Z370 Gaming-ITX/ac (non ITX motherboard wouldn't fit in the case anyway with an ATX power supply). The graphics card is Gigabyte's cheapest offering and has a small sink across the VRMs. I'm hoping that undervolting will help keep the VRMs in check.
There are multiple scenes to render as benchmark (I guess BMW one is the shortest/most popular). https://www.blender.org/download/demo-files/
I enjoyed earlier days of "Silent PC" building, ten or fifteen years ago. For example, building a silent tower or desktop for a DAW or softsynth back then in a recording/studio environment required some ingenuity. SSD? Not on a hobbiest budget. I recall one build, not mine, fully immersed in a bucket of oil (mineral?) for passive heat dispersion.
Today, as a new reference point, any MacBook Pro within the last few years may qualify as truly silent for many people's everyday usage. It does for me. And when I do heat up the CPU/GPU with heavy tasks, the fans spin up but then they go away completely as soon as the hard work is done. Back to silent.
No more spinning platters or crappy fan bearings or poorly engineering airflow nowadays. :-)
There's no hacker pride in buying off-the-shelf, so the performance bar for DIY is higher. Progress!
This is admittedly pointless pedantry.
Of course it can’t be completely silent. Heat generates air movement which is “sound”.
By 0dB he means 0dB SPL which is give or take correct.
It's really surreal and a number of people actually became nauseated at the sensation.
Linus did a silent PC build which even in a sound proofed case and at idle was about 14dB and broke 20dB under load:
Even the high end microphone used to record the sound level in this video produced it's own 7dB of noise.
(This is unabashed pedantry. But I'm on my lunch break, so...)
No point in making your PC less noisy than the noise floor.