1. I like big, big monitors.
2. I prefer a full size keyboard.
3. I prefer a separate mouse.
4. I prefer big freaking disk drives installed.
5. I put the desktop under my desk, and with a wireless keyboard and wireless mouse, there is much less of a snarl on my desk.
6. The desktop has an optical drive I still use.
7. The desktop has lots of USB ports and they're all in use.
8. I can replace/alter parts of the machine without buying a new one.
9. Desktops are cheap.
10. I can build what I want with parts from newegg. Premade powerful computers are always "gaming machines" and I don't want a gaming machine that comes with a graphics adapter that sounds like a 747 taking off.
11. I want an all-metal case because a machine caught fire once.
Edit: 12. My desktop doesn't have a microphone or camera, so they cannot be surreptitiously turned on remotely.
Only for consumer level ones. Higher end ones are "workstations" and can generally be specced from mid-range-consumer-level to almost-a-friggin-supercomputer-node. ;)
As a bonus, no RGB. :)
Pick anything over the 800$ range that isn’t a macbook and you’re way more likely to hit clocks than not.
Note that workstation-class (H) laptop CPUs also make compromises on performance - the Ryzen 9 4900H is 8C16T but only has 4MB L2$, 8MB L3$, and a max TDP of 54W. A desktop Ryzen 9 3950X by comparison is 16C32T and has 8MB L2$, 64MB L3$, and a 105W default TDP (and will go much higher with even basic PBO if your cooling allows). The differences on the Intel side are even starker.
Yes. And as a bonus, I can use my Alienware 17 R4 as a throwing weapon. Or for workout. And the power brick is a perfect cup warmer.
But I like it, nonetheless.
It’s obvious that it’s impossible at the moment to get 3950X performance in a laptop format, but you can get laptops able to keep temps reasonable with 35-50W, and that’s what a lot of laptop SoCs target as total power.
Those SoCs hit (and sustain) their top clocks, whatever those are for that specific SKU.
What I understood from OP is a common complaint for macbooks, that fail consistently to sustain their specified to clocks, because Apple deliberately under specifies their cooling solutions for better ergonomics (and design reasons).
This is actually completely wrong. Almost no laptops sustain their top (boost) clock on heavy workloads. Most usually only sustain max performance for minutes (or seconds!) before throttling. Here's an example chart that shows how various premium Athena/Evo U laptops perform: https://www.notebookcheck.net/Asus-Zenbook-S-UX393JA-Laptop-...
On the workstation side, people complain about Macbooks, but recent MBPs actually throttle their Intel H processors less than a comparable XPS 15 for example: https://www.notebookcheck.net/Apple-MacBook-Pro-15-2019-in-r...
If you are interested in how modern Intel laptop chips throttle and what base and boost clocks mean, you want to do a search for PL1, PL2, and Tau. For AMD chips, you will want to look up STAPM, Fast and Slow PPT.
Note that while a i7-10875H's top "boost" clock is 5.1GHz, the sustained "base" clock is only 2.3GHz. This is so low to be meaningless as a top speed. In practice, unless your laptop's cooling is absolutely terrible, you'll probably end up mostly running in the 3-3.5GHz range under full load. In comparison, on a properly cooled desktop system, a same-gen i9-10900K desktop system should be able to maintain a sustained (all-the-time) clock of about 5GHz (very close to its 5.3GHz boost). AMD chips scale a little bit better due to 7nm having better power efficiency and how PPT works, but the same ratio roughly applies.
This is the problem with all the marketing BS. When I talked about “top clocks”, I wasn’t referring about “boost” clocks. I’m talking of the clocks that the SoC is designed for (that won’t appear on the box), i.e. a lot of laptops are not leaving “performance gaps” due to bad cooling, those SoCs are designed for that level of performance and attaching a fat copper heatsink won’t do much difference.
I had a lot of trouble with a client that complained that our board was not properly designed because the performance they were seeing was not “as advertised”. In the end we had to ship the whole thing to AMD, and have them test the system with a thermal sink. Everything was as expected.
If anyone is interested in this kind of stuff, your explanation is really good, so I won’t add anything because I’d probably do a terrible job :)
Back in the day, most CPUs had had a fixed clock, but these days modern Intel and AMD chips simply don't - they all clock opportunistically, which depends on powers, thermals, but also workload (try running an AVX-512 loads for example). How do you characterize "clock" in this context? Base (minimum) and Boost (hard limit, now split to Max Turbo <2C and All Boost MC) seem to be reasonably sensible numbers.
Now we can argue semantics all day, but to bring it back around if you're just going to say "top clocks" is what the SoC was designed for at a specific workload/power envelope (In AMD's PB, that'd be PPT, TDC, and EDC) then every laptops will "hit their top clocks with no issues," but I'd say that argument (statement?) is a bit circular/pointless. ;P
How often is that a problem, really?
Multithreaded builds (make -j 24) can really hammer the drive. Read and write interleaved, which uses up cache in both directions.
A well proven way to move heat out of a silicon package to prolong its ability to perform at its highest potential. And in a desktop, you've got room for 'em.
I installed an EK spreader sandwich on my Samsung NVMe drive, and it made a massive (20° c) difference over stock. It was previously a bare stick with no surface area/thermal sink to pull heat away.
You seem to assume it's the case that heats up to 80C or more, that's not true and also not necessary for NVMe SSDs to go over their limit. Your CPU/GPU has fans moving the heat away, that's a better position to be in...
You will just have to accept that you've been wrong on this topic. Move on, it happens.
Read and learn! And then accept when an initial assumption turns out to be wrong.
Go to https://www.computerbase.de/2020-09/samsung-980-pro-ssd-test.... look at the graph. You see that a bunch of them go to the 80C line or hover above. All of them throttle (what you said does not happen). In that graph are shown, going above the limit:
1. FireCuda 520 1TB
2. Patriot Viper VP4100 1TB
3. Samsung 970 Evo 1TB
4. WD Black SN750 1TB (+ the same one with a cooler)
This is only a small part of the market of course, but it goes to show that the throttling is a real thing that happens with multiple models.
Then you had a moving goalpost there, that those SSDs do not throttle under realistic workloads. However, this is a sequential read that's only 5 minutes long. Hardly unrealistic. The hour long constant load benchmark is a different graph, however, constant load is also realistic if it's longer than 5 minutes.
If you activate the other chart modes you see the measured performance, which shows the drops linked to the too high temperature, and that they did the same thing for write performance.
You can counteract this with a lot of targeted airflow and/or a heatsink, the heatsink will at least help move the throttling to a later moment. Gamersnexus had a very impressive demonstration of this, one where they did get this wrong: They had an article about a MSI SSD heatsink where they claimed it did not help (so the SSD did throttle! Again something you said does never happen), where it then turned out that it did not work only because their applied temperature sensors (glued them to the heatsink), and IIRC they also missed the higher performance they got regardless. GN often gets it right, stuff like that happens, but it made this one memorable and highlighted the positive effect of these heatsink coolers.
I'm into this topic professionally for years now. I'm not wrong here. If you can't take my word for it, look at professional SSD reviews, they have covered this also for years now.
And sure: There are scenarios where this does not matter. Gaming. Browsing. But: In those workloads there is no significant difference to a SATA SSD anyway. These NVMe SSDs are only interesting if you have large (and thus: long) file transfers. This is what they have to get right (and some do, but not all of them).
By the way, by repeating that I'm wrong and by starting with a straight "No", by always commenting without reasoning and politeness, you made sure that I will correct you - and that I'm not buying into your strange attempts to correct your statements to something that is correctish. They don't work anyway, these SSDs throttle.
You should change your tone around here.
But you don't know what causes the throttling. Its stupid to shove a heatsink on nand. It does not help. Not a single bit. Period. Under any normal daily usage, or even if you had a workstation, you're not going to be reading/writing so constantly frequently that you're ever going to cause the controller to heat up and cause throttling. If you experience any excessive heat. You have bigger issues in your case. Period. Your only proof of throttling is benchmarks running constant read/writes over a period of time. This is not real world usage and doesn't make it necessary to go out and start shoving heatsinks on every single nvme drive. If that was the case then all the laptops which have space between the nvme and the case, or motherboards which lack a 'heatshield' like the gigabyte board you linked to, would have throttling issues. Which they don't.
> You should change your tone around here.
So now you're threatening me?
Anyway I'm done, not gonna sit here an argue anymore.
There are all kinds of vendors / models, who knows what kind of throttling they use?
Having a fast machine that can sustain throughput and I/O is a perfectly rational desire, and for some of us, need.
To test that, one can try it with ramdisk first, before getting an expensive ssd.
I mean, hardware wise they are not much faster, but cooling is a different story.
The highest spec macbook comes with a 9980HK and starts at 2800 $. A 3900X has approximately twice the performance in multi-threaded workloads and you can easily build an entire quiet workstation with it for less than 1000 $. Half the performance, thrice the price. Great deal.
Yes, there are also "laptops" with a 3900X in them. But even those still have lower performance than a desktop with a 3900X because of thermals.
Specs can never tell the true story, but it's clear that the mobile processor is going to be much slower for anything remotely processor intensive, and probably much more than twice as slow for anything making good use of multithreading.
Having had to go back and forth between a laptop and a desktop for a processor intensive application (AutoCAD) the difference was painful.
These are short tests too, so would be a best case. In real life, laptop performance is probably significantly worse due to thermal throttling.
While we're mentioning other minor gotchas, another one is memory latency and bandwidth. While desktop systems commonly have XMP and 1.35V support, very few laptops do (typically running at JEDEC timings at 1.2V). While there's diminishing returns, the difference between JEDEC 2933 CL21 or 3200 CL22 and say 3800 CL16 can actually be noticeable in certain workloads and is often effectively "free" (one-click in the BIOS) extra performance on the desktops.
Also, laptops generally have CPUs that pull 15 watts. Whereas desktops have CPUs that can pull 95 watts or higher (sometimes up to 150). The difference is astounding.
40, 100, 29 in pmset -g thermlog instantly.
CAD, for example, barring new geometry work in progress now, is single threaded.
People working on high surface count models want these things:
Big, fat, fast cache
Sustained sequential compute performance
GPU that focuses on geometry and precision. This is not generally an issue today, but can be on laptops.
Desktop machines with active cooling are where it is at.
Desktops are great.
Though with water cooling you might also have a lot of pump noise, especially if the radiator is mounted incorrectly.
The larger the diameter of a fan, the lower the RPM it can spin at to move the same amount of air (same cooling capacity) as a small fan. Provided you can put the air where it's needed (e.g. a 1 foot fan can't "focus" air onto a 6 inch radiator) a larger fan will just about always be quieter and more efficient.
Oh, and having a water cooled PC with the radiator and fan inside the PC itself is silly. If you run the pipes outside or into your basement, your PC is almost completely silent, plus the cooling capacity will usually be much, much higher, because not having to cram fans and a radiator into a small enclosure lets you make them bigger and more efficient.
And historically, bigger case fans have been problematic, those 200mm fans they tried to introduce some years ago. Bad static pressure if I recall correctly?
Also, I wouldn't call having a radiator and fan inside the PC silly. The case eats a lot of the noise already, it's the easy and the common setup, and good AIOs are quiet and cool well. But maybe you just wanted to share a cool big water cooling setup with everything noisy routed into the basement ;)
The hoses aren't welded steel, you know.
In my case I have a DDC style pump, and its very quiet after I got a car wash sponge and cut a square hole in it and put the pump inside. It looks ghetto but its inside a case so who cares.
At first I was worried about heat, but its lasted 10 years so far.
I heard a story about a DIY PC that did nothing but circulate water through the cpu cooler from a fish tank; the tank was big enough to dissipate heat through evaporation and other means, only occasionally needing a top off.
I actually mounted an automotive transmission cooler to the outside of my PC and made it part of the watercooling loop, because it was inexpensive and large. There are no fans on it, but it still reduces the work the standard pc cooling radiator with the fans on it needs to do.
The most limiting thing is TDP, which in the highest performance laptop processors is still capped at 45W, whereas a maxed out desktop processor can draw 100W or more.
See these tables for i9, for example, compare Coffee-Lake-S (Desktop) with Coffee-Lake-H (Laptop)
Maybe a small desktop. My desktop processor is 180W TDP (Threadripper 1950x), while some others are 250W TDP. You can also get a dual-socket workstation, for 2x CPUs (both pulling 200W each).
Thermals and power are significantly higher on desktops, it ain't even funny. Laptops win in power-efficiency, but absolute performance is always going to be a Desktop.
My Macbook laptop, on the other hand, sounds like a jet everytime I run yarn install.
So yes, desktops have higher thermals. But it handles it so much better than a laptop that it almost becomes irrelevant.
As a result, your 10 year old desktop is probably 50-100% faster than the most expensive Macbook Pro or Thinkpad. It can be quite astonishing swapping to even an old desktop after using a laptop for a long time.
I'm planning to drop in a 5950X upgrade in the workstation in a couple weeks as well. Looking forward to both the huge multi-threaded and IPC gains.
I have mine set to 80% and only set it to 100% if I know I'm going to be away from AC for a while.
Time after time, as someone who's been working on his 10 year old desktop (with a replacement ssd + graphics card when the old one died), I meet devs and analysts using laptops who reason more or less: "well it says i7 and it says x GHz and it says ddr3/4 and it's got a gpu with the same marketing number, so laptops perform the same as desktops cause they have the same hardware in them don't they".
Clearly, what they really mean is "I've never worked with and compared with a desktop". I suppose one of the problems is that SOME of them that have 'used desktops' were actually using neutered VMs in a shared corporate environment that run really poorly and are pretty underspec'd in a shared environment.
But every time, it's actually been the case that not only is the desktop faster and cheaper, but things usually remained faster on an X year old desktop hardware vs more modern laptops for any serious workload.
Edit: and in case it needs to be said, I have both desktops and multiple portable devices in my household because the downside of desktops is clearly portability.
More expensive, less performant, and less serviceable with a suite of proprietary bloatware on top.
I might break the pattern this year though, I got a 3900x last year around launch and moved my 4670k to a home server. I've been pretty impressed with the 3900x while the 4670k has been maxing out all cores in the server for some tasks so was considering buying a second 3900X(T) to replace the 4670k again. But with the rumours about the 5900X, I might end up putting the 5900X in my desktop at the same time as a 3000 series GPU and moving the 3900x to my home server.
Most laptop cooling is awful, but you can certainly find laptops that are well built if you look for it.
Not sure you should weigh bloatware, either. You can trivially install a fresh Windows or Linux and you have to on a custom built PC anyway. If you buy a premade PC, it probably comes with the same crap.
For what it's worth, I switched to a desktop once the core race heated up - now I've got a 12 core 3900X.
I can't believe how much faster it is when it's using all the cores. Night and day. Highly recommended. And 12 cores is barely scratching the surface of the crazy workstations you can build these days.
For tasks 4 threads and less, it would be a bit faster than a current gen Intel laptop chip, but I don't think it would have been worth the portability penalty to me personally if that's all I did with it.
The problem is this base clock speed is given by the CPU manufacturer not the laptop maker. And Intel wouldn't know what kind of laptop it's getting crammed into. So careful definitions don't really help. Desktops are already big clunky things that have to be kept plugged in, I can trust them a lot more to deliver the CPU's promised performance. Whereas laptops are notoriously making compromises because customers tend to be very unrealistic about noise, battery life, etc.
Laptops being power-conscious, they're usually much closer to the point of maximum efficiency on the power curve. In that sense you get better performance per watt. But that's negated by the increased front cost.
At full load, it depends on the heat dissipation; Dell's Precision 7XX0 dissipates heat well enough to keep the CPUs from throttling, but the 5XX0 does not.
Advantages of using a laptop :
- portability - I work as a consultant and I need to work on premises occasionally - if I had to switch between desktop/laptop it would be too cumbersome
- standalone when you need it - when I travel or am on vacation I usually need to do a few hours of work - I won't lug my full setup but I'm 100% ready on the go
- can develop for OSX/iOS with MBP
- thermals - I use a fully loaded 2018 i9 MBP and I have to undrvolt/disable turbo boost in office because the laptop hits full fan speed with a VM + IDE running and people start turning heads
- lower performance compared to desktop equivalent and especially compared to best available workstation
I'm hoping VS code and remote development setups (either running VSCode in the browser hosted on my desktop or remote tools) get sufficiently good that I can get a lightweight ARM Mac and then I SSH to my home workstation - feels like the ideal solution if the tooling gets there
My laptop only runs Office 365 for email, calendar, and the occasional word doc. It also runs Slack and a browser for zoom meetings as access to Jira/Bugzilla web interface.
Oh, and RDP into XFCE on Ubuntu on the desktop that runs as a server.
Plus the server is cabled up to development platforms (serial ports, remote power, GPIO, JTAG, USB). So it cannot move, and even if it were a laptop, it could not move.
I don't require the laptop to be upgradeable or powerful, it is just the UI into the rest of the system.
It currently sits here with the lid closed cabled to a 27 inch display.
In the Windows land the only hope seems to be AMD while Apple has me hoping for the A14 performance.
Ideally Apple would push out a 2in1 with touch but they are set on pushing iOS for touch - which is just too limited for all intents and purposes. I would gladly buy a premium Lenovo 2in1 or some Ryzen 5xxx series ultra portable windows laptop (they are much better on thermals from what I've seen) but I still need a OSX client from time to time unfortunately. There's just no flexibility with Apple ecosystem - you either fit into their intended use cases or you're stuck with suboptimal tradeoffs.
I'd like to be more client-server these days too, using whatever device is convenient at the time but storing my data centrally so all my devices can access it, it's all systematically secured and backed up, I can also access it remotely via VPN, etc.
The key thing, though, is that I want it to be my server, not someone else's that I don't control and have to keep paying for.
Check out iStat, it gives you the ability to set a fan curve and that has helped quite a bit with my laptops Thermals. I found the highest rpm I could run without hearing the fans and set the two lowest points in the curve to keep it at/under that point and I almost never hear my fans anymore. Only time I really hear them now is when something is compiling and even then it's much more bearable since I keep the highest rpm limited to 80/85%.
You can have these with a laptop too. At home I use my laptop with an external screen, keyboard, mouse (the latter two are wireless), because it's much more comfortable.
It's not just the monitor being big, I want lots and lots of pixels. I'm currently running 3840*2160, and have a second monitor attached set up in portrait mode (so I can display a manual page while I work on the other one).
Of course, I'd get an even bigger monitor with more pixels if they didn't cost so dang much :-) How big, you ask? A wall size retina display! I've wanted one for 40 years.
Microsoft make at least two, the Surface Keyboard  and the Ergonomic version . I own several of the standard ones, and they're the best-made keyboards I've ever owned (and the only ones my joints can currently cope with), but the fact that they're Bluetooth instead of USB drives me insane. I'd pay extra for a wired version that just instantly sends the keystroke every time instead of going to sleep after a while (while it's awake it's instant, it just wants to save battery after being idle).
If you really want a dongle the previous Sculpt Ergonomic has a dongle but the Num Pad is separate :-/ It doesn't have the delay you talk about, and has much better battery life than the Surface version (had it for three years and have't changed them yet).
edit: two monitors is still a pain, though.
I’ve got a USB-C dock hooked up to my TV too with a wireless KB/mouse combo connected into that, if I want to plug my laptop in on the couch.
Even works with your phone, if you've got one that supports it. Recent flagship Samsung phones with DeX do, but I'm really holding out for more external monitor support in Phosh so I can plug in a Librem 5 like this.
Once the laptop is festooned with hubs, external drives, cables everywhere, might as well just buy a desktop.
I used an external monitor, keyboard and mouse when at the desk, even when I only owned a laptop. But I also use the laptop a lot on the couch, not just for traveling - so I wondered if I would find it annoying to have a separate desktop machine.
So far it's been well worth it. The desktop performance is much better (especially for gaming) and Dropbox, git, Chrome sync and VS Code sync make having multiple machines relatively painless.
Edit: I see in previous comments you were looking for a full sized bluetooth keyboard. I recently got a Logitech G915  and I'm pretty happy with it. Expensive though.
 - https://www.logitechg.com/en-eu/products/gaming-keyboards/g9...
The couch is fine for reading a tablet.
For me the trick when moving back and forth between the office and home was to have an external drive with everything on it and carry it back and forth. A better approach is probably remote desktop or remote login.
I travel with a cheap laptop loaded with only what I anticipate needing on the trip. It eases my mind to not worry too much about having it stolen, lost, or smashed.
Sure if I'm on a high latency link I might have to adjust to clicking taking a wee bit to register, but normally I don't have to adjust my usage.
Windows to Windows.
It's a dealbreaker for me.
not only is that hard, it's hard to find a bluetooth keyboard that isn't laggy as heck and doesn't occasionally freeze for a second or five. Or just give up until you re-pair it.
Maybe one of the next 20 that I try...
> How big, you ask? A wall size retina display! I've wanted one for 40 years.
I have always thought, since Digital Research GEM, that the "desktop" in GUIs should be your desk's top. We're nearly there...
Never had any issue like that with my full-size Apple Magic Keyboard. It Just Works™
With Thunderbolt/USB-C hubs, this is easily fixed.
At home, I have a similar setup but with a 27" colour-calibrated screen in place of the 34" curved.
When travelling, I use an unpowered four port dock which means I can connect five external drives without needing the powered dock and its power brick. The unpowered dock only struggles if I have more than a couple of platter drives (instead of SSD) connected.
Prior to that I used Apple keyboards which work fine but have low-travel chiclet-style keys which aren't as nice to type on (they are much quieter for office environments though.)
My setup now is a laptop with the USB-C docking station. So I plug in one USB-C cable and get:
30" external display, ErgoDox Ez, USB mouse, bigger speakers, 120W power, Mic for zoom calls.
Obviously there's a tradeoff, but the point is you can get many of the advantages of both fairly easily.
I know that purchase was effective, because a week after it was installed there was a big storm and power went out everywhere except my neighborhood. Just having a generator successfully wards off power failure, you never have to actually turn it on.
I submit this is objective proof that I am living in a simulation and none of you exist, you're just artifacts of the simulation.
Beware, you need to regularly maintain it too (hopefully it does a weekly starter test, and it probably needs yearly oil changes). I didn't check mine, and found the battery charger had failed at about 1 am when my wife had a flight out that morning. That was fun.
(We don't have everything on the generator, so it's not as effective as yours, our utility wiring is actually pretty fragile, and our well pump is one of the things not on the generator, buying a portable genrator for that seems to have helped, but I did have to roll it out a week ago)
It would have cost twice as much to run everything with the generator, and that isn't really necessary, so it's a smaller one.
I worried about the battery being dead and no way to hand crank the generator, so I made sure that the generator could be started from a car battery or one of those zap-o-matic car jumpstarters. (I bought one of those last year, and had occasion to try it out on my stone dead car battery last month - it worked great!)
Fortunately, after about 6 weeks or so, the market forgets I did a transaction, and things start moving my way. I'm forced to be a long term investor.
Vijay, is that you?
Hmm. Is this something I can buy at Home Depot or something?
The UPSes we have all our key equipment running on have been some of the best tech investments we've ever made, even if they did cumulatively cost over £1,000.
Just reporting facts.
I have some servers in Copenhagen, and from their logs they've lost mains power about every 2-3 years. They have a UPS, but I certainly don't bother for a desktop computer.
I'm not sure what you mean by "most of these Tweets"; I'm reporting on my experience living in Silicon Valley, California, where power goes out a couple of times a month on avg.
I have all servers etc on UPSes but wish I could find bigger UPSes as most can only hold up about 1 hour which often isn't enough.
until you switch to external batteries, they keep increasing the inverter size, which is unnecessary for your use case.
you could also just buy chargers, batteries and inverters separately and wire them together.
Of note, if various online resources are to be believed, you can't just hack and extend the battery capacity for longer run-times as they likely aren't rated for the corresponding higher thermal load.
they're not cheap, but i've bought cheaper and that stuff costs even more in the end.
power electronics that are safe and reliable costs.
> you can't just hack and extend the battery capacity for longer run-times as they likely aren't rated for the corresponding higher thermal load.
thermal load... of batteries?
i'd expect you'd confuse the microcontroller in the UPS. more batteries will require longer to bring up to voltage than it expects, which implies something is wrong. further all of the time remaining estimations it presents will be wrong.
Thermals referred to the power electronics which do get hot.
Yes certainly better to get equipment designed for it rather than risk burning down the house.
You’re right about the internet, though. It’s a monopolized system here, and the anticompetitive nature of it means it sucks across the entire country.
I guess YMMV... but it's not something I've personally had to worry about, and won't do anything about it unless the electricity supply gets significantly less reliable. They are saying that the move to renewables might make the grid less stable over the next few years though, so worth thinking about perhaps.
Frankly, I find nowadays I can't do much without an internet connection anyway... Maybe we should put ~5 second batteries into desktops just to sync disks and power off safely.
In a 240V system, that feeder will typically feed one or more pad-mounted distribution substations which feed properties and in very rural areas a number of pole-mounted smaller distribution transformers.
In a 110V system, many more properties are fed from pole-mounted transformers (because you want to minimise length of 110V runs due to resistance losses and you therefore do not want the extensive LV mains used in a 240V system.
In general, the US has longer lengths of vulnerable HV lines but fewer of them are radials (in other words, more US HV runs between two HV/EHV subs and can therefore be sectionalised and run from either end). Therefore an HV fault is more common in the US but it is less likely to take out as many people for as long.
In this case though, it scarcely matters since either system is likely to have properties connected to only a single HV line which comes off as a spur from an EHV/HV substation. These HV lines are often on poles in rural areas and therefore vulnerable to damage in heavy weather. This happens less in urban areas because HV lines tend to be buried there. In The Netherlands which is one extreme, everything under 50kV is buried but NL is a very dense country.
There is no point comparing your experience of power cuts in Amsterdam, London, Copenhagen, vs Chicago or NYC because power cuts are quite rare in all of these places. If you live in the English Lake District in a small village or in a small town in a rural area of the US, you are likely to have experienced power cuts. Performance on continuity of service measures like TIEPI varies much more within countries than between them.
I am from Germany. The latest wide-scale power failure I can think of was 2007. I remember another shorter one in Bremen which must have been around ten years ago. They happen really rarely here, in spite of all that FUD that wind and solar energy makes the power supply unreliable.
I do agree with the other items, especially number 10. It's the one big thing I miss from having a desktop.
I also bought a chromebook just for fun for $150 from the pawn shop. I have a "build farm" of various computers with different operating systems in the basement for use when there's a problem with one of the D targets, but using putty to remotely access them.
Laptops are for shallow work, not much else.
For you. I get all my work done a laptop and have been for a decade.
This keeps most of the advantages (the processor is weaker on the smaller form factor thanks to the cooling requirements), while also keeping some size, mobility and power-saving benefits.
I got into this because I got tired of opening up the large cases, moving to a smaller form factor and external devices was much easier. I don't care for working on the move, and every place I'm likely to ever want to use the computer at will have spare monitors/keyboards/mice.
Also every laptop has a different - stupid - keyboard layout.
Also when pressing Ctrl+F4 I use the outside of my hand to press Ctrl, then my index finger for F4. That means that the F4 key needs to be over or to the left of the #5 key, or else I can't reach.
This makes the whole keyboard lopsided, rendering the rightmost quarter completely useless. Don't get why they would do that.
Let the 15 people in the world who _need_ the numpad for data entry use an external one.
Also just gave away the remaining one of two 24" NEC flat panels I bought 17years ago. It still works.
1) I have 2 "big big" 4K monitors hooked up to my laptop so no problems here. The laptop's monitor is not used as the lid is closed.
2/3) I use external keyboard and mouse. No problems here
4) I have 2TB worth of SSD in laptop and I also have huge external drives array.
5) Said laptop is sitting on my shelf, I do not even see it. On my desk are 2 huge monitors on arms with VESA mounts and wireless keyboard/mouse. Said laptop is also running NOMACHINE so I can also access my few worktations and servers without lifting my butt.
6) I have external optical drive but frankly I do not recall single time in a last 3 years when I had to actually activate it.
7) I have 2 external 10 port USB 3.0 hubs hooked up to 40Gbps Thunderbolt 3 port of said laptop. Again no problem in this department.
8) Yep. Desktop is much better in this department.
9) Not my desktops ;) They're server/workstation type.
10) This is how I build my "desktops". No argument here.
11) I do not know what to say about it.
BTW, it sounds to me that your laptop setup is indistinguishable from a desktop, and your setup is even less portable than a desktop, so why not go the cheap desktop route?
Not any laptop but some do. It even does it while driving its own built-in monitor but I do not really use it.
>"your laptop setup is indistinguishable from a desktop"
you got that part right
>"your setup is even less portable than a desktop"
Nope. When I am out of my main office for whatever reason I unhook said laptop from all the cables and it works off-site including all my development environment. If my stay away is extended (working for a month in ocean side cottage for example I also would take one of those monitors). I just do not run production servers/databases/etc on my laptop ;). Btw my laptop soon transitioning from 32GB RAM to 128GB RAM as to give me more flexibility with the databases.
Worth case I can still access my workstations/servers remotely using SSH or NOMACHINE if SSH does not cut it. Since I have very fat internet pip it works just fine.
But the key for me is I do that because I have a work laptop and a personal desktop and I want no question about the ownership of things I do on the desktop.
I organized my windows and tabs and splits in such a way that it degrades gracefully when I switch from the external monitor and back.
How often often do I really need that? Not often, but when I do it's really useful.
1. Not thaaat big, but I prefer sub-4k anyway: WQXGA support, with docking station more than one - check
2. External keyboard - check
4. Two M2 1TB Samsung + one 500GB SATA + optional external drives of similar sizes. More than enough
5. Admitted, docking station for some cases and connections.
6. Not anymore, USB bootable devices for the rare occasion. Other former reasons are more or less obsolete for my causes.
7. Docking station again
8. Lenovos are not so bad in this respect too
9. This one was > 2000EU - but will last several years as its ancestors did too
12. A piece of duct tape solves all camera problems
I am a consultant, the necessity for traveling around was partly responsible for the original choice, but even today I'm not missing much. I have still NAS-es with even bigger drives and DIY desktops and rack-based servers for image processing (requiring multiple GPU's). But thats me and my area of work. For most things, the laptop ist quite sufficient.
I would love elaboration on what happened.
I had to wash the floor, walls, everything in my office to get the stink out.
Replaced the mobo and graphics card and I was back in business, but this time I bought some sheet steel and set the computer on that, which hopefully would buy some time to get the fire out.
The machine had a metal case, which I'm sure greatly slowed down the spread of the fire. So for me from now on, it's metal cases all the way, baby.
This reminds me of an old joke: https://www.reddit.com/r/ProgrammerHumor/comments/46sere/a_p...
Desktop computing FTW.
Upgrading stuff and costs are both really significant for personal use (which is why I am typing this from my home desktop), but if money is no object, I just get a desktop-replacement laptop.
I hate the whole "let's just make laptops like tablets" thing, where "like tablets" means "no ports."
I need ports for all kinds of stuff, including driving robotics and interfacing equipment. Which is also why having a laptop is nicer than a desktop. It's really great to have a built-in UPS and be able to move my office easily.
EDIT: Just checked, and it's easy to upgrade RAM or replace the battery, and I have room for a 2.5" hard drive and actually 2 NVMe SSDs (they make 8TB ones) plus an optional SIM card and a PIV card and SD card and the built-in camera and microphone array are really nice to have nowadays. But I do love desktops. You wouldn't think 10 USB ports is that important, but nowadays everything uses USB, so it's nice to have for pure convenience.
I am pretty sure your computer has a microphone, because it most likely has a speaker.
I have only ever seen the hybrid mic/headphone ports on laptops and not desktop motherboards. They may exist, but I have yet to see one. Although I guess what matter is how the ports are actually wired.
... You can put a big disk on a laptop, USB3 hub, etc etc
I think main thing missing for me is a beefy GPU and maybe a bit more RAM.
In my case a mouse slows me down. I have to take my hands off the keyboard and reach for the mouse. The touchpad is right there where my hands are. My touchpad also have three physical buttons, which are very useful instead of tapping the touchpad (I disabled that.)
I really need a mouse only for playing (which I stopped doing on my laptop since many years ago) or to try out some very rare sites linked to HN. It doesn't happen every year.
This is probably the only ergonomics pro of a laptop (except that I carry it with me, of course). Just in case I'll need a separate keyboard again, does anybody know of a good full size one with a touchpad under the space bar and physical buttons? I googled and found many keyboards with a touchpad in place of the number pad or further to the right. By the way, the number pad is not important for me. I'd love not to have it on my 15" laptop. I can keep it there on an external keyboard, but it's extra travel for my right hand if I really have to use a mouse and I never use it anyway.
Just as a PSA for others: Laser printers in particular draw too much inrush current through their fuser drum heater for surge protectors/UPS devices to handle. They (the printers) should be always connected directly to the wall.
I migrated entirely off desktops and went laptop / mobile.
Got happy as my current work benefits from me being highly mobile. Super glad I did it. Ended up surprised at just how much can do with a Note type phone and optional keyboard / track pad.
But, I never did replace fast and responsive. Just kind of coped and the other benefits made the whole thing worth it.
This pandemic has me rethinking some things and yeah, I want to build a nice machine. Want that workstation type feel and performance.
The closed computing argument holds more water every quarter too.
What we need is a reasonable battery pack, and software to throttle the machine down on an interruption. Make this package a couple hundred bucks, or something a person can just load their own cells into and it's bound to be a winner.
13. I'd like to work on some spreadsheets from my couch.
Damn...guess I'll buy a laptop again.
4. The size of the drive isn't an issue; you can fit big drives just as easily in a laptop. The only difference is that a desktop PC can fit more than two drives.
I've come to really like the convenience of being able to pick up my workstation and take it with me.
To me, there's only one really, really big disadvantage to laptops: the noise. If I put together a desktop, it's as quiet as possible. Powerful laptops apparently can't be quiet.
You can buy fairly powerful desktops configured with just the on chip graphics or inexpensive business class graphics cards.
I'll add a '13' to your list.
13. Bluetooth is evil, and any desktop I have seen with Bluetooth had it on a mini-pcie card, and could be yanked.
My desktop is always cabled via 1Gb Ethernet, so if yanking the BT also loses the WiFi, I don't really care.
If you are using wireless keyboards/mice, I don't know if the logitech adapters are more or less secure than BT. That could be an entire separate thread.
1. M.2 cards
2. Full-size PCIe cards (WiFi+Bluetooth combo cards, the Bluetooth part may actually connect to an internal USB header)
3. USB dongles
Probably less of a concern, but it also moves the heat output.
They have some really cool fanless builds, that if you were using a CPU like the AMD 4750g , you would have zero noise, and it'd make for a very decent workstation.
You can connect an external display to your laptop. That's what I'm doing right now. Similarly for 2 through 6.
> 7. The desktop has lots of USB ports and they're all in use.
The external monitor will give you more.
I think reasons 8, 9, and 10 are all legit. I don't think there are any intrinsic benefits to going one way or the other. It all boils down whether you feel the time you spend building your rig is worth the money you save.
It takes less than an hour. I enjoy it, and it keeps me familiar with the guts.
Most of the hour is just being careful I'm hooking up the wires correctly. In college I worked as an electronics technician, and one job was putting together a stack of Heathkit serial interface cards. I don't recall the exact times, but the first one took me an hour and the 5th one maybe 10 minutes.
It's also like taking the cylinder heads off my old Mustang V8. The first time took me 2 hours. The third time - 20 minutes. It's all in being familiar with just what to do, and having all the right tools ready.
Your points are all good but this one gave me pause for thought. We're all working from home and using our microphones and cameras more than ever. I'm using my regular laptop with plug-in microphone and webcam. Are laptop makers thinking in terms of upping the specs on the microphone and webcams in their upcoming models?
Maybe I'm nitpicking, but I feel like this is a bit much of an exaggeration. I just measured my Pixel 3 (at its thickest point, the camera lens) and a 2011 Macbook Pro (including the rubber bumper, admittedly, but those were super slim at the time), and the MBP is only ~2mm thinner than the Pixel. The race for ultra-thin laptops seems like a relatively recent thing to me.
In any case, because of all my remote collaboration these days, I bought a decent mike on a boom to use, and my colleagues say it is much better. Still have to find a decent camera.
Advice from my wife, a photographer:
- if you're still having picture problems (after setting the camera's anti-flicker to 50Hz/60Hz as appropriate), put diffuse illumination on your face e.g. a full-screen browser window on http://blank.org; or for more control--colour temperature as well as brightness--a cheap LED panel and desktop tripod (from a photo supplies shop).
- turn off any "smoothing" or "beauty"features in the cam's software.
- likewise, don't use any fake background features. Pin up a plain bedsheet behind yourself if you think your place is too messy to be the background.
Anyway, once there is a picture of any kind, in my experience most people care more about the audio being unclear or laggy, so you're well ahead.
Ideally you want a camera that does clean HDMI out (no ISO indicators etc), but even that can be worked around if you run OBS studio and crop it out.
It was able to be angled for comfort, etc. So, you'd be able to put a desktop somewhere (above? below?) and use it from bed.
No idea if they're still sold, but it wouldn't be too surprising. :)
Basically you can work on a laptop, unless it's a special extremely short term situation.
Out of curiosity, what do you use instead? I'm wondering what the best options are for workstation-type desktops.
If you do not need a strong graphics card you can always just get a weak one, with less energy usage. GT 1030 for example, though that's really weak, but it's also just a 30W card.
Or depending on the processor you need you can use the integrated graphics. The AMD Apus are pretty great, Ryzen 3 3200G and Ryzen 5 3400G - and there are new 4000 versions that can be be bought in kits or OEM machines. Intel has their own weaker integrated graphics on most of their processors, even their stronger units, for just office work strong enough.
If I'm running a multicore big job, the fan spins up and annoys me.
in case someone thinks I dunno how to write
Or are you just doing ssh work?
When I'm at my cabin with 10/2 Mbps internet it's like I'm at home, except when I forget myself and start watching YouTube, then I notice the reduced quality of the video (RDP selectively encodes fast-updating regions to lowish bitrate h264 or similar).
I even do my Teams stuff over RDP, bidir sound just works without fiddling.
- : https://www.velkase.com/products/velka-3
That 'extreme example' weighs as much as many laptops and it's just a case. Even if you carried a small display, and everything else you need, I doubt most people would take that out if they found themselves with extra time at the airport.
Edit: Found the machine I was thinking of: https://en.wikipedia.org/wiki/Compaq_Portable
Edit: Especially with the rise of the Small form factor PC culture.
Small cases have gotten pretty nice but if you have to carry around a separate screen to go with it then you might not end up any better off than if you had an all-in-one luggable computer.
You can, until Apple integrates all the suppliers into their supply chain.
> 1. I like big, big monitors.
I've realized that I find it uncomfortable to use more than one screen, so I just use one 27″ 4K screen. But occasionally I open my laptop and put it on a stand to show some extra windows.
> 2. I prefer a full size keyboard.
My favorite keyboard is the older version of the Apple Wireless Keyboard powered by AA batteries. It feels really nice to type on it (maybe because I'm used to it.) I recently bought a brand new Apple keyboard because I thought it would be better (and I could recharge it over USB), but it's much worse.
> 3. I prefer a separate mouse.
I use an Apple trackpad, and I love it. I only go back to a mouse if I'm playing a game.
> 4. I prefer big freaking disk drives installed.
I have a 2 TB SSD, which is more than enough for me.
> 5. I put the desktop under my desk, and with a wireless keyboard and wireless mouse, there is much less of a snarl on my desk.
I like to put my laptop on the desk, and I only plug in a single Thunderbolt 3 cable for charging, gigabit ethernet, and my monitor. I really love my CalDigit TS3 Plus dock .
It's really nice that I can unplug a single cable and take my laptop to any cafe or co-working space. (Before/after the pandemic.)
> 6. The desktop has an optical drive I still use.
I haven't used an optical drive for about 5 years, and I don't own any optical disks.
My dock has a lot of USB-A and USB-C ports, and it's in a very convenient location.
> 8. I can replace/alter parts of the machine without buying a new one.
That's a great point, and it's one of the major downsides of using an Apple laptop. I do have AppleCare, and Apple's support is really amazing.
> 9. Desktops are cheap.
That's very true! But I was using a 2012 MacBook for 8 years before I upgraded it, and I think I'll probably continue using my current laptop for another 8-10 years. It's expensive, but it's a very high quality machine and I think it will last for a very long time.
> 10. I can build what I want with parts from newegg. Premade powerful computers are always "gaming machines" and I don't want a gaming machine that comes with a graphics adapter that sounds like a 747 taking off.
I certainly can't do that for any kind of laptop (either Apple or other brands.)
> 11. I want an all-metal case because a machine caught fire once.
Nothing to worry about there
> Edit: 12. My desktop doesn't have a microphone or camera, so they cannot be surreptitiously turned on remotely.
MacBooks have a hardware light that turns on when the camera is active, and I haven't heard any reports about hackers being able to disable it. Although one disadvantage with the newest MacBook is that you can't attach a sliding camera cover anymore, because it will crack the screen when you close it.
The microphone can be surreptitiously turned on, but it's not something I'm too worried about.
>6. The desktop has an optical drive I still use.
for what? You're probably better off (in terms of both convenience and IO performance) to store it on a hdd/ssd as an iso and mount it.
2. Burning a bluray for backup. (I do backups on different media types just for insurance, and I like that blurays are write-once and incompetence and ransomware cannot fk with them.)
No need. You still buy CDs, you're old enough that we can wait.
(I still buy them, too.)
2. Failing that, just archive your existing discs and ditch the drive for good. It's the year 2020, and most (all?) software are distributed digitally, so you shouldn't need to get out your drive in the future.