Hacker News new | past | comments | ask | show | jobs | submit login
Building a Lightroom PC (paulstamatiou.com)
272 points by chambo622 on Jan 24, 2018 | hide | past | favorite | 225 comments



I can't imagine spending $6000 on a computer for editing photos and then saying "It also means I don't plan to buy and meticulously use a color calibration device". My computer might not be half as fast, but at least I know my blues are the shade of blue I think they are.


Fair point :) I should reconsider. The UltraSharp line is factory calibrated but yea that doesn't mean much as it varies between display.


It is factory calibrated for general desktop usage, but not necessarily for photograph reproduction.

Buy the cheapest Datacolor Spyder5(S5X100). The hardware is same between all the versions and only the software differs. Then get the open source DisplayCAL(https://displaycal.net/) that works with Datacolor sensors and other manufacturers. I calibrate my display once or twice a year. I also use it to calibrate my gaming computer and laptop.


I use the Open Source Hardware colorhug and it works with DisplayCAL

  http://www.hughski.com/
If you are also printing than you need to spend more money on a calibrator that works with both screens and printers.

I can't tell you how huge the difference can be even on high end monitors. I had a friend that swore all Macs were calibrated and he got mad at me when I laughed. After we did the calibration he realized how wrong he was.

Also buy this: SpyderCHECKR 24 and use it as the first picture at any new site. Color correction within 10 seconds.

https://www.amazon.com/dp/B00LPS46TW/ref=wl_it_dp_o_pC_nS_tt...


I used to work at Datacolor, and I specifically worked on Spyder and SpyderCHECKR. It's nice to see that those get lots of love when it comes to color calibration :) but it still amuses me that they haven't figured out how to deal with the fact that their high-end Spyder unit sales are getting supplanted by the open source color calibration libraries..


Shhhh don't let them know. Funny thing is that Professional Shops and video companies are really starting to embrace these open source libraries.

So what is your thought on the Color Hug 2? http://www.hughski.com/colorhug2.html

I bought the first color hug and it is still serving me well but I really like the new color hug 2 especially the latency tool.


> Shhhh don't let them know. Funny thing is that Professional Shops and video companies are really starting to embrace these open source libraries.

Oh, I haven't worked there for a few years, but someone I know already brought it to the attention of upper management(pointing out the fact that Amazon reviews for Spyder products literally tell people to use the Open Source libs) and they've basically just ignored it as a whole. Their products are cool, and there were a few great people, but the company itself was... I won't really go further into it. Alas, I haven't been involved in imaging in quite sometime so I can't really comment on Color Hug :/


Or you can do it manually the old way with a colour reference sheet. More laborious and requires good lighting (preferably daylight).


Well, it's calibrated to a specific delta-E so the amount of variation is limited - in practice, unless you're in a room with controlled lighting and are doing professional work (neither of which seem to be the case) calibration doesn't really matter.

(I went with an LG 32UD99 w/ some similar criteria of decent content creation performance and some gaming - it was reasonably priced, performs well enough, and more importantly, the thin bezels and lack of logos is quite schmexy - will have a hard time going back to anything branded. The LG comes w/ Freesync but I have since swapped out my Vega 64 for a 1080Ti, and as you mentioned, HDR is pretty pointless atm on PCs so ¯\_(ツ)_/¯. LG showed a slightly improved 32UK950 at CES.)

Anyway nice build writeup, didn't spot anything too out of wack on the build-side and the inclusion of the CL timing chart was a nice thing since I've often seen people get confused/mess that up.

As a fellow long-time Mac user forced to migrate (mostly in Linux, but Windows for VR and LR) I liked the part of the writeup on environment/keyboard tweaks. One thing I didn't see was about privacy. Windows 10 is very invasive. I used O&O ShutUp10[1] to go through all that stuff but there are a lot of tools [2] and other considerations [3].

Besides WSL, I've still found Cygwin to be indispensable as there's still system-level stuff that doesn't work outside the WSL (interacting w/ PowerShell scripts and such).

[1] https://www.oo-software.com/en/shutup10

[2] https://www.ghacks.net/2015/08/14/comparison-of-windows-10-p...

[3] https://senk9.wordpress.com/checklists/windows-10-privacy-ch...


Not only between displays, but also between lighting conditions in the environment where the display is used. In principle, you should recalibrate every time you switch bulbs – even more important any time the monitor moves to a different room!


My understanding is that calibrating the monitor only takes the light emitted from the monitor into account, so the environment doesn't matter for calibration. That said, the way we perceive colour varies depending on environmental lighting, so if you want to be particular you should use D50 or D65 lights in the room, especially if you're printing.


It depends on the calibration device you are using - some also take ambient light measurements to compensate.


In what sense are they calibrated if things vary between displays?


They do a base calibration. Do a Google search for Dell Calibration report. It shows what they calibrate. Keep in mind, it's still a pretty loose calibration. I've never once gotten two of the same models that look the same out of the box.


This has driven me nuts more than once. I’ve very intentionally ordered two “matching” monitors at the same time, supposedly from the same lot, and they were vastly different. It may be an overreaction but since a number of years ago I’ve gone to just one big and nice monitor (even if it’s not technically as nice as the others) because those details get in my head so much.

As the saying goes: “comparison is the thief of joy” - so I remove the second thing to directly compare to.


Factory calibration is a thing in UltraSharp line, can't be compared to hardware calibration but works pretty well out-of-the-box.

http://www.tftcentral.co.uk/reviews/dell_up2718q.htm#hardwar...


I am under the impression they do some basic tests that don't entirely accommodate for uniformity of illumination and consistency of color across the display. In other words - I think maybe they only test one part of the display.


ICS profile with the screen’s unique profile baked into the EDID? Just spitballing.


> My computer might not be half as fast, but at least I know my blues are the shade of blue I think they are.

I've wondered if one could do color calibration using any ordinary digital camera and some clever software, but don't know enough about how color calibration actually works on computer to figure out if it would actually work.

The idea is that you would have some reference photos of assorted physical objects of known colors. The physical objects are chosen to be items that are commonly found around the house or office, or are cheaply and easily obtained.

You pick such a physical object, place it next to your monitor, and tell the software which object you are using.

The software then displays several images of that item on the monitor. One is the reference photo, and the others are tweaked versions of that reference photo that tweak the colors.

You then take a photo with your camera, showing the images on the monitor and the object itself. You upload that to the software.

The software then compares the object in the photo to the images on the monitor, and figures out which best matches. From this, it should be able to deduce some information about how accurately the monitor is displaying color, and what adjustments could be made to improve it.

Note that this does not depend on the camera being accurately calibrated. It just depends on the camera being consistent, and having enough color resolution to see the tweaks that the software applies to the images on the monitor.

Note also that this should work for more than just calibrating monitors. It should also be able to figure out a color profile for your camera along the way. Then it should be able to print some test images, have you take photos of them, and from those and the camera color profile figure out a color profile for your printer.


Color calibration is a chain, you need to know if your camera is calibrated too, so you need at least standardized targets, finding random objects of known color that cover the whole color space sufficiently is not trival, only then you could use your calibrated camera to measure a displayed target.

But I wouldn't cut corners there and just get a calibration device for about $100 plus a standard color calibration target. Certainly more sensible investment, then the difference between a 2k and 6k rig.


Interesting idea, but I think there's a variable you're missing, which is the light by which the physical object is illuminated by. The colour temperature and spectral power distribution of the light makes a difference. If you used a custom white balance in your camera you should be able to get the colour temperature close to right (depending on how neutral the camera is), but to calibrate colours you'll need a high quality light source (real daylight is quite good, but varies a lot). At this point, a cheap calibrator is probably easier and far more accurate.


Having purchased the i1Display 2, the Huey Pro and then the Spyder 4 Elite these devices have a pretty finite lifespan of a couple years at most.

If the manufacturer doesn't discontinue software updates for your latest OS the sensors in the devices themselves seem to use fairly cheap plastic which is susceptible to colouration over time causing the calibration's to drift quite significantly.

It might be more cost effective to just have someone calibrate your equipment for you rather than buying another plastic baubel with questionable longevity.


Serious question: If you're not selling work to be printed or resold as stock, how important is calibration these days?

Given the proliferation of the screen as a photo consumption device, and considering how just about every screen out there has slightly different properties (color temp & gamut, brightness and contrast, OLED vs LCD), what will calibration necessarily accomplish?


Calibration puts you (hopefully) in the middle of the distribution of errors, so your viewer is more likely to see pretty much what you intended.

I publish a lot of black and white images in part to further reduce the uncertainty of the end-viewer's calibration. Can't get a perceived color cast if there's no color.


> Calibration puts you (hopefully) in the middle of the distribution of errors

That would be an interesting research topic! Perhaps the average color of the un-calibrated displays is not the same as the one of a calibrated display.

For example most TV are put in some very colorful mode from factory and most user don't bother to switch it. The same is true for old Samsung mobile phones which had a lot of popping colors. You could switch them to more realistic tones but nobody ever did this.


>> Can't get a perceived color cast if there's no color.

But you can still have other problems depending on the dynamic range of your photos.

I often edit my photos so that stuff in shadows are not perceptible. The problem is that I edit on my laptop LCD screen, but on my phone OLED at default settings, I see clearly everything that was supposed to be hidden in the shadows.


I just use f.lux e mess all my calibration :-)

Another way to mess it is to buy a glass with "digital lenses" that will filter the blue spectrum.


there's also this gem: "I'm not a professional photographer. I'm merely a hobbyist."

o_O


Why should that tip off your BS detector? There's a long, long, long, long history of amateurs lavishing far far more attention on their rigs than pros ever could. It became a trope in Renaissance literature.

Basically, people get rich doing other things and then dive into their hobbies buying all top-of-the-line gear. Whereas the pro started with nothing and learned the basics, so they are able to get much much better results from substandard gear.

Holds across all creative disciplines.


I'm pretty sure a professional would rather buy a dual-socket Xeon based machine (say 2x E5-1650 v4) that has 4x the memory performance for closer to half the price.


A lot of professional photographers don't know much about computers and CPUs. Remember that "professional photographer" crosses a very broad spectrum of photographers, since the bar is basically "I charge for my work".

For better or worse, many professional photographers often rely on the advice of friends and salespeople for "something fast enough for what I need to do".


Why so negative? Some people spend a lot more than that on what I'd classify as less important stuff.

I'm not in a position to travel the world with high end photo gear. (I just recently replaced my 10 y.o. desktop pc case)

...but I am in a position to ve happy together with thise who can.

(And I also have rewarding and less expensive hobbies like learning to fix my stuff and coding side projects :-)


For $6000 he could have got something like:

CPU: 2 x Intel Xeon E5-2637v4, 3.5GHz (4-Core, dual threaded, HT, 15MB Cache, 135W) 14nm

RAM: 128GB (8 x 16GB DDR4-2400 ECC Registered 2R 1.2V DIMMs)

Graphics: NVIDIA Quadro P4000 GPU, 8GB GDDR5, 105W, Single-Width, PCIe 3.0 x16,

Seagate 1TB Exos 7E2 HDD (6 Gb/s, 7.2K RPM, 128MB Cache, 512n) 3.5-in SATA

just a quick build i did over at Silicon Mechanics


There's no real benefit of that much RAM in Lightroom (Premiere Pro is a different story). I also don't need spinning disk storage as I have a 12TB NAS (which is 3 years old, I can probably replace all the disks for 10TB models) for when I archive my shots, so faster M.2 SSDs were a priority.

The dual CPU setup doesn't help Lightroom either -- it's not terribly efficient for multiple cores so I can't imagine it would do any better for dual CPUs. I'm trying to find the adobe faq on the topic I thought I've read about that.

Quadro has some merit if I wanted a 10-bit workflow since I already have a 10-bit panel.. but I did kinda want to dabble in some gaming and VR with this build as a side-benefit so I went with a GTX card.


> There's no real benefit of that much RAM in Lightroom

This really depends on what you are editing. 13mp snaps from your ancient DSLR? Yeah, no point. 52mp snaps from your digital medium-format camera? You'll probably want more than you'd think. 4x5 or 8x10 negative scans? Open the bloodgates.

(and remember that by default Lightroom does not parallelize across multiple photos within a batch job, so multiply your measurements by 4x or whatever if you're going to be launching multiple batch jobs at once to fully occupy a high-end processor.)

128 GB is overkill, but there's a solid justification for at least 32 GB or 64 GB for power-user situations.


Running latest lightroom on a 1950x w/1080ti card and CPU % doesn't go beyond 20%. Just unfortunate all this computing power is wasted.


Oddly, Lightroom does not parallelize across multiple photos in a job. Try cutting your photos into smaller batches and launching multiple batches in parallel.

You would figure that's an obvious and simple feature to offer, at least as an option.


I'm clueless when it comes to graphics cards. Yeah, in your case, you might really want to just massively overclock a few cores. :)


I work with huge 3d images my self 600x600x600 pixels/voxels, can become quite demanding when calculating warps, etc.


That dual CPU setup gets you 4x the memory performance though... Which is likely your main bottleneck.


They included a $1499 monitor in the list as well, along with a $400 VESA arm mount, and hundreds of more dollars in accessories.

Overall, with your Silicon Mechanics build they'd probably be closer to a $8000 build. On top of that, a P4000 is pretty much a beefed up GTX1060, which gets smoked by a GTX1080TI if one is going for GPU performance. They're not really comparable builds.


I missed the monitors. Fair point.


I use both Lightroom and Darktable regularly.

While DT has a bit more bumpy UX, it makes up for that by being more flexible than LR and really fast on Linux with a good GPU with OpenCL support. And there is nothing that LR can do that DT can't.

I.e. you don't need to shell out 6k for new box to make LR run fast. Just bite the bullet and learn another RAW editing app.

If you have thousands (or ten thousands, in my case) worth of RAWs processed in LR, just keep editing them in LR if you still have to. I often hear this as a reason why people don't want to switch.

I switched to DT in 2011. Everything before is LR. I use LR when I get images from friends who use LR and want me to do some advanced stuff for them or when I have to touch one of my pre-2011 RAWs. It is always slow on the same machine, compared to DT, but it's quite ok if you use it occasionally. :)


Lightroom is slow as fuck in my experience. Just bought a new laptop with a top of the line i7 CPU, SSD, 32GB RAM, 1050 GPU, etc. and it tears through video editing, solidworks CAD work, etc.

But even still LR lags constantly working with a small collection of photos from my M4/3 camera. Like I can't scroll through a collection without it hanging.

I assume this is caused by poor GPU utilization, but it's very frustrating whatever the cause.


Yeah it was really weird to read of all the in-depth research done and extra mile the author went to in order to get an optimal build for Lightroom, only for the benchmarks to have been slower than a Macbook Pro in some areas and in some areas shaving only 2 mins off of 15 mins. That's just ridiculous.

I shoot with a D3100, whose 14MP is a third of the author's 42MP, but with Darktable, editing RAWs on a Mid-2012 Macbook Pro has been a breeze.


Any reqs for a GPU?

I have not gotten a handle on my photo archive/workflow since switching to Linux, but want to get familiar with DT.


N.B.: the "switch to Windows" is from macOS and not from Linux, just in case anyone was wondering. I think it's very interesting that the availability of Linux on Windows helped the author choose Windows.


The switch stories are always either “from Mac-OS” or “to Linux”, the former being a “oh no you didn’t” and the latter being “I’m with cool birds now” narrative.

It’s basically non-stories because your OS matters as little as what programming language you use.

Pick the one that works for you, but don’t become a fucking missionary or tie your personality to your brand of choice.


> but don’t become a fucking missionary

That might work for you and your choices but when you talk to developers whose software doesn't support your choice because "nobody uses it", and there are no native alternatives, you quickly see the fight is real.

The odd bit of platform evangelism is the reason we don't all have to use Windows. It's the reason we have a choice at all.


Yea, I made the jump from osx to Linux switch more than a year ago because of the hardware situation. I also have dual boot windows just for Lightroom. It was a bit of tinkering just because I decided to go for a bit esoteric distribution (NixOS) and also switch to tiling (i3 and sway). It’s different, I think it’s much better and tiling really keeps the clutter at bay. I am very happy with the setup and when I have to use OS X (still have the old MacBook pro for time on the road) I don’t enjoy it any more.


>> Pick the one that works for you, but don’t become a fucking missionary or tie your personality to your brand of choice.

I find that people naturally tend to want others to be like them.

It doesn't matter whether its politics, religion, OS, camera brand, camera sensor size, programming language, diet fad or whatnot.

Maybe it makes them feel better about the choices they made?


More like people want external confirmation that they made the right choice.


>It’s basically non-stories because your OS matters as little as what programming language you use.

>Pick the one that works for you, but don’t become a fucking missionary or tie your personality to your brand of choice.

I do what I want and you can't tell me what to do!

I think RMS was and is right and computing should not be viewed through a purely pragmatic lense, and I'm going to keep mentioning the importance of foss and copyleft to the future of computing and how it enables freedom for the user until I see fit to stop.

I think OS choice greatly matters. I say this as a senior sysadmin who has had to support all 3 major OS's for over a decade. Windows 10 was the final straw for me and I went completely gnu/linux and haven't looked back since except to feel sorry for all the stockholm syndrome I see in those chained to those ecosystems.


The RMS message is different though.

It is not about any specific OS or platform, it is about making sure the owner/admin is the final arbiter of what can or can't be done on that computer (now and into the infinite future).


>it is about making sure the owner/admin is the final arbiter of what can or can't be done on that computer

Which you can't do with Windows, and only to a certain degree with osx.


As someone who made the same jump, Windows wasn't really a viable platform for non-dotNet developers until that happened. The state of Linux on the desktop is, frankly, a disaster — I spent months on trying to make it work, but wow, it's not worth the time.


Windows has been a viable platform for software development since always.

I have been developing Windows software since Windows 3.1 and my first UNIX was Xenix, followed by DG/UX, Aix and many other variants.

Windows is perfectly viable developer OS for C++, Delphi, Tcl/TK, Perl, Python, Java and .NET developers.

Microsoft did a mistake not following up on Windows NT POSIX support, because that is what many care about is POSIX shell utilities and C APIs, the actual kernel is irrelevant.

However now GNU/Windows fixes that problem.


Exactly I spent many years with a windows desk top as my pc developing for Solaris and before that Primos (an ITS derived os) and for Unisys A Series mainframes.


> GNU/Windows

I LOL'ed.


+1

That's crazy.


I must be doing something wrong. I've been using Linux for my primary desktop since 1995 (started with TurboLinux 2.0, then jumped to RH v5.2).

Was it a cakewalk over the year? No. Even today there are times where I'm in WTF mode about something or another (my favorite is when I do an update to the NVidia drivers and it borks my system hard enough I have to restore my X config in some manner or another at the command line because I have custom crap in it).

But I don't regret anything about my 2+ decade decision to ditch Windows.


Linux on Windows is still a disaster? It's been something I want to play around with but haven't had the chance yet.


I read it as a standalone Linux on Desktop is a disaster. Not Linux on Windows.


Hardly a disaster ok open office isn't as good as Office but its a perfectly workable solution.


The fact that there is a wall between the Linux stuff and Windows stuff (“You can't edit a file that originates from the Linux userland inside Windows.”) sounds very hacky to me.


Just do a samba share from the Linux vm to windows


It's not a VM, it's like the inverse of WINE... it's a compatibility layer... and that one piece (not being able to edit files in the LSW from windows/gui) is what keeps me off of it.


Oh why not use windows pro and hyper-v then and run a Linux vm then that's assuming you don't run a small whitebox server to do your dev on?


I do.. and samba to use a GUI editor against files in the VM. That said, there's the overhead of a full VM... some of the same reasons one would use WINE instead of a full Windows VM.


Paul, did you consider buying an existing workstation like the HP Z840 ( http://store.hp.com/us/en/mdp/business-solutions/z840-workst... ) or similar?

I did this a number of years ago and never regretted it and dual-Xeon has really helped with DxO PhotoLab and Adobe Lightroom processing time (compared to all the other computers I had access to).

Even years later I still believe that this is performing better than something I could've built myself. I think of computers like cars, in that if you upgrade 1 part significantly for performance (i.e. the engine) that it forces upgrades to everything else (i.e. brakes, chassis, cooling). A pre-built workstation balances all of these things to give one package where all of the potential is achievable.


Following on from this, if you don't need the latest hardware (i.e. you are currently using a laptop as your main device), you can buy used workstations on eBay rather cheap.

At the end of 2016 I bought a three year old Dell T3600 with 32GB ECC RAM and a 8c/16t Xeon E5-2670 (Sandy Bridge) processor. I just threw in a modern graphics card (GTX 1060) and SSD, and it's perfect for my needs. Total cost was under €600.

I still have a laptop, so if I'm out I will just SSH in to work.


+1 here. Buying used professional grade gear is the way to go. I just built a fileserver with better passmark than threadripper for 700 dollarydoos (plans for lots of concurrent transcodes), and got ECC ram to boot.


Looking at that, you can only get workstation graphics, which is something he did not want since he does occasionally play games.


Low spec the graphics card and replace just that one component.

You'd still get a more RAM support (and ECC at that), heavy workload CPU support, and most importantly a motherboard designed to shove that much data back and forth between the CPU, RAM and other devices.

What I realised with my gaming builds before is that the weakest link was just shifting the data around. The CPU was seldom maxed out, the RAM seldom straining under the workload, the storage not fully saturated... it was all constrained by how these bits fitted together and how the motherboard set the constraints on those components.

e.g. the Core i7-8700k compared to a Xeon E5-2637 v4 https://ark.intel.com/compare/126684,92983 the increased CPU cache, extra bus speed, more memory channels make a difference.

I basically doubted my ability to build a gaming rig that balanced all of the components to give the best performance for the money spent (for the same use-case as Paul)... that could rival one of the big company workstations. And when I looked at the money I was spending, I saw that HP were delivering more for each $ I spent.

In my case I purchased a standard HPZ800 (it's a few years ago!) and replaced the graphics card. It's great for gaming, and really strong for photo work (also have a Sony a7rii) and for video encoding.

Both routes (self-build rig vs workstation) are valid, just curious whether Paul went through those considerations too.


I think by the time you built up that dell system to match the custom rig, it will be much more expensive. ECC has no benefit here (for anyone really). And while you get more cores and cache with Xeon, the 6 core overclocked i7 with double the clock speed would surely outperform except in the most parallel of processing applications.

As for picking all the perfect components that work together, there are myriad of people and places online crowdsourcing and benchmarking the perfect configurations. I doubt Dell is putting much thought and testing into their configs beyond reliability and cost.


There would be no difference at all for him with ECC ram. Except for slightly lower performance.


Hey - ah that did not come to mind, no. Was a bit over-enthusiastically into the idea of making a project out of this and building one.


I'm leaning in the same direction myself. My 2017 MBP Pro has been a disappointment. The touch bar is a nuisance. The excessively large trackpad leads to a lot of phantom taps, and I'm still using adapters to plug things into it.

Since I'm wrapping up the last of my iOS client work there's nothing really tying me to macOS any more and a PC seems to provide way more bang for the buck for the things I care about - Android Studio, VS Code, Ableton Live, ZBrush, and a bit of gaming here and there. Apple's hardware playbook for the last few years seems to consist entirely of making things thinner for thinness' sake and making piecewise upgrades harder and harder.


2016 MBP with touchbar reporting in. I have the 13” model without a dedicated graphics card, and I made the mistake of getting an external 4K display. The laptop can use the display, but everything is remarkably sluggish for a computer I paid $3000aud for. Especially safari with a few tabs open. The most embarrassing part is that I sort of got used to it and stopped noticing, until I spent time on my Windows gaming pc. The difference is night and day.

I’ve recently ordered an external thunderbolt GPU enclosure, and I’m hoping I can solve the performance issue that way. But in the long run ... I’m not really sure what I want to run on my next computer. To be honest all the answers seem a bit bad. Macs are overpriced and underpowered. Linux on a laptop still seems like an endless stupid time hole - I had the Ubuntu installer reliably kernel panic on me the other day. And windows ... does windows support smooth scroll yet? Can you turn off the telemetary and pre-installed games in the start menu? Will I be able to install and try out the database or exotic programming language of the week on Windows, or will it be more fighting?

Is it just me or did computers stop feeling better with each generation? When did we lose our way?


My work laptop is a 2017 15" MBP. I was able to get a 4k external monitor and it definitely felt sluggish. That's when I realized the $70 multiport dongle that apple sells cannot do 4k at 60hz, and so I was seeing 4k 30hz. Had to get a dedicated Displayport to USB-C cable and now it seems to be fine.

I haven't had too many issues with performance. Few crashes here and there. The keyboard is more of an issue for me when dust gets under the keys. I don't mind the giant trackpad as I feel it gets palm rejection right most of the time.

By far the worst part of this computer is the touchbar. It's useless. I can't fathom what Apple was thinking making this required for all higher end Macbooks. The only thing I use it for is the volume slider, and occasionally buffering songs on Spotify. Other than that the ESC key is horrendous and it has no utility for me over standard keyboard hotkeys/fn buttons.


A guy at work just got his 2016 13" MBP replaced under warranty due to the slow graphics with an external display. There seems to be some known issue.

He loves his new one.


"Linux on a laptop still seems like an endless stupid time hole - I had the Ubuntu installer reliably kernel panic on me the other day."

Reading comments like this, I can't help but feeling that somebody is missing a trick here: I don't doubt your experience, and I've seen similar comments, but they don't match my experience at all, which is that Linux works with no tinkering. It would be really interesting to collect experience reports from folks like you, to see why there is such a divergence, and figure out what could be done about it. Back in the day, the worst cause of Windows crashes was basically a single problem: the quality of ATI drivers were bad, and once that was clear, it got fixed.


Thats a great idea. For what its worth, in my case I was using the 16.04.3 LTS installer (17.10 had been pulled at the time due to a bios issue). The kernel was panicking on boot because of issues using the opensource noveau driver for my NVIDIA gtx1080 graphics card. I needed to blacklist the driver to get the installer to boot. (And once I did that it ran fine.)

I ran into exactly the same bug installing ubuntu on my friends' PC a couple of weeks ago. In his case he has an nvidia graphics card from a few generations ago. (8xx series? 7xx? I can't remember.)

Whatever the bug is presumably its been an unfixed problem in the 'LTS' release branch for 2+ years.


I think your problem is more related to the fact that your MBP is using a U-class CPU than not having dedicated graphics.

I used the built-in GPU on my desktop with a 4K monitor and outside of gaming, it did not affect the UI snappiness at all. I would suspect that if you had the 15" MBP with the quad core CPU and turned off the discrete graphics that you would notice that the Intel HD graphics 4k performance is pretty good.


I bought and returned a 2017 MBP just last week. It is a pretty awful experience when compared to the old model. The keyboard is truly awful. As you said the TouchBar is annoying and adds nothing positive to the experience. I think Apple could have left at least one USB Type-A port just to make life a little bit easier but nope. I feel the touchpad has got a little worse as well although maybe that is just due to the size of it. It is annoyingly large.

However the biggest issue was that when playing a video on Netflix or YouTube (both in Chrome) I would get dropped frames when switching around programs. That doesn't even happen on my 2011 ThinkPad so it sure as hell shouldn't happen on a 2017 high-end MBP!

So yeah £3k on a MBP and it went back after a week as it was just a pain in the ass to use.


> The touch bar is a nuisance. The excessively large trackpad leads to a lot of phantom taps, and I'm still using adapters to plug things into it.

Agree on the touch bar. Got myself a USB-C "dock", which does power delivery, USB, and display all in one connector. Truly magic, and made me resent USB-C a lot less.


Those are nice, but I haven't found one that would:

1. Pass through enough power to sufficiently charge my 15" MBP (not an issue on 13" I think)

2. Have 4k/60Hz DisplayPort port

3. Not get extremely hot when plugged in (I had three, including the official Apple dongle and they all had this issue O.o )


This seems like massive overkill, but it doesn't look like OP actually cared about money, given the $400 VESA mount and the $159 (wat) custom PSU cables. Also surprising that one would spend that much money only to get a little bose speaker.

But if ya can afford it, more power to you.


In regards to speakers, I found over time I rarely use speakers and are thus happy with a rather basic set (maybe not quite as basic as OP). Headphones is where I prefer to put the majority of my audio budget.


Totally fair, just surprising that one would splurge on literally everything else but not a decent pair of speakers.

Although it looks like OP does know what he is talking about and gave options for higher quality speakers if one wanted them.


> PCPartPicker.com is one relatively new resource that I have found to be invaluable.

Just the greatest site ever, if you're into building your own PCs. Fun to browse other builds, and then it makes it so simple to put together a system.


I second that - Anyone reading this thinking about building a PC, make sure you use this site. Can't stress that enough.


Second that, it pulls data from a number of retailers and has localized sites with local retailers which is a nice touch.

Something I noticed is that if the item price can’t be displayed you can search around the item options and enter your own pricing to get a feel of the price of the system if parts would be in stock


One thing to keep in mind: Disabling UAC is a VERY BAD IDEA. No matter how you look at it. Look at it this way, if you do that, then any moderately smart ransomware will be able to do wonders on your PC.

That's reminiscent of a 4chan post where the user downloads a program, tries to execute, is then warned by Firefox, but ignores it, warned by the AV Software, but disables it, warned by Defender, and disabled, and finally warned by UAC and disables it, only to get it infected and flame away because Windows sucks.


Agreed - I mention that in the post as well. It can allow for any automatic software to take over. Known risk, especially for when I was setting everything up and installing 100 different things and getting the UAC full screen takeover so often.


I've had similar a experience doing Android Development. I much prefer OS X over Windows 10 for doing work, but performance can be a real issue.

Over time, and with every new yearly incremental software refresh, compile times on my MacBook getting slower and slower, it started to hurt productivity. So looking at options for new hardware, I realized I could either spend $3000 on a new Mac, that on paper might not even be that much faster then my current machine, or just throw together a PC from components for a bit over $1000 and get a much faster machine.

There is just no good Mac option if you're looking for a simple i7, 32GB RAM, 256 SSD for a reasonable price.


Some of the Thinkpad range fit your criteria, and can be tricked into thinking they are Macs...


So, the main reason he switched to a Windows machine was to use the latest/fastest hardware with Adobe Lightroom and Premiere, but then he goes on to say the Lightroom and Premiere don’t take advantage of many core CPUs, the fastest GPUs, fastest SSDs, etc...

And Windows is fine after tweaking and adjusting dozens of settings.

I’ll just stick with a Mac.


Premiere Pro definitely takes advantage of GPU and CPU very efficiently. Lightroom does not - which is why instead of going for some 18 core i9 7980XE build where it has more cores but a much lower clock per core, i opted for just 6 cores but with a very high clock.


I don't think that Premier does really take advantage of video cards. I saw a video where they had switched a really old GTX 660 vs a 1080ti and the render time was not changed between them.


I did a bench of the Premiere Pro h264 benchmark ("PPBM") with CUDA acceleration and without (software): the difference was 7m 4sec without GPU and 48 seconds with GPU.


Sorry, I should have phrased it better, you don't have to have the latest and greatest video card for Premiere. But, yes, you are correct, having GPU acceleration does work.


It's possible the operation bandwidth limited in which case at some point a faster card won't speed it up any more. But gpu acceleration vs software is an enormous difference in premiere


In our workload this is definitely the case. When on the GPU the processing is almost infinitely quick. It's the transfer to and from the GPU memory which takes the time. This might be the case in the parent post.

There was a guy who managed to go directly from the SATA controller to GPU memory by modifying the Linux kernel for some postgres database queries and got massive speedup. Would love it if that sort of optimization could be more readily available.


Seems a bit short sighted of the lightroom developers high end mac users have been worried about Apples ignoring of the high end for years now.

Ryzen and Threadripper is supposed to be a beast for content creation that can use a lot of cores

I recall reading in SoS (sound on Sound) years ago that the big name recoding studios where worried about this back then.


It doesn't feel like it uses my AMD card at all. Whenever I check taskmanager it's always using CPU, even when rendering. Don't know if it just doesn't support AMD cards, but I'm not sure what would help increase it's performance aside from RAM now.


Actual encoding for renders is all CPU based: https://forums.adobe.com/thread/2122549

Personally, I use DaVinci Resolve, but the results are largely the same, with exports being largely entirely CPU dependent: https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-1...

Looks like Puget Systems does a lot of performance shootouts, including some Premiere ones: https://www.pugetsystems.com/all_articles.php

Philip Bloom recently did a writeup of his switch from a Mac to a PC for his editing machine (yes, it's definitely a trend, unless you are a FCP diehard or doing audio production, the Mac is sadly done as a content production platform): http://philipbloom.net/blog/makingtheswitch/

Will be interesting if they revisit some of the shootouts post-meltdown/spectre patching...


And Windows is fine after tweaking and adjusting dozens of settings.

MacOS or Linux also require tweaking to be useful for most people.


I probably flip several settings to make macOS liveable for myself. I’d have to do all the adjustments Paul discussed and probably more to make Windows liveable.


I on the other hand need about a day or two (interspersed with real work) to tweak a Linux system, a few more than that to tweak a Windows system, and a week or two for a Mac. And newer versions of all of them keep breaking the essential tweaks, so I am constantly frustrated by the OS getting in my way.

You can prove anything with anecdotes :)


to be useful? For most people? wow, that's quite a claim.


I am still shocked Adobe software handles 4k systems so incredibly poorly.


For image and photo editing - after many, many years I ditched Adobe Photoshop for Pixelmator Pro - couldn’t be happier, there’s a few workflows that they’re still improving with very regular releases but god damn it’s fast!


You can do whatever you want with Windows.

Even after tweaking dozens of settings on my Mac, I still don’t like it.


Can you see the time with a fullscreen video without having to minimize the video yet?


I didn't downvote you, but there are dozens of utilities to do this.

Here's one way to do it with AutoHotKey - https://autohotkey.com/board/topic/23879-always-on-top-clock...


Sucks that the best approximation is an always-on-top clock.

It's in the taskbar, they have functionality to show / hide the taskbar, they should combine the two.


Not really because I can make it look like whatever I want. And I can place it wherever I want because Windows is flexible like that.

Meanwhile Apple doesn’t even let you change the color of your mouse pointer. You couldn’t even replace the dock if you wanted to because Apple hides the API that lets a replacement change the usable display size. They don’t even let you soft-disable an external display! LMAO

Windows is so much better. I can run Windows on any hardware that I want to and I can make it do exactly whatever I want it to. And it’s got way more software. And better software.


I'll take the downvote as a "no".


Mac isn't really an option if you want a powerful desktop though, unless you want to spend $5k I guess...


But a $5k Mac will lag a $5k Pc in performance.


Yet OP spent that and more...


For a whole lot more performance than you'll get out of your comparably priced Mac. I guess he did have to... change settings though, so the thousands spent on mac aesthetics are completely worth it.


I guess if your time is free or you enjoy all the labor that goes into research, assembly and maintenance, sure, this is cheaper.


It's really not that hard, and it's a myth that your free time is worth money. I have a child, a wife, and a demanding job. I still somehow find the hour or two it takes to order parts and the hour or two it takes to piece the PC together. I don't know about you, maybe you can just open a laptop and make $500 / hour on demand whenever you like, but I can't. Not sure what maintenance you're referring to either.


Yeah, but nobody is saying that OP made smart decisions here...


This is a brilliant post. The ability to navigate around the article easily and then parachute into a section and revel in detail is beautifully executed. Using links to add asides is another excellent touch.

When scowls complain that the Web is crap, this is where you take them to see what the Web could in fact be.


Can anyone shed some light on how some blog posts of his are done (coded, styled, built)? (For example, https://paulstamatiou.com/photos/new-zealand/ and https://paulstamatiou.com/photos/new-zealand/auckland/. They're simply marvelous!)


https://paulstamatiou.com/responsive-retina-blog-development...

Jekyll, and a whole lot of attention to detail.


I love the detail that the author goes into, it's a nice change of pace from the normal articles on the web. His travel photography is also stunning, almost makes me want to do something with my travel photos as well, but once I see how much time he puts in, I think I'm not going to do it until I'm well retired.

At any rate, I'm glad he enjoys what he's doing and shares i with the world, the photography really gives an effect of how he saw it, which is all to rare with people's vacation photos.


Word of caution 4K@27" is still pretty terrible in Windows. It's fine if you stay in specific apps for 99% of the time (such as a cad workstation or a Lightroom machine as in this case) but 27 requires scaling and many apps simply don't scale well.

The "simple" solution is to run 4K at 32" or larger where you can run att 100% scaling. Unfortunately, if you want a screen with decent color you are adding serious money to get to 4k@32".


Why is this said in every discussion of high-dpi displays? I've been using high-dpi displays on Windows for longer than Retina was a thing and it's totally fine with only the extremely rare app/dialog not supporting it. My main display right now is 4k@27" with scaling on. Most apps are built properly and Windows now even supports fringe features like multi-monitor setups with different scaling ratios (a feature I was eagerly awaiting).

It's not perfect and I'll occasionally come across some old utility that was built 10 years ago and renders poorly, but never a commonly used or modern one. The level of inconvenience doesn't even register.

What do you want to use that doesn't work? Please be specific!


I definitely use lots of apps that aren't rebuilt in the last decade which is pretty common in the enterprise (in house stuff).

But among the "new" and "widely used" apps on my desktop right now that don't scale well (i.e. bitmap scale) are e.g. Skype, McAffee, Cisco AnyConnect (it seems about 50% of the apps on my desktop are not properly DPI aware, but simply bitmap scaling).

That's not the bad part though, I can learn to live with a few blurry bitmap scaling apps (and the odd miniature one that doesn't scale at all for some reason), but the bigger problem is apps that partially scale so they become unusable. Here is a screenshot I just took, of JetBrains DotTrace (a profiler) at 2x scaling (At a laptop with 15" @ 4K): http://prntscr.com/i4tpmb It not only looks terrible, but some things are actually unusable.

The bad thing about that kind of bug is that when an app is unusable, even if it's only one app of 100 that you use, you have no choice but to change scaling or resolution just to do that task which is a terrible interruption, especially for scaling that might require a logout to take effect.

Edit: I don't consider multi monitor with different DPI's a fringe feature, since the high DPI is very common on laptops so any setup with a laptop plus an external screen will often be using different scaling for the laptop screen and the external screen.


Wow that screenshot of DotTrace is pretty bad, thanks for sharing. Skype supports high dpi just fine though (I use it regularly).


You may be able to fix this issue by overriding the DPI scaling for this application to System in its program properties dialog.


A co-worker is seriously considering switching back to Mac because so many apps are misbehaving in a mixed-DPI environment in Windows. Maybe if both your monitors are 4k you're okay, but it really sucks if 1 is 4k and the other isn't, such as when you have a laptop with an additional monitor.


I actually use two monitors with different scaling and it works just fine for me. That’s why I mentioned it. What apps is your friend having trouble with?


>> Word of caution 4K@27" is still pretty terrible in Windows.

Well, YMMV. I have been running 2 x 4K@27 with 100% scaling for a couple of years now, and I am very happy. I don't adjust the default font sizes in my apps either.

FWIW, my screens are between 21" and 27" away from my eyes at their closest and furthest points.


Yes, I’d also opt for 100% and just ever so slightly too small text, over scaling.

You can always zoom text in web browsers, editors etc when so most reading is fine anyway.

If you have good vision and the screen not too far away it’s certainly doable - but 32 is a pretty big improvement.


[offtopic] show me a 1080ti for $799 and I'll buy it on the spot.

Effing miners devoured everything even for twice the price...


Unfortunately yea that was the price when I purchased it - I got it shortly after launch and built the pc later


Well, if you think water cooling is cool, try passive cooling.

A few years back I build my current PC:

- Core i7-4770S (Haswell, so you see its not that new anymore)

- 32GB RAM

- SSD 840 EVO 1TB (not that new either)

- some passive PSU

Last year I added an

- AMD Radeon 460

So the whole thing is completely passively cooled.

So no noise at all and I am pretty happy with the performance. I don't know what 'instant' performance in Lightroom means, but so far my experience with Darktable was just fine (actually, I was wondering why some options have a 'slow' or 'fast' suffix). That said, I am just a casual Darktable user, so I take all my photos in RAW and JPG and view them most of the time in the JPG version, but when I would like to create a photo calendar of some sort I use Darktable to get the most out of the pictures.

The only downside is that hardware doesn't age that good :-/

  $ grep -m1 bugs /proc/cpuinfo 
  bugs            : cpu_meltdown spectre_v1 spectre_v2


Even the newest CPUs have those bugs (well, AMD doesn't have meltdown iirc). That said, I think the old 47xx cpus have aged extremely well. Running a 4790K myself, and will probably hold out for 2nd or 3rd gen Ryzen (unless Intel has something competitive in price/performance). The latest CPUs only do much better for many-core chores, and I just don't do enough of it to spend $2k+ on a new setup (migrating some bits) over keeping what I have for a 10-20% boost in performance.


Passively cooling 65W TDP?


Apparently ;-)

Actually, I bought the CPU, mainboard and cooler assembled together from mifcom, so they did the testing that everything runs stable and smooth.

Just looked up some other details:

- Mainboard: ASUS Z87-Deluxe

- Case: Xigmatek - Asgard Pro

- PSU: 500W - FSP Aurum Xilenser

- CPU Cooler: Thermalright Macho HR-02 passive [1]

As you can see that cooler is pretty big. The case isn't too fancy but it does the job and venting slots on the upper side. Interestingly dust doesn't seem to be a problem, in fact I never had so little dust in any previous pc.

[1]: http://media.bestofmicro.com/N/G/460924/gallery/Thermalright...

[2]: https://duckduckgo.com/?q=Xigmatek+-+Asgard+Pro&t=ffab&iax=i...


Now that is a brick of a cooler.


Both me and my wife recently realized something: the JPEG produced by the Pentax cameras we own are 90% of the time better than what we achieve with hours of raw conversion tweaking.

Given that I'm not doing retouching most of the time, dealing with raw files becoming more and more a pain, without an actual gain.

Had anyone else thought about this?


I usually shoot to JPEG and RAW simultaneously. Most times I never touch the RAW file but in case I ever want they're there. Disk space is cheap nowadays so redundancy isn't a real problem for me anymore.

Haven't used a Pentax expect for my trusty K1000 so I don't know if they support that.


The higher end ones can - they even have an option to choose between PEF and DNG for RAW format.

This is what I'm doing as well, but the DNG files take 20-30MB of space, even at pathetically low 16MP (/s, high quality 2MP was enough for A4 prints...) resolution, and disk space is not THAT cheap once you want local and offsite backups as well.


>> the JPEG produced by the Pentax cameras we own are 90% of the time better than what we achieve with hours of raw conversion tweaking

Canon, Fuji and Olympus OOC JPEGs also have great reputations.

>> Given that I'm not doing retouching most of the time, dealing with raw files becoming more and more a pain, without an actual gain.

If you're not retouching, there probably is no point for you to edit RAW. I use RAW myself mainly to correct color temp and high iso noise reduction. The noise reduction in particular is useful for me because I prefer to use smaller sensor cameras.


A lot of the photos are great! And a lot of them..meh. Which is fine! I'm no star shooter, and far be it from me to judge someone wanting to show off their holiday. But he's spent an inordinate amount of time editing quite a bit of photos that surely could be left on the hard drive, or simply given a batch auto-touch-up by Lightroom in a matter of hours. And it's even stranger that the hobby would drive him to such a monumental and costly upgrade.


Consider the possibility that he simply has a (relative) lot of money.


And time!


It's a project :) I get joy some going in depth with everything I'm passionate about. I do it with my photosets (spent a year editing the 11k+ photos and then designing/developing the 9 photosets): https://paulstamatiou.com/photos/new-zealand/

and other projects like my RasPi Frame: https://paulstamatiou.com/getting-started-raspberry-pi/

or documenting one of my design projects at work: https://paulstamatiou.com/twitter-video/


Looks like he built a really fancy and nice gaming machine. Makes sense to switch to Windows if you are a gamer.


Good gaming PCs are good development and creative machines. All of these activities excel with more cores, ram and gpu cores.


The setup is nice but I'm long over the tower on the floor. I actually wanted to go back to the start of my computing journey (Commodore128, then 286, etc) and get a desktop[0] again.

It's very unfortunate that while Intel was still probably the right call for Adobe software, the I/O impact on from Intel's Meltdown patches are going to be significant on that machine. Once it all settles down and is finalized in a few months that is.

As well, the watercooling thing was pretty neat back in at the turn of the millenium. That's when the Celeron 300A and custom machined watercooling was big. I lost interest once CPUs became mostly no longer thermally limited (exception for Intel's current CPUs with the +~18C IHS issues). In general I prefer air cooling because I tend to always go for simplicity. A fan on a chunk of metal is pretty easy to troubleshoot and repair upon failure and reliable.

I did get my 80's form factor desktop again. I built a Ryzen system[1] back in March 2017, the first new computer I put together since 2008 and I love it. I get by at least.

[0]Back in the day, horizontal desktops were referred to as desktops and towers were towers.

[1]https://i.imgur.com/JpgeAje.jpg


I can tell you are a computer wizard with the perfectly aligned windows and task manager open. How long did it take you to prepare for that photo?


There are lot and lot of advices in this article you SHOULD PRECISELY NOT FOLLOW when you build an editing PC.

> Watercooling

Watercooling addicts will try to argue that you can achieve a better cooling performance than with aircooling. The truth is that watercooling is at best only 1 or 2 degrees cooler than a proper aircooling system, or even worse in some tests. Then they will argue that watercooling provide a better performance/silence ratio, which also falls short when you consider that top aircooling systems are basically dead silent. You really don't want to deal with water in your PC (even with all-in-one systems) for nothing to gain.

> Not using a calibration device

I don't even understand how you can come up with the idea of writing an article about building a PC for photography/video editing while you don't already use and don't even plan to use a calibration device. The point is not even if you plan to publish, print or whatever, what is at stake is the way you view your own images. And there is ABSOLUTELY NO SENSE in buying a top of the line monitor if you don't calibrate it.

> Buying the best performing components

For 60-70% of the price of the top of the line product, you will get 95-99% of the performance of it. The same best performing product will anyway be "obsolete" (compared to the new best performing product) in a couple months and its price will drop 20-30%. This has always been true and will always be true for any PC building. This is even more true for an editing station since the top of the line will not even give you the 1-5% performance benefit you should expect.

> Buying a gaming video card

Particularly the top of the line. Your editing software will never use the processor and the memory that comes with such a gaming video card. And only pro cards will provide you 10-bit workflow, which is what you need if you bought an editing monitor with 10-bit panel and 99-100% Adobe coverage. You can buy an entry-level pro card, it will be more than enough for Lightroom/Capture One/Photoshop/Premiere/etc. IF you don't buy a pro card, then why do you buy such a monitor?

> Bothering with huge overclocking or with RAM latencies (!)

I agree it's quite easy now to do some overclocking, but you should not aim for the extreme. It's your work machine, you want stability.

> Delidding your CPU

What?! Just don't that. Let's be serious a minute.


I think you are misinformed about the current state of watercooling (even AIO). I'm running a very hot overclocked machine running extra voltage than normal. A 280mm radiator is an improvement over even the largest copper hsf.

As for the gaming card - it's not top of the line (that would be Titan Xp and Titan V) - but yes it's very high-end. So why did I opt for that? Why not? I do a ton of 4K gaming and VR as well. Definitely not the primary goal but a nice side benefit. I mention some of the games I play in the setup section.

As for the pricing - obviously monetary concerns were not much of an object with this build but the price of my graphics card has doubled since I purchased it. And RAM has gone up another $100.

But yes - it will get outdated in just a few weeks. The benefit of the PC is that I can just swap out an upgrade to the next high-end thing over a weekend when I want. I already did that once - this build started out as a 7700K cpu/mobo and I swapped them out for an 8700K before publishing this.


> I think you are misinformed about the current state of watercooling (even AIO). I'm running a very hot overclocked machine running extra voltage than normal. A 280mm radiator is an improvement over even the largest copper hsf.

I'm not saying it's not an improvement, I'm saying the improvement is marginal, meaning it's not worth it.

> I do a ton of 4K gaming and VR as well. Definitely not the primary goal but a nice side benefit. I mention some of the games I play in the setup section.

I think that's my point, it's more a gaming/overclocking station than an editing workstation. The article is fine and well written from this perspective.

Oh, and a small bonus for you: you should move your cpu watercooling radiator from front to top. In your configuration your radiator is basically heating your whole computer, including you graphic card. Since the graphic card is warmer and usually louder, this is not what you want.


Absolutely wonderful article. Your blog in general is incredible - nice to see folk maintaining such great blog content outside of Medium. Also gotta say - the quality of your photo sets, in particular NZ are just jaw dropping.

Regarding your LR benches any ideas why your 7700K setup was so much slower than your iMac and MBP in Lightroom? Seems odd how much faster the 8700K OC is...

On a related not I ran a similar test comparing a 7700K build with a GTX 1070 vs a maxed out Macbook Pro 2015 15". The Macbook Pro was 25% faster which is bizarre given that the 7700K is ~30% faster in single and multi-threaded performance.


The guy went to great pains to share a very detailed article. Perhaps you could be a bit more helpful with your opinions rather than dismissive of his.

As for delidding and water cooling, while yessir, water cooling doesn't normally offer much benefit, that is because you are still cooling with ambient air. If your processor overclocked, you will likely see larger improvements with water cooled. And for the i7 8700k, there seems to be quite a lot of anectodal evidence so far that it is difficult to keep cool at 5ghz+ is difficult without delidding.

I am in the process of designing a new dev rig and in the past, I have always stuck with stock speeds. But the ability to overclock 6 cores from 3.9ghz to 5ghz seems worth it.

And you can purchased pre-delidded, guaranteed, and tested for stability CPUs from various reputable vendors online for an upcharge.


> I wanted to place the more active drive that I would install the OS on there, and leave the less active Lightroom scratch drive in the standalone M.2 slot.

> I couldn't quickly ascertain how each slot was identified in the UEFI and I didn't want to mistake installing Windows on the wrong drive. To solve this I only installed the SSD under the heatsink first and would install the other one after I had Windows up and running.

I've found this to be the best way to install Windows whatever the circumstance as it will often install the bootloader on whatever the BIOS says is attached to slot 0 and then the rest of the operating system to the disk you specified.

By only having one disk installed you save the hassle of sorting it out later.


It was also a good investment. After recent spike in RAM and GPU prices he can sell it and make good money.


Yeah no kidding - price of the graphics card doubled since I got it.


I use Lightroom in VirtualBox which seems to work ok for me

(I don't really do that much fiddling with the photos though).

The photos are actually stored on my Linux filesystem and provided to Windows via VirtualBox's file sharing feature.

And then I just rsync the photos to a backup computer.


How's performance on that, have you compared with plain Windows or wine?


I think I tried and failed to get lightroom to run in wine quite a while ago. Maybe the situation with that has improved now? I haven't used it on plain Windows I'm afraid.

I don't do anything that intensive though, just brightening images and tagging which seems to work fine for me.


I believe Capture One Pro will take better advantage of all that hardware than LightRoom.


This is really interesting. Reminds me to a degree of John Siracusa's OSX reviews. I am struck by how customizable windows is. Is it possible to customize osx to such a degree?


I'm not sure, but much of this was to get Windows to function as baseline macOS anyways.


> Reminds me to a degree of John Siracusa's OSX reviews

Lacks the pixel counting...


Great write up, and incredibly impressed with your post on New Zealand.

I recently upgraded from a 5d2 to a Mark4, and the integration with the iphone made for an interesting workflow. As I shoot, I take mental notes about what type of adjustments I'll make in post processing, and with Lightroom Mobile, I was able to take a handful of shots, and share them via social media.

I'm considering switching a workflow from my heavy duty gaming rig, to using something like a Surface Pro.


Great article. I have almost the exact same components and I'll definetly get use of the overclocking segment when I get around to it.

Why did you go with a water cooler?


Is it really worth using 6K on a non work PC, even if you have the extra cash?

I think, personally, I would prefer a 3K machine that I replaced twich as often.


What I want to know about is that lovely desk


It's a walnut humanscale float sit/stand desk: https://paulstamatiou.com/stuff-i-use/#desk


Question for Paul, if you could use a Windows machine for work, would you? If not, why?


For actual work I would miss two things: Sketch (design app) which I need to use because our Twitter design team has a lot of shared assets for it. There is Figma design but I can't use that in isolation. Then I also need Framer for prototyping which is currently mac only. Those would be some very significant hurdles to jump over if I was to use it for 100% of my work too.


I had the same problem, but affinity designer helped me move to windows.


Anyone can be an amazing photographer - but there is a distinction between a camera nut, and a photographer, to this end, Ken Rockwell makes a really interesting point:

"You need to learn to see and compose. The more time you waste worrying about your equipment the less time you'll have to put into creating great images. Worry about your images, not your equipment."

http://www.kenrockwell.com/tech/notcamera.htm

You can guess which category I feel this fellow falls into.

I hope he got a great deal of pleasure building his system - it does look very impressive (if not downright pretty), but it doesn't make me think any more, or less of his photography.


I am definitely of the same mindset - but it's also my hobby and I do get some fun from building a bit of an overkill system (and documenting it in detail) :) I also viewed the build as its own project - I tend to do these very long blog posts about such projects in the 12 years I've been running my website.

I started out with just the cheapest camera I could get my hands on, including working on my high school's yearbook staff in ~2000. Now that I've been working for over a decade and have some resources I pick up a few gadgets here and there for my hobbies.


Awesome!

I'm still a film guy when I want to go make pretty pictures - though I've taken some killer stuff on my iPhone of all things - I dont have a good digital camera, so all of my really good cameras are still film bodies - nothing to me beats the Ergonomics of an EOS-1

At the risk of giving myself the hug of death - https://leho.blastpuppy.com/~aloha/photos/


It's ironic that you quote Ken there, because he is definitely in the camera nut camp.

His entire site is dedicated to cameras and lenses, virtually nothing on actually composing or photographic technique. Most of his photos are generic family holiday snaps, particularly his newer photos.


Camera geeks have money - and he's a businessman first, photographer second. ;-)


Interesting point! I admit that spending a lot of time worrying about equipment with little or no direct effect on the work or how you produce it is over the top — but worrying about equipment makes a lot of sense for any creator and especially in photography. Different lenses and different cameras will have a huge impact on your final photograph. Changing your tools affects your work flow and your final photograph as a result. Swapping cameras or lens will obviously affect your final image much more so than changing PC but it's still there and it is worth thinking about.


Isn't this a good use case for the iMac Pro?


I guess at the time of building this PC, iMac Pro stil wasn't announced. It would be great to hear what author thinks now about iMac Pro.


It was - but iMac Pro can't be upgraded. You want a new motherboard/CPU for the next gen that comes out in 9 months? Can't just go out and swap the parts out over the weekend. Also the Vega GPU in it is not as performant.


Then I installed my essentials:

Atom text editor (Give vsCode a try too). Also look at FileSearchEX, the proper way to find files.


I have little use for any of the features included in Windows 10 Pro

Except securing it with Software Restriction Polices.


very nice and informative... i also think you like to write a lot :D


Step 1. Be wealthy :P

I am always enamored with the simplicity of one's living space being directly proportional to wealth. I try hard to not have much stuff, but my living options are seemingly always going to be working class style places with pretty much the antithesis of what is apparent in this photo.

Of course the camera lies, and I shouldn't compare my insides to some else's outsides. There's probably a new puppy being paper trained just behind the camera, or a bunch of pocket change and random pens in a pile usually on that desk, but I do lament that I may never have such a simple, clean, and uncluttered living space. The spaces I may ever have will have things like unvaulted ceilings, windowsills, trim, standard door size passages between rooms, probably carpet heh...

I do wonder if such things prevent clarity of thought.


"I have too much stuff. Most people in America do. In fact, the poorer people are, the more stuff they seem to have. Hardly anyone is so poor that they can't afford a front yard full of old cars."

http://www.paulgraham.com/stuff.html


There's another dead comment that explains a lot of it.

I cannot see a reason why that particular comment is dead so I guess it's based on the posting history.


One of the most liberating things I ever did was move country with only a single checked bag.

The sheer amount of useless shit that I threw away was astounding.


That is probably because they are buying non running examples for parts and also do DIY as they cant afford the > £200 ph for a BMW mechanic.


It's easy to get rid of stuff if it's no big deal to just buy it again. Less well-off people don't have that luxury.


Just stash crap in bins or drawers rather than on top of your desk.


Have you given up on making some money?


I guess no? But I did for sure notice a change in my ambition / perspective a while ago after I crossed the theoretical threshold mentioned quite a bit a few years ago https://news.ycombinator.com/item?id=1381927

I guess I value time more now? Honestly I may have to sit on your question a while and maybe it was one that was lingering when I posted my comment.


Oh, 60 000 USD/year? My first reaction is that this is tons of money, and that you should easily be able to afford fancy cameras, a high-powered computer, and a well-kept place to live, as we see in the link. Congratulations! Then again, maybe the cost of living where you are is higher than what I'm used to, so I guess I can't draw any certain conclusions.


Not related to the blog post, but I think Paul has one of the best looking personal websites on the web. Simple and clear, but very delightful to read. Also he has amazing travel posts with nice photos. For example, his Japan post triggered me to travel to Japan three years ago. Keep up the good work Paul.


Thank you!! It's the very rare comments like this that bring me joy. While I mainly do the site and photosets just for myself (so so many times when I'm out to dinner with folks and am telling a story about some trip I can instantly pull up the set and show the journey) it's nice to see when others get some value from it, especially for something that has consumed countless nights and weekends over the years.


I really like the blog as well, its one of the nicest I've seen.

Are you using your own custom made jekyll site?

Just curious about how you designed your blog and what components you use


I agree, the design is great, and the animations on the maps is really nice.


Agreed — never been to the site before, but had to add it to my "web design inspiration" bookmarks 5 minutes into reading. Delightful!


Paul (and anyone else who uses Sony cameras) I would like to suggest that you test out Capture One for processing your RAW files.

I have found it does a better job of handling the Sony files than just about anything else. Also is GPU accelerated.

My experience is based on the Sony RX10 files, and a friend of mine who uses it for handling his A7Rii files.

A free version is available for use with Sony RAWs; or you can upgrade to the full version of the software at a low price. Just a happy user, I have no financial interest in the company.


Lightroom is plain bad with Fujifilm X-Trans raw files so I tried to find an alternative but I couldn't. The Lightroom workflow really is what makes the difference. I feel at home with Lightroom.

So I keep exporting jpegs that look like garbage until Adobe decides to have decent support for the Fuji X line.


Try Capture One since they have specific support for X-Trans since version 10. Or if you want to continue Lightroom you can bulk transfer the raw files with Irident X-Transformer then import them in Lightroom. Only worth it if you really want the best quality.


The first item in the ‘Tweaking Windows’ section is Disable User Account Control. Can someone share a clear explanation of what this means and the nature of the risks involved?


UAC limits software to user privileges, even if you're running an administrator account. If you're running an administrator account, you just click YES in a popup to grant elevated privileges when they're requested. If you're running a user account, you enter an administrator password. Disabling UAC lets anything you run use administrator privileges without alerting you. Similar to running a Linux box as root. It's a really stupid thing to disable.


It may be stupid depending on what you are doing, and how savvy a user you are.

The problem with UAC is that 90% of users have no idea when it would be necessay to click "no" when that dialog box shows up. For them, it's the box that always annoys you and you have to just click "yes" to make it go away.

I understand what it's supposed to do, but have had it disabled since it was released, and have saved hours of task interruption it in exchange for no other problems.


> how savvy a user you are.

I strongly disagree with that caveat. As a savvy user, UAC behaves like a burglar alarm for me. I am not savvy enough to open a 7z, PDF, JPG or DOCX in a hex editor and determine whether it contains an exploit. Even if I were alone on the planet due to the ability to do so, I wouldn't have the time to do anything else. Because I have UAC enabled, if I open a zip file and get a UAC prompt I know that something is fishy.

There are known unknowns and your savvy is perfectly suitable for that; however, your savvy won't help at all for the unknown unknowns. Double-clicking an .exe isn't the only way to get pwnd.

> you have to just click "yes" to make it go away.

Exactly, UAC doesn't really work for non-savvy users. In which case, who is the target audience?


My gut reaction was disabling it is rather silly but then again I can't remember ever clicking No for security reasons.


It’s also a popup that comes up several times every week or even day on windows. Really wish it was easier for apps on windows to run in user mode only and stick to it, but it seems like everything needs UAC on a regular basis for one reason or another.


That reason is developer incompetence.

MSDN is full of posts and articles how to be a nice guy and run as user, unless of course the application needs to do some low system level calls.

And even then, there are ways to split the architecture between privilege levels.


Disabled the administrative confirmation popup e.g. [0]. Kinda like Windows' weaker version of sudo.

When UAC is disabled, anything can run as administrative, but without confirmation or any real sign that it is, so a bit of a risk. Though not quite on the level of running Mac or Linux as root (Windows has another higher level priority, System), it is close to it.

IIRC it also disables some security features of Edge, if you use that. It might also disable some security features in Office, which is a bit more of a concern.

[0] http://images.ntwind.com/winsnap3/uac-unsigned.png


The only issue I have with this glorious build is the totally inadequate Apple keyboard. It doesn't have so many keys I need to be productive. Not to mention it's simply too small to use. Also it fits in with the rest of his setup like an apple in a bucket of oranges.

Also, OP should take a look at VS code. It integrates very well with GNU/Windows.


[flagged]


The performance difference was negligible for the cost, and I will likely upgrade to faster 3d xpoint SSDs in a year anyways.


Really expected more when i read the title. It's a stretch to call this a "water-cooled machine", when it's just one all-in-one CPU cooler.

Water-cooled machine should be a custom loop. It's a lot more hassle, but that's exactly why it's interesting. All-in-one systems are just like traditional air systems with big radiators, the only big difference is that you can move the radiator around.


Custom loops are such a hassle between leak testing, constantly changing the liquid, adding anti-algae additive. I've done many of them when I was really into PC building in the early 2000s (I was even on the FutureMark 3DMark Hall of Fame for a while for some extreme overclocks). The new AIO coolers are extremely performant and a no-brainer cost wise. No harm in trying out the new tech :)


You don't have to constantly change the liquid if you don't go for the looks and use the good coolant, instead of shampooie colored one.

If you have build custom loops, you should really understand the difference. AIO are much closer to traditional air coolers, than to custom-loop water cooling.

Keep an eye on VRM btw. Point of failure in your system right now.


I don't agree, it clearly fits the definition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: