IPad Air 9.7". Notification centre on lock screen used to have two columns. I used to fill the screen with widgets and have it as a dashboard. Now due to various updates I'm stuck with one column and half the screen wasted. It's even worse on iPad Pro 12.9".
There's no apparent way to fix this bad UI design.
So now I have a raspberry pi do the same task with no issues. Bonus feature is that no update can remove this functionality again.
Status signalling is what a particular kind of "rich" person does to indicate they are rich or high value. Often also used to bolster social capital. You see it when women buy extraordinarily expensive designer brand hand bags. Men buying sports cars but have no clue what is under the hood. The key element is not the actual product but the visible cost involved in actually purchasing it - expense as a feature.
I work in devops. So I can easily have quiet developer time and operational issues appearing out of nowhere. We're used to this mixture. "Crunch time" is only for short term thinking.
So, in a perfect day, do you want to be an employee mashing the keyboard all day with no break? Or just sitting there staring out the window? Logically both extremes are unrealistic.
Now, choose a reasonable mixture. Does that look like 75% mashing of keys / coding / meetings / etc and 25% staring out the window? Maybe it's 90/10? Or 60/40. No perfect answer exists: the only answer is "it depends".
A work week is not a sprint. It's more like a marathon. Each checkpoint is to be met. Slow down or speed up as necessary. This is how I experience dev ops. Some hours of a week are "get it done NOW". Others are "do nothing - all is calm". A few are "we'll need this next month, need to make some progress today". Plenty are "end of week" or "end of day" style tasks.
Interesting response. My take-away is to optimize for more than just a given day, because not all days/weeks are created equal in their demands. Thank you.
When you sum up all the red shifts of a large piles of galaxies they are receding from us at speeds faster than can easily be explained using the Hubble model. One theory for this is a repulsive energy of a form we have not yet observed, dubbed "dark energy", which is purported to be causing accelerating expansion of the universe since the Big Bang. More here:
Creepy. But then again if you make analogies for most web sites/services they get creepy real fast. This is how far things have moved since the previous "analog" days.
Protobuffers always seemed like an interesting approach but every time I've tried to use in a prototype I've ended up deciding against the added overhead of including them. And then I use JSON because I simply wanted versioned serialisation when messaging or persisting data.
Does Intel really think this approach is good for them? As a technical person, all I see is a company in trouble with products they need to lie about. This goes beyond market speak - it's deceptive.
The people who won't be fooled by this are likely the customers who are interested in the actual 10% difference for the high end and likely want this chip anyway.
At $499, the i9-9900k is almost competing against the 12-core Threadripper 2920x ($649, 12-core/24-threads, 4.4GHz clock, 60 PCIe lanes, quad-memory channels).
I think most people will find more use out of +4 cores (granted, on a NUMA platform) than higher clocks. Cores for compiling code, rendering, video editing, etc. etc.
Pretty much only gamers want +Clock speeds, and more and more games actually use all-cores these days (Doom, Battlefield, etc. etc.)
-----------
That's the thing. The i9-9900k isn't even a "high end chip" anymore. Its at best, "highest of the mid-range" since the HEDT category (AMD Threadripper, or Intel-X) has been invented.
Once you start getting into 8-cores/16 threads, I start to worry about dual-memory channels and 16x PCIe lanes + 4GB/s DMI to the Southbridge. Its getting harder and harder to "feed the beast". A more balanced HEDT system (like Threadripper's quad-memory channels + 60PCIe lanes) just makes more sense.
I wish. We use a commercial path-tracer that scales very well to many cores, GPUs and entire clusters when it's chewing away at a single fixed scene or animation.
But in interactive mode many scene modifications are bottlenecked on a single or few threads and locks until it gets back into the highly optimized rendering code paths. So a lot of work goes into quickly shutting down as many background threads as possible to benefit from high turbo-boost clocks on Xeon Gold processors so the user doesn't have to wait long and then ramp them back up when it's just rendering the fixed scene.
Agreed. Games aren't the only thing people do with lots of cores / HEDT. Give me a 128 core machine and I'll happily keep them busy all day with work. No need for a heater either.
There's no apparent way to fix this bad UI design.
So now I have a raspberry pi do the same task with no issues. Bonus feature is that no update can remove this functionality again.