It’s mostly that they don’t understand their own users and potential customers in this particular case of Bitnami. There are so many other ways to increase revenue without alienating the core developer base. Enterprise want stability, breaking changes is a poor way to convince someone to pay you.
Last company was pretty heavy free user of Bitnami charts for various things but biggest being Redis clusters. I can't imagine they could convert everything into a cluster using their own charts before this kicks in. Very possible they end up tossing at least a year worth of licensing towards Bitnami.
Yes. Whatever money you get with this is going to be small and short-lived. The big money is in compliance. That is a GTM problem, not a technical one.
That could be a total gamechanger. With their new ARM chip's performance, if they are really prioritizing upstream contributions I could absolutely see a world where my next laptop is has a qualcom chip and runs linux. I have generally viewed Qualcomm as little more than pattent trolls for... pretty much the entire time I have been aware of their existence. It seems like they are intent on changing that and I hope they do!
if anything, people have a much much higher opinion of microsoft now than they did even just 5 years ago. the arc can change, when you have a good CEO like Satya
I have had coworkers that were 10x more valuable than the other coworkers. If you haven't then your company probably only employs people that are incredibly talented or incredibly mundane
You may have high performers (and low performers), including people that perform very well in a particular context or even help the whole team performing better. There is no question on this.
Putting an actual individual factor on it and especially on engineer productivity, like we were at a line factory producing everything in a repeatable and measurable way is more questionable. 1x, 10x, 1000x? People here are discussing 10x becoming 2x, it sounds very mathematical. Now if there is an actual scale with an industry baseline and an individual measurement, I'm more than interested to learn about it and I think employers should fairly compensate engineer based on the scale factor you would put on your CV.
> N3 is actually much worse than expected
> ~30% higher density in real-world products
I'm a software guy so excuse my ignorance, but isn't 30% higher density a pretty big deal? Why is N3 worse than expected? Were expectations sky high and expectations weren't hit, or is 30% density increase not as big of a deal as it sounds to me.
I think a lot of people may not understand how density increases relate to performance improvements
It's the combination of density and cost increase that's the problem. I don't have the actual N3 numbers, but taking the original example, if you get 30% increased transistor density, but your cost per area goes up 40%, then as a customer you're not in a great position - you're still paying more per transistor.
While there are still other benefits to gain from a new node and increased density (despite the cost increase), if your cost per transistor goes up, it limits where you might want to use the new node (particularly in value sensitive parts of the market).
There's been a long-term trend towards this point - the cost of a new node (the blend of developing it, implementing a design in it, and cost per transistor) has been spiralling up for like a decade+. These are the same pressures that has caused the consolidation around Intel/TSMC/Samsung(ish) in the bleeding edge.
I just don't know what is left to achieve really. I have an M1 pro and I just don't really hit any bottlenecks in my workflow as is (unless I am building some obscenely large legacy codebase)
It just seems to me that we are hitting a point of diminishing returns in terms of CPU performance because honestly, the speed of my laptop could triple and it would not noticeably affect my experience in any way.
The main areas of improvement that I would actually notice are better battery life, and faster RAM and SSDs (faster networking as well)
I am a YouTuber, and I spend a significant amount of time editing and rendering video. My main laptop is a full-spec M1 Max MacBook Pro, and when I'm home, I work on a full-spec M1 Ultra Mac Studio.
Both computers are extraordinarily fast, but I still spend a lot of time waiting.
I would be willing to spend a lot of money:
(1) to reduce that time,
(2) to significantly increase my laptop's battery life, and/or
(3) to significantly increase the size of my laptop's already-rather-gargantuan 8TB SSD.
Maybe I should become a programmer. Sounds like there's less waiting :P
This is more GPU than CPU, but I want to infer 3D models from my security cameras in real time so I can do some CSI "turn left and look behind it" shit. And use the overlapping textures for superresolution so I can shout "enhance!" and read the license plate reflected in the perp's eyeball.
As for reading e-mails and so on, yeah, we've pretty much reached peak e-mail.
I haven't described anything that can't already be done (well, reading a license plate in an eyeball was mostly an exaggeration). It just can't be done affordably in real time on a home computer. And it's just the first few out of many examples to come to mind.
I think people fall into the trap of conflating "this is what I do with my computer" with "this is what my computer is for." Obviously if computers are only for doing the things you can already do with them, then they won't benefit much from improvements.
> I just don't know what is left to achieve really.
Just use your imagination a little bit.
Unless you think your current workflow and the tasks you use your machines for are the pinnacle of what an individual will ever be able to accomplish?
Currently there are so many things that are so computationally intensive that they can only be processed on server farms that only the Googles and Amazons of the world can afford.
Seconding this. I'm wondering if the person you are replying to might be having a bit of a dunning kruger issue
I only recommend the people that I have worked with that I would want to work with again. Not because of loyalty, but because they are good, and will make me look good for recommending them
To clarify for other users, "dunning kruger issue" means:
The Dunning–Kruger effect is a cognitive bias[2] whereby people with low ability, expertise, or experience regarding a certain type of task or area of knowledge tend to overestimate their ability or knowledge. Some researchers also include in their definition the opposite effect for high performers: their tendency to underestimate their skills.