In China, engineers hold the most power, yet the country prospers. I don't think the problem is giving engineers power, rather a cultural thing. In china there is a general feeling of contributing towards the society, in the US everyone is trying to screw over each-other, for political or monetary reasons.
I believe this work is a continuation of the work the asahi linux people did to get games working on M-series macs. It seems Alyssa Rosenzweig works at valve as a contractor. Super cool work. Some seriously talented folks.
What a jump, I'd be curious to hear first why anyone would prefer Intel above pretty much anything else, but also secondly how the actual experience difference between the two after working at both, must be a very strong contrast between them.
On her website it says she is working on GPU drivers there - I wouldn't be surprised if that's something she greatly enjoys and Intel gave her then opportunity to work on official, production shipping drivers instead of reverse engineered third party drivers.
Maybe she was given a huge signing bonus to avoid her working on making X86 irrelevant? Combined with perhaps some interesting project to work on for real.
I wouldn't have thought so 5-10 years ago, but with Microsoft offering Windows on ARM the is really no OS that specifically targets x86 (Legacy MS products will keep it alive if the emulation isn't perfect).
The thing is, x86 dominance on servers,etc has been tied to what developers use as work machines, if everyone is on ARM machines they'll probably be more inclined to use that on servers as well.
Microsoft has tried Windows on ARM, like, 5 times in the past 15 years and it's failed every time. They tried again recently with Qualcomm, but Qualcomm barely supports their own chips, so, predictably, it failed.
The main reason x86 still has relevance and will continue to do so is because x86 manufacturers actually care about the platform and their chips. x86 is somewhat open and standardized. ARM is the wild, wild west - each manufacturer makes bespoke motherboards, and sockets, and firmware. Many manufacturers, like Qualcomm, abandon their products remarkably quickly.
Huh? Qualcomm announced the X2 chips just 2 months ago with shipments for early next year. Looked at a local dealer site and there's MS, Dell, Asus and Lenovo WinArm machines (with current gen Elite X chips).
Yes, Windows on desktop hardware will probably continue mainly with x86 for a while more, but how many people outside of games, workstation-scenarios and secure scenarios still use desktops compared to laptops (where SoC's are fine for most part)?
1: It's not meant to be cute but rather incredulity at a statement of declaring something to having failed that still very much seems to be in progress of being rolled out (and thus indicating that it'd be nice to have some more information if you know something the rest of the world doesn't).
2: Again, how are they failures? Yes, sales have been so-so but if you go onto Microsofts site you mostly get Surface devices with Snapdragon chips and most reports seems to be from about a year ago (would be interesting to see numbers from this year though).
3: Yes, I got a new x86 machine myself a month back that has quite nice battery life. Intel not being stuck as far behind on process seems to have helped a fair bit (the X elite's doesn't seem entirely power efficient compared to Apple however).
4: Yes, _I_ got an x86 machine since I knew that I'd probably be installing quirky enterprise dependencies from the early 00s (possibly even 90s) that a client requires.
However, I was actually considering something other than wintel, mainly an Apple laptop. If I'm considering options and being mostly held back by enterprise customers with old software I'd need to maintain the moat is quite weak.
My older kids previous school used ARM Chromebooks (currently x86 HP laptops at current upper highschool but they run things like AutoCAD), the younger one has used iPad's for most of their junior high.
Games could be one moat, but is that more due to the CPU or the GPU's being more behind Nvidia and AMD. Someone was running Cyberpunk 2077 on DGX Spark at 175 fps (x86-64 binary being emulated.. )!
But beside games and enterprise...
So many people that are using their computers for web interfaces, spreadsheets, writing, graphics(photoshop has ARM support) and so on won't notice much different about ARM machines (why my kids mostly used non-x86 so far), it's true that such people are using PC's less overall (phones and/or tables being enough for most of their computing), but tell a salesman Excel jockey that he can get 10-20% more battery life and he might just take it.
Now, if Qualcomm exits the market by failing to introduce another yearly/bi-yearly update then I'll be inclined to agree that Win-Arm has failed again.. but so far it's not really in sight.
I imagine there's also some challenging work that would be fun to dig into. Being the person who can clean up Intel's problems would be quite a reputation to have.
There’s a real limit on what level of problem one engineer can fix, regardless of how strong they are. Carmack at Meta is an example of this, but there are many. Woz couldn’t fix Apple’s issues, etc.
A company sufficiently scaled can largely only be fixed by the CEO, and often not even then.
I'm sure most would stay at valve if they could. The just do so much contract work, and I'm sure a stable job at intel is better pay, benefits and stability.
Would it shock you to hear that famous engineers with their own personal brand power have different opportunities and motivations than many/most engineers?
Their point is even made stronger by your comment. Engineers of this type don't experience megacorps like regular engineers. They usually have a non-standard setup and more leeway and less bureaucracy overhead. Which means brand isn't the biggest thing, the specific projects and end user impact are.
The mathematically correct way to distribute the winnings is 50-50. In a situation where value is created only if 2 entities come together, the only fair way to distribute the winnings is 50-50. If Alice provided $1m dollars of startup capital, but can only achieve her goal having Bob on the team. Mathematically, Bob should be entitled to half the profits. In your game Bob is clearly being disadvantaged. Real life doesn't typically have the constraint "In addition, they have no ability to communicate or negotiate the offer; it's a one and done thing". Without this constraint, Bob can act rationally by threatening and following through with spitefulness in order to negotiate better terms. If Alice is not willing to negotiate until the fairness mark is reached, they are just as liable for the net loss in value.
There is a difference between learning about things that you find interesting at your own pace, and learning about things that interest other people with tight deadlines. Even if I enjoy learning, there were absolutely courses that were just a waste of time.
> Even if I enjoy learning, there were absolutely courses that were just a waste of time.
My university experience is somewhat different, and I believe whether this holds true or not depends a lot on the degree course:
- In mathematics, there are barely any "filler courses". Basically all of them were interesting in their own right (even though because of your own interests, you will likely find some more exciting than others).
- On the other hand, computer science more felt like every professor had their own opinion how the syllabus should be, and the hodgepodge that came out of it was adopted as syllabus (design by committee). Thus, there were quite a lot of interesting things to learn, but also "filler courses". Additionally, the syllabus did not feel like a "consistent whole" with a clear vision, but rather like lots of isolated courses that you had to pass.
I suspect even many math majors probably believe "general ed" required classes to be a waste of time. Point well taken, though: there are some subjects that do not lend themselves to editorializing or opinion. I majored in history & comparative religion as an undergrad and most of my lower level courses I'd consider to be "fact retention" efforts. Lots of reading, but not a lot of analysis or synthesis. I took mostly graduate level courses because of this for most of my last two years (and this was at a top 5 public university).
>
I suspect even many math majors probably believe "general ed" required classes to be a waste of time.
In the German university system, there are in general no required "general ed" classes. :-)
(it is typically only required that you do some often prescribed classes in a minor subject that you can commonly choose from a typically pre-defined list by the faculty (but if you hate all of the suggestions from this pre-defined list, it is sometimes possible to choose other minor subject or classes, but this will typically involve more bureacracy). For example, when stuying mathematics, it is common to choose physics, computer science, economics or some engineering science as minor).
Any further general education classes (in particular foreign language courses) are completely optional - and it is not an uncommon complaint of students who have very broad interests that during a typical degree course, you have barely any time to attend classes outside of the prescribed syllabus.
> ...there were absolutely courses that were just a waste of time.
How? Surely over 15 weeks each course taught you something about either the world or yourself.
I just looked back over my undergrad transcript to double check my experience. I took something away from every single class. It wasn't always the material itself.
> Azure is #2, behind AWS because Satya's effective and strategic decisions
I am going to have to disagree with this. Azure is number 2, because MS is number 1 in business software. Cloud is a very natural expansion for that market. They just had to build something that isn't horrible and the customers would have come crawling to MS.
You could just as easily make the argument that cloud is a very natural expansion for Google given their expertise in datacenters and cloud software infrastructure, but they are still behind. Satya absolutely deserves credit for Microsoft's success here.
Unfortunately money plays a big role. The US is different, with good paying jobs for engineers being common but believe it or not, that is not the norm everywhere in the world. Sometimes people need to follow the money. I think this plays a big role on the russian side of the war. Their economy isn't very diversified and the state owns the means to a large portion of the economy's production. If you need cash, a good way to get it is to do putin's bidding.
reply