I ended up with Roboto Mono, a font I used to use before I switched from Linux to Mac for my daily drivers. I'm not the kind of person who can identify a font by name simply by looking at it, so I was pleasantly surprised! I chose it more or less at random back in 2012.
Microsoft, who has owned LinkedIn since 2016, has recently been making headlines because recently they fired a lot of their engineers and QA staff and are now essentially vibecoding huge chunks of their enterprise.
What's more, Microsoft never paid the really big bucks like the FAANG companies, and so it's more or less an open secret that at the height of the tech hiring frenzy Microsoft had to fight for B-tier engineers that weren't good enough to work at e.g. Apple.
So, it's been 10 years, which is long enough for that trademark Microsoft mediocrity to seep into LinkedIn. And they're probably vibe coding everything. That's how you get to gigs.
"What's more, Microsoft never paid the really big bucks like the FAANG companies"
I never knew this open secret. In my day msft was very glamorous and I guess something like oracle played the role you're ascribing to msft now. I wonder what their strategy was? (I tend to doubt this was a careless/unexamined decision.) Maybe they figured that paying extra for individuals doesn't get you much if you have enough structure in place? A Bill Bellichik approach to hiring. Is the relationship you're making (FAANG salaries == better products) accepted as true?
I only have my own observations of their products and secondhand info but my understanding is Microsoft simply doesn’t care about engineering. They have a sales pitch (product idea), then they build and ship the MVP that can earn money. If something sells, they figure they can solve scaling by throwing enough money at it. Classic b-tier tech company (and startup) garbage. They never work out the unit economics, etc.
FAANG (at least the few I’m familiar with) tend to be engineering companies. They hire talented engineers who can work from first principles and build products with profitable unit economics that solve interesting new problems. I don’t think Microsoft even knows what software engineering would mean.
Good question. For a long time I think the justification was location: Microsoft is in Seattle, and it’s only the Bay Area that is getting inflated salaries.
Ada used to be mandated in the US defense industry, but lots of developers and companies preferred C++ and other languages, and for a variety of reasons, the mandate ended, and Ada faded from the spotlight.
On their own merits, people choose SMS-based 2FA, "2FA" which lets you into an account without a password, perf-critical CLI tools written in Python, externalizing the cost of hacks to random people who aren't even your own customers, eating an extra 100 calories per day, and a whole host of other problematic behaviors.
Maybe Ada's bad, but programmer preference isn't a strong enough argument. It's just as likely that newer software is buggier and more unsafe or that this otherwise isn't an apples-to-apples comparison.
I made no judgement about whether Ada is subjectively "bad" or not. I used it for a single side project many years ago, and didn't like it.
But my anecdotal experience aside, it is plain to see that developers had the opportunity to continue with Ada and largely did not once they were no longer required to use it.
So, it is exceedingly unlikely that some conspiracy against C++, motivated by mustache-twirling Ada gurus, is afoot. And even if that were true, knocking C++ down several pegs will not make people go back to Ada.
C#, Rust, and Go all exist and are all immensely more popular than Ada. If there were to be a sudden exodus of C++ developers, these languages would likely be the main beneficiaries.
My original point, that C++ isn't what's standing in the way of Ada being popular, still stands.
Build everything from source within a single unified workspace, cache whatever artifacts were already built with content-addressable storage so that you don't need to build them again.
You should also avoid libraries, as they reduce granularity and needlessly complexify the logic.
I'd also argue you shouldn't have any kind of declaration of dependencies and simply deduce them transparently based on what the code includes, with some logic to map header to implementation files.
The problem is doing this requires a team to support it that is realistically as large as your average product team. I know Bazel is the solution here but as someone who has used C++, modified build systems and maintained CI for teams for years, I have never gotten it to work for anything more than a toy project.
It usually ends up somewhat non-generic, with project-specific decisions hardcoded rather than specified in a config file.
I usually make it so that it's fully integrated with wherever we store artifacts (for CAS), source (to download specific revisions as needed), remote running (which depending on the shop can be local, docker, ssh, kubernetes, ...), GDB, IDEs... All that stuff takes more work for a truly generic solution, and it's generally more valuable to have tight integration for the one workflow you actually use.
Since I also control the build image and toolchain (that I build from source) it also ends up specifically tied to that too.
In practice, I find that regardless of what generic tool you use like cmake or bazel, you end up layering your own build system and workflow scripts on top of those tools anyway. At some point I decided the complexity and overhead of building on top of bazel was more trouble than it was worth, while building it from scratch is actually quite easy and gives you all the control you could possibly need.
>Build everything from source within a single unified workspace, cache whatever artifacts were already built with content-addressable storage so that you don't need to build them again.
Which tool do you use for content-addressable storage in your builds?
>You should also avoid libraries, as they reduce granularity and needlessly complexify the logic.
This isn't always feasible though.
What's the best practice when one cannot avoid a library?
You can use S3 or equivalent; a normal filesystem (networked or not) also works well.
You hash all the inputs that go into building foo.cpp, and then that gives you /objs/<hash>.o. If it exists, you use it; if not, you build it first. Then if any other .cpp file ever includes foo.hpp (directly or indirectly), you mark that it needs to link /objs/<hash>.o.
You expand the link requirements transitively, and you have a build system. 200 lines of code. Your code is self-describing and you never need to write any build logic again, and your build system is reliable, strictly builds only what it needs while sharing artifacts across the team, and never leads to ODR.
I feel like the only dude on the planet who uses fullscreen workspaces on Mac.
The number of times I have noticed the corner of my windows is precisely zero because each important application gets its own workspace, so the window frame doesn't get rendered. Sometimes I'll tile two windows side by side on my external monitor but even then this is a complete non issue for me.
Are you guys just running everything on the one desktop workspace in windowed mode? That seems like madness.
Oh man those first few generations of CD burners were rough. We had this old Pentium 2 that had so little memory you had to close everything but the burner software (Easy CD Creator or something, IIRC) otherwise the memory exhaustion would cause a buffer underrun and the disc would be ruined.
A few years later my mom finally let us get one with buffer underrun protection (and some multiplier on the write speed) so I could make mix CDs with music off Napster for my girlfriend and life was good.
Last year's birthday present from me to my wife was a mix CD. I attempted to recreate Cereal Killer's Greatest Zukes Album, briefly mentioned in Hackers (1995): "All great artists that asphyxiated on their own vomit!" My criterion was that the artists featured had to have died of a drug or alcohol overdose before September of 1995 (when Hackers was released), and four of the tracks had to be by Jimi Hendrix, Janis Joplin, Cass Eliot, and the Blues Brothers (satisfying the Belushi requirement), who are named in the film.
I went as far as sourcing a SCSI drive with a dedicated card just to get results. Fond memories of clicking Burn and slowly backing away from the desk to let it do its thing.
I'm 40. IBM hasn't been even remotely relevant to me since I was in elementary school, and only then because we had IBM-brand machines in the computer lab where we pretended to work while surreptitiously playing Oregon Trail.
I honestly don't even know what they do any more.
Whatever happened to IBM happened eons ago in tech years.
They haven't been relevant to anything user-facing at all since they sold the Thinkpad line, and their "golden age" was over by ~2000. Their hardware was awesome in the '80s and '90s, truly great stuff, but they've been mostly out of that game for about 25 years, and totally out for what, almost 20? Quite a few people on this site were born after the last piece of IBM hardware they might have appreciated was manufactured, and most of those folks may never have touched a single item of IBM's.
Microsoft's best work is also pretty damn far in the past, at this point. All my fond memories of them are pre-2010. I loved a lot of what they were doing with various little software projects in the '90s (encarta! All kinds of weird experiments and little programs and games!) but that seems mostly gone now. I expect any list like this for them would be a handful of old pieces of software and HID items from the '90s and '00s (remember when they made really good mice and pretty good keyboards?), but dominated by a complete inventory of everything the xbox division has built to the point that it'd look more like some kind of gaming-focused list.
I'm not sure Amazon has built enough non-terrible user-facing stuff to make a top-50 list. Or a top-25. Or a top-5 that's not just a five different Kindle models. Their entire Fire line sucks, which just leaves Alexa. Not enough meat to make a meal like this out of, I think.
Google's list would be hilarious because it'd mostly look like a copy of that Google Graveyard site.
> Microsoft's best work is also pretty damn far in the past, at this point.
I would almost reflexily agree except for VSCode. It about the most perfect piece of software (to the point where now there aren't any user facing feature updates) I've ever used, and I use it all day everyday. It's absurdly flexible, excellent defaults--it really, actually improved my view of MS from somewhere subterranean to a height of respect.
I'd include WSL, but thats kind of backhanded praise and it is frustratingly slow a lot of the time.
I'm with you that TFA comes off as mean spirited and needlessly so.
But having worked in large orgs in highly regulated and bureaucratic sectors (aerospace), sometimes things don't change until the process fails spectacularly.
Policy like "we can't accept email for security purposes" comes from total fucking morons in sub-C level upper management who have no insight into how the business actually works, for whom it's easier to say "no" than it is to say "yes".
It's entirely plausible that this episode (which I bet blew through a lot of PPNS budget in toner) caused some mid level manager to report the process breakage, kicking off a review of whether they really need fax.
reply