This is a symptom of "bullshit" going on around in big tech companies. "bullshit" here is an economic term defined in the book "bullshit jobs". https://www.amazon.com/Bullshit-Jobs-Theory-David-Graeber/dp...
Reading through the post, I was noticing
So much corporate Jargon which really does not mean anything important.
Dehumanizing language when describing people interviewing and being interviewed and its process.
Too much obfuscation of ideas that can be very simply explained.
glorification of simpler problems into heroic challenges.
Delusions of Grandeur.
Today's such jobs are tomorrows layoffs.
I think I will stop here. I have crossed my negativity threshold for the day.
As a perfect example of this is the trend in most places I've seen where data scientists strive to increase the complexity of their model (so they can prove how "smart" they are). A huge part of a software engineering education (whether in the classroom or in dev shop) is learning that complexity is the enemy. No engineer would choose a 3 layer MLP over a simple linear regression for an imperceptible improvement in performance.
The additional irony of all this is that a decade+ ago a software engineer who had strong quantitative and numeric programming skills was rare and an elite find. You would have thought that the data science boom would have dramatically increased the number of these people but I find them even rarer.
I think some of this is exacerbated by modern pillars of machine learning and data science. Competition sites like Kaggle are entirely based on maximizing test set accuracy, and so winning submissions these days are huge morasses of ensemble methods that are trained for days and weeks on GPUs, but in the end they are often only marginally better than some of the fairly basic standard approaches. And when companies like Google are building their bots for Go or Starcraft, they are using cutting edge techniques. When people see that and get inspired to get into data science, thats what they want to do, even the the majority of problems are more rooted in data quality, thoughtful understanding of the problem, and more rudimentary methods.
Its also the result of some of the rhetoric of important figures in the field. Yann LeCun has pushed back strongly in the past on criticisms of modern day machine learning's occasionally lack of concern with introspection and model understanding. Judea Pearl, a Turing award winner for his work in machine learning, devotes large portions of his pop-sci The Book of Why attacking the field of statistics on the whole, as well as engaging in multiple attacks on historical influencers in the field with such ferocity it borders on character assassination. He has even rebuffed modern critics, such as the very widely respected Andrew Gelman, by saying they are "lacking courage" by failing to accept his "revolutionary" causal inference methods over the traditional ones used in statistics.
The attitude is driven a lot by the people and institutions at the top, and as someone in the field, I unfortunately encounter this kind of thinking way too often.
Due to the hype it becomes a goal in some organizations however. "We need to do machine learning because we have big data" or some such. Doesn't matter if the problem could've been solved in 5% of the time and cost with 20 lines of code, thou shalt use machine learning.
It doesn't help that data scientists (creating and training the ML model) and software developers (creating and maintaining the software) usually come from different backgrounds, requiring a "data engineer" as an additional intermediary.
It always a problem with hype, blockchain (or merkle trees) has the same problem but worse, because the problems it solves well are rarer and more narrow.
To put this statement into context, I'm speaking as someone who had been writing code in C, from the era of PC XT. Perhaps NIPS 2010 was a rite of passage to ML for me. There is a screen, full of industry grade C++ and PyTorch in front of me, right now...
Yes, I know that there are folks that deal with vast amounts of data with inscrutable relationships where you need fancy algorithms to make progress. But seriously, most problems just don't need it, and many folks would be better off with mastering basic statistics and data analysis.
It's fascinating how far you can get with basic stuff. My favorite? Statistics for Experimenters, by George E. Box. It's like a secret weapon! https://www.amazon.com/Statistics-Experimenters-Design-Innov...
Were you around during the dotcom era?
Although I'm not old enough, I've heard that OR in the 80s was the same crap.
The problem could have been tackled with greater accuracy using machine learning, but it would have taken a long time for the system to generate enough data points for a sound model and would have required more storage space. This was also complicated by the fact that the model had to be regenerated whenever the physical system being modeled was changed.
The linear programming solution was a lot cheaper and was "close enough" to serve as a useful approximation.
Basically a mathematical approach to problems of logistics and scheduling developed first in WW2. Very powerful in the domains for which it was developed but less generally applicable than enthusiasts hoped, leading to the usual “hype cycle”.
If you have a problem OR could solve or just want to fool around with it PuLP is very easy to use https://pythonhosted.org/PuLP/ Of course the ease of use means that it is a commodity skill now.
That said, SpaceX interview process is even more ridiculous. The first step is to talk on the phone with a non-engineer recruiter who has to ask you highly technical questions, but doesn't understand a word of your response, and you know it. They then sort of have to correlate what you're saying with the answers they have and decide whether you know anything or not. The most uncomfortable interview situation I've ever been in. Or at least that's how it was a few years ago, maybe they've changed it. I was so thrown off by this, I totally fucked it up and never got to the second step, in spite of nominally having all the right experience. To relate, imagine trying to explain low level assembly to a five year old, over the phone.
And, these guys aren't even Waymo.
Bullshit is neither an economic term nor an anthropological one. David Graeber is an anthropologist, not an economist, though he has written inexplicably popular books on economic topics that betray his lack of understanding of economics.
Bullshit is actually used as a technical term in philosophy occasionally.
> One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted. Most people are rather confident of their ability to recognize bullshit and to avoid being taken in by it. So the phenomenon has not aroused much deliberate concern, or attracted much sustained inquiry. In consequence, we have no clear understanding of what bullshit is, why there is so much of it, or what functions it serves. And we lack a conscientiously developed appreciation of what it means to us. In other words, we have no theory. I propose to begin the development of a theoretical understanding of bullshit, mainly by providing some tentative and exploratory philosophical analysis. I shall not consider the rhetorical uses and misuses of bullshit. My aim is simply to give a rough account of what bullshit is and how it differs from what it is not, or (putting it somewhat differently) to articulate, more or less sketchily, the structure of its concept.
"Debt" I think shows a deep understanding of the relationships economics has with history, philosophy, and society. Graeber knows he's not an economist but he's got a point to make and he's not shy about making it even though it says less than flattering things about some aspects of economics.
You're link is broken for me btw.
> Now, this may sound a little silly - if someone wrote a book called "Metal: The First 5,000 Years," and then filled that book with stories of war and bloodshed, never failing to remind us after each anecdote that metal was involved in some way, we might be left scratching our heads as to why the author was so fixated on metal instead of on war itself. And in fact, that is indeed how I felt for much of the time I was reading Graeber's book. The problem was exacerbated by the fact that Graeber continually talks around the idea of debt in other ways, mentioning debt crises (without reflecting deeply on why these happen), the periodic use and disuse of coinage (which apparently is just as bad as debt in terms of enabling the capitalism monster), and any other phenomenon related to debt, without weaving these observations into a coherent whole.
> In other words, I am now angry at myself for paraphrasing the book, and trying to put theses into Graeber's mouth, because this is such a rambling, confused, scattershot book that I am doing you a disservice by making it seem more coherent than it really is.
> The problem of extreme disorganization is dramatically worsened by the way that Graeber skips merrily back and forth from things he appears to know quite a lot about to things he obviously knows nothing about. One sentence he'll be talking about blood debts and "human economies" in African tribes (cool!), and the next he'll be telling us that Apple Computer was started by dropouts from IBM (false!). There are a number of glaring instances of this. The worst is not when Graeber delivers incorrect facts (who cares where Apple's founders had worked?), it's when he uncritically and blithely makes assertions that one could only accept if one has extremely strong leftist mood affiliation
have you read the book? The book is an exploration, and an interrogation, with so much to learn from that to say that about it seems pretty philistinic.
Maybe you were just summing up the review you linked from Noah Smith. I read most of it, it's a bit meh but Noah doesn't really seem to be trying too much in it. This though: "leftist mood affiliation". That's cheap 'preaching to the choir' language.
If you have a link to a more serious review I'd genuinely like to read it.
I don't think there have been any fundamental changes since 2000 that would incentivize make communication from large public corporate entities to be more honest or logically rigorous.