Hacker News new | past | comments | ask | show | jobs | submit login

This blog post was so painful for me to read.

This is a symptom of "bullshit" going on around in big tech companies. "bullshit" here is an economic term defined in the book "bullshit jobs". https://www.amazon.com/Bullshit-Jobs-Theory-David-Graeber/dp...

Reading through the post, I was noticing

So much corporate Jargon which really does not mean anything important.

Dehumanizing language when describing people interviewing and being interviewed and its process.

Too much obfuscation of ideas that can be very simply explained.

glorification of simpler problems into heroic challenges.

Delusions of Grandeur.

Today's such jobs are tomorrows layoffs.

I think I will stop here. I have crossed my negativity threshold for the day.




"Obfuscation" and "delusions of grandeur" are practically synonyms for ML and Data "Science" in this industry. I've been around for a while and I've never quite seen something as over-hyped and hyper-glamorized as these two specializations.


Calm down. Machine learning is a part of software engineering. Like multiprocessing, computer graphics or network protocols. It is here to stay. It is a part of a pallete of algorithms with which one can build software.


Your comment is absolutely correct but further points out just how far astray data science has become from any meaningful work. This issue is that a huge number of "data scientists" have limited programming ability and nearly zero engineering sense.

As a perfect example of this is the trend in most places I've seen where data scientists strive to increase the complexity of their model (so they can prove how "smart" they are). A huge part of a software engineering education (whether in the classroom or in dev shop) is learning that complexity is the enemy. No engineer would choose a 3 layer MLP over a simple linear regression for an imperceptible improvement in performance.

The additional irony of all this is that a decade+ ago a software engineer who had strong quantitative and numeric programming skills was rare and an elite find. You would have thought that the data science boom would have dramatically increased the number of these people but I find them even rarer.


What are these data scientists? Most statisticians I know would just use the linear regression unless they needed a neural network for marketing purposes. Statisticians will spend years studying linear regression and variations in graduate school. I thought it’s a CS guy who would be more fascinated with neural networks.


Well there are some fairly distinct camps forming in data science. You are correct that those coming from a statistics background would generally prefer simpler, more parsimonious models. There is a not-insignificant group that seem to be coming into the field via other channels (CS, boot camps, self-teaching, etc.) who view statistics as a field as a bit of a dinosaur and therefore the statistician mindset to be backwards. Simpler models aren't a good thing, they are a bad thing. Any amount of increased complexity is worth even a small amount of improvement in performance.

I think some of this is exacerbated by modern pillars of machine learning and data science. Competition sites like Kaggle are entirely based on maximizing test set accuracy, and so winning submissions these days are huge morasses of ensemble methods that are trained for days and weeks on GPUs, but in the end they are often only marginally better than some of the fairly basic standard approaches. And when companies like Google are building their bots for Go or Starcraft, they are using cutting edge techniques. When people see that and get inspired to get into data science, thats what they want to do, even the the majority of problems are more rooted in data quality, thoughtful understanding of the problem, and more rudimentary methods.

Its also the result of some of the rhetoric of important figures in the field. Yann LeCun has pushed back strongly in the past on criticisms of modern day machine learning's occasionally lack of concern with introspection and model understanding. Judea Pearl, a Turing award winner for his work in machine learning, devotes large portions of his pop-sci The Book of Why attacking the field of statistics on the whole, as well as engaging in multiple attacks on historical influencers in the field with such ferocity it borders on character assassination. He has even rebuffed modern critics, such as the very widely respected Andrew Gelman, by saying they are "lacking courage" by failing to accept his "revolutionary" causal inference methods over the traditional ones used in statistics.

The attitude is driven a lot by the people and institutions at the top, and as someone in the field, I unfortunately encounter this kind of thinking way too often.


Thanks for sharing your expertise. It was very interesting to hear your perspective.


Yes! It's one tool in the software engineering toolbox! It's a great tool for some problems!

Due to the hype it becomes a goal in some organizations however. "We need to do machine learning because we have big data" or some such. Doesn't matter if the problem could've been solved in 5% of the time and cost with 20 lines of code, thou shalt use machine learning.

It doesn't help that data scientists (creating and training the ML model) and software developers (creating and maintaining the software) usually come from different backgrounds, requiring a "data engineer" as an additional intermediary.

It always a problem with hype, blockchain (or merkle trees) has the same problem but worse, because the problems it solves well are rarer and more narrow.


To me, it seem to be larger than one tool. I think of it as a color in a pallete, with which one can paint software. Octarine.

To put this statement into context, I'm speaking as someone who had been writing code in C, from the era of PC XT. Perhaps NIPS 2010 was a rite of passage to ML for me. There is a screen, full of industry grade C++ and PyTorch in front of me, right now...


ML can be useful, but it is getting too much attention. Far more hype than the value it actually provides in many domains, IMHO.

Yes, I know that there are folks that deal with vast amounts of data with inscrutable relationships where you need fancy algorithms to make progress. But seriously, most problems just don't need it, and many folks would be better off with mastering basic statistics and data analysis.

It's fascinating how far you can get with basic stuff. My favorite? Statistics for Experimenters, by George E. Box. It's like a secret weapon! https://www.amazon.com/Statistics-Experimenters-Design-Innov...


Heh, given that I am starting to see more and more companies that offer ML engineers $2-6k/month (before tax), it's starting to resemble gaming industry in all its negative characteristics instead.


I cannot tell from your comment whether 2-6k/month before tax should be considered a lot or a little. I think in the major tech centers that 2-6k/month is quite low for anyone with significant experience (>5 yrs). Do you disagree?


They used the word negative.


Since he compared to gaming I think he's saying it's low


Does "blockchain" get an honourable mention?


Definitely.


Really?

Were you around during the dotcom era?

Although I'm not old enough, I've heard that OR in the 80s was the same crap.


Nobody talks about operations research today. But techniques that fell under that umbrella, like ARIMA and linear programming are still widely used, and aren’t going anywhere. (And it’s not without some irony that automated bulk time series forecasting is now sold as AI).


It's funny but at my last company, one of our systems used some linear programming to generate a model of physical processes.

The problem could have been tackled with greater accuracy using machine learning, but it would have taken a long time for the system to generate enough data points for a sound model and would have required more storage space. This was also complicated by the fact that the model had to be regenerated whenever the physical system being modeled was changed.

The linear programming solution was a lot cheaper and was "close enough" to serve as a useful approximation.


Linear and quadratic programming are amazing and totally underappreciated. Often they are the fastest way to get useful answers for problems (the solvers got really good over the past few decades).


What's OR?


What's OR?

https://en.wikipedia.org/wiki/Operations_research

Basically a mathematical approach to problems of logistics and scheduling developed first in WW2. Very powerful in the domains for which it was developed but less generally applicable than enthusiasts hoped, leading to the usual “hype cycle”.

If you have a problem OR could solve or just want to fool around with it PuLP is very easy to use https://pythonhosted.org/PuLP/ Of course the ease of use means that it is a commodity skill now.


There is also Google OR tools.

https://developers.google.com/optimization


Yep. Taxi routing service, and not even the best one, you'd think they're launching those taxis to Mars.

That said, SpaceX interview process is even more ridiculous. The first step is to talk on the phone with a non-engineer recruiter who has to ask you highly technical questions, but doesn't understand a word of your response, and you know it. They then sort of have to correlate what you're saying with the answers they have and decide whether you know anything or not. The most uncomfortable interview situation I've ever been in. Or at least that's how it was a few years ago, maybe they've changed it. I was so thrown off by this, I totally fucked it up and never got to the second step, in spite of nominally having all the right experience. To relate, imagine trying to explain low level assembly to a five year old, over the phone.


Not saying this is the case for spacex, but in my field (totally not space or engineering or software (but very much "tech" (physics/chemistry)) related), these types of interview are for weeding out the non-standard folks (of which there are many, including me but many of us (including me) have become good at hiding it). A person with high E would presumably (but not always because it is indeed a difficult task, casualties are regrettable but expected) "grok" the task and begin feeding the right keywords to the recruiter. Once you realize the game, it becomes fairly easy. Just read the job description and sprinkle the keywords provided therein.


Thank you so very much for saying this. When will these people realise that nobody gives a toss about their overly long and overcomplicated selection process.

And, these guys aren't even Waymo.



mission critical!


It seemed like a lot of words to say very little.


> This is a symptom of "bullshit" going on around in big tech companies. "bullshit" here is an economic term defined in the book "bullshit jobs".

Bullshit is neither an economic term nor an anthropological one. David Graeber is an anthropologist, not an economist, though he has written inexplicably popular books on economic topics that betray his lack of understanding of economics.

Bullshit is actually used as a technical term in philosophy occasionally.

http://www2.csudh.edu/ccauthen/576f12/frankfurt__harry_-_on_...

> One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted. Most people are rather confident of their ability to recognize bullshit and to avoid being taken in by it. So the phenomenon has not aroused much deliberate concern, or attracted much sustained inquiry. In consequence, we have no clear understanding of what bullshit is, why there is so much of it, or what functions it serves. And we lack a conscientiously developed appreciation of what it means to us. In other words, we have no theory. I propose to begin the development of a theoretical understanding of bullshit, mainly by providing some tentative and exploratory philosophical analysis. I shall not consider the rhetorical uses and misuses of bullshit. My aim is simply to give a rough account of what bullshit is and how it differs from what it is not, or (putting it somewhat differently) to articulate, more or less sketchily, the structure of its concept.


> not an economist, though he has written inexplicably popular books on economic topics that betray his lack of understanding of economics.

"Debt" I think shows a deep understanding of the relationships economics has with history, philosophy, and society. Graeber knows he's not an economist but he's got a point to make and he's not shy about making it even though it says less than flattering things about some aspects of economics.

You're link is broken for me btw.


What point is Graeber trying to make in Debt? It seems to be “capitalism bad” but that may be too kind to the book’s coherence.

On Bullshit

https://en.wikipedia.org/wiki/On_Bullshit

https://noahpinionblog.blogspot.com/2014/11/book-review-debt...

> Now, this may sound a little silly - if someone wrote a book called "Metal: The First 5,000 Years," and then filled that book with stories of war and bloodshed, never failing to remind us after each anecdote that metal was involved in some way, we might be left scratching our heads as to why the author was so fixated on metal instead of on war itself. And in fact, that is indeed how I felt for much of the time I was reading Graeber's book. The problem was exacerbated by the fact that Graeber continually talks around the idea of debt in other ways, mentioning debt crises (without reflecting deeply on why these happen), the periodic use and disuse of coinage (which apparently is just as bad as debt in terms of enabling the capitalism monster), and any other phenomenon related to debt, without weaving these observations into a coherent whole.

> In other words, I am now angry at myself for paraphrasing the book, and trying to put theses into Graeber's mouth, because this is such a rambling, confused, scattershot book that I am doing you a disservice by making it seem more coherent than it really is.

> The problem of extreme disorganization is dramatically worsened by the way that Graeber skips merrily back and forth from things he appears to know quite a lot about to things he obviously knows nothing about. One sentence he'll be talking about blood debts and "human economies" in African tribes (cool!), and the next he'll be telling us that Apple Computer was started by dropouts from IBM (false!). There are a number of glaring instances of this. The worst is not when Graeber delivers incorrect facts (who cares where Apple's founders had worked?), it's when he uncritically and blithely makes assertions that one could only accept if one has extremely strong leftist mood affiliation


> It seems to be “capitalism bad” but that may be too kind to the book’s coherence.

have you read the book? The book is an exploration, and an interrogation, with so much to learn from that to say that about it seems pretty philistinic.

Maybe you were just summing up the review you linked from Noah Smith. I read most of it, it's a bit meh but Noah doesn't really seem to be trying too much in it. This though: "leftist mood affiliation". That's cheap 'preaching to the choir' language.

If you have a link to a more serious review I'd genuinely like to read it.


Shit, I remember most of the "bad stuff" in Debt predating capitalism by somewhere between centuries and millennia. Seems like a weird way to write it if its Secret Purpose was to be a long-winded hit piece on capitalism.


Sometimes, lessons from the past can help to remove some of the rose-tinted glasses that people seem to associate these newer companies with. For example, it's worth reading Enron's Vision and Values statement (http://www.agsm.edu.au/bobm/teaching/BE/Cases_pdf/enron-code...) from 2000.

I don't think there have been any fundamental changes since 2000 that would incentivize make communication from large public corporate entities to be more honest or logically rigorous.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: