Hacker News new | past | comments | ask | show | jobs | submit login
Dear Academia, I loved you, but I’m leaving you (2014) (anothersb.blogspot.com)
129 points by tacon on Aug 18, 2018 | hide | past | favorite | 87 comments



Regarding "literally everyone does it" (fishing for results, removing negative ones, and so on): it's very true, and much more prevalent than what people think. Even at top-level universities with big shots. Especially at top-level universities with big shots. I know someone, upon entering a lab to do electronic microscopy, who found out that the image data her lab had written multiple papers on very high-impact journals about (we're talking Nature/Science here) could have easily been generated by dead cells instead of living ones, i.e. they had been staring at artifacts for years to draw conclusions on protein content, mechanisms, and so on. Of course, upon learning this, her boss quickly set upon contacting the journals in question to issue an apology and retract the relevant papers, thanked her for noticing the mistake, and everything went back in order. Just kidding, he told her to shut up if she wanted to keep her job.

Interesting fact: the majority of the impact factor values of big-name journals (Nature, Science, Cell and co.) comes from a few select high-impact papers. Most of the rest goes relatively uncited. The reason is, most of these papers are not astounding but simply make it through the editorial system because the big shots in charge of the labs know the editors. Not that these papers themselves are devoid of merit (though I know my fair share of Nat/Sci/Cell papers that happen to be 80-100% bunk), but they wouldn't have made it to such highly selective journals otherwise. These journals also happen to have an usually high retraction rate. I have friends working in the aforementioned big-shot labs that are jaded to tte point that they just dismiss most thing coming out of Science.

And that's just my area and that of people I know. Things go very awry in medicine or anything related to cancer where the pressure is so high, the field so competitive, the scientific questions so complicated, the experiments so hard to reproduce and the incentives to publish so exacerbated (note that medical journals generally have the highest impact factors). And I can't even begin to imagine what goes on in psychology.

Plus, the pay is lousy.


Related story:

When I was in grad school I was doing graph theoretic analysis of human metabolic networks. I had some really interesting and promising preliminary results about the structure of various metabolic pathways and had been invited to present my unpublished work in a multi-departmental lecture. I was making some new visualizations for my talk and spotted something odd, which led me to a bug in my source code that had reversed the direction of several steps in many of the pathways. I was mortified . . . I ran some preliminary tests on the corrected code and saw that many of my results were completely in error. I wouldn't know for sure until I could get several days of cluster time, but I had proof that my last year of work was completely bogus.

I stayed up all night confirming the new code and the extent of my invalidated results and brought this to my advisor the morning before the lecture. He told me to shut up and present it anyways. Being a young idealist I argued with him about the nature of science and the search for truth . . . I refused to go along with it and present something I knew was false. He made one of the research scientists present on their topic at the last second so our lab wouldn't "lose face" and I was pretty much in the dog house forever after.


From someone with no connection to analysis of human metabolic networks (besides having one, I guess): Thank you!


You might not know it sir. But you are a hero. Definitely my hero! Not all heroes wear capes.

This is how Science should be done.


Thank you.


Graph theoretical analysis of human metabolic networks sounds very interesting, can you elaborate on that and/or point to a few papers?


Papers, not really since I was out on the edge at the time and I haven't caught up with the field since then. Nothing I did ever made it into publication but it was pretty interesting. Mostly was focusing on the redundant connectivity of various metabolic pathways in healthy human cell lines and comparing them to the reduced pathways in cancer cells where mutations and large scale chromosomal loss restricted the ability to process and regulate several key compounds. The general idea was to look for pathways that were nearly always preserved even when cancer cells were extremely evolved/mutated from their original lines and map those to drugs that would target the enzymes mediating those reactions. Basically you'd try and knock out a weak link in the cancer metabolic network for which a healthy cell had significant redundancies.

A major conceptual difficulty was the somewhat nebulous definition of a metabolic pathway, beyond what reactions were drawn together in a textbook diagram. Also somewhat thorny was the concept of building a network with any sort of transitive property with a notion of mass conservation built in. I wrote about 30k lines of terrible Perl code exploring this stuff, if I had independent backing I would definitely dust it off and finish the project but it's pretty far in the rearview mirror at this point.


I had a similar experience. I worked in a neuroimaging lab during grad school. I found a mistake in our code that basically meant we had being overestimating the p-value from MRI data. It potentially invalidated many of the previous papers (and there were a lot, this was a "paper mill").

Being naive, I just assumed they'd be retracting them, but my labmates explained there was no way our PI would let that happen. It was never boiled up to the PI, so he never even knew. A year later (due to this as well as other issues), I quit the PhD and joined a company doing something "real".


Coming from neuroscience also, I can really relate to this. I'm just glad that people like Neuroskeptic are casting light on the problem. I was really surprised at the degree to which neuroscience is proof by brain graph with lit up blobs :/.


But the real question is: how else would you do it?

Having journals/conferences with things like double-blind peer-review processes in place to me is probably the only way to do this. How else would you filter all the noise?

We don't have this in industry and it's a bit sad. I mean we do in the form of some blogs, conferences, etc. But in no way it compares to the system academia has. Have you ever tried publishing to an academic conference compared to an "industry conference"? All you need for most industry-conferences is an abstract, an idea, not even evidence of new sorts; actually you need reputation but that's a different issue.

There are flaws in every system for sure and academia has their hands full of them especially since they have to deal with quite important things like medicine.

But I find it funny that people complain about academia, and yet there's no other system that comes close to producing empirical truths.


You're right, of course. I mostly complain because it feels cathartic to rant, but I am well aware that it's probably the same in the private sector, the government, etc. That's just human structures I guess.

As for designing a better system that's more robust to these flaws, it is a hard problem. There are structural issues when trying to apply traditional scientific virtues to the real world (how do you reasonably ensure reproducibility with a ridiculously expensive experiment, or one spannig decades, or one involving cohorts of tens of thousands of people? How do you ensure peer review is honest and comprehensive when there are exactly three labs specialized in a subject, and they're all competitors? Or even collaborators?), even if you assumed that editorial bias (toward big names, against negative results, toward trendy subjects, etc.), dishonesty and nepotism didn't exist. Plus there's a wide system of power-wielding stakeholders with huge incentives to maintain the status quo: established big shots, publishers (cough cough Elsevier cough cough), and so on.

So yeah, things are broken but we're all trudging along trying to enjoy what we do and make do with what we have. Who knows, things may even get fixed one day.


Science is not when you publish something. Science is when at least two independent labs can reproduce your published result.


While I have a similar disillusion with the academia, I think sometimes people misinterpret the difficulty with reproducibility, especially on a field they are not familiar with. It seems to me it's imprudent and a bit disrespectful to jump to the gun when a certain research fail to get reproduced, especially when it comes to biology, because the experimental procedure can be incredibly complex and prone to all kinds of errors. I know of cases where even within a group only one person can successfully perform a certain protocol with a good enough success rate and it takes a lot of learning to get on their level. Without expertise in the field and knowledge of how the reproduction is done, it's really hard to examine all the data and judge the integrity of the science behind it. To me the much bigger problem is how so much of the research is warped by what gets funding, fast.


Wrong result or no result is result too. With help of those who tried to reproduce and failed, it's easier to find missing gaps in knowledge and fill them. Otherwise this knowledge will be lost.


The issue I see with this is that it implies anyone who isn't doing work high-profile enough to get independent labs to try to reproduce it isn't doing science.


IHMO, it's correct. They are just sitting on shoulders of giants.


And you're not a developer/engineer until 2 companies independently use your work, right?


So, build two more LHC's. ;-)

In my view it should be a judgment call whether a particular result needs to be replicated. I'm aware of the so called replication crisis. But adding more arbitrary rules will simply invite people to figure out how to game those rules, if they were gaming the original rules (deliberately or unwittingly).


LHC is not the first collider in the world, so results from previous experiments are used to calibrate LHC.


Yes, this seems like a much less wasteful, and possibly more fruitful approach. Rather than running identical experiments, one can perform experiments that test different but overlapping aspects of the underlying theory. I suspect this kind of "replication" is done all the time in physics, which has led to a robust knowledge base in spite of less than ideal rigor within any individual experiment.


Why two and not three or one?


The more the better. You'd really need a statistically significant number of labs repeating it to be sure. If the underlying study is scientifically sound then a single repeat is good, two repeats is better. 20 repeats is even better. Good luck getting 1 or 2, unless the study is an important work that introduces a useful new technique or is the basis for additional studies. Also good luck knowing whether 2 labs failed for every 1 published. Science is hard. Reporting needs to be improved and there should be grants available to fund repeat studies.


It's easy to keep secret when you only one who knows truth. It's also easy for two, because you immediately know who to blame if secret is revealed, so you can keep pressure. But situation changes for three, because you cannot know who revealed secret, so it's much easier to crack.

Imagine, I will report today that I'm able to determine temperature of the physical vacuum (e.g. using double slit experiment), and then able to drop it, i.e. cool vacuum below absolute zero. Will you trust me? Of course, not. Then when you will trust me?


If two independent labs can reproduce the result it is extremely unlikely to be the result of false positives. The difference between having two versus three independent verifications is minuscule. A standard has to be set and 2 is a good enough standard.


Looks like this poster was in STEM. The situation's even more bleak in the humanities--see for instance this Reddit thread[1]:

> The humanities PhD is still a vocational degree to prepare students for a career teaching in academia, and there are no jobs. Do not get a PhD in history.

[1]: https://www.reddit.com/r/AskHistorians/comments/96yf9h/monda...


I got a master's degree in history by bailing on a PhD program after the second year. The Reddit post you linked is spot on.

I left the degree program long enough ago that some of the people who started with me have finished their PhDs, although many of them still have not, after nearly a decade. Of the ones with PhDs, the only ones with decent jobs are the ones who work outside of academia. The others are adjuncts, and some of them qualify for food stamps.

The thing about getting a PhD is that you defer all of the income and career opportunities many people have in their 20s. By the time you graduate, other people your age have already paid their dues in entry level jobs, got promoted, bought cars and homes, been able to afford travel, and saved money for retirement. When you graduate you will be a decade behind everyone else in that aspect of life.

The only reason any rational-thinking person would make such a sacrifice is if the payoff at the end could make up for it.

A STEM PhD opens doors to a lot of jobs in the private sector. The only door opened by a humanities PhD is a chance to compete in one of the worst job markets in the world: academia, where there are several times as many new PhD graduates as jobs every year.


Wow, that's hard to read. I thought people who went into Literature had it bad. It sometimes seems like the education sector is an experiment in how to create a society with almost unlimited power to do things, and almost total ignorance about what's a good idea.


Academia lives in a world where people are willing to spend hundreds of thousands to make themselves more knowledgeable in subjects which might have no economic value in the world. Just look at how much of our undergrad education consists "general education" type courses. While these courses are interesting, most of them are pretty useless. I took Chinese history, Asian american film, music history, art history and so on. All of these courses were pretty much usless to me from an economic stand point.

I think we need to create two tiers of higher-education: one where people focus purely on vocational/career skills while another where you can focus on non-vocational subjects. Let students decide what their general education will consist of instead of making it some top-down approach. People don't like authoritarianism.


I think the problem is, that leads to a society that's a bit like a sports car being driven by a nine-year-old. STEM and vocational stuff is great for getting to places fast. It's awful for making a society that can talk critically about where it's going.

I say talk, and not 'think', because I don't think STEM people are bad at thinking about problems critically. I just think a lot of humanities is about developing a common set of reference points, a common vocabulary, so you can work on social problems together.

If you don't have that, you can't really stop malicious actors from drowning out their opposition, who are isolated by the fact they don't share a common conceptual toolset. And if that happens for long enough, you end up with a society that's basically a bunch of amazingly powerful machines operated by Fox News.


Do the humanities really succeed at accomplishing those goals? In my own college experience, the humanities courses I took mostly taught me how to parrot the professor's opinions, how to write long-winded dissertations with overly complicated language with the goal of appearing intelligent rather than being understood, and how to insert meaning into literary works that author likely never intended.

And while I was very successful in those courses, I fail to see how they prepared me to be a better citizen, or to think critically about the direction society is going towards. For my daily life, I would probably have been better off taking a course in plumbing than literary analysis.


I don't think the humanities departments are particularly successful - but they've been underfunded and horribly hamstrung by political and administrative interference for about half a century now.

It's also not that Shakespeare's going to make anybody a better thinker. It's more that stuff like this sets the tone for the culture in general. If you don't really teach hard, interesting culture, then all the people who might have been interested just get the feeling that culture is kinda garbage and go on to engage with hard interesting ideas in other fields. Which leads to a public culture that's not in any way healthy - one that's really just people passing the time, that doesn't really push us or make us grow.

History shows this kind of thing is just incredibly dangerous. If everybody likes culture that's enjoyable and easy, easy, self-satisfied thinking predominates. Then suddenly everybody is marching waving big flags and wearing funny hats, because it's a lot of fun, and makes you feel great to be part of something.


High / difficult culture like Wagner and Nietzsche?


> History shows this kind of thing is just incredibly dangerous. If everybody likes culture that's enjoyable and easy, easy, self-satisfied thinking predominates. Then suddenly everybody is marching waving big flags and wearing funny hats, because it's a lot of fun, and makes you feel great to be part of something.

Please provide a reference for this. It sounds too much like a just-so story for me to take that at face value.


Well, the classic reference is Adorno, "The Culture Industry". But you're right, it is a just-so story. I don't think it's possible to do a fair treatment of the relation between culture and politics in a hundred-word post, and my own limitations have made it still less rigorous.

To add some more unrigorous, allusive examples to the mix: the typical Al-Qaeda member is not a imam, but rather an engineer. Technical capability by no means implies ethical capability- so if you imagine a whole society of highly developed technical abilities, and highly under-developed critical capacities, it's a scary thought. And you don't really have to look far into history to find lots of examples.

Consider the V2 rocket, for instance. One of the most amazing technical innovations of the century, but equally, utterly pointless on every other level, even the military one. That's the kind of thing that nations with deeply stupid official cultures tied to deeply brilliant technical cultures make.


I'm not sure if a citation would make it any better. That line of logic makes me think of the opposite pitfall - that it leads to needless hair-shirted puritanism because anything enjoyable can't possibly be a good thing. "We can't use anesthesia in surgery because the lack of suffering is detrimental to character!" By definition it leads to needless suffering.

While knowledge and culture may seem boring or useless in itself (the fine details of long dead Roman politics) it is possible that there are hidden great applications to it in the right situation. Just for two surprising relevance examples Roman gladiatorial archetypes experiencing political correctness after annexing former foes, gladiatorial endorsements, and after the extended war had resulted in consolidation of farms in the hand of slave holding elites past legal limits the idea of land reform or enforcing the existing laws being called 'unRoman'.

Science alone has a history of finding bizarre peripheral finds that nobody would have expected. Take an incident that resulted in cattle bleeding to death after routine mild procedures - the source was eventually traced back to fungus affecting their feed. Useful in itself but good only for agronomy. Said fungus was used to produce the first bloodthinner on the market.


>anything enjoyable can't possibly be a good thing.

I think you're conflating enjoyment, in the sense you enjoy an apple, and enjoyment in the sense you enjoy a painting. It's a really common conflation, mainly because in our society, the dominant culture is enjoyable in the sense an apple is enjoyable. That, for me, is part of the problem. Nobody ever got inspired by eating an apple. Nobody ever learned new forms of empathy, or developed, through appetitive enjoyment.


I think of it in different frameworks personally - than enjoyment - novelty exposure and exploration vs sticking with comfortable and familiar - peripheral to how one receives it (although attitude would influence what and how much is learned).

Theoretically appreciative enjoyment could be empathic in a "rags to riches" sense but that could arguably be making the bitterness more palatable while undermining fundamental aspects say the sense of helplessness - even if it shows both ends and cycle of grudges and oppression. I guess it is like Shakespearean "write to please both the groundlings and the noble patrons" - hard to do, let alone properly but if done right offers the potential to create masterpieces.

Anyway I can see some point to the arguement personally even if I would phrase it radically differently - and it may not be entirely what you wanted.


If that's how you feel then why didn't you prioritize clear and cogent communication of your own thoughts when engaging with the course?

It sounds like you prioritized maxing your GPA per unit of effort instead of engaging with the material in the way that's most beneficial to you.

Why on earth would you ever write an essay that's just meaningless jargon to you? What on earth were you expecting to learn?

The same thing happens in STEM courses. Three examples:

1. Projects. If your project incorporates the professor's research priorities/interests you can get away with way less work, but will also get less out of the course.

2. Programming assignments in CS. Professors rarely assign open problems; they're almost always assigning variants of important solved problems. Students will often look up the paper or wiki article where the original solution was described and then copy the algorithm and/or proof. This often results in high marks for low levels of effort.

3. There's an even better analogy in Mathematics, where students sometimes learn how to parrot the structure/rhetoric in proofs instead of actually learning the mathematics. Professors often don't do a careful enough job grading and give these proofs high marks because they sound correct and are mostly correct. However, it's clear from a second glance that the student has no idea wtf they're saying.

Students prioritizing low-effort parroting instead of seriously engaging with the material is pervasive in CS and Mathematics.

Some of that blame lies with the professor, and when students never learn that parroting/memorization is bad, I guess that's a huge failure of the entire education system (K-12 through higher ed). But IMO students who know that's the game they are playing (e.g., writing jargon-laden essays instead of writing in a way that is true to their own voice) don't have a lot of room to then complain that all they learned was how to play the "max GPA for min effort" game...


Clear and cogent communication of my own thoughts lead to me having lower grades. Lower grades lead to fewer opportunities - less chances of getting a scholarship, recommendation, access to certain programs, etc.

It depends on what you seek from university. Unfortunately at the undergraduate level, its role has largely shifted from a place of learning into a provider of credentials that act as a signal to employers. With the high cost of tuition (especially in the USA), you want a good return on your investment, not waste your time engaging with the material at the detriment of your chances of success.

It seems a far better strategy to maximize your GPA while spending the least possible amount of time on the assigned material, in order to have time to pursue your own interests and be able to more effectively learn the things you want to learn.


This post only reinforces my point: GPA hacking is an intentional choice on the part of students, and GPA hacking is not specific to the humanities.

You got nothing out of your humanities courses. The fault was not in your courses but in yourself. You certainly could have chosen to improve your prose, refine your theses, and strengthen the presentation of your arguments. Instead, you chose to GPA hack.

One of the great things about university is that you get what you put into it. Students who choose to invest in credentialism get credentials. Students who choose to invest in an education get an education. You got what you very explicitly chose to get!

The sad irony, by the way, is that your perceived downsides are rare and GPA hacking backfires. I have never heard of a faculty member refusing to write a letter for an engaged student just because they received a B or even a C; however, I've heard plenty of stories about faculty writing lackluster letters or even refusing to write letters for students who were very clearly GPA hacking.

Perhaps most importantly, your original post was written as an indict of humanities courses. In my post, I pointed out that GPA hacking is also possible in STEM fields. Your latest post justifies GPA hacking by appealing to the credential-granting role of higher education. Your argument justifies GPA hacking not only in the humanities but also in STEM fields.

A common saying in education is that cheaters mostly rob from themselves. IMO the same is true for GPA hackers.

One final tangential note about this excerpt from your post:

> With the high cost of tuition (especially in the USA), you want a good return on your investment, not waste your time engaging with the material at the detriment of your chances of success... It seems a far better strategy to maximize your GPA while spending the least possible amount of time on the assigned material, in order to have time to pursue your own interests and be able to more effectively learn the things you want to learn.

Doesn't this place you in a bit of a double bind? If you know better than the faculty, then GPA hacking isn't required because you already know the material. If you don't know better than the faculty, then it's very likely that you are frittering away your time on fads instead of focusing on fundamentals.


It all depends on your goals, I agree. You can certainly use university as a means to come out a better educated, well-rounded citizen, or simply a method of acquiring the credentials necessary to get hired. If you can afford the former, great, but for many, it may be come across as a luxury.


> It all depends on your goals, I agree... but for many, it may be come across as a luxury.

As I've already stated,

>> The sad irony, by the way, is that your perceived downsides are rare and GPA hacking backfires

Choosing to learn employable skills is important, but there's no upside to viewing education as "simply a method of acquiring the credentials necessary to get hired".

You seem to have missed the fundamental point of my last two posts: your view of how to approach education is self-defeating even if you view education purely as a way of increasing lifetime earnings!


> I took Chinese history, Asian american film, music history, art history and so on. All of these courses were pretty much usless to me from an economic stand point.

I majored in Math and CS and took a smattering of other courses (one course away from a minor in each of Religion, Philosophy, and Economics).

The CS courses were by far the most useless. Perhaps 50% of the material I had taught myself before started undergraduate: everything up to and including data-structures as well as the pragmatic aspects of databases, OOP, and Networking courses. Another 40% I could've taught myself without much guidance (non-theoretical PL, non-optimizing compilers, and a non-proof-based algorithms course). The other 10% was project-based or independent study where the forcing function to work with people who were both above and below my skill level was the most valuable aspect of the coursework. (pre-college internships and freelancing gave me lots of experience working with others, but almost all of them were far above my skill level or else were completely non-technical).

The Mathematics coursework was by far the most useful. The lower-level content courses (Calculus and Linear Algebra) are essential for accessing most of the interesting parts of the software industry (self-driving, robotics, research software engineer, etc.). Also, the proof-based coursework is full of useful thought patterns. Most of the difficult/economically important problems I've solved in my life were all relatively trivial extensions of proof techniques or theorems from my undergraduate Analysis, Algebra, and Logic coursework.

The humanities courses sit right in-between CS and Mathematics in terms of usefulness. The value of regular practice writing and communicating, with a professional feedback mechanism, was extremely valuable. I don't often use the content knowledge at work, but the writing skills are extremely useful. Learning to write well without regular professional feedback is difficult.

The content knowledge from my humanities courses was not immediately useful in an economic sense. However, that well-roundedness did get me an amazing spouse and extremely interesting friend group, though. I'm not sure how to put a dollar value on either of those things.

So economically, Math > humanities > CS. CS is more economically valuable but the CS education isn't more valuable because the core content knowledge that's necessary to get a well-paying job doesn't require formal course-work. Basic programming, SQL, and web frameworks are definitely accessible to passionate middle schoolers.

In career dimensions beyond net worth (i.e., interesting work), Math > CS > humanities.

In non-career dimensions, humanities >= Math > CS. My marriage and friends are mostly a product of the humanities aspect of my education and are each easily worth the $20k I paid for undergraduate.

YMMV.

> I think we need to create two tiers of higher-education

FWIW there are at least 4 tiers:

1. Vocational: Community College and most for-profit universities/educational programs (including bootcamps)

2. Advanced Vocational BA/BS: "industry-tracked" degrees at traditional universities. E.g., many universities have both a CS degree and an SE degree. The physics/engineering split is comparable. Theoretical technical courses are replaced with pragmatic technical courses and the more pure humanities courses are replaced with business/econ/industry-specific communications coursework.

3. Traditional BA/BS

4. Traditional BA/BS at elite academic institutions (e.g., top 10 CS programs tend to have bachelors degrees that are different in kind from small branch campuses of most state university systems)


It truly is a shame. I was seriously considering pursuing a PhD in history and one of my professors warned me against it. I ended up geting a degree in accounting and have not regretted anything.


Same here except I ended up in CS. Solid job prospects and I have found other ways to engage with my love of history


This is a recurring topic on HN.

My (European) experience is that academia provides a great work environment. If you have tenure, you're free to work on whatever you want, you have very low pressure (beside the pressure you decide to inflict on yourself), you're basically paid to learn new things and teach them to others. You can easily visit other institutes abroad. You get to meet interesting people. Age discrimination isn't an issue. The salary isn't on par with what you could make working for a company but it's enough. If you're ambitious and talented you can easily earn money on the side (consulting, starting your own company, taking a leave) and you can have the best of both worlds.


Yes, I believe we have it easier in Europe. I realize my other post sounds rantish but the reason I'm still in academia, beyond the very valid points you've mentioned, is that I don't believe the nepotism and dishonesty issues are specific to academia.

>You get to meet interesting people.

I want to stress that part because I don't feel it does the environment justice. You get to meet fantastic and very intelligent people from all countries in the world. During gathering events, meetings, parties, etc. I've been to, it's not rare to see eight different nationalities out of eight people around a dining table. You talk about science at first but very quickly discussions derail into comparisons between such and such country's perspective, differences in culture, jokes about the language, and so on. And it happens every time. Not all environments provide this.

Of course, you may argue that this diversity of culture and insights is counterbalanced by the community's homogeneity in many other aspects (e.g. the overwhelming majority of scientists are very liberal, even by European standards), and it's pretty obvious that your view of, say, Iranian or Russian people is going to be skewed if the only ones you've talked to are very educated and liberal 20- or 30-somethings working as expats. Scientists are known for their insularity, and while they're a demographic that's among the most susceptible to date outside their country, they're also very prone to date inside the community. The 'ivory tower' stereotype is not completely wrong.


This is also one of the great things about working in Silicon Valley.


What is special about "Iranian or Russian people"?


The sensibilities of expats from these countries tend to be quite misaligned with that of their governments and a large chunk of their fellow countrymen. It is of course the case with almost every country but the divide can be markedly stronger in some countries.


Do you have any evidence allowing you to single out these two countries?


Well no, I don't have 'evidence'. I just report what the expats in question tell me. I'm not singling them out, they just came to my mind due to recent conversations. I'm not trying to make a grander point or anything, and not every remark has to be completely exhaustive and evidenced to be 'allowed' into a casual conversation. If you want me to source every statement I make, you're going to have to pay me, just like my boss does when I write papers.


>I'm not singling them out

But you did it and you did it with the "it's pretty obvious" qualifier.

This casual bias needs to be corrected.


being around really smart people doing interesting things is pretty great.


Life is also pretty great if you win the lottery. But it's not wise to plan on it.

One of the main points brought up here and elsewhere is that getting a tenure-track position is hard: you're basically in a giant tournament. And that tournament continues to actually get tenure. So I don't disagree with you, but you're starting from the assumption that one has tenure, which is very much not a given for the vast majority of grad students.

(Yes, yes, I'm aware this is not strictly true and that many lottery winners end up declaring bankruptcy and often report being miserable. But it works as an illustration.)


Getting a job in academia is not entirely random :) Just like for any competitive job, it's important to objectively assess chances of success and reconsider them along the way.

There are many things to consider. How bright you are, what is your research topic and will it be popular in a few years, who is your PhD advisor, are you willing to work abroad and so on...

But I agree that there is uncertainty for most people and graduate students should have an exit plan, e.g. keep their research or teaching work relevant to the job market.


The problem is getting a tenure position - with all these nepotism, protectionism, elitism of academic world.


meh as a baby academic I am not convinced that the nepotism protectionism elitism of academia are markedly higher than the 'real world'


The big difference is the workplace variety. You can work at start ups, small businesses, mid-sized companies, or mega-corporate, or even government jobs. It isn't that they don't exist there but that it is easier to escape certain types of it.


Quit my professorship about four months ago, and I've Never been happier. Academia works for some people, but is systemically twisted and has become terribly corrupt in most places.

There's a life outside people. Not just life but intellectual life. I'm so much happier not contributing to a system I hate.


obligatory "Dont become a Scientist" http://katz.fastmail.us/scientist.html


I had this same problem when I did projects in different disciplines of the university. As an outsider, with the same skill-set, you can look through some of the wrong ideas of other departments.

And it is hard to discuss some of those ideas, as they are often established within the specialization. The reasons why those wrong ideas have become established have more to do with tradition than actual science. And the reason why they were accepted, is because they matched some observations. Afterwards they were coincidence or cherry-picked, but often not recognized as such. And alternative theories were not needed nor supported, due to the internal culture.

It is easy to compare it with programming problems. You can claim that your program works when you tested it for certain problems. But if no-one else is allowed to test your code, your program probably still has a lot of bugs.

In some cases there is even a fundamental problem. They started with the wrong models in the first place, and/or misinterpreted certain observations. And from there they started growing new models on top of it. It is like programming code that started with the wrong abstractions.

Most problems occur from theories that are developed without much practical experience. Sometimes even without any practical experience. A friend of mine is a computer-scientist who has almost no computer experience. I have heard many wrong ideas from him.

If I want to talk about the problems, I will probably step on someone's toes. Causing anger and the belief that I am just misinformed or under educated.

We like to project these problems onto psychology and such, but I can also find them in hard sciences.

Let me pick one in hard physics. There are some theories in physics that model field-lines as real. While in reality, fields are always continuous. There are clearly some mistakes in one (or more) of the underlying theories.


With apologies for the snark, don't they teach the use of paragraphs in academia?


It's indicative of how far gone he is.


Can anyone speak to what it's like doing a post doc in CS?


you should be more specific-what part of CS?

all the CS postdocs I know see a) pretty happy b) have a really easy back up of a high paying industry gig that they can pivot to because of currently insanely inflated demand for academics to make deep learning on the blockchain on the cloud on embedded devices doing quantum backprop

-these are ML/theory postdocs


I am primarily interested in the experience of ML post-docs that forgo going into industry. Can you elaborate a bit more about their experience? Salaries look a bit lower for post docs vs industry work. Some I've seen start in the mid-40k to 70k range. Could be different based on geography.


are you already a grad student or are you considering it?

in my department [we are an outlier probably] post doc wages are actually pretty comparable to industry i.e. >= 100k. 40k to 70k seems low to me.

I would expect the range to be more like 60-100k.

I haven't been on the real job market yet-but the key to getting a TT or postdoc position seems to be a) collaborators who want to hire you b) people who have heard you give a talk in person and are familiar with your research.


I am considering going into a CS PhD focusing in ML. The mid 40k-70k range was from quick google search I did for CS post docs in ML in lower cost of living areas where the cost of living is much lower than on the West Coast. I am trying to look at career prospects and weigh whether it makes sense to stay in academia or jump to industry (after I complete a PhD). If wages are closer to 60k-100k for post docs, then I may consider staying in academia for some time after completing a PhD depending on whether my career interest shift.


Well I would be happy to provide some context. I just finished my first year of CS Phd in ML (more on the theory side) and I really like it. I think most of the places you would want to do a post doc in CS are probably going to be moderately high CoL. My phd is in a place with pretty low CoL (but a still a top 10-top 20 school (depending on who you ask) ) so the graduate stipend goes reasonably far.

The other thing to note in ML is that it seems like a few people go to industry research labs for a few years i.e MSR/FAiR/google brain and then come back to the academy since there are industry roles that involve research and publication. for instance moritz hardt.

my personal plan for the first 3 years of grad school is to work really hard and try to keep both academia and industry open and after year 3 evaluate the number of publications I have and my current skill set to see if I can make it in the academy or shift more towards industry.

I think the biggest factor I would comment on is look very closely about what jobs the graduated students from the department you matriculate at AND more importantly the professor you want to work with go on to do post Phd. There are a lot of naysayers in this thread about the risks of an academic career and I share those concerns but I felt a lot more comfortable taking the plunge after I looked at the career record of the graduated students of my advisor. They were all either tenure track or had good industry positions.

edit: if your advisor has collaborators in industry groups I think it is pretty straight forward to get an industry gig.


When evaluating the warnings from naysayers you have to keep in mind that CS is quite an outlier as far as backup career prospects go. I made it all the way to CS postdoc and every step of the way I had to keep swatting away industry recruiters waving wads of cash at me. I finally made the leap for other reasons but it was effortless. I think this is absolutely not how it works in other disciplines.

One exception I can think of is what I call "closet programmers," which are folks that work in various areas which rely on software such as experimental physics, astronomy, molecular biology. and end up mostly doing programming because they love it. We have a bunch of engineers like that and they are all excellent :-)


100% correct. I think we are both very lucky in that we are able to do fun science and chase our intellectual interests with a realistic and still fun safety net.

Also those closet programmers are always really fun to talk to since their problems and culture are breath of fresh air.


Thanks for the context. That sounds like a good plan to me regarding post-doc locations. I am also interested in theory side of ML. What areas of mathematics should one learn really well that apply to the theory side? What blogs, papers, books would you point one to to learn the theory side more? To your knowledge are their applications of abstract algebra to ML? If so, what areas of algebra apply & what problems do they solve?


I could rant about this for hours. I actually just went to a defense for a deep learning paper that had a ton of abstract algebra. I am honestly not really a fan of deep learning and algebra because all the papers to me like- seem to stop at describing some really basic feedforward network as some really specific mathematical structure but these theories a) provide very little explanation of empirical phenomena b) provide no new directions of research in terms of like useful network architectures.

I haven't really come across algebra in machine learning other than people applying it to deep learning.

i.e. https://arxiv.org/pdf/1802.03690.pdf

ie. https://icml.cc/Conferences/2018/Schedule?showEvent=2048

I don't personally find papers like this valuable but idk I have never really enjoyed abstract algebra.

For areas of mathematics to do theory in ML (and to do ML more generally!)

-probability/concentration/hoeffding bounds [the PAC model] [Key]

-linear algebra [key]

-optimization [key]

for books

-understanding machine learning by shai ben david

This book is nice since it really balances theory with a more practical understanding.

-An Introduction to Computational Learning Theory by kearns is a classic [low priority]. this is fun since the proofs are simple and deep but is very very far away from practical algorithms.

-convex optimization by boyd

Course Notes:

[I think a good alternative to blogs is stalking course notes for other schools-they are very often public.]

- http://ttic.uchicago.edu/~avrim/MLT18/index.html

good learning theory course by avrim blum who is a big deal in learning theory and theory.

- tim roughgardens notes are a blessing for algorithms and theory [seriously he should have a patreon or something]

https://theory.stanford.edu/~tim/notes.html

Blogs:

-http://www.argmin.net/

this is ben recht's blog and is filled with ML wisdom.

-https://blogs.princeton.edu/imabandit/ not quite learning theory but a lot of ML adjacent stuff

I don't read many blogs as I should tbh so other people can give better advice

VIDEOS https://www.youtube.com/channel/UCW1C2xOfXsIzPgjXyuhkw9g

This is the simons institute youtube channel. probably the best single location for recordings of TALKS in computer science-good amount of ML talks.

https://simons.berkeley.edu/videos


I don't see a way to respond to your latest reply, but thank you for the recommendations! I'll take a look at them.


For the OP, sure. Of course. He expected something else? After all that time as a student, he didn't see what he saw as a post-doc?

When I was in K-12, I got strong messages that education was generally good and, in particular, good preparation for a good career, e.g., financially secure, enough to be a good family provider and more, and a Ph.D. was the best degree.

Okay, around DC, with only a BS in math, I quickly had a good career going in applied math and computing. Often I could have done better if I'd known more about math and physics; so I did a lot of study on/off the job and learned a lot. Then I went for a Ph.D. in applied math and did well with it.

But, no way, not a chance, never, not even for a milli, micro, nano, pico second did I ever want to be a college prof. Instead I wanted to return to the career I had before grad school and, then, just do better at THAT career.

But my wife's Ph.D. program nearly killed her. So, to try to help her get well, I took a slot as a prof in an MBA program in a B-school near her family farm.

She didn't get well; that academic slot was as in the OP: The B-school wasn't much about business. The students were putting in time, money, and effort and learning next to nothing useful for career in business. For me, as in the OP, I was being financially irresponsible.

So, I left for business, an AI project at IBM's Watson lab.

Publishing? The OP claims that some of the publishing was unethical. Actually, in the papers I wrote as sole author, I didn't encounter that. My papers were fully honest, honorable, ethical, etc. Some co-authored papers were nearly all about hype and PR (public relations, getting known, pretending to have some good, leading edge research); the papers weren't actually wrong or unethical, and hype and PR are more common in business than in those co-authored papers!

Looking back, here's what's wrong generally with B-school: It has physics envy, wants to see itself as doing high end, pure research. E.g., too many B-schools are much more interested in the question P versus NP than anything having to do with being successful in business. B-schools are not clinical like medicine, law, dentistry, pharmacy. Except for some courses in accounting and maybe business law, B-schools are not vocational training or much help for a business career.

More generally in US research university education, the big push is for research that can get research grants, and helping the students do well outside academics is a low priority. Really, at a good research university, the students will be able to learn from the best sources the best theoretical foundations of some field; that foundation might, maybe a long shot but might, help for some work in the field, even in business. Yes, one really good application might be a strong pillar for a whole career. Fine. But day by day those research university foundations are not very good information about how to be successful in business, might be like teaching about the details of photosynthesis and the chemistry and thermodynamics of internal combustion engines as a foundation for running a lawn mowing service!

Since the OP is about psychology, my brother tried that. He got his Masters. He concluded that psychology knew a lot about rats that was next to useless for people and otherwise next to nothing useful about people, was good at what was not of interest and not much good at what was of interest. He changed to political science and got his Ph.D. there. He has never used the degree for his career!

What happened? At the high end US research universities, ballpark 60% of the university budgets is from the university's 60% or so off the top of research grants to the profs. The research grants are heavily for (A) the STEM fields, e.g., from NSF, and (B) bio-medical science, e.g., from NIH. (A) is mostly for US national security and got started during the Cold War after The Bomb and more in the STEM fields were so important in WWII. As a secondary effect, the research funding AND the DoD funding for systems has helped US progress in information technology, e.g, the early days of Silicon Valley. For (B), that is, bluntly, because Congress has to vote the money and a lot of members of Congress are old enough to care about progress in medicine. As a secondary effect, the technology in US medicine is likely the best in the world. Still, as in the OP, commonly trying to be a college prof is financially irresponsible and possibly objectionable ethically, etc. But, again, there are some really big bucks involved, both to run a university and to get research grants.

Some good news, for the taxpayers, about the research grants, especially from NSF and NIH, (A) the grant applications commonly get expert reviews and (B) generally the grants are quite competitive.

For the students, learning is MUCH easier now than ever before: We are awash in books, often in PDF and for free, video lectures, e.g., quantum mechanics at MIT, etc. And, especially in practical computing, the US workforce does a LOT of independent study and self teaching, e.g., commonly knows MUCH more about practical computing than research university computer science profs.

So, in short, we can leave the research universities to do far out research and for the STEM fields, even for theoretical purposes (e.g., do research with a day job as reviewing patents in Switzerland!), i.e., there's no law going to sleep at night for an hour thinking about how to resolve P versus NP, and especially for practical purposes, be largely self-taught.

For how to run a successful lawn mowing service, bath and kitchen renovation service, auto repair and body repair shop, ..., building supply company, fast food restaurant, Web site, software house, etc., people get to learn from their parents, early jobs, what they can read, and especially what they can figure out for themselves. For using some algorithms for the traveling salesman problem to find a route over some ground to minimize mowing time, that might, but I doubt it, be okay for some farms but not for mowing suburban lawns!


The cause of the problems in academia is hinted at near the end:

> the truth is I understand why they have to do things that way, if they don't get published, they don't get money to continue work that should theoretically be beneficial down the road!

Scientists optimizing for what is likely to be published (read: what is likely to earn money), sacrificing elsewhere. Maybe if we payed them enough so that they don't have to do this to survive, then we wouldn't have this issue.


> Scientists optimizing for what is likely to be published (read: what is likely to earn money), sacrificing elsewhere. Maybe if we payed them enough so that they don't have to do this to survive, then we wouldn't have this issue.

How do you do this in a society with scarce resources? With scarcity, you need to have a measure of success, otherwise everyone could just say they were a scientist and get a six figure salary to do nothing. Any metric you choose will be gamed.

This worked hundreds of years ago because the only people that could get educated were the wealthy, so we didn't need to worry about supporting scientists with public funds, they supported themselves.


> With scarcity, you need to have a measure of success, otherwise everyone could just say they were a scientist and get a six figure salary to do nothing. Any metric you choose will be gamed.

As a current grad studrnt, I'm calling BS. We (meaning academics, even the ones gaming the system) _know_ what strong, reproducible, well-designed research projects look like. It doesn't take a genius to develop an experiment, run it, document what happened, and publish _all_ of the results. Hell, you learned the basics in middle school -- pick an independent variable, tweak it, then see what happens to the dependent variables.

Now it's obviously not that simple -- isolating just one independent variable is practically impossible much of the time -- but the concept of documenting EVERYTHING that you tried should be obvious. There's absolutely zero impetus to share all of that information, however, especially when you only have 12 pages in a conference paper to write up everything. So instead you go fishing, and you run a boatload of tests, and you see which ones "solve" the problem, and you write up just those tests, ignoring all the failures and the false starts.

It pisses me off, to be honest, and it's just one of the many reasons I'm trying to get out of here as soon as I can.


I'm also a grad student.

We do know what good research looks like, that's not the problem, at least not directly. We're both told "go do good research" and then we go and each write one paper. The government then says "okay this is fine, but next year, I can only fund one of you, so, now whoever brings me the most papers next year will keep their job, and the other will hit the bricks." This is where the problem starts.


It has been suggested that a lottery system for grant proposals would actually both work better and be more fair than our current system.


Tracking what actually happens in scientific studies and how many don't get published would be a good idea. And a blockchain would be a way to do it. Knowing the whole truth could make science better.


Can you explain why it cannot be done with Web2.0 and Object Oriented Programming in ACID Cloud?


Well, it’s well-known [citation needed] that to be taken seriously, any vague business idea proposal must exhibit Proof of Buzzword. Just like in the Bitcoin protocol, when a Buzzword has successfully been mined enough times its reward value halves and one must add new, previously unexploited buzzwords to the proposal to maintain its credibility.

By this token web2.0, ACID, Cloud et al. have almost no buzzword value left. Even Bitcoin is very 2017. Which is why the only logical solution is nano-engineered quantum genetic algorithms with dark matter entanglement in P-space.


> And a blockchain would be a way to do it

I can't even.


Money corrupts everything including academics.


It's just how it goes in those fields...remove all of the negative results, don't actually report the ridiculous number of fishing expeditions you went on (especially in fMRI research), make it sound like you mostly knew what you were going to find in the first place, make it a nice clean story.

This has always bugged me about our implementation of the scientific method. There's no reason it has to be this way; it's just convention.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: