Interesting fact: the majority of the impact factor values of big-name journals (Nature, Science, Cell and co.) comes from a few select high-impact papers. Most of the rest goes relatively uncited. The reason is, most of these papers are not astounding but simply make it through the editorial system because the big shots in charge of the labs know the editors. Not that these papers themselves are devoid of merit (though I know my fair share of Nat/Sci/Cell papers that happen to be 80-100% bunk), but they wouldn't have made it to such highly selective journals otherwise. These journals also happen to have an usually high retraction rate. I have friends working in the aforementioned big-shot labs that are jaded to tte point that they just dismiss most thing coming out of Science.
And that's just my area and that of people I know. Things go very awry in medicine or anything related to cancer where the pressure is so high, the field so competitive, the scientific questions so complicated, the experiments so hard to reproduce and the incentives to publish so exacerbated (note that medical journals generally have the highest impact factors). And I can't even begin to imagine what goes on in psychology.
Plus, the pay is lousy.
When I was in grad school I was doing graph theoretic analysis of human metabolic networks. I had some really interesting and promising preliminary results about the structure of various metabolic pathways and had been invited to present my unpublished work in a multi-departmental lecture. I was making some new visualizations for my talk and spotted something odd, which led me to a bug in my source code that had reversed the direction of several steps in many of the pathways. I was mortified . . . I ran some preliminary tests on the corrected code and saw that many of my results were completely in error. I wouldn't know for sure until I could get several days of cluster time, but I had proof that my last year of work was completely bogus.
I stayed up all night confirming the new code and the extent of my invalidated results and brought this to my advisor the morning before the lecture. He told me to shut up and present it anyways. Being a young idealist I argued with him about the nature of science and the search for truth . . . I refused to go along with it and present something I knew was false. He made one of the research scientists present on their topic at the last second so our lab wouldn't "lose face" and I was pretty much in the dog house forever after.
This is how Science should be done.
A major conceptual difficulty was the somewhat nebulous definition of a metabolic pathway, beyond what reactions were drawn together in a textbook diagram. Also somewhat thorny was the concept of building a network with any sort of transitive property with a notion of mass conservation built in. I wrote about 30k lines of terrible Perl code exploring this stuff, if I had independent backing I would definitely dust it off and finish the project but it's pretty far in the rearview mirror at this point.
Being naive, I just assumed they'd be retracting them, but my labmates explained there was no way our PI would let that happen. It was never boiled up to the PI, so he never even knew. A year later (due to this as well as other issues), I quit the PhD and joined a company doing something "real".
Having journals/conferences with things like double-blind peer-review processes in place to me is probably the only way to do this. How else would you filter all the noise?
We don't have this in industry and it's a bit sad. I mean we do in the form of some blogs, conferences, etc. But in no way it compares to the system academia has. Have you ever tried publishing to an academic conference compared to an "industry conference"? All you need for most industry-conferences is an abstract, an idea, not even evidence of new sorts; actually you need reputation but that's a different issue.
There are flaws in every system for sure and academia has their hands full of them especially since they have to deal with quite important things like medicine.
But I find it funny that people complain about academia, and yet there's no other system that comes close to producing empirical truths.
As for designing a better system that's more robust to these flaws, it is a hard problem. There are structural issues when trying to apply traditional scientific virtues to the real world (how do you reasonably ensure reproducibility with a ridiculously expensive experiment, or one spannig decades, or one involving cohorts of tens of thousands of people? How do you ensure peer review is honest and comprehensive when there are exactly three labs specialized in a subject, and they're all competitors? Or even collaborators?), even if you assumed that editorial bias (toward big names, against negative results, toward trendy subjects, etc.), dishonesty and nepotism didn't exist. Plus there's a wide system of power-wielding stakeholders with huge incentives to maintain the status quo: established big shots, publishers (cough cough Elsevier cough cough), and so on.
So yeah, things are broken but we're all trudging along trying to enjoy what we do and make do with what we have. Who knows, things may even get fixed one day.
In my view it should be a judgment call whether a particular result needs to be replicated. I'm aware of the so called replication crisis. But adding more arbitrary rules will simply invite people to figure out how to game those rules, if they were gaming the original rules (deliberately or unwittingly).
Imagine, I will report today that I'm able to determine temperature of the physical vacuum (e.g. using double slit experiment), and then able to drop it, i.e. cool vacuum below absolute zero. Will you trust me? Of course, not. Then when you will trust me?
> The humanities PhD is still a vocational degree to prepare students for a career teaching in academia, and there are no jobs. Do not get a PhD in history.
I left the degree program long enough ago that some of the people who started with me have finished their PhDs, although many of them still have not, after nearly a decade. Of the ones with PhDs, the only ones with decent jobs are the ones who work outside of academia. The others are adjuncts, and some of them qualify for food stamps.
The thing about getting a PhD is that you defer all of the income and career opportunities many people have in their 20s. By the time you graduate, other people your age have already paid their dues in entry level jobs, got promoted, bought cars and homes, been able to afford travel, and saved money for retirement. When you graduate you will be a decade behind everyone else in that aspect of life.
The only reason any rational-thinking person would make such a sacrifice is if the payoff at the end could make up for it.
A STEM PhD opens doors to a lot of jobs in the private sector. The only door opened by a humanities PhD is a chance to compete in one of the worst job markets in the world: academia, where there are several times as many new PhD graduates as jobs every year.
I think we need to create two tiers of higher-education: one where people focus purely on vocational/career skills while another where you can focus on non-vocational subjects. Let students decide what their general education will consist of instead of making it some top-down approach. People don't like authoritarianism.
I say talk, and not 'think', because I don't think STEM people are bad at thinking about problems critically. I just think a lot of humanities is about developing a common set of reference points, a common vocabulary, so you can work on social problems together.
If you don't have that, you can't really stop malicious actors from drowning out their opposition, who are isolated by the fact they don't share a common conceptual toolset. And if that happens for long enough, you end up with a society that's basically a bunch of amazingly powerful machines operated by Fox News.
And while I was very successful in those courses, I fail to see how they prepared me to be a better citizen, or to think critically about the direction society is going towards. For my daily life, I would probably have been better off taking a course in plumbing than literary analysis.
It's also not that Shakespeare's going to make anybody a better thinker. It's more that stuff like this sets the tone for the culture in general. If you don't really teach hard, interesting culture, then all the people who might have been interested just get the feeling that culture is kinda garbage and go on to engage with hard interesting ideas in other fields. Which leads to a public culture that's not in any way healthy - one that's really just people passing the time, that doesn't really push us or make us grow.
History shows this kind of thing is just incredibly dangerous. If everybody likes culture that's enjoyable and easy, easy, self-satisfied thinking predominates. Then suddenly everybody is marching waving big flags and wearing funny hats, because it's a lot of fun, and makes you feel great to be part of something.
Please provide a reference for this. It sounds too much like a just-so story for me to take that at face value.
To add some more unrigorous, allusive examples to the mix: the typical Al-Qaeda member is not a imam, but rather an engineer. Technical capability by no means implies ethical capability- so if you imagine a whole society of highly developed technical abilities, and highly under-developed critical capacities, it's a scary thought. And you don't really have to look far into history to find lots of examples.
Consider the V2 rocket, for instance. One of the most amazing technical innovations of the century, but equally, utterly pointless on every other level, even the military one. That's the kind of thing that nations with deeply stupid official cultures tied to deeply brilliant technical cultures make.
While knowledge and culture may seem boring or useless in itself (the fine details of long dead Roman politics) it is possible that there are hidden great applications to it in the right situation. Just for two surprising relevance examples Roman gladiatorial archetypes experiencing political correctness after annexing former foes, gladiatorial endorsements, and after the extended war had resulted in consolidation of farms in the hand of slave holding elites past legal limits the idea of land reform or enforcing the existing laws being called 'unRoman'.
Science alone has a history of finding bizarre peripheral finds that nobody would have expected. Take an incident that resulted in cattle bleeding to death after routine mild procedures - the source was eventually traced back to fungus affecting their feed. Useful in itself but good only for agronomy. Said fungus was used to produce the first bloodthinner on the market.
I think you're conflating enjoyment, in the sense you enjoy an apple, and enjoyment in the sense you enjoy a painting. It's a really common conflation, mainly because in our society, the dominant culture is enjoyable in the sense an apple is enjoyable. That, for me, is part of the problem. Nobody ever got inspired by eating an apple. Nobody ever learned new forms of empathy, or developed, through appetitive enjoyment.
Theoretically appreciative enjoyment could be empathic in a "rags to riches" sense but that could arguably be making the bitterness more palatable while undermining fundamental aspects say the sense of helplessness - even if it shows both ends and cycle of grudges and oppression. I guess it is like Shakespearean "write to please both the groundlings and the noble patrons" - hard to do, let alone properly but if done right offers the potential to create masterpieces.
Anyway I can see some point to the arguement personally even if I would phrase it radically differently - and it may not be entirely what you wanted.
It sounds like you prioritized maxing your GPA per unit of effort instead of engaging with the material in the way that's most beneficial to you.
Why on earth would you ever write an essay that's just meaningless jargon to you? What on earth were you expecting to learn?
The same thing happens in STEM courses. Three examples:
1. Projects. If your project incorporates the professor's research priorities/interests you can get away with way less work, but will also get less out of the course.
2. Programming assignments in CS. Professors rarely assign open problems; they're almost always assigning variants of important solved problems. Students will often look up the paper or wiki article where the original solution was described and then copy the algorithm and/or proof. This often results in high marks for low levels of effort.
3. There's an even better analogy in Mathematics, where students sometimes learn how to parrot the structure/rhetoric in proofs instead of actually learning the mathematics. Professors often don't do a careful enough job grading and give these proofs high marks because they sound correct and are mostly correct. However, it's clear from a second glance that the student has no idea wtf they're saying.
Students prioritizing low-effort parroting instead of seriously engaging with the material is pervasive in CS and Mathematics.
Some of that blame lies with the professor, and when students never learn that parroting/memorization is bad, I guess that's a huge failure of the entire education system (K-12 through higher ed). But IMO students who know that's the game they are playing (e.g., writing jargon-laden essays instead of writing in a way that is true to their own voice) don't have a lot of room to then complain that all they learned was how to play the "max GPA for min effort" game...
It depends on what you seek from university. Unfortunately
at the undergraduate level, its role has largely shifted from a place of learning into a provider of credentials that act as a signal to employers. With the high cost of tuition (especially in the USA), you want a good return on your investment, not waste your time engaging with the material at the detriment of your chances of success.
It seems a far better strategy to maximize your GPA while spending the least possible amount of time on the assigned material, in order to have time to pursue your own interests and be able to more effectively learn the things you want to learn.
You got nothing out of your humanities courses. The fault was not in your courses but in yourself. You certainly could have chosen to improve your prose, refine your theses, and strengthen the presentation of your arguments. Instead, you chose to GPA hack.
One of the great things about university is that you get what you put into it. Students who choose to invest in credentialism get credentials. Students who choose to invest in an education get an education. You got what you very explicitly chose to get!
The sad irony, by the way, is that your perceived downsides are rare and GPA hacking backfires. I have never heard of a faculty member refusing to write a letter for an engaged student just because they received a B or even a C; however, I've heard plenty of stories about faculty writing lackluster letters or even refusing to write letters for students who were very clearly GPA hacking.
Perhaps most importantly, your original post was written as an indict of humanities courses. In my post, I pointed out that GPA hacking is also possible in STEM fields. Your latest post justifies GPA hacking by appealing to the credential-granting role of higher education. Your argument justifies GPA hacking not only in the humanities but also in STEM fields.
A common saying in education is that cheaters mostly rob from themselves. IMO the same is true for GPA hackers.
One final tangential note about this excerpt from your post:
> With the high cost of tuition (especially in the USA), you want a good return on your investment, not waste your time engaging with the material at the detriment of your chances of success... It seems a far better strategy to maximize your GPA while spending the least possible amount of time on the assigned material, in order to have time to pursue your own interests and be able to more effectively learn the things you want to learn.
Doesn't this place you in a bit of a double bind? If you know better than the faculty, then GPA hacking isn't required because you already know the material. If you don't know better than the faculty, then it's very likely that you are frittering away your time on fads instead of focusing on fundamentals.
As I've already stated,
>> The sad irony, by the way, is that your perceived downsides are rare and GPA hacking backfires
Choosing to learn employable skills is important, but there's no upside to viewing education as "simply a method of acquiring the credentials necessary to get hired".
You seem to have missed the fundamental point of my last two posts: your view of how to approach education is self-defeating even if you view education purely as a way of increasing lifetime earnings!
I majored in Math and CS and took a smattering of other courses (one course away from a minor in each of Religion, Philosophy, and Economics).
The CS courses were by far the most useless. Perhaps 50% of the material I had taught myself before started undergraduate: everything up to and including data-structures as well as the pragmatic aspects of databases, OOP, and Networking courses. Another 40% I could've taught myself without much guidance (non-theoretical PL, non-optimizing compilers, and a non-proof-based algorithms course). The other 10% was project-based or independent study where the forcing function to work with people who were both above and below my skill level was the most valuable aspect of the coursework. (pre-college internships and freelancing gave me lots of experience working with others, but almost all of them were far above my skill level or else were completely non-technical).
The Mathematics coursework was by far the most useful. The lower-level content courses (Calculus and Linear Algebra) are essential for accessing most of the interesting parts of the software industry (self-driving, robotics, research software engineer, etc.). Also, the proof-based coursework is full of useful thought patterns. Most of the difficult/economically important problems I've solved in my life were all relatively trivial extensions of proof techniques or theorems from my undergraduate Analysis, Algebra, and Logic coursework.
The content knowledge from my humanities courses was not immediately useful in an economic sense. However, that well-roundedness did get me an amazing spouse and extremely interesting friend group, though. I'm not sure how to put a dollar value on either of those things.
So economically, Math > humanities > CS. CS is more economically valuable but the CS education isn't more valuable because the core content knowledge that's necessary to get a well-paying job doesn't require formal course-work. Basic programming, SQL, and web frameworks are definitely accessible to passionate middle schoolers.
In career dimensions beyond net worth (i.e., interesting work), Math > CS > humanities.
In non-career dimensions, humanities >= Math > CS. My marriage and friends are mostly a product of the humanities aspect of my education and are each easily worth the $20k I paid for undergraduate.
> I think we need to create two tiers of higher-education
FWIW there are at least 4 tiers:
1. Vocational: Community College and most for-profit universities/educational programs (including bootcamps)
2. Advanced Vocational BA/BS: "industry-tracked" degrees at traditional universities. E.g., many universities have both a CS degree and an SE degree. The physics/engineering split is comparable. Theoretical technical courses are replaced with pragmatic technical courses and the more pure humanities courses are replaced with business/econ/industry-specific communications coursework.
3. Traditional BA/BS
4. Traditional BA/BS at elite academic institutions (e.g., top 10 CS programs tend to have bachelors degrees that are different in kind from small branch campuses of most state university systems)
My (European) experience is that academia provides a great work environment. If you have tenure, you're free to work on whatever you want, you have very low pressure (beside the pressure you decide to inflict on yourself), you're basically paid to learn new things and teach them to others. You can easily visit other institutes abroad. You get to meet interesting people. Age discrimination isn't an issue. The salary isn't on par with what you could make working for a company but it's enough. If you're ambitious and talented you can easily earn money on the side (consulting, starting your own company, taking a leave) and you can have the best of both worlds.
>You get to meet interesting people.
I want to stress that part because I don't feel it does the environment justice. You get to meet fantastic and very intelligent people from all countries in the world. During gathering events, meetings, parties, etc. I've been to, it's not rare to see eight different nationalities out of eight people around a dining table. You talk about science at first but very quickly discussions derail into comparisons between such and such country's perspective, differences in culture, jokes about the language, and so on. And it happens every time. Not all environments provide this.
Of course, you may argue that this diversity of culture and insights is counterbalanced by the community's homogeneity in many other aspects (e.g. the overwhelming majority of scientists are very liberal, even by European standards), and it's pretty obvious that your view of, say, Iranian or Russian people is going to be skewed if the only ones you've talked to are very educated and liberal 20- or 30-somethings working as expats. Scientists are known for their insularity, and while they're a demographic that's among the most susceptible to date outside their country, they're also very prone to date inside the community. The 'ivory tower' stereotype is not completely wrong.
But you did it and you did it with the "it's pretty obvious" qualifier.
This casual bias needs to be corrected.
One of the main points brought up here and elsewhere is that getting a tenure-track position is hard: you're basically in a giant tournament. And that tournament continues to actually get tenure. So I don't disagree with you, but you're starting from the assumption that one has tenure, which is very much not a given for the vast majority of grad students.
(Yes, yes, I'm aware this is not strictly true and that many lottery winners end up declaring bankruptcy and often report being miserable. But it works as an illustration.)
There are many things to consider. How bright you are, what is your research topic and will it be popular in a few years, who is your PhD advisor, are you willing to work abroad and so on...
But I agree that there is uncertainty for most people and graduate students should have an exit plan, e.g. keep their research or teaching work relevant to the job market.
There's a life outside people. Not just life but intellectual life. I'm so much happier not contributing to a system I hate.
And it is hard to discuss some of those ideas, as they are often established within the specialization. The reasons why those wrong ideas have become established have more to do with tradition than actual science. And the reason why they were accepted, is because they matched some observations. Afterwards they were coincidence or cherry-picked, but often not recognized as such. And alternative theories were not needed nor supported, due to the internal culture.
It is easy to compare it with programming problems. You can claim that your program works when you tested it for certain problems. But if no-one else is allowed to test your code, your program probably still has a lot of bugs.
In some cases there is even a fundamental problem. They started with the wrong models in the first place, and/or misinterpreted certain observations. And from there they started growing new models on top of it. It is like programming code that started with the wrong abstractions.
Most problems occur from theories that are developed without much practical experience. Sometimes even without any practical experience. A friend of mine is a computer-scientist who has almost no computer experience. I have heard many wrong ideas from him.
If I want to talk about the problems, I will probably step on someone's toes. Causing anger and the belief that I am just misinformed or under educated.
We like to project these problems onto psychology and such, but I can also find them in hard sciences.
Let me pick one in hard physics. There are some theories in physics that model field-lines as real. While in reality, fields are always continuous. There are clearly some mistakes in one (or more) of the underlying theories.
all the CS postdocs I know see a) pretty happy b) have a really easy back up of a high paying industry gig that they can pivot to because of currently insanely inflated demand for academics to make deep learning on the blockchain on the cloud on embedded devices doing quantum backprop
-these are ML/theory postdocs
in my department [we are an outlier probably] post doc wages are actually pretty comparable to industry i.e. >= 100k. 40k to 70k seems low to me.
I would expect the range to be more like 60-100k.
I haven't been on the real job market yet-but the key to getting a TT or postdoc position seems to be a) collaborators who want to hire you b) people who have heard you give a talk in person and are familiar with your research.
The other thing to note in ML is that it seems like a few people go to industry research labs for a few years i.e MSR/FAiR/google brain and then come back to the academy since there are industry roles that involve research and publication. for instance moritz hardt.
my personal plan for the first 3 years of grad school is to work really hard and try to keep both academia and industry open and after year 3 evaluate the number of publications I have and my current skill set to see if I can make it in the academy or shift more towards industry.
I think the biggest factor I would comment on is look very closely about what jobs the graduated students from the department you matriculate at AND more importantly the professor you want to work with go on to do post Phd. There are a lot of naysayers in this thread about the risks of an academic career and I share those concerns but I felt a lot more comfortable taking the plunge after I looked at the career record of the graduated students of my advisor. They were all either tenure track or had good industry positions.
edit: if your advisor has collaborators in industry groups I think it is pretty straight forward to get an industry gig.
One exception I can think of is what I call "closet programmers," which are folks that work in various areas which rely on software such as experimental physics, astronomy, molecular biology. and end up mostly doing programming because they love it. We have a bunch of engineers like that and they are all excellent :-)
Also those closet programmers are always really fun to talk to since their problems and culture are breath of fresh air.
I haven't really come across algebra in machine learning other than people applying it to deep learning.
I don't personally find papers like this valuable but idk I have never really enjoyed abstract algebra.
For areas of mathematics to do theory in ML (and to do ML more generally!)
-probability/concentration/hoeffding bounds [the PAC model] [Key]
-linear algebra [key]
-understanding machine learning by shai ben david
This book is nice since it really balances theory with a more practical understanding.
-An Introduction to Computational Learning Theory by kearns is a classic [low priority]. this is fun since the proofs are simple and deep but is very very far away from practical algorithms.
-convex optimization by boyd
[I think a good alternative to blogs is stalking course notes for other schools-they are very often
good learning theory course by avrim blum who is a big deal in learning theory and theory.
- tim roughgardens notes are a blessing for algorithms and theory [seriously he should have a patreon or something]
this is ben recht's blog and is filled with ML wisdom.
-https://blogs.princeton.edu/imabandit/ not quite learning theory but a lot of ML adjacent stuff
I don't read many blogs as I should tbh so other people can give better advice
This is the simons institute youtube channel. probably the best single location for recordings of TALKS in computer science-good amount of ML talks.
When I was in K-12, I got strong messages that education was generally good and, in particular, good preparation for a good career, e.g., financially secure, enough to be a good family provider and more, and a Ph.D. was the best degree.
Okay, around DC, with only a BS in math, I quickly had a good career going in applied math and computing. Often I could have done better if I'd known more about math and physics; so I did a lot of study on/off the job and learned a lot. Then I went for a Ph.D. in applied math and did well with it.
But, no way, not a chance, never, not even for a milli, micro, nano, pico second did I ever want to be a college prof. Instead I wanted to return to the career I had before grad school and, then, just do better at THAT career.
But my wife's Ph.D. program nearly killed her. So, to try to help her get well, I took a slot as a prof in an MBA program in a B-school near her family farm.
She didn't get well; that academic slot was as in the OP: The B-school wasn't much about business. The students were putting in time, money, and effort and learning next to nothing useful for career in business. For me, as in the OP, I was being financially irresponsible.
So, I left for business, an AI project at IBM's Watson lab.
Publishing? The OP claims that some of the publishing was unethical. Actually, in the papers I wrote as sole author, I didn't encounter that. My papers were fully honest, honorable, ethical, etc. Some co-authored papers were nearly all about hype and PR (public relations, getting known, pretending to have some good, leading edge research); the papers weren't actually wrong or unethical, and hype and PR are more common in business than in those co-authored papers!
Looking back, here's what's wrong generally with B-school: It has physics envy, wants to see itself as doing high end, pure research. E.g., too many B-schools are much more interested in the question P versus NP than anything having to do with being successful in business. B-schools are not clinical like medicine, law, dentistry, pharmacy. Except for some courses in accounting and maybe business law, B-schools are not vocational training or much help for a business career.
More generally in US research university education, the big push is for research that can get research grants, and helping the students do well outside academics is a low priority. Really, at a good research university, the students will be able to learn from the best sources the best theoretical foundations of some field; that foundation might, maybe a long shot but might, help for some work in the field, even in business. Yes, one really good application might be a strong pillar for a whole career. Fine. But day by day those research university foundations are not very good information about how to be successful in business, might be like teaching about the details of photosynthesis and the chemistry and thermodynamics of internal combustion engines as a foundation for running a lawn mowing service!
Since the OP is about psychology, my brother tried that. He got his Masters. He concluded that psychology knew a lot about rats that was next to useless for people and otherwise next to nothing useful about people, was good at what was not of interest and not much good at what was of interest. He changed to political science and got his Ph.D. there. He has never used the degree for his career!
What happened? At the high end US research universities, ballpark 60% of the university budgets is from the university's 60% or so off the top of research grants to the profs. The research grants are heavily for (A) the STEM fields, e.g., from NSF, and (B) bio-medical science, e.g., from NIH. (A) is mostly for US national security and got started during the Cold War after The Bomb and more in the STEM fields were so important in WWII. As a secondary effect, the research funding AND the DoD funding for systems has helped US progress in information technology, e.g, the early days of Silicon Valley. For (B), that is, bluntly, because Congress has to vote the money and a lot of members of Congress are old enough to care about progress in medicine. As a secondary effect, the technology in US medicine is likely the best in the world. Still, as in the OP, commonly trying to be a college prof is financially irresponsible and possibly objectionable ethically, etc. But, again, there are some really big bucks involved, both to run a university and to get research grants.
Some good news, for the taxpayers, about the research grants, especially from NSF and NIH, (A) the grant applications commonly get expert reviews and (B) generally the grants are quite competitive.
For the students, learning is MUCH easier now than ever before: We are awash in books, often in PDF and for free, video lectures, e.g., quantum mechanics at MIT, etc. And, especially in practical computing, the US workforce does a LOT of independent study and self teaching, e.g., commonly knows MUCH more about practical computing than research university computer science profs.
So, in short, we can leave the research universities to do far out research and for the STEM fields, even for theoretical purposes (e.g., do research with a day job as reviewing patents in Switzerland!), i.e., there's no law going to sleep at night for an hour thinking about how to resolve P versus NP, and especially for practical purposes, be largely self-taught.
For how to run a successful lawn mowing service, bath and kitchen renovation service, auto repair and body repair shop, ..., building supply company, fast food restaurant, Web site, software house, etc., people get to learn from their parents, early jobs, what they can read, and especially what they can figure out for themselves. For using some algorithms for the traveling salesman problem to find a route over some ground to minimize mowing time, that might, but I doubt it, be okay for some farms but not for mowing suburban lawns!
> the truth is I understand why they have to do things that way, if they don't get published, they don't get money to continue work that should theoretically be beneficial down the road!
Scientists optimizing for what is likely to be published (read: what is likely to earn money), sacrificing elsewhere. Maybe if we payed them enough so that they don't have to do this to survive, then we wouldn't have this issue.
How do you do this in a society with scarce resources? With scarcity, you need to have a measure of success, otherwise everyone could just say they were a scientist and get a six figure salary to do nothing. Any metric you choose will be gamed.
This worked hundreds of years ago because the only people that could get educated were the wealthy, so we didn't need to worry about supporting scientists with public funds, they supported themselves.
As a current grad studrnt, I'm calling BS. We (meaning academics, even the ones gaming the system) _know_ what strong, reproducible, well-designed research projects look like. It doesn't take a genius to develop an experiment, run it, document what happened, and publish _all_ of the results. Hell, you learned the basics in middle school -- pick an independent variable, tweak it, then see what happens to the dependent variables.
Now it's obviously not that simple -- isolating just one independent variable is practically impossible much of the time -- but the concept of documenting EVERYTHING that you tried should be obvious. There's absolutely zero impetus to share all of that information, however, especially when you only have 12 pages in a conference paper to write up everything. So instead you go fishing, and you run a boatload of tests, and you see which ones "solve" the problem, and you write up just those tests, ignoring all the failures and the false starts.
It pisses me off, to be honest, and it's just one of the many reasons I'm trying to get out of here as soon as I can.
We do know what good research looks like, that's not the problem, at least not directly. We're both told "go do good research" and then we go and each write one paper. The government then says "okay this is fine, but next year, I can only fund one of you, so, now whoever brings me the most papers next year will keep their job, and the other will hit the bricks." This is where the problem starts.
By this token web2.0, ACID, Cloud et al. have almost no buzzword value left. Even Bitcoin is very 2017. Which is why the only logical solution is nano-engineered quantum genetic algorithms with dark matter entanglement in P-space.
I can't even.
This has always bugged me about our implementation of the scientific method. There's no reason it has to be this way; it's just convention.