Hacker News new | past | comments | ask | show | jobs | submit login
Peter Higgs: I wouldn't be productive enough for today's academic system (2013) (theguardian.com)
378 points by ramgorur on Oct 16, 2016 | hide | past | favorite | 141 comments



When I was first exposed to the research "environment" during my Diplom studies (undergraduate - graduate, early to mid 90ies), I immediately recognised that if you actually love research and knowledge, academia was the last thing you ever want to get into. No surer way to kill the spark.

Now that I've gone back to do my PhD, the only reason I can do something I consider meaningful is because I am not a regular PhD student. Interestingly, that's also the feedback I get, though as "helpful" advice that while what I am doing may be both good and important, it is unlikely to lead to success in academia. With the implication that I should stop doing it and concentrate on something more reasonable. Fortunately, I am not particularly interested in success in modern academia, so I get to do something I consider both good and important.

A related issue is that there really is no such thing as a senior researcher. Instead, professors are turned into research managers, responsible for helping their charges' careers, who then also turn into research managers. Actual research appears to be mostly a still not entirely avoidable side-effect. (And this seems similar to the way the only real way to advancement in industry is to switch to management, all dual-track equivalence rhetoric aside).


> Now that I've gone back to do my PhD, the only reason I can do something I consider meaningful is because I am not a regular PhD student.

Interesting, your experience matches mine very closely! I once quit a doctoral program in disgust but once I had enough money to fund myself, I could go back to become, perhaps similar to you, a "not regular PhD student" and create a more suitable situation for myself. Not only did I become free in my choice of research topic and collaborators/supervisors. More importantly, I was able to reject proposed, "faster" topics that would have led more immediately to bite-sized, publishable results. As a result, I was able to find and work on something I believe could be fundamental in my field.

On a sidenote, I just started re-reading Kuhn's Structure of Scientific Revolutions. I am growing more and more convinced that the right way to do something important in today's academic environment (that's the link to OP) is the sort of half-half position that seems to be our shared experience. If you are too much of an insider, you might end up doing "normal science". If you are too much of an outsider, the risk is high that you are/become just another crank. Half-half seems like a good position. Maybe a few decades ago, it was still possible to hold an academic position and maintain one's independence. Today, a full-fledged, conventional academic career seems hard to square with a free spirit, what with all the dross of admin and the drag of teaching duties.


> you might end up doing "normal science"

Note that Kuhn doesn't think that normal science is bad in general. Normal science - incremental, systematic research within a scientific community with clear, well-defined rules and solution methods - is necessary and worthwhile, IF you have a good paradigm. When Scott Aaronson publishes a new proof about some obscure or exotic quantum computational complexity class, that is valuable normal science. When the Intel researchers in the 80s or 90s came up with the umpteenth optimization trick that enabled next year's scheduled Moore's Law microprocessor speed doubling, that is valuable normal science.

The problem comes when researchers try to do normal science without a good paradigm, either because they never had one to begin with, or because their old paradigm broke down. For example, it looks like the statistics-based paradigm that guided research in psychology is breaking down, because of the replication crisis. Kuhn's advice to people in this situation would be to re-examine the philosophical basis of the field, NOT to continue on doing business as usual.


> Note that Kuhn doesn't think that normal science is bad in general.

Yeah, I did not write that well. I was thinking too much about myself. From a population point of view, I agree with you (or Kuhn) that "normal science" is a necessary and valuable part of the enterprise. It would be inefficient to have everybody constantly questioning everything. You need the "normals" to milk a given paradigm for all its worth and the rebels to keep the paradigm, if you will, "on its toes" and replace it when the time has come.

It seems to me commerce is not dissimilar. Not everybody can or should be a wide-eyed entrepreneur. You need people who compete within a business model (in the process wringing all the efficiencies to be had out of it) as well as entrepreneurs who create and thus compete between business models.

I wonder if there is an optimal proportion between entrepreneurial/rebellious "predators" and "normals".



> Kuhn's advice to people in this situation would be to re-examine the philosophical basis of the field, NOT to continue on doing business as usual.

I am not familiar with Kuhn but the statements you echo resonate deeply with me. I hope in some places academics do shift their paradigm (when necessary) but from what I've seen at uni the outlook is bleak. Researchers do x for years, built up their reputation by doing so and at some point their ego is so big that they don't dare asking themselves whether they've wasted years doing something very stupid. Even worse, these people favor PhD candidates that are willing to follow their footsteps.


> Interesting, your experience matches mine very closely! I once quit a doctoral program in disgust but once I had enough money to fund myself, I could go back to become, perhaps similar to you, a "not regular PhD student" and create a more suitable situation for myself. Not only did I become free in my choice of research topic and collaborators/supervisors. More importantly, I was able to reject proposed, "faster" topics that would have led more immediately to bite-sized, publishable results. As a result, I was able to find and work on something I believe could be fundamental in my field.

This is really the way to go. You have to love your field so much that you're willing to save up enough to support yourself to do independent research in your field. I've been thinking of doing that. Saving up enough to support myself to work on a research-esque project (i.e. not a startup with necessarily any revenue-generating potential). Your love for your field (in my case, computer science) has to be so great that you're willing to forego the currently ridiculously high developer salaries for a number of years to try to advance the field as a whole (which ideally benefits society on the whole, but doesn't benefit you in a financial sense anywhere as much).


It is of course more exciting to be working on revolutionary science than normal science—but I wonder if normal science seems worse than it needs to be just because it's more susceptible to a kind of bureaucratic watering down—e.g. like pushing participants to sacrifice quality in their work for publishability, as you mention.


Same experience - i did my phd after building my business, which allowed me to be an academic "tourist". Academia is largely a bureaucratic job, with short breaks in which you do research. Just the amount of time spent on formatting papers, writing grants, catching up/organizing conferences, handling students is full time for most PIs. What's worse, funding agencies and grants sometimes explicitly promote this constant "career building" track instead of, you know, plain simple curiosity-driven research.

I also have the sense that academics live in a sort of bubble, prioritizing concerns that are secondary, such as ego contests, impact factors, building CVs etc. (to me at least, this bubble is reminiscent of another microcosmic bubble sector - the army). Of course there are among them true scientists, with intact childlike curiosity and passion, but they are increasingly hard to spot in the overcrowded academia.

One needs to be a part-time scientist so that he has enough time to concentrate on a subject.


   prioritizing concerns ... impact factors
Rest assured that researchers don't prioritise this because we want to, it's because society decided (by voting) that we have to if we want any sort of career progression, funding for students etc.


Who is this society you refer to? When did it vote anything regarding impact factors?

To a very good first approximation, academics are assessed by other academics.


   When did it vote 
It's a shorthand for saying the government that was voted in by the people. Although in reality that is also a major simplification, because funding requirements are typically written by administrators in concert with lobbyists, and then waved through by uninterested parliamentarians.

   academics are assessed by 
   other academics
Yes, but the narrative those other academics have to provide has changed. It is now important to show non-academic impact. It is also highly helpful to have industry funding.

There are additional requirements that didn't exist in the past, like gender, ethnic and national quota.

There is Goodhart's law, and scientists develop ways of hacking this system, but there is no free lunch, and the effort that goes into successful bids for funding is major now.

I close with a link to a successful request for research funding from 1921:

https://dirnagl.files.wordpress.com/2014/01/warburg-dfg.jpg

https://dirnagl.com/2014/01/14/otto-warburgs-research-grant/


I 'm with the parent in this one. The journal situation is ridiculous. Scientists have complete control over it - the uber-vast majority of voters know nothing about journals. And it's also ridiculously easy for each and every scientist to effect change: just publish your results in open journals.


> every scientist to effect change: just publish your results in open journals.

This is a fundamental misunderstanding of the problem being discussed.

The problem is the pressure to publish frequently, which leads to low quality science. Many of the nobel laureates had at most two, often zero, papers in graduate school. Now, they might require a graduate student to have seven or eight papers to get out. Some of this is a shift to sharing more data, but a lot of it is just science of minimal value being packaged up and pushed out.

And the drive for this does come from generic metrics, that rely not on scientific understanding, but on the ability to count. The government sometimes prioritizes funding by papers/$ funded for grant, papers/year, etc.


You are correct , and i indeed did not complete my thinking. If we can't avoid over-publishing, then at least publishing should be open and fast as long as it's technically correct. Personally i find high-IF journals to be a very bad 'filter' for science. Just publish the damn paper, as long as its methodologically correct, and let the entire community assess its significance, not just 3 overworked reviewers and an editor.

I think government officials don't have a direct interest in supporting the current inefficient publishing model, but it's a model that scientists themselves have accepted and reinforced since the 60s. If scientists put forward an alternative model that works better, i dont think governments or the wider public would object.


Well, the senior scientists who sit in evaluation committees and on editorial boards have some control over it. In many fields, only publications in "top" journals matter. If two persons then apply for the same job or grant and one has a paper in Nature while the other has a paper in Bob's Open Journal, the committee will go with the Nature paper every time.


The senior scientists evaluation committees are quite happy with the status quo and dont want to lose their power in their fiefdom. Search Bing for [PNAS Fiske]


I agree, it's a social construct. In my corner of science, publication in open journals and conferences is the norm. That's because a couple years ago senior researchers got together and said "let everything be open". This is orthogonal to what I'm whining about.


> Scientists have complete control over it

Social science has shown that a group of people can follow some rule even if all members disagree privately. This happens because there is social status, peer pressure and incomplete information, and so people tend to think that the others agree with the status quo.

I think because it seems so counter-intuitive you're ignoring this empirical observation. Scientists simply, believe it or not, don't have "complete control" over the situation.


You are referring to the academic microcosm, but the OP said that scientists act so because "the wider public" voted for them to do so. It's true that the insiders of the academic bubble are living the tragedy of their commons.


I am currently a PhD student. In my experience, I've had to navigate my way in unusual directions in order to work on things I'm interested in, rather than just go with the flow to collect publications.

I talk with several of my peers who are unhappy with the choices academia offers, and would be open to the thought of pursuing research without being a career academic in the conventional sense. The biggest hurdle that comes up is the feeling of being alone -- not knowing other people who are trying to hack the system and do things differently.

Since there seems to be some traction along the lines of independent research pursuits here, I'd like to pass out a quick survey I just created, for people currently trying to pursue academic research on their own path (or interested in doing so): https://goo.gl/forms/wLvkGPstpCCBJY513

It's very preliminary, but I thought it might be a good start to get some discusion going. I invite responses/comments, either here, or on the survey form.


Also a current PhD student, and I've definitely found myself frustrated with the way that academia seems to be at odds with my desire to work on the things that I think are not only the most interesting but the most important. I look around and see the profs working on so many projects that just seem so small and limited. I'm in psych and we have a problem where dozens of people are running competing/overlapping experiments that are all critically underpowered and/or divided into least publishable chunks with little thought to broader importance or applicability.

The whole situation really makes me want to ultimately do my research outside of the establishment, but I recognize that collaborations and publications are a key part of credible science. In order to do that I figure that I will need a platform for recruiting participants and conducting experiments for the kind of research that I would want to do. Ideally the platform would be compelling enough to bring in researchers in academia (or elsewhere). So I'm working on that platform. That's the programming project that brought me to HN.

I filled out your questionnaire. Not sure if my project is related to your frustrations, but it seems somewhat close. Either way I share your pain.


I think almost every graduate student has dreamt at some point about doing academia start-up style. The question is a) can you get funding and b) can you get published if you do and you don't have super star scientists on board?


Well, I'm personally funded for the next 3 years, so as a one-man shop, I'm fine for awhile. In terms of actually getting funding, that's another question, but my university has a fair amount of startup development including a network of alumni who act as angel investors for in-house projects. There's also a local startup scene that I could conceivably look into.

As far as publishing goes. I've got papers that I'm working on now, and I've got several collaborations going that extend beyond my current program to faculty in other universities. So I'm building a network of people who are interested in the kind of research that I am doing that I could potentially facilitate.

The bottom line is that if I'm sitting on data or access to data, I'm going to have no shortage of academics who would like to get involved. They don't need to be superstars, just researchers who publish, and I'm doing pretty well in that department already. My immediate well-funded faculty supervisor's eyes light up when I talk to him about the kind of datasets I could potentially get for him. So there's potential funding on the academic side as well, at least on small to moderate scales.


Can you elaborate on the platform you're building? What is it, how does it work, what problems do you think it solves, etc.?


I can go into a little bit. I don't want to totally out myself when I'm still a months away from a product that I could demonstrate.

Bottom line is that the online experimentation tools for large swaths of psychological research (questionnaires mainly but also cognitive tasks) aren't well handled by the big current players in that market. We're just not a big enough value proposition. The tools aren't designed for us. We really need for participants be able to just login to the experiment and be taken through the whole process. We need tools to track their participation, send reminder emails, etc that are built in to the experiment rather than cobbled together around it. And what most of us would like is for it to spit out a cohesive dataset at the end, ideally with at least some of the data-cleaning done (and documented) for us. A lot of research is cobbled together from a bunch of different tools that spit out separate chunks of data that need to be painstakingly re-constituted at the end. This may not be true for all academic research, but it is for the research that I'm targeting. So there's a value proposition there with a market of tens of thousands of grant-funded researchers.

In addition, I have access to several local sources of potential data collection (outside of academia) that are extremely appealing to an important subset of the researchers who might use my platform. I'm currently working with three such groups, but if things worked well there, there are hundreds if not thousands of such places in North America and Europe (and elsewhere). So I'd potentially be able to offer researchers not only a fairly comprehensive suite of tools for collecting data, but I'd have a growing network of recruitment sites so researchers could really just design it and click 'go' (and pay me).

This isn't all of it, just the research side. I'm pretty sure that the same set of tools can be used to go after a couple other non-academic markets as well. I've got at least 2 other non-academic revenue opportunities that I think are pretty solid and can be explored in tandem.

Then there are the long-term plans that change everything... (cartoonish evil laugh)


Thanks for the reply. I understand that you may need to hold your cards a little close to your chest, but it's cool to get an idea of what you're working on. What you're proposing sounds really useful. I'm excited about the prospects for what another commenter called the startup approach to science; so many of the ways we're doing research and science could be improved drastically with newer, smarter platforms and methods. Good luck with yours, I hope you're wildly successful.


I appreciate the interest. Thanks for the well-wishes!


On the other hand, I know many happy and successful professors doing excellent work inside the system. Most of the well-known people in my field are really very good at what they do, and the work is progressing quickly.

Deep learning is a big thing, yes? Hinton, Bengio, LeCun, etc. All that work originated in universities under completely conventional structures. Companies snatched it up once it was clear it would be the next big thing.

Self driving cars might have some impact? Thrun, Fox & Burgard were key players in probabilistic state estimation, wrote the Book, and trained the car people at $(BigCo). At universities.

Just to balance out some of the 'academia has no idea what it's doing' statements in this thread. It's not perfect but it's not ineffective.


Well computer science seems to be in decent shape right now. Psychology though...


> A related issue is that there really is no such thing as a senior researcher. Instead, professors are turned into research managers, responsible for helping their charges' careers, who then also turn into research managers. Actual research appears to be mostly a still not entirely avoidable side-effect.

Although this is a bit depressing and one of the reasons I left academia, is this not partly a reaction to the changing nature of science? With many fields well established with strong foundations there are arguably fewer revolutionary ideas required. It looks like science is going to be more evolutionary and less revolutionary, with emphasis shifting from the individual researcher to large teams.

Take for example the LHC where managing the huge teams and technology is (I imagine) significantly more of a challenge than interpreting the data or deciding what experiments to run. A bit of an extreme example perhaps but large projects are probably going to become more prevalent as most of the lower-hanging fruit is claimed.


But we've seen time and time again that great ICs are not great managers, and what I understand of tenure track positions is that they are based on IC performance.

This seems largely a byproduct of the funding model, which seems unlikely to change in academia.


I think you're probably right that there are fewer revolutions per capita and more bureaucracy and politics, compared to before WW2 -- but how much causality there was in each direction, I don't know.


Sorry, but your argument loses serious credibility beginning with the first sentence. The last thing, really? How about, say, making cardboard boxes for a living? Flipping burgers? Don't get me wrong, I understand the point you are making, in part because it's been made more articulately by others many times already. But you are simply wrong when you say that "there is no such thing as a senior researcher". I personally know at least four -- late career academics who still engage deeply with research, have their own ideas, write single -author papers, and all the rest. It's true that many others do go the route you have described: possibly more than is optimal, though there is some value in a field having a few caretakers (or managers as you call them) which I rarely see mentioned. But the situation is not the cartoon that you make it out to be. Not by a long shot.


> The last thing, really? How about, say, making

> cardboard boxes for a living? Flipping burgers?

Or working as a patent clerk, maybe?

Yes, pretty much anything outside the academic process was what I meant, so that your livelihood does not depend on the outcome of your research, because then you have to tailor your research directions towards the safe and accepted. And so your straw men actually would work, though suboptimally because the income is probably too low to allow you free time.

Also, misquoting me by leaving out the "really" makes your argument disingenuous, because the "really" made clear that the statement is not a 100% but a trend (if a somewhat overwhelming one), and 4 exceptions (names?) don't exactly disprove a generalization.

It would also be interesting if you see your four examples as holdouts from a different era, or general exceptions. I'd wager they're more of the former rather than the latter, and that their number is dwindling.


> Or working as a patent clerk, maybe?

Does it still work though? In the days of Einstein, I imagine you could slack off easier in such a position than nowadays.

The problem these days is that we're inundated with busywork not only in academia, but in most occupations.


And wouldn't the smartest fraction of the population, or at least the researchers think they're the smartest, have the best skills at shamming and avoiding individual busywork and finding careers where they can ruminate in relative peace compared to working in a busy call center, for example?

Even in general at very large scale, a world of limited and shrinking economic opportunity but unlimited and expanding education (well, until the future student loan crash arrives), means you can try to pile on busywork but the people shamming and pencil whipping it are better at shamming and pencil whipping and everything else, than any other cohort in the history of humanity...

As a cultural touchstone look at the movie "Office Space" still relevant today. He stares at the walls for an hour or two every day trying to look busy. Sure you might have to pass three interviews with ten people while proving P=NP and writing a javascript compiler to GET the job, but to DO the job you have to update TPS report headers every couple weeks and that's about it, other than loading letter size paper into the laser printer and staring at walls looking busy. Not to spoil the plot of the movie but the lead actor is not a theoretical physicist although if he were, it would be a dream job for him, work about an hour a week then look really busy the other 39+ hours while working on physics.


Reddit (and to some extent HN) are proof that there are plenty of white collar jobs with oodles of free time.


It's not the right kind of time, though. Reading and occasionally posting usually don't require sustained concentration -- they can mostly be done as a break from that. (Of course there are times when an interesting and difficult topic shows up, or a debate arises that one wants to participate in.) So for the most part, it's a break from work. Switching from one nontrivial technical task to squeeze bits of a much more difficult task into one's day is much harder.


Are you seriously claiming that making posts on Hacker News and Reddit requires anywhere near the intellectual rigour and deep thought required for "paradigm-shifting" changes?


I've personally found jobs with high levels of physical tedium good for contemplation, and they don't impinge on time outside work with stress. I can see making cardboard boxes pretty rewarding for a budding philosopher with access to a good library and some peers to converse with.


I love contemplation, and create physical busywork for myself so that I can do it. But that's slacking. To do real intellectual work I need to stop and use pen and paper, or maybe a computer.

Even if contemplation is enough, most "manual" jobs require you to use your brain. Not necessarily in an interseting way, but enough to distract you from contemplation. It can't all be moving heavy objects from one pile to another.


Arguing against unreasonably literal interpretations of broad statements only serves to water down all conversation to an unending series of disclaimers.


One of my professors recently reached the legal retirement age, but applied for a senior professorship so he could keep doing research - or, as he put it, so that he "would be allowed to keep working". I found it very encouraging to hear that positive attitude in someone who has been at the institute for the past 30-40 years.


Are you retired and doing a PhD for fun? Or planning to return to (unrelated?) industry after your PhD?


I can feel him. It is not just in science, but in business as well. People want results, not understand their problems. And they want them yesterday, despite only telling you about it today. In some regards it's just a trick to keep you working hard for them. But still it's neither fun nor actually productive.

It's great to work with freelancers though, since from the good ones you can learn how to handle that situation: Don't give users, customers, management enough information to really control you, understand that you are the one providing the results or not, and don't talk about your work but only about that part of the results that they actually care about. This way they don't consider you unproductive or incompetent, but that they need you. They still hate you, but they hate you because they need you. And that you can use to actually solve problems, take the time you need, and thereby provide the results they really need.


> but in business as well

I think business is more ruthless. It won't let you waste too much time in a false belief that your product or science will become successful. That leads to faster iteration.


Science isn't meant to be done in order to become "successful" short-term. As for businesses, I'd argue that iterating too fast "because markets" can lead to your product doing nothing useful but exploiting market microstructure to make money - i.e. you start to build throwaway shit that waste peoples' time, but they figure it out only after they pay you.


In the pharmaceutical business at least, huge sums of money are frequently spent on strategies that even graduate students could point out as being unlikely to succeed. The reasoning behind these unproductive decisions is almost always "markets".


Can you elaborate on how this works? Do you mean there is (in effect) some kind of market for vapourware, so companies work on things that sound good even though they predictably can't work? Who benefits?


The people who benefit are those who can raise funds/support for the next round.

It is more difficult to obtain objective evidence about biological sciences and its subset, medicine, than other fields such as maths or chemistry. This is due to the inherent complexity and poorly understood processes by which living organisms thrive, or not.

So, there is abundant interest from non-experts (i.e. potential investors or market analysts) in gaining a slice of attractive markets such as currently unmet conditions e.g. lung cancer or Alzheimer's disease. Such people are sometimes more easily swayed by the proclamations of "famous professors" than graduate students might be.

So, yes, there is in effect a lot of "vapourware" in the drug business. And much of it succeeds precisely because there is insufficient expertise on the behalf of investors, who get their knowledge of the underlying science third hand, or worse.


The OP topic is complaining that acedemia is too reluctant too fund things that are unlikely to succeed.


Maybe yeah. I haven't worked in science yet, so zero experience with that. In any case the feeling may be the same though. If the "super productive" business is faster than the "super productive" science, then the old school business was probably also faster than old school science.


> People want results, not understand their problems.

Software Managers more commonly known as PM


Peter Higgs published about 5 scientific papers after his Nobel-winning work in 1964 until his retirement in 1996, none of which were particularly impressive. I think this is below any reasonable standards, not just below contemporary academic standards. Therefore, barring special circumstances like an exemplary teaching record, in my opinion Edinburgh University would have been right to sack him and replace him with a more productive person. In short: I don't think that Higgs nearly getting sacked is an accurate indication that academia has too much of a 'publish or perish' culture.


This is a truly idiotic thing to say.

"none of which were particularly impressive"

First who are you to judge quality of his work, that guy has a Nobel. Imagine if they did follow your utterly idiotic suggestion and did kick him out, other universities who recognize importance of his work and would instantly hire him. Years later when he would actually win the prize, Edinburgh University would look crazy for kicking out a Nobel prize winning physicist.

So no Peter Higgs is a genius, he knew importance of what he had achieved and took leisurely path, nothing wrong in that. Edinburgh University knew importance of his work and correctly decided that keeping him was a great investment.

The fact that you think that a researcher who won a Nobel prize somehow did not work "hard enough" in later years and should have been fired is a great indicator of dysfunctional academic culture.


Beyond his Higgs Boson papers, the rest haven't been cited particularly much. Not indicative of a genius. Sure, no-one is calling him stupid, but there are potentially many other people who have changed the field as much, or more than him. Just because he won a Nobel Prize for one work doesn't make him immune to criticism.


There is valid criticism, and then there's "this guy did not do anything after making a Nobel worthy discovery, should have been kicked out".

Also "haven't been cited particularly much" is utterly bullshit. Unless the goal is to optimize for mediocrity (3 papers each year with 20-50 citations each) being better than one break-through Nobel worthy work. Frankly citations are very very easy to game if you are a professor with reasonable means at a good university, and are a really really bad indicator of success.

Since he already had a very successful paper maybe he wanted to write risky papers. In any case that's not worse than other researchers who write cookie-cutter papers adding extra terms to equations that are guaranteed to be cited by the next guy adding even more terms. By finding these minor faults with a Nobel award winning researcher, you are displaying the same dysfunctional thinking that has plagued academia.


I'm not sure why you have this idea that academia is resistant to paradigm shifts; they happen all the time. Any papers that are "risky" enough to start such paradigm shifts end up getting cited tons.


Its not me but rather its you who has the wrong idea that all papers with 50 citations are good. Or rather the more citation == better research.

Frankly almost 95% of papers are crap and better off not having been written, had it not been for publish or perish culture, or "lets count papers/cites to shame a Nobel award winning researcher culture" that you espouse.

Citations are a self reinforcing metric. Once a community starts counting them, the only way to succeed is to publish more which in turns leads to higher counts.

There is nothing wrong in publishing a good thorough paper over 4 years maybe slowly updating it as a working paper as done in economics.


I am not saying that all papers with citations are good; rather, that most good papers get tons of citations.


I wish papers were "running", as in a wiki page with developments being incrementally added. It'd certainly help reduce the amount of redundant reading, and much greater coherency.


Except it wasn't a great investment. Unless he was doing something great besides research.

If you put in amazing groundbreaking work in the first years of a startup and then started slacking off. How long before the company that you helped build has a right to kick you to the curb? A year? Five? More? How about 30?

5 papers, even good ones is pitiful for 30 years. Sure, it's possible that they were the culmination of long brilliant research projects. If that's the case then great. If it wasn't then the University has every right to ask what's up, which they did and decided to keep him on anyway.


Hahhahah if you think having a Nobel award winning professor is not a great investment, you frankly don't understand how world works. There are Universities in some part of the world that will gladly pay more than what Edinburgh U. could, just to have him listed on the website.

Actually I have met people who did put in hard work in the initial years of a Unicorn statup. Guess what the value of their equity greatly exceeds the salary. And if the startup were to even kick them to curb they would live happily on their enormous earnings. Doing great research to an extent is similar.


Let's stick to the facts. University employs a Nobel Award winning scientist. He spends the next 30 years doing very little. The University says "hey, that's not cool. We thought you were going to keep doing research. Everyone says they're being too demanding."

I didn't say Nobel Award winners AREN'T a good investment. I said this one WASN'T. The University clearly thought he was going to produce. He didn't, and they were unhappy... with the return on their investment in him.

Right. The equity of early employees is high. My example wasn't about compensation, clearly. It was how long you continue to keep an underperforming employee in a position just because they did good work in the past.


>> I said this one WASN'T.

HAHAHAHHAHHAHAHAHAHHAHAHH, ROFL

Come on dude, I am sorry but you are wrong. Not just wrong but it seems you fundamentally misunderstand how world works.

See Peter Higgs is a Genius. The moment he published that paper, he knew that he essentially could do nothing, and the University would never "risk" losing him and waste potential payoff, even worse be ridiculed for firing a potential Nobel laureate. Also as wikipedia shows during all those years his research on Bosons kept him winning awards.

The University got far far far more than what they could have expected for when he eventually won the Nobel prize. Keeping him employed is equivalent to holding an option with enormous expected pay-off at small yearly recurring cost.

>> My example wasn't about compensation, clearly. It was how long you continue to keep an underperforming employee in a position just because they did good work in the past.

Except in academia having a seminal discovery, worthy of Nobel prize is equivalent to having large equity in that field. And from point of view of the University, losing such a person is equivalent to losing stake in delayed recognition of that work.


What am I wrong about? Seriously, I'm not sure what you think I'm arguing for. I'm saying that a university with an employee who does nothing for decades has the right to be a little pissed about that.

I'm not saying that having a Nobel Laureate on staff isn't worth something. I'm not saying that they should have fired him. I'm simply saying that I can understand the view in the earlier post where it was said that Higg's isn't a great example of why today's publishing climate is a bad thing.

The original article was about a Nobel Laureate who "wouldn't be productive enough for today's publishing climate". The implication is that this means that there is a problem with today's publishing climate. There may be, but Higgs isn't a good example of why. He was arguably not productive enough even for his earlier lower pressure time. He just put out one very brilliant piece of work that made up for it.

But that is in no way an indictment of the current academic focus on publications. There are other reasons and examples of why the current climate is a problem, but Higgs isn't one. The Higgs lesson in this context is "get a Nobel and you can do whatever you want". If you don't have a Nobel you're going to have to consistently produce research, and while the pressure to do so wasn't as high in the 70s and 80s, one paper every 3-5 years is pretty awful in an environment where publishing research is the goal.


You seem to have a problem with the concept of tenure. Why not just come out and say professors don't deserve tenure and they must keep pushing the rock up the hill grinding out paper after paper of incremental drivel?


I just tried to clarify my position in a post close to this one. Let me add this about tenure. Tenure is meant to protect researchers so they can do the resarch that they want and find most interesting. The idea is to insulate them from the vagaries of others opinions, as it is believed that this type of freedom is good for research. What it is NOT designed to do, and never was, was put tenured professors in a position where they can just not do research. While pressuring researchers to put out a half-dozen mediocre papers a year is probably not a good idea, as it lowers the quality of the field and turns science into a commodity, "less" is not necessarily the solution. In this case the amount that was being produced clearly indicated that research just wasn't getting done. If a researchers job is to do research, then expecting a certain amount of time and effort spent doing that is not unreasonable.

For that reason Higgs just isn't a good example of why the system is broken. The university was upset about his lack of productivity in the 70s, long before the current publishing environment became an issue.


> This is a truly idiotic thing to say.

We ask that you please leave these out of comments on HN. They're not OK and luckily not necessary.


Sure, but it wont let me edit to remove that line, now.


I would go even further. Look at history of art - most artists were just alcoholic bums, such as Verlaine. We would be better off as a society if we made those people more productive than writing poetry, for example, they could be soldiers.


I'm definitely not in the humanities camp, but I don't see how someone who produced work that presumably thousands of people found value in would be less beneficial to society than yet another person to die at war.


If you are being sarcastic, it isn't working. Did Verlaine have a lifetime paid appointment to not create any poetry? While other artists were soldiers because they needed income?


The point is it's about time horizon. The parent would decide that in early days of Higgs' career, he wasn't productive enough, and thus he would prevent him making the discovery for which he is famous. Likewise, at the time Verlaine, it would be more productive if he was just sent to war.

But from a longer time horizon, nobody really cares about those people in the 1950s having to support Peter Higgs living, or about people not having Verlaine in the army. We care about their results differently now.

I wish that timescale is the one thing that people who use the word "productivity" or "efficiency" would understand. There is no universal optimum, it depends on the time horizon you're looking at.

And while I am at it, let me make another comment to the article. I think today, we are so obsessed about efficiency of other people working, because there simply aren't enough jobs for everybody (due to automation). Since having a job (in a general sense) is customarily a requirement to being fed, most humans optimize towards not actual productivity, but an appearance of productivity. All these attempts to measure productivity are just a symptom of this problem - we desperately need something with which we can bang people in control of resources over their heads with, so that we could eat.

In other words, most jobs are changing from doing actual work into proving to other people in society that you did, ever so diminishing, amount of work. It's a shift in the focus of the competition, and unless we collectively realize that we can just lay back and don't need to actually compete, it won't get any better.


> The parent would decide that in early days of Higgs' career, he wasn't productive enough, and thus he would prevent him making the discovery for which he is famous.

You are stretching the meaning of my comment. Did you know that he published three papers in the three years before getting hired in Edinburgh in 1960? I only pointed out that (on any 'time horizon') his scientific productivity after 1964 was basically non-existent.

In fact I agree on some level with a lot of the comments here, including yours. I just think that Peter Higgs is not the right example to justify the cause.

> and unless we collectively realize that we can just lay back and don't need to actually compete, it won't get any better.

And therefore, sadly, it won't.


Ah, OK. But I am not sure what cause are we justifying here - the tenure? I guess the idea of tenure is predicated on beating somebody enough to have him go through graduate and postgraduate studies, so then only people who are likely to really want to work in the field will remain and they will continue working on their own.

It seems to me that there is tradeoff. We can look today at Peter Higgs and say, whoah, what a failure he was after 1964. But could that have been said in 1974? I am not sure. What if in 1975 he would come up with another breakthrough?

So the trade-off is in the timescale on which we judge the scientist's output. If you shorten the timescale, you decrease your accepted risk, and you can miss some rare wins (and I think that's where the Higgs example shines, because it is an example of such rare event). If you make the time scale longer, you accept greater risk of people turning badly. Idea of tenure advocates maximal practical timescale of such trust - one human lifespan, because with tenures arguably the wins are worth more than the accrued losses.


To balance your view, I think js8's sarcastic point worked very well. There is much more to a person's achievement than 'raw productivity' measured by some arbitrary scale.


But the scale isn't arbitrary; even mathematicians, who are notorious for low output, have more than one paper every two years.


I think your grasp of art history is a little off, but I'm also assuming that was sarcasm for the sake of it.

Verlaine? Not exactly a good example.


Yeah, I know, I am sure somebody can find a better example.


I think you highlight an important issue. I had an impression from the interview that he talks about going for years publishing nothing with considerable amount of braggadocio. The question is, what did he do all these years? If he pursued ambitious and risky avenues of research that didn't pan out, then he has all the rights to brag, as that's what tenured academics are supposed to do in theory (but still he would do everyone a favor by writing up some of that stuff). If, OTOH, he just decided to coast after doing some excellent work, then I am much less sympathetic.

Contemporary academia may be obsessed with publishing, but there is some middle ground between that and publishing a paper every 6 years. A certain webcomic comes to mind: http://www.smbc-comics.com/?id=2495


That is just anecdotal evidence. You need many more samples to be convincing.


We expect senior professors to train students. How do you train students? By getting them to do work and write papers. If they haven't published by graduation time, they can't prove they can get the job done. This is the main reason that senior people produce a lot of papers - they have a lot of students.

I've written a bunch of papers - I don't care any more about quantity. But I want each of my graduating students to have a publication or two so they can get a job. So we turn out a bunch of papers every year, and no, not every one is earth-shaking, partly because roughly every second paper is a student's first one and that's what they were capable of at the time.

It's easy to miss this kind of dynamic from the outside.


This is a great point. Although I would wager that the large quantity of mediocre papers your students have produced have not exactly harmed your own career. So, while your personal motives may be beyond repute, there will be many academics who think "This is the main reason senior people have a lot of students- they produce a lot of papers." Which is one of the dynamics that is troublesome. The marginal value of another PhD student to a senior professor is almost always fairly positive. Which means senior professors have an incentive to maximize the number of PhD students. This has two problems:

1) The value of the senior professor's mentorship to each PhD student diminishes (maybe almost linearly) with each PhD student added.

2) If each senior professor hires on average X PhD students per generation, approximately (1/X) of the those PhD students can become senior professors if the rate that academia grows is negligible compared to growing X times as large per generation. (And right now, X must be at least 20.)


Don't forget that a prof has to raise the money in ferocious competition to fund every student. Only successful profs can get enough money for a lot of students.


Success defined how? By publication record. Which leads to the rich get richer dynamic I think is troublesome. The more students you have, the easier it is to get funding for more students.


Rich-get-richer sounds unfair, but if you controlled studio time, would you give it to Radiohead or Vanilla Ice?

Some people have proven themselves to be really good at things, and it makes sense to give them the resources to do it well.

Most countries also have special funding contests only for young researchers to bootstrap themselves into the main contests.


Not if they make the students teach.


Jean-Pierre Sauvage (chemistry Nobel price 2016) said exactly the same thing recently. Working without pressure for 30 years in a state funded facility was central to his achievements.


Also brings to mind John Sulston (Physiology and Medicine 2002), working for many years in a relatively low-key "staff scientist" job before emerging to lead some of the early genome projects. Quotes from [1]:

>>> JG: You never entered the professorial track, but remained a staff scientist at the LMB. Did that suit your temperament?

>>> JS: Absolutely. It was a very good fit for me. If there was something that I thought was important and worth doing, I could just focus on that, and the only pressure was from family life. It meant you had time to sort all this out and not feel the pressure that you had to cut corners or guess.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1756915/


Max Perutz, as quoted in his obituary:

Impishly, whenever he was asked whether there are simple guidelines along which to organise research so that it will be highly creative, he would say: no politics, no committees, no reports, no referees, no interviews; just gifted, highly motivated people picked by a few men of good judgment.

http://www.theguardian.com/news/2002/feb/07/guardianobituari...

Quoted by Geoffrey West, of the Santa Fe Institute. The presentation below describes the SFI's own approach to interdisciplinary research:

http://fixyt.com/watch?v=w-8sbSPf4ko


Reminds me of the real history of Silicon Valley:

https://www.youtube.com/watch?v=ZTC_RxWN_xo

TL;DW The seeds are based on unrestrained spending of huge amounts of non-existing money by the government during WWII.


I don't think you understand how money works.


How can he know whether that's actually true? Everyone is troubled by the current state of science. How can we know whether that's not the standard old man's wish for the good old days?


This is a really good question.

As an old man in science who feels strongly that it was much better in the past (less admin, less evaluation BS, less pressure for funding etc), I do worry sometimes that the past wasn't better, and the undivided attention I could give to research as a PhD/postdoc was simply a tranquil niche my supervisor had carved out for his students (as I do for mine today), but that he faced similar pressures.


I don't believe that's true of Higgs, since he said it of a period in which he was still 'in the system':

    > Higgs said he became "an embarrassment to the department
    > when they did research assessment exercises". A message
    > would go around the department saying: "Please give a
    > list of your recent publications." Higgs said: "I would
    > send back a statement: 'None.' "
    >
    > By the time he retired in 1996, he was uncomfortable
    > with the new academic culture.


He retired in 1996, but judging from his CV it looks like he stopped doing research in the 1960s or 70s. [1] His last research paper was published in 1979, and the paper before that was published in 1966. Perhaps he was an excellent teacher, but it doesn't seem like he needed to be in a research university.

[1] http://www.ph.ed.ac.uk/higgs/peter-higgs


I read it as yours being exactly the view he disliked; that his lack of published research papers did not correspond to any lack of his doing research.


I don't know if he was doing research or not. (And I'm not trying to judge him either way, since I know almost nothing about him.) But when he was doing his Nobel-winning work, he was publishing on average about one paper per year. Then he published one paper in 13 years, and then zero papers in 17 years (and never again). He could have been doing research but unsuccessfully.

He didn't lose his job. (One might argue that he should have, but I won't.) He just said that he would have a hard time finding a different job. Isn't that obvious? If you are looking for a new job at a research university with zero publications for 17 years, you can hardly expect departments to be dying to hire you.

I think most people in that situation are not doing any research. Most researchers with ambitious research projects still manage to publish occasionally, not just for "the system" but for themselves. You want to know that you are making at least a little progress on something. Andrew Wiles, for example, published every year or two throughout his work, with the exception of one four-year gap. [1] No 17-year gap.

[1] http://web.math.princeton.edu/WebCV/WilesBIB.pdf


This may be a symptom of the hyper-connective world we live in today. You observe this phenomenon elsewhere outside of academia as well. There is an intense pressure to distinguish oneself from the greater body of people occupying the same industry as yourself.

Consider that a vibrant Github profile is essentially a prerequisite to be considered as an engaged professional in software engineering these days. It doesn't matter if you have ethical qualms about Github, or spend most of your day doing... your day job.

A 22-year old shouldn't have to fret about not having enough public repositories or not having contributed enough to open-source. I'd expect any craftsman to start producing their best work well into their career, and not at the start of it.


Today's business requirement that we look productive and emphasize (re)action over observation, thought and planning is the major reason why I often consider leaving the work force entirely (at almost 40).

Although it's not often reflected upon these days, innovative, useful software was actually being created before, say, 2005. And, that software wasn't being built at 48-hour hackathons. And, it wasn't being bolted on to an MVP. It was acceptable then (and usually required) that a knowledgeable and skilled team spend a lot of time contemplating requirements and design.

I'm proud that I had created important, useful software over the years. Some of it had use for over a decade. Some of it is still in use today. But, I often joke with friends that I never would have been hired today, and I never would have been accepted to the same university program.


When I was growing up it seemed to be common knowledge that substantial pieces of work take time, and therefore it isn't likely for one to produce many substantial pieces of work during a lifetime.

It remains to be seen if this unmanageable pace that we as an industry have adopted will make us crash and burn hard enough to not repeat our mistakes but as a species, we seemed doomed to repeat ourselves so I am slightly skeptical.

I'm glad to hear that you have works of pride.


Ethical qualms about Github? Could you explain? Should I worry about hosting my code there?


Beyond uptime concerns (which Github manages very well) is the fact that it is closed source, and whenever you use a closed system but that provides very good services for free you yourself is the commodity.

Beyond that, dozzie's remark is very much on point, the host could become tainted in the same way as for instance SourceForge. Fortunately, the distributed nature of Git means that you are unlikely to "lose" your source-code.

But really you have no control over your code, it is their liberty to censor it if they please.

The primary issue, as far as I am concerned, is that while I can clone and go elsewhere is that Github is the centralized hub for (FL)OSS today. Everything lives there. If you are not on there, you are de facto invisible.

A better alternative, as far as I am considered, would be a distributed net of "self-hosted" Gitlab instances.

I am not comfortable with all the bad press that Github has suffered regarding gender equality, but I feel that I am suffering from vendor lock-in due to the success of the platform.


If GitHub censors your code, or even if they don't, you can publish it elsewhere


Absolutely, but that is part of the ethical qualm. You are perpetuating the use of Github as the dominant (code) social network by using it.

If you continue to use it after Github has performed an act of censorship (or anything else that violates your values) and you continue to use the service then that act goes unpunished as they will keep their market share and may repeat that act again. After all, there weren't enough negative consequences to avoid doing so in the future.


Always. It's a third-party place that can sink at any time (if you don't believe this, why would GitHub be any different to what happened to SourceForge or Google Code?).

Your code should be hosted under your very own domain first (so called "canonical repository", or how I call it, the "source of truth"), and GitHub or BitBucket should be a replica. Thanks to git's distributed nature, this replica can be two-way, but still a replica.


Google Code shut down, and everything of interest moved. There was no problem.


Despite the altruistic acts of certain people, not every project was migrated. Those that weren't migrated are now lost and a part of our digital heritage died. I consider that a problem.



Changing the de facto social network is so hard though - no matter the bad press on Github's awful diversity policy ("not for whites to lead"; "white women are the problem") or recent research take-down (Gitlab followed suit, then retracted and apologised) or even that its closed-source and Gitlab is (can be) open.

If they both started from nothing today, there's no doubt in my mind that we'd all quickly settle on Gitlab, but affecting that shift when Github is such an established default is so much harder.


If they both started from nothing today, there's no doubt in my mind that we'd all quickly settle on Gitlab, but affecting that shift when Github is such an established default is so much harder.

Absolutely. Just yesterday I wanted to make a small change to a popular open source package, about 10 lines of C. Forked on Github, then sent a Pull Request. Because that's where the project lives, so that's where everyone goes. And Github has no interest in federating PRs in a way that they could be sent from Gitlab.


To be fair, Gitlab doesn't really have an interest in federating PRs either. But, "we" can make that be a public utility that exists.


Aye, the https://en.wikipedia.org/wiki/Network_effect is a powerful force in dictating human behaviour.


The core problem are grant procedures. Universities get a 20%+ cut of grants a professor eecieved. Therefore they wants professors who receive lots of grants. Grant awarders are judged on their ability to choose applicants that create value using those funds. An easy metric is papers produced. And those who publish lots of papers in the past is a good predictor of future production. So these are the applicants that get funded.

Another problem is the lack of funds for professor positions relative to the number of trained applicants. With so many qualified academics, departments have to use some metric to base decisions, and publications is more measurable (grant $$$) than scientific value.


20+% is optimistic. Overhead rates (at R1s in the US) are closer to 50-70%. I've heard tell of places with extremely specialized facilities (like deep-sea research vessels) that charge nearly 100% overhead (i.e., to spend $50,000 on "your" research, you need to bring in at least 100,000).


Yes. But there is variability per institute and so was trying to be conservative in the lower threshold of the percentage cut.

The university cut does make some sense. There are a lot of services on which PI's rely. Administration to handle hiring processes. Building maintenance and plumbing. But per usual, instead of just covering costs, it is viewed as revenue on which to grow the university (and admin pay).

Actually, I am not sure whether university cut factors into funding decisions at NIH and the like.


Sure, I just wanted to put the actual numbers out there because 20% or so actually seems pretty fair for keeping the lights on, taking out the trash, etc, while 70+% is....a lot. On top of this, a lot of the....infrastructure is still fee-for-service. The university may fund the initial purchase of an MRI scanner (or whatever) out of overhead, but individual labs still pay $400+ an hour to use it.

I don't know if the NIH or NSF takes overhead into account, but many of the most successful grantees also have some of the highest overhead so probably not.

On the flip side, I've heard someone claim that their department chair strongly encouraged applying for NIH grants, because the NIH-negotatied overhead rate is much higher than other funders (also, allegedly more prestigious, but you'd think $500,000 spends about the same, regardless of where it comes from).


He is probably right. The current academic environment is terrible and drives talented young researchers out of basic research. But still, it delivers. The question for me is, does it deliver more, less or the same amount of knowledge than during Higgs's time?


I'd say it produces enough. At the cost of the health of the researchers. Many people (including me) leave not because they do not like research, but because they have grown to hate the environment.


I can relate to that. I would go as far as hate, but definitely disappointed. As an PhD in CS I've seen my share of pressure and I agree completely that it's just not a way to do meaningful research as much as it's mass production of small incremental steps. It's debatable whether big breakthroughs Van happen this way


I totally agree with the 'mass production' observation. I have just submitted my thesis, on day one I was told almost verbatim, "do not expect to contribute anything of note". It feels like they do not even want you to try.


Big breakthroughs too are really just a collection of small incremental steps -- but the observer isn't aware of the intermediate steps.


Well, this is isn't always the case. Most of the time they are not really hidden, they are just a huge amount of new results not correlated with each other, that everyone knows about. Then someone comes and finds the thing that correlates them all. Think about special relativity and what will probably happen when we will finally solve the general relativity/quantum theory riddle.


Same here.


> At the cost of the health of the researchers.

Absolutely. We burn through our scientists (and those who want to become ones) at an incredible rate.


'The question for me is, does it deliver more, less or the same amount of knowledge than during Higgs's time?'

I don't think that is the right question. Say it produces more knowledge, but the fields that are studied are selected, not by how intrinsically interesting they are, but instead by how much low-hanging fruit they have. I don't think that would necessarily be an improvement.


There are more people in the world, so probably more schools, which means more teachers. So, there is probably more research volume.


To test that theory consider something unrelated like fine arts. Surely a ten million person city has more people with job title "artist" than a million person city, probably ten times as much. For the sake of argument say that when the western world measured populations in many tens of millions we got a legendary level playwright every millenia. Now that world culture has a hundred times the participants, we should have a legendary level playwright appear every decade or so. Just not seeing it. We have good playwrights, sure, but not legendary level every decade.

The upper end of mental scales, IQ or whatever, does not seem to follow the usual statistics for QA/QC of machine screw production or whatever simple industrial theory. On a purely numerical basis the world wide English Literature academic community should have maybe ten Shakespeare level professors active right now. If we do have those people, they don't have the job title "English literature professor". So where are they?

Part of the problem is "shoulders of giants". Perhaps I personally could have invented the calculus, but Newton beat me to it, so I'll never amount to much there. Under that theory unless we lose information as a culture, eventually academia will become perfectly sterile and actual advance will cease, although we can trust numerical metrics will increase forever into perpetuity as they always do unless confronted with reality, which in the case of that metric, is impossible.


> So where are they?

There are two questions here, I think: are there a proportionately greater number of people in the world today with Shakespeare-level talent for language; and do those who exist now have the same chance of being recognized and lionized as Shakespeare did?

I don't know how we could answer the first question, but I assume it's roughly correct, just on faith in the statistical likelihood of the normal distribution of IQ/talent holding true. The second question maybe gives us some way to account for why we don't seem to see them, though. For one thing, people with that set of talents have a much wider set of fields to go into, now: in Shakespeare's time, people with literary talent could write poetry, or plays (or, as he did, both), and not much else. Today, both of those are sub-divided into lots of sub-genres, and prose fiction exists (and is also highly sub-divided), and prose non-fiction, and song-writing, and journalism, and writing for tv and movies, etc. We wouldn't expect to see every current Shakespeare working solely as a playwright: we should be looking across all the fields that need literary talent, if we want to accurately count today's Shakespeares.

And we need to consider whether a Shakespeare-level talent today has any reasonable chance of getting Shakespeare's level of recognition and admiration. There is so much more content today that any one person's output can't get nearly the attention and focus that the smaller world of Elizabethan England could give to Shakespeare. Achieving 'legendary' status probably depends less on absolute talent level than on how much one stands out from one's competitors - that's much harder to do in a larger, more sophisticated world. You would probably have to be significantly better than Shakespeare today to stand out as much as he did in his time.

Going back to the sciences, the situation is a little different because it probably is the case, as you suggested, that the field is finite and there will probably come a time when there is nothing significant for a new Newton to discover, and thereby be recognizable as a Newton-level mind. That's less the case for the arts - the working space there is much larger, possibly infinite. But the increasing sophistication and the growing number of fields to work in is the same in science as in literature, and so is the consequent dilution of attention and recognition. I'm not sure that conditions now allow us to reliably recognize our geniuses.


That "shoulders of giants" problem sounds awfully like the thinking behind the Unibomber, no?

On why we don't have more Shakespeares alive today- what if we do? Would we revere Shakespeare as much as we do now if there were 9 others who wrote at the same time? In any case, I'm not so sure that Shakespeare would have the job title "English literature professor" either. Maybe they're TV and film script writers that we don't hear about?


I think it's producing a lot of technology; knowledge, I'm not so sure.


I ditched my academics career and went into public service. I expected to do actual science, instead I ended up studying bad papers which cost huge piles of money. I feel sorry for the naive and the burnt-out young academics. Potential wasted all the way.


A strong will is needed to do research in the dark, you are going to sacrifice a lot and as an outsider you will almost always be looked down, very often rightly so (because you miss the day-to-day peer review that at the very least avoids you methodological or execution mistakes).


It was cruel and unusual punishment by whoever it was that recommended that a scientist who doesn't watch television should watch that awful scientist insulting sitcom.


The problem is that universities and journals are ranked.

Rankings are based, in part, on metrics related to publications and citations and that's the utility function they try to optimize.

https://en.wikipedia.org/wiki/Publish_or_perish


"He has never been tempted to buy a television, but was persuaded to watch The Big Bang Theory last year, and said he wasn't impressed."

... [choppy panted laughter]


"...to this day he owns neither a TV nor mobile phone, and only acquired his first computer on his 80th birthday."

https://www.theguardian.com/science/2013/dec/06/peter-higgs-...

Perhaps he used his computer to watch it.


Eventually, the biggest consequence of a focus on incremental advancements may be that there will be no more 'Einsteins' or Higgses in a role-model sense.


A sign of the higher freq / lower amplitude of todays era ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: