Now that I've gone back to do my PhD, the only reason I can do something I consider meaningful is because I am not a regular PhD student. Interestingly, that's also the feedback I get, though as "helpful" advice that while what I am doing may be both good and important, it is unlikely to lead to success in academia. With the implication that I should stop doing it and concentrate on something more reasonable. Fortunately, I am not particularly interested in success in modern academia, so I get to do something I consider both good and important.
A related issue is that there really is no such thing as a senior researcher. Instead, professors are turned into research managers, responsible for helping their charges' careers, who then also turn into research managers. Actual research appears to be mostly a still not entirely avoidable side-effect. (And this seems similar to the way the only real way to advancement in industry is to switch to management, all dual-track equivalence rhetoric aside).
Interesting, your experience matches mine very closely! I once quit a doctoral program in disgust but once I had enough money to fund myself, I could go back to become, perhaps similar to you, a "not regular PhD student" and create a more suitable situation for myself. Not only did I become free in my choice of research topic and collaborators/supervisors. More importantly, I was able to reject proposed, "faster" topics that would have led more immediately to bite-sized, publishable results. As a result, I was able to find and work on something I believe could be fundamental in my field.
On a sidenote, I just started re-reading Kuhn's Structure of Scientific Revolutions. I am growing more and more convinced that the right way to do something important in today's academic environment (that's the link to OP) is the sort of half-half position that seems to be our shared experience. If you are too much of an insider, you might end up doing "normal science". If you are too much of an outsider, the risk is high that you are/become just another crank. Half-half seems like a good position. Maybe a few decades ago, it was still possible to hold an academic position and maintain one's independence. Today, a full-fledged, conventional academic career seems hard to square with a free spirit, what with all the dross of admin and the drag of teaching duties.
Note that Kuhn doesn't think that normal science is bad in general. Normal science - incremental, systematic research within a scientific community with clear, well-defined rules and solution methods - is necessary and worthwhile, IF you have a good paradigm. When Scott Aaronson publishes a new proof about some obscure or exotic quantum computational complexity class, that is valuable normal science. When the Intel researchers in the 80s or 90s came up with the umpteenth optimization trick that enabled next year's scheduled Moore's Law microprocessor speed doubling, that is valuable normal science.
The problem comes when researchers try to do normal science without a good paradigm, either because they never had one to begin with, or because their old paradigm broke down. For example, it looks like the statistics-based paradigm that guided research in psychology is breaking down, because of the replication crisis. Kuhn's advice to people in this situation would be to re-examine the philosophical basis of the field, NOT to continue on doing business as usual.
Yeah, I did not write that well. I was thinking too much about myself. From a population point of view, I agree with you (or Kuhn) that "normal science" is a necessary and valuable part of the enterprise. It would be inefficient to have everybody constantly questioning everything. You need the "normals" to milk a given paradigm for all its worth and the rebels to keep the paradigm, if you will, "on its toes" and replace it when the time has come.
It seems to me commerce is not dissimilar. Not everybody can or should be a wide-eyed entrepreneur. You need people who compete within a business model (in the process wringing all the efficiencies to be had out of it) as well as entrepreneurs who create and thus compete between business models.
I wonder if there is an optimal proportion between entrepreneurial/rebellious "predators" and "normals".
I am not familiar with Kuhn but the statements you echo resonate deeply with me. I hope in some places academics do shift their paradigm (when necessary) but from what I've seen at uni the outlook is bleak. Researchers do x for years, built up their reputation by doing so and at some point their ego is so big that they don't dare asking themselves whether they've wasted years doing something very stupid. Even worse, these people favor PhD candidates that are willing to follow their footsteps.
This is really the way to go. You have to love your field so much that you're willing to save up enough to support yourself to do independent research in your field. I've been thinking of doing that. Saving up enough to support myself to work on a research-esque project (i.e. not a startup with necessarily any revenue-generating potential). Your love for your field (in my case, computer science) has to be so great that you're willing to forego the currently ridiculously high developer salaries for a number of years to try to advance the field as a whole (which ideally benefits society on the whole, but doesn't benefit you in a financial sense anywhere as much).
I also have the sense that academics live in a sort of bubble, prioritizing concerns that are secondary, such as ego contests, impact factors, building CVs etc. (to me at least, this bubble is reminiscent of another microcosmic bubble sector - the army). Of course there are among them true scientists, with intact childlike curiosity and passion, but they are increasingly hard to spot in the overcrowded academia.
One needs to be a part-time scientist so that he has enough time to concentrate on a subject.
prioritizing concerns ... impact factors
To a very good first approximation, academics are assessed by other academics.
When did it vote
academics are assessed by
There are additional requirements that didn't exist in the past, like gender, ethnic and national quota.
There is Goodhart's law, and scientists develop ways of hacking this system, but there is no free lunch, and the effort that goes into successful bids for funding is major now.
I close with a link to a successful request for research funding from 1921:
This is a fundamental misunderstanding of the problem being discussed.
The problem is the pressure to publish frequently, which leads to low quality science. Many of the nobel laureates had at most two, often zero, papers in graduate school. Now, they might require a graduate student to have seven or eight papers to get out. Some of this is a shift to sharing more data, but a lot of it is just science of minimal value being packaged up and pushed out.
And the drive for this does come from generic metrics, that rely not on scientific understanding, but on the ability to count. The government sometimes prioritizes funding by papers/$ funded for grant, papers/year, etc.
I think government officials don't have a direct interest in supporting the current inefficient publishing model, but it's a model that scientists themselves have accepted and reinforced since the 60s. If scientists put forward an alternative model that works better, i dont think governments or the wider public would object.
Social science has shown that a group of people can follow some rule even if all members disagree privately. This happens because there is social status, peer pressure and incomplete information, and so people tend to think that the others agree with the status quo.
I think because it seems so counter-intuitive you're ignoring this empirical observation. Scientists simply, believe it or not, don't have "complete control" over the situation.
I talk with several of my peers who are unhappy with the choices academia offers, and would be open to the thought of pursuing research without being a career academic in the conventional sense. The biggest hurdle that comes up is the feeling of being alone -- not knowing other people who are trying to hack the system and do things differently.
Since there seems to be some traction along the lines of independent research pursuits here, I'd like to pass out a quick survey I just created, for people currently trying to pursue academic research on their own path (or interested in doing so): https://goo.gl/forms/wLvkGPstpCCBJY513
It's very preliminary, but I thought it might be a good start to get some discusion going. I invite responses/comments, either here, or on the survey form.
The whole situation really makes me want to ultimately do my research outside of the establishment, but I recognize that collaborations and publications are a key part of credible science. In order to do that I figure that I will need a platform for recruiting participants and conducting experiments for the kind of research that I would want to do. Ideally the platform would be compelling enough to bring in researchers in academia (or elsewhere). So I'm working on that platform. That's the programming project that brought me to HN.
I filled out your questionnaire. Not sure if my project is related to your frustrations, but it seems somewhat close. Either way I share your pain.
As far as publishing goes. I've got papers that I'm working on now, and I've got several collaborations going that extend beyond my current program to faculty in other universities. So I'm building a network of people who are interested in the kind of research that I am doing that I could potentially facilitate.
The bottom line is that if I'm sitting on data or access to data, I'm going to have no shortage of academics who would like to get involved. They don't need to be superstars, just researchers who publish, and I'm doing pretty well in that department already. My immediate well-funded faculty supervisor's eyes light up when I talk to him about the kind of datasets I could potentially get for him. So there's potential funding on the academic side as well, at least on small to moderate scales.
Bottom line is that the online experimentation tools for large swaths of psychological research (questionnaires mainly but also cognitive tasks) aren't well handled by the big current players in that market. We're just not a big enough value proposition. The tools aren't designed for us. We really need for participants be able to just login to the experiment and be taken through the whole process. We need tools to track their participation, send reminder emails, etc that are built in to the experiment rather than cobbled together around it. And what most of us would like is for it to spit out a cohesive dataset at the end, ideally with at least some of the data-cleaning done (and documented) for us. A lot of research is cobbled together from a bunch of different tools that spit out separate chunks of data that need to be painstakingly re-constituted at the end. This may not be true for all academic research, but it is for the research that I'm targeting. So there's a value proposition there with a market of tens of thousands of grant-funded researchers.
In addition, I have access to several local sources of potential data collection (outside of academia) that are extremely appealing to an important subset of the researchers who might use my platform. I'm currently working with three such groups, but if things worked well there, there are hundreds if not thousands of such places in North America and Europe (and elsewhere). So I'd potentially be able to offer researchers not only a fairly comprehensive suite of tools for collecting data, but I'd have a growing network of recruitment sites so researchers could really just design it and click 'go' (and pay me).
This isn't all of it, just the research side. I'm pretty sure that the same set of tools can be used to go after a couple other non-academic markets as well. I've got at least 2 other non-academic revenue opportunities that I think are pretty solid and can be explored in tandem.
Then there are the long-term plans that change everything... (cartoonish evil laugh)
Deep learning is a big thing, yes? Hinton, Bengio, LeCun, etc. All that work originated in universities under completely conventional structures. Companies snatched it up once it was clear it would be the next big thing.
Self driving cars might have some impact? Thrun, Fox & Burgard were key players in probabilistic state estimation, wrote the Book, and trained the car people at $(BigCo). At universities.
Just to balance out some of the 'academia has no idea what it's doing' statements in this thread. It's not perfect but it's not ineffective.
Although this is a bit depressing and one of the reasons I left academia, is this not partly a reaction to the changing nature of science? With many fields well established with strong foundations there are arguably fewer revolutionary ideas required. It looks like science is going to be more evolutionary and less revolutionary, with emphasis shifting from the individual researcher to large teams.
Take for example the LHC where managing the huge teams and technology is (I imagine) significantly more of a challenge than interpreting the data or deciding what experiments to run. A bit of an extreme example perhaps but large projects are probably going to become more prevalent as most of the lower-hanging fruit is claimed.
This seems largely a byproduct of the funding model, which seems unlikely to change in academia.
> cardboard boxes for a living? Flipping burgers?
Or working as a patent clerk, maybe?
Yes, pretty much anything outside the academic process was what I meant, so that your livelihood does not depend on the outcome of your research, because then you have to tailor your research directions towards the safe and accepted. And so your straw men actually would work, though suboptimally because the income is probably too low to allow you free time.
Also, misquoting me by leaving out the "really" makes your argument disingenuous, because the "really" made clear that the statement is not a 100% but a trend (if a somewhat overwhelming one), and 4 exceptions (names?) don't exactly disprove a generalization.
It would also be interesting if you see your four examples as holdouts from a different era, or general exceptions. I'd wager they're more of the former rather than the latter, and that their number is dwindling.
Does it still work though? In the days of Einstein, I imagine you could slack off easier in such a position than nowadays.
The problem these days is that we're inundated with busywork not only in academia, but in most occupations.
Even in general at very large scale, a world of limited and shrinking economic opportunity but unlimited and expanding education (well, until the future student loan crash arrives), means you can try to pile on busywork but the people shamming and pencil whipping it are better at shamming and pencil whipping and everything else, than any other cohort in the history of humanity...
Even if contemplation is enough, most "manual" jobs require you to use your brain. Not necessarily in an interseting way, but enough to distract you from contemplation. It can't all be moving heavy objects from one pile to another.
It's great to work with freelancers though, since from the good ones you can learn how to handle that situation: Don't give users, customers, management enough information to really control you, understand that you are the one providing the results or not, and don't talk about your work but only about that part of the results that they actually care about. This way they don't consider you unproductive or incompetent, but that they need you. They still hate you, but they hate you because they need you. And that you can use to actually solve problems, take the time you need, and thereby provide the results they really need.
I think business is more ruthless. It won't let you waste too much time in a false belief that your product or science will become successful. That leads to faster iteration.
It is more difficult to obtain objective evidence about biological sciences and its subset, medicine, than other fields such as maths or chemistry. This is due to the inherent complexity and poorly understood processes by which living organisms thrive, or not.
So, there is abundant interest from non-experts (i.e. potential investors or market analysts) in gaining a slice of attractive markets such as currently unmet conditions e.g. lung cancer or Alzheimer's disease. Such people are sometimes more easily swayed by the proclamations of "famous professors" than graduate students might be.
So, yes, there is in effect a lot of "vapourware" in the drug business. And much of it succeeds precisely because there is insufficient expertise on the behalf of investors, who get their knowledge of the underlying science third hand, or worse.
Software Managers more commonly known as PM
"none of which were particularly impressive"
First who are you to judge quality of his work, that guy has a Nobel. Imagine if they did follow your utterly idiotic suggestion and did kick him out, other universities who recognize importance of his work and would instantly hire him. Years later when he would actually win the prize, Edinburgh University would look crazy for kicking out a Nobel prize winning physicist.
So no Peter Higgs is a genius, he knew importance of what he had achieved and took leisurely path, nothing wrong in that. Edinburgh University knew importance of his work and correctly decided that keeping him was a great investment.
The fact that you think that a researcher who won a Nobel prize somehow did not work "hard enough" in later years and should have been fired is a great indicator of dysfunctional academic culture.
Also "haven't been cited particularly much" is utterly bullshit. Unless the goal is to optimize for mediocrity (3 papers each year with 20-50 citations each) being better than one break-through Nobel worthy work. Frankly citations are very very easy to game if you are a professor with reasonable means at a good university, and are a really really bad indicator of success.
Since he already had a very successful paper maybe he wanted to write risky papers. In any case that's not worse than other researchers who write cookie-cutter papers adding extra terms to equations that are guaranteed to be cited by the next guy adding even more terms. By finding these minor faults with a Nobel award winning researcher, you are displaying the same dysfunctional thinking that has plagued academia.
Frankly almost 95% of papers are crap and better off not having been written, had it not been for publish or perish culture, or "lets count papers/cites to shame a Nobel award winning researcher culture" that you espouse.
Citations are a self reinforcing metric. Once a community starts counting them, the only way to succeed is to publish more which in turns leads to higher counts.
There is nothing wrong in publishing a good thorough paper over 4 years maybe slowly updating it as a working paper as done in economics.
If you put in amazing groundbreaking work in the first years of a startup and then started slacking off. How long before the company that you helped build has a right to kick you to the curb? A year? Five? More? How about 30?
5 papers, even good ones is pitiful for 30 years. Sure, it's possible that they were the culmination of long brilliant research projects. If that's the case then great. If it wasn't then the University has every right to ask what's up, which they did and decided to keep him on anyway.
Actually I have met people who did put in hard work in the initial years of a Unicorn statup. Guess what the value of their equity greatly exceeds the salary. And if the startup were to even kick them to curb they would live happily on their enormous earnings. Doing great research to an extent is similar.
I didn't say Nobel Award winners AREN'T a good investment. I said this one WASN'T. The University clearly thought he was going to produce. He didn't, and they were unhappy... with the return on their investment in him.
Right. The equity of early employees is high. My example wasn't about compensation, clearly. It was how long you continue to keep an underperforming employee in a position just because they did good work in the past.
Come on dude, I am sorry but you are wrong. Not just wrong but it seems you fundamentally misunderstand how world works.
See Peter Higgs is a Genius. The moment he published that paper, he knew that he essentially could do nothing, and the University would never "risk" losing him and waste potential payoff, even worse be ridiculed for firing a potential Nobel laureate. Also as wikipedia shows during all those years his research on Bosons kept him winning awards.
The University got far far far more than what they could have expected for when he eventually won the Nobel prize. Keeping him employed is equivalent to holding an option with enormous expected pay-off at small yearly recurring cost.
>> My example wasn't about compensation, clearly. It was how long you continue to keep an underperforming employee in a position just because they did good work in the past.
Except in academia having a seminal discovery, worthy of Nobel prize is equivalent to having large equity in that field. And from point of view of the University, losing such a person is equivalent to losing stake in delayed recognition of that work.
I'm not saying that having a Nobel Laureate on staff isn't worth something. I'm not saying that they should have fired him. I'm simply saying that I can understand the view in the earlier post where it was said that Higg's isn't a great example of why today's publishing climate is a bad thing.
The original article was about a Nobel Laureate who "wouldn't be productive enough for today's publishing climate". The implication is that this means that there is a problem with today's publishing climate. There may be, but Higgs isn't a good example of why. He was arguably not productive enough even for his earlier lower pressure time. He just put out one very brilliant piece of work that made up for it.
But that is in no way an indictment of the current academic focus on publications. There are other reasons and examples of why the current climate is a problem, but Higgs isn't one. The Higgs lesson in this context is "get a Nobel and you can do whatever you want". If you don't have a Nobel you're going to have to consistently produce research, and while the pressure to do so wasn't as high in the 70s and 80s, one paper every 3-5 years is pretty awful in an environment where publishing research is the goal.
For that reason Higgs just isn't a good example of why the system is broken. The university was upset about his lack of productivity in the 70s, long before the current publishing environment became an issue.
We ask that you please leave these out of comments on HN. They're not OK and luckily not necessary.
But from a longer time horizon, nobody really cares about those people in the 1950s having to support Peter Higgs living, or about people not having Verlaine in the army. We care about their results differently now.
I wish that timescale is the one thing that people who use the word "productivity" or "efficiency" would understand. There is no universal optimum, it depends on the time horizon you're looking at.
And while I am at it, let me make another comment to the article. I think today, we are so obsessed about efficiency of other people working, because there simply aren't enough jobs for everybody (due to automation). Since having a job (in a general sense) is customarily a requirement to being fed, most humans optimize towards not actual productivity, but an appearance of productivity. All these attempts to measure productivity are just a symptom of this problem - we desperately need something with which we can bang people in control of resources over their heads with, so that we could eat.
In other words, most jobs are changing from doing actual work into proving to other people in society that you did, ever so diminishing, amount of work. It's a shift in the focus of the competition, and unless we collectively realize that we can just lay back and don't need to actually compete, it won't get any better.
You are stretching the meaning of my comment. Did you know that he published three papers in the three years before getting hired in Edinburgh in 1960? I only pointed out that (on any 'time horizon') his scientific productivity after 1964 was basically non-existent.
In fact I agree on some level with a lot of the comments here, including yours. I just think that Peter Higgs is not the right example to justify the cause.
> and unless we collectively realize that we can just lay back and don't need to actually compete, it won't get any better.
And therefore, sadly, it won't.
It seems to me that there is tradeoff. We can look today at Peter Higgs and say, whoah, what a failure he was after 1964. But could that have been said in 1974? I am not sure. What if in 1975 he would come up with another breakthrough?
So the trade-off is in the timescale on which we judge the scientist's output. If you shorten the timescale, you decrease your accepted risk, and you can miss some rare wins (and I think that's where the Higgs example shines, because it is an example of such rare event). If you make the time scale longer, you accept greater risk of people turning badly. Idea of tenure advocates maximal practical timescale of such trust - one human lifespan, because with tenures arguably the wins are worth more than the accrued losses.
Verlaine? Not exactly a good example.
Contemporary academia may be obsessed with publishing, but there is some middle ground between that and publishing a paper every 6 years. A certain webcomic comes to mind: http://www.smbc-comics.com/?id=2495
I've written a bunch of papers - I don't care any more about quantity. But I want each of my graduating students to have a publication or two so they can get a job. So we turn out a bunch of papers every year, and no, not every one is earth-shaking, partly because roughly every second paper is a student's first one and that's what they were capable of at the time.
It's easy to miss this kind of dynamic from the outside.
1) The value of the senior professor's mentorship to each PhD student diminishes (maybe almost linearly) with each PhD student added.
2) If each senior professor hires on average X PhD students per generation, approximately (1/X) of the those PhD students can become senior professors if the rate that academia grows is negligible compared to growing X times as large per generation. (And right now, X must be at least 20.)
Some people have proven themselves to be really good at things, and it makes sense to give them the resources to do it well.
Most countries also have special funding contests only for young researchers to bootstrap themselves into the main contests.
>>> JG: You never entered the professorial track, but remained a staff scientist at the LMB. Did that suit your temperament?
>>> JS: Absolutely. It was a very good fit for me. If there was something that I thought was important and worth doing, I could just focus on that, and the only pressure was from family life. It meant you had time to sort all this out and not feel the pressure that you had to cut corners or guess.
Impishly, whenever he was asked whether there are simple guidelines along which to organise research so that it will be highly creative, he would say: no politics, no committees, no reports, no referees, no interviews; just gifted, highly motivated people picked by a few men of good judgment.
Quoted by Geoffrey West, of the Santa Fe Institute. The presentation below describes the SFI's own approach to interdisciplinary research:
TL;DW The seeds are based on unrestrained spending of huge amounts of non-existing money by the government during WWII.
As an old man in science who feels strongly that it was much better in the past (less admin, less evaluation BS, less pressure for funding etc), I do worry sometimes that the past wasn't better, and the undivided attention I could give to research as a PhD/postdoc was simply a tranquil niche my supervisor had carved out for his students (as I do for mine today), but that he faced similar pressures.
> Higgs said he became "an embarrassment to the department
> when they did research assessment exercises". A message
> would go around the department saying: "Please give a
> list of your recent publications." Higgs said: "I would
> send back a statement: 'None.' "
> By the time he retired in 1996, he was uncomfortable
> with the new academic culture.
He didn't lose his job. (One might argue that he should have, but I won't.) He just said that he would have a hard time finding a different job. Isn't that obvious? If you are looking for a new job at a research university with zero publications for 17 years, you can hardly expect departments to be dying to hire you.
I think most people in that situation are not doing any research. Most researchers with ambitious research projects still manage to publish occasionally, not just for "the system" but for themselves. You want to know that you are making at least a little progress on something. Andrew Wiles, for example, published every year or two throughout his work, with the exception of one four-year gap.  No 17-year gap.
Consider that a vibrant Github profile is essentially a prerequisite to be considered as an engaged professional in software engineering these days. It doesn't matter if you have ethical qualms about Github, or spend most of your day doing... your day job.
A 22-year old shouldn't have to fret about not having enough public repositories or not having contributed enough to open-source. I'd expect any craftsman to start producing their best work well into their career, and not at the start of it.
Although it's not often reflected upon these days, innovative, useful software was actually being created before, say, 2005. And, that software wasn't being built at 48-hour hackathons. And, it wasn't being bolted on to an MVP. It was acceptable then (and usually required) that a knowledgeable and skilled team spend a lot of time contemplating requirements and design.
I'm proud that I had created important, useful software over the years. Some of it had use for over a decade. Some of it is still in use today. But, I often joke with friends that I never would have been hired today, and I never would have been accepted to the same university program.
It remains to be seen if this unmanageable pace that we as an industry have adopted will make us crash and burn hard enough to not repeat our mistakes but as a species, we seemed doomed to repeat ourselves so I am slightly skeptical.
I'm glad to hear that you have works of pride.
Beyond that, dozzie's remark is very much on point, the host could become tainted in the same way as for instance SourceForge. Fortunately, the distributed nature of Git means that you are unlikely to "lose" your source-code.
But really you have no control over your code, it is their liberty to censor it if they please.
The primary issue, as far as I am concerned, is that while I can clone and go elsewhere is that Github is the centralized hub for (FL)OSS today. Everything lives there. If you are not on there, you are de facto invisible.
A better alternative, as far as I am considered, would be a distributed net of "self-hosted" Gitlab instances.
I am not comfortable with all the bad press that Github has suffered regarding gender equality, but I feel that I am suffering from vendor lock-in due to the success of the platform.
If you continue to use it after Github has performed an act of censorship (or anything else that violates your values) and you continue to use the service then that act goes unpunished as they will keep their market share and may repeat that act again. After all, there weren't enough negative consequences to avoid doing so in the future.
Your code should be hosted under your very own domain first (so called
"canonical repository", or how I call it, the "source of truth"), and GitHub
or BitBucket should be a replica. Thanks to git's distributed
nature, this replica can be two-way, but still a replica.
If they both started from nothing today, there's no doubt in my mind that we'd all quickly settle on Gitlab, but affecting that shift when Github is such an established default is so much harder.
Absolutely. Just yesterday I wanted to make a small change to a popular open source package, about 10 lines of C. Forked on Github, then sent a Pull Request. Because that's where the project lives, so that's where everyone goes. And Github has no interest in federating PRs in a way that they could be sent from Gitlab.
Another problem is the lack of funds for professor positions relative to the number of trained applicants. With so many qualified academics, departments have to use some metric to base decisions, and publications is more measurable (grant $$$) than scientific value.
The university cut does make some sense. There are a lot of services on which PI's rely. Administration to handle hiring processes. Building maintenance and plumbing. But per usual, instead of just covering costs, it is viewed as revenue on which to grow the university (and admin pay).
Actually, I am not sure whether university cut factors into funding decisions at NIH and the like.
I don't know if the NIH or NSF takes overhead into account, but many of the most successful grantees also have some of the highest overhead so probably not.
On the flip side, I've heard someone claim that their department chair strongly encouraged applying for NIH grants, because the NIH-negotatied overhead rate is much higher than other funders (also, allegedly more prestigious, but you'd think $500,000 spends about the same, regardless of where it comes from).
Absolutely. We burn through our scientists (and those who want to become ones) at an incredible rate.
I don't think that is the right question. Say it produces more knowledge, but the fields that are studied are selected, not by how intrinsically interesting they are, but instead by how much low-hanging fruit they have. I don't think that would necessarily be an improvement.
The upper end of mental scales, IQ or whatever, does not seem to follow the usual statistics for QA/QC of machine screw production or whatever simple industrial theory. On a purely numerical basis the world wide English Literature academic community should have maybe ten Shakespeare level professors active right now. If we do have those people, they don't have the job title "English literature professor". So where are they?
Part of the problem is "shoulders of giants". Perhaps I personally could have invented the calculus, but Newton beat me to it, so I'll never amount to much there. Under that theory unless we lose information as a culture, eventually academia will become perfectly sterile and actual advance will cease, although we can trust numerical metrics will increase forever into perpetuity as they always do unless confronted with reality, which in the case of that metric, is impossible.
There are two questions here, I think: are there a proportionately greater number of people in the world today with Shakespeare-level talent for language; and do those who exist now have the same chance of being recognized and lionized as Shakespeare did?
I don't know how we could answer the first question, but I assume it's roughly correct, just on faith in the statistical likelihood of the normal distribution of IQ/talent holding true. The second question maybe gives us some way to account for why we don't seem to see them, though. For one thing, people with that set of talents have a much wider set of fields to go into, now: in Shakespeare's time, people with literary talent could write poetry, or plays (or, as he did, both), and not much else. Today, both of those are sub-divided into lots of sub-genres, and prose fiction exists (and is also highly sub-divided), and prose non-fiction, and song-writing, and journalism, and writing for tv and movies, etc. We wouldn't expect to see every current Shakespeare working solely as a playwright: we should be looking across all the fields that need literary talent, if we want to accurately count today's Shakespeares.
And we need to consider whether a Shakespeare-level talent today has any reasonable chance of getting Shakespeare's level of recognition and admiration. There is so much more content today that any one person's output can't get nearly the attention and focus that the smaller world of Elizabethan England could give to Shakespeare. Achieving 'legendary' status probably depends less on absolute talent level than on how much one stands out from one's competitors - that's much harder to do in a larger, more sophisticated world. You would probably have to be significantly better than Shakespeare today to stand out as much as he did in his time.
Going back to the sciences, the situation is a little different because it probably is the case, as you suggested, that the field is finite and there will probably come a time when there is nothing significant for a new Newton to discover, and thereby be recognizable as a Newton-level mind. That's less the case for the arts - the working space there is much larger, possibly infinite. But the increasing sophistication and the growing number of fields to work in is the same in science as in literature, and so is the consequent dilution of attention and recognition. I'm not sure that conditions now allow us to reliably recognize our geniuses.
On why we don't have more Shakespeares alive today- what if we do? Would we revere Shakespeare as much as we do now if there were 9 others who wrote at the same time? In any case, I'm not so sure that Shakespeare would have the job title "English literature professor" either. Maybe they're TV and film script writers that we don't hear about?
Rankings are based, in part, on metrics related to publications and citations and that's the utility function they try to optimize.
... [choppy panted laughter]
Perhaps he used his computer to watch it.