What I think is mostly missing the mark is treating this as an expertise that you learn in college, i.e. straight out of high school you go to college and then do a PhD and you interact in a bubble of people who are the same way. And you try to comment on the greats of literature, without any real world experience.
I value it much higher to read critiques by different authors and artists, in a kind of Viennese coffeehouse gossip culture way.
It's the equivalent of wanting to become an expert on the philosophy of ethics without ever having to resolve a real ethical conundrum in real life, like pulling the plug on someone's medical support or advising about authorizing an artillery strike or whatever other thing may arise with difficult tradeoffs outside neat thought experiments. It's being clever from the sidelines.
So, I don't think it's a field of expertise, I think it's a teaching job. And teaching about art and literature and helping the new generation process the message therein is good. But it doesn't make it a research field. Indeed, the idea that a humanities teacher at university should have regular novel thoughts and innovations is a very new idea, from the 19th century, originating in the Humboldtian reform of German universities. Before that, teachers would read the classics to students and comment on them, but they mainly passed down the same type of commentary that they received in their education, of course with some of their own flavor, but it wasn't really seen as producing new knowledge, just making it easier to digest the existing high-prestige work of literature.
I more or less agree with that, with the proviso that I think academia in general (not just the humanities) would benefit from easing up a bit on the insistence on "producing new knowledge". It's good to produce new knowledge, sure, but I think the way that's been pushed has led to a situation where people just publish a lot of papers without necessarily creating a lot of new knowledge. In part this is due to Goodhart's law and people optimizing for publications. In part though it's due to the two-tiered (tenure/non-tenure) academic job system.
Even in fields quite remote from humanities, we have, for instance, a bunch of people who need to be taught calculus and so on. And it would be fine for them to be taught calculus by someone who isn't "creating new knowledge" in mathematics. But you can get paid a lot more to create new knowledge while begrudgingly teaching calculus now and then than you can to just teach calculus with gusto.
Likewise in the humanities, I think your argument leaves open the possibility that there could be new knowledge produced there, but that we just shouldn't expect everyone who's teaching Intro to American Literature or whatever to be producing such knowledge.
In my view a good step would just be to significantly reduce the pay gap (and gaps in benefits, job security, etc.) between teaching jobs and research jobs. There are many people who love Moby Dick or basic calculus and could ably and happily teach it for years without feeling any need to write a novel or prove a novel theorem themselves. We'd all benefit if such people could get a steady job doing that.
Yes, simple lecturing jobs are fine, and they do exist, but as you said they are paid less. Because in truth this is the reality already, we just don't admit it.
The intention behind it is understandable though. Someone who has produced new knowledge tends to have a more flexible mind, they have felt that the walls of knowledge are soft and malleable and not some concrete slab. They work with the math even outside class, and have a real grasp on why things are defined in certain ways, having also defined new concepts and written new theorems and proofs and having faced dilemmas of how to construct it to be most elegant and compact and logical etc.
Now, of course today the research and the teaching are often on quite distant topics. Like teaching some basic computer science stuff like basic data structures and algorithms while you actually research computer graphics or speech recognition.
I'm not so sure that having produced new knowledge is so vital to teaching old knowledge, at least not at the scale that's required for a tenure-track job. It's good for a teacher to not be locked into a static view of old knowledge, but I don't think that requires anything like the breakneck amount of publishing that's expected in many fields today.
idk man, I'm a humanities major whose spent all of my career, which spans about a decade and a half now, around people who have had very little to no exposure to humanities courses and my take is that more people need to take them.
The quality of analysis and opinion outside of academia is just, I'll be blunt, incredibly poor. I think claiming that literary analysis courses are just a bunch of people spouting opinions is an unfair reduction. You learn analytical techniques, you learn how to identify theme and structure, how to perform a historical analysis versus a contemporary reading, close reading, logical analysis etc etc. There's not just depth at the level of an individual work, there's tons of technical and analytical and procedural depth to uncover in the practice of interpretation.
Inadequacies in this practice lead directly to bad societal outcomes imo. People who are unable to critically dissect narratives are also easy to manipulate. Worse, a lot of people who lack exposure to these ideas do not even ask important questions in the first place, even basic ones like, "how might this tech actually impact society" because they simply have never had the training to learn that asking these kinds of questions is important.
Also I do think it is an actual research field, which, just like any other field, changes as available tech changes. For example digital humanities is a relatively new approach that was mostly enabled by the advent of statistics and computers. This unlocked a whole new suite of literary analysis techniques and perspectives, and these new techniques have actually furnished novel interpretations and second looks at previous works (a really concrete example, these techniques have been used to resolve questions of disputed or unclear authorship) just like technological innovations do the same in other sciences and research fields.
This is a symptom of a much deeper issue. The education system is completely broken because quite often if the student memorizes answers, that's already huge success. There is very little attention towards teaching students new ways of thinking. Not to mention that even in prestigious schools, teachers are often dropouts who failed to secure a more lucrative career in private sector, which means they themselves aren't competent. And even besides this, many people have too much shit going on in their lives to dedicate attention to education.
I don't disagree with all that you said about analytical techniques and so on, but those are learning "how" not learning "what". It's not about learning the result of someone else's analysis, it's about learning to do the analysis. No doubt you'll need to study other people's analyses to see how they did them, but the point is not to learn their result but their process. This is reversed from something like learning physics where you are primarily learning the results of other people's discoveries (e.g., Newton) and understanding their process is secondary.
> For example digital humanities is a relatively new approach that was mostly enabled by the advent of statistics and computers. This unlocked a whole new suite of literary analysis techniques and perspectives, and these new techniques have actually furnished novel interpretations and second looks at previous works (a really concrete example, these techniques have been used to resolve questions of disputed or unclear authorship) just like technological innovations do the same in other sciences and research fields.
I'd say that concrete stuff like resolving authorship questions is not really the lion's share of digital humanities. And the key thing there is that there was an answer that was found. The mere fact that the field changes because of new techniques and "novel interpretations" doesn't get us very far. The question is whether such changes are an advance over previous research or simply a change. In scientific fields if a new theory is accepted it means the old theory is either enlarged or discarded; we either decide "we knew X was true, and now we know Y is also true" or we decide "we knew X was true but now we know it is false and actually Y is true". But "new interpretations" can just mean something like "some people think X and now some other people think Y", but without reference to any ground truth this doesn't represent forward progress.
I'll add that I agree that humanities courses are valuable and that society would benefit from more people taking them. I just don't think that focusing on the aspects of humanities that are slightly more scientific is a good way to justify that. Insofar as something like resolving authorships questions is concrete and measurable, it's because it's using scientific methodology. The humanities cannot beat science at its own game. I see the humanities as more valuable in how they provide a broader context and motivation for scientific and technical work. I don't think this is incompatible with what you said, it's just a matter of what gets the emphasis.
I see the humanities (I mean literature here and not so much history etc) as a continuation of cautionary tales and parables and fables and legends and fairy tales around the campfire. Its "output" is that you're better able to feel on a visceral level how others might feel or what the social outcome can be, or what the deep values of your society are etc. Literary analysis is twice removed from this. Ground level is how you act in a new situation. Once removed is the concrete stories you hear at the campfire. Twice removed is the dissecting analysis of the story and more meta levels.
In physics the ground level is solving a new engineering problem in industry. You use the recipes and methods you learned in physics class to formulate your problem in the language of physics and solve it. Once removed is teaching the equations and methods themselves. Twice removed is discussing how the physics knowledge was produced historically and how research is done today.
I do belive though that learning about the history of science is immensely enlightening. In school it seems like these things just fell from the sky fully formed like Athene from Zeus' forehead. Kuhn's book on the Copernican revolution is super interesting. I found it better than the more famous Structure of Scientific Revolutions. The latter is more abstract and tries to build a big theory, while the former mostly just narrates how it happened in the concrete case of Copernicus, Kepler et al.
I value it much higher to read critiques by different authors and artists, in a kind of Viennese coffeehouse gossip culture way.
It's the equivalent of wanting to become an expert on the philosophy of ethics without ever having to resolve a real ethical conundrum in real life, like pulling the plug on someone's medical support or advising about authorizing an artillery strike or whatever other thing may arise with difficult tradeoffs outside neat thought experiments. It's being clever from the sidelines.
So, I don't think it's a field of expertise, I think it's a teaching job. And teaching about art and literature and helping the new generation process the message therein is good. But it doesn't make it a research field. Indeed, the idea that a humanities teacher at university should have regular novel thoughts and innovations is a very new idea, from the 19th century, originating in the Humboldtian reform of German universities. Before that, teachers would read the classics to students and comment on them, but they mainly passed down the same type of commentary that they received in their education, of course with some of their own flavor, but it wasn't really seen as producing new knowledge, just making it easier to digest the existing high-prestige work of literature.