

The Limits of the Digital Humanities - RougeFemme
http://www.newrepublic.com/article/117428/limits-digital-humanities-adam-kirsch

======
tsunamifury
I spend my days trying to structure books into relational databases to attempt
to map them to more interesting non-linear behaviors (UX and UI).

Most of the humanities has been relegated to linear data storage (books).
Unlocking that with new interfaces is the most obvious way that the humanities
can become obviously digital.

I don't really see it as a very heavy question -- there is still a need for
writers, painters, photographers, etc -- there is just more spots for workers
to design the frames for that content in new and exciting ways. The professors
won't get a handle on this until its already defined and part of the past,
because humanities isn't really research for the future as much as it is a
study of the past.

------
robin2
I can't quite make up my mind whether its irony, or the other half of the same
phenomenon, that humanities are feeling pressure to become more business-like
whilst businesses are being drawn towards methods from the humanities.

An example of the latter would be the work of ReD Associates. (See, e.g.,
[http://www.theatlantic.com/magazine/archive/2013/03/anthropo...](http://www.theatlantic.com/magazine/archive/2013/03/anthropology-
inc/309218/), [http://rdn32.com/2014/04/03/the-moment-of-
clarity/](http://rdn32.com/2014/04/03/the-moment-of-clarity/))

------
ppod
>digital analysis of literature tells us what we already know rather than
leading us in new directions

Isn't it sadly typical of the humanities to view data that confirms a
hypothesis as redundant?

~~~
bakhy
yes, of course. because it is completely atypical for humanities to have data
that either confirms or disproves some hypothesis. that's more of a "hard
science" thing ;) humanities simply cannot operate like that. no data can
prove why Shakespeare was good.

i think the author was simply pointing out that no significant new discovery
was, or perhaps even could be, made that way. he was just attacking the hype,
the story of new breakthroughs right around the corner, of the entire field of
humanities being shaken to the core.. not by this.

and also, this Ngram discovery is a weak one. to reach some proper scientific
level they should not restrict it to checking a couple of examples with known
outcomes. it really should produce something new. otherwise, would it not be
in danger of simply being a manifestation of confirmation bias?

edit - ok, i may be using the term "humanities" wrong :) i guess some can get
a lot from data, but some really cannot.

~~~
ppod
>checking a couple of examples with known outcomes.

How do we know the outcomes are known without evidence? For example:

> The computer can tell you that titles have shrunk (and you hardly need a
> computer to tell you that: the bulky eighteenth-century title is commonplace
> and a target of jokes even today

Now of course it's a really god hypothesis that book titles have got shorter.
I'd bet a lot that it's true. But the way science works is that we take our
hunches and make sure that we're not fooling ourselves by checking against
actual data. Of course there is a risk of bias by choosing which hypotheses we
test - but that's better than just assuming that all our hypotheses are true!

~~~
bakhy
you are right, but i believe the point was in the triviality of the find.
humanities are often in territory which can not be quantified.

------
alialkhatib
It's interesting reading this from a background in Anthropology, having
experienced some of the challenges the article describes. In particular, the
perspective that people need to know how to code (Ramsay), and the backlash it
evoked was something I've experienced on numerous occasions. I'm interested in
digital cultures and quantitative methods, but Anthropology largely abandoned
quantitative stuff in the 70s and now it seems like anyone who mentions
statistics and quantitative methods non-sarcastically gets shunted into other
fields (e.g. Sociology, Informatics, CS).

I didn't whither away in Anthro, but now I'm leaving the field because,
essentially, my methodology and skill set align me more closely with a CS
program than with an Anthro program. That my research interests are in line
with cultural Anthropology is secondary; methods determined my field, which
when you think about it is kind of bizarre.

Dan Jurafsky (at Stanford) and I had a brief conversation about this and
seemed to come from a similar experience. He described himself as a linguist
and seemed to feel that that's where he is intellectually, but the field of
Linguistics sees him as more of a computer scientist because of his tools and
so he's got an appointment in CS.

I liken Anthropology in digital cultures to being an anthropologist of some
South American culture; there are languages, customs, and mindsets that you
need to learn - and indeed be fluent in - before you can even claim to be an
expert. Anything less and you're seriously limiting yourself. There will be
people who study digital cultures who can't code, eschew quantitative stuff
(including "Big Data" et. al), and generally stick to orthodox convention.
Their expertise will someday be limited by not being able to "speak the
language" both literally and figuratively (ie having an emic understanding of
the culture they're studying).

I don't think "someday" is today. Maybe that's because my undergrad advisor is
an Anthropologist with no programming background, yet unequivocally one of the
leading researchers in digital cultures. I think there's still enough basic
stuff that we need to figure out to orient ourselves with digital cultures
that people have to be able to code and bring domain-specialized skills to the
field. Or maybe we're so far along that getting entrenched in digital culture
has become abstracted from being savvy to the point of being a programmer.

The article ends on the prescription to critique technology in the spirit of
"intellectual responsibility", but I get the feeling that it doesn't see the
"identity crisis" described earlier as the very critique the article calls
for. Hacking away at each other, intellectual vying for control of the digital
humanities are ultimately advocating for what the field ought to be. This is
what we (social scientists and humanists) badly need.

If the digital humanities can't coalesce into something meaningful (and
perhaps soon), people like me will ultimately go into CS and related fields
where we'll learn the tools we need to become fluent in the technical skills,
rather than hone qualitative skills and study sociocultural theory. If that
continues to happen, the digital humanities could lose an entire generation
and ultimately fail to keep Anthropology and the Humanities modern and
relevant. I don't know what will happen to these disciplines if they get left
behind.

/rant.

