This remark at least partly justifies the current situation - a lot of the things people invented in the 70s was too computationally expensive to do in practice then, but only becomes feasible to do now.
For example, contrast what the 1996 and 2011 editions of the Garbage Collection Handbook have to say on (hard) real-time garbage collection. Real-time GC has seen a lot of progress since 2000.
And there have been also plenty of discoveries in other areas. For example, Monte-Carlo tree search, as used by Go-playing programs, is about a decade old .
Superpipelining interpreters and inlining polymorphic dispatch are some of my favorites for the '00s.
Is this referring to graph databases, or something else?