I curate and tag about 8-10 interesting and high quality developer related links every weekday. Been doing so for over a year and a half now, I believe I've curated over 2500+ links :)
Advertising and SEO happened. There is simply too much garbage out there and even Google has run out of ideas for how to filter it out. So they've resorted to curating by favouring the largest commercial sites. Essentially, they've turned the internet into cable TV (isn't that what all the big players wanted anyway?).
Algorithmic search has it's limitations, it can only really give you what you search for, and for something niche, the signal could be drowned out by the mainstream noise.
I suppose HN could be replaced with a google search for "interesting tech(ish) related things" I don't really think that would or could work though.
I'll also check the blog roll of interesting sites I find, because I don't think they'd necessarily show up in a search.
I don't think web search is in decline though, or if it is, its down to the walled gardens cutting off large parts of the web.
Search engines were never just about algorithmic search. They were search+recommend (what you'd call "AI" these days), but the most successfull ones always turned recommend into advertising.
To return to the Hacker News example, if I read whatever is at number 1 on the front page of HN via a search engine, I don't think it would give 'you may also like' recommendations to anything approximating the HN front page. They would be things relevant to the article.
The broader trend towards curated lists does exist, true, but in the bigger picture it's no different to how any news service has "curated" the news over the years. The main difference is linking to third party sources (something which arguably Drudge Report invented online).