Hacker News new | past | comments | ask | show | jobs | submit login

The problem is that these perceptions are very much in the eye of the beholder. As the saying goes, things have always been getting worse.

If anyone can come up with an objective measure for any of this, I'd be interested. As far as I can tell, though, these perceptions are strongly affected by the fact that the things that we dislike make a much stronger impression than the rest of what we see.

From my perspective, the downvoting isn't more aggressive than it used to be, there isn't more junk on the front page than there used to be (though there might be more ideologically conflictual material, since society is moving in that direction), and the titles aren't more baity than they used to be, since we edit most of the bad ones. I'm biased, though, and conditioned by the moderation job. The interesting question is whether there's a way to get beyond such biases.




While I think "degenerating" is a strong word, I do think there is a noticeable drop in the quality of comments since a decade ago. I don't have any hard data on this, but I think it's fairly obvious if you dig around in the archive. There used to be a more academic and countercultural feeling to commentary, as well.

Maybe some metrics to investigate are: number of words per comment, vocabulary level of the average comment [1], and so on. While neither of these are solely indicative of quality, I think they are something to consider.

[1] http://www.englishprofile.org/wordlists/text-inspector


I dig around in the archive a lot, and that's not obvious to me at all. There was a lot of variance in the past and there's a lot of variance now.

Number of words per comment would be easy enough to measure, but I'm not sure length is a good indicator of quality—not everyone who goes on at length has much to say. Your vocabulary link is interesting though—thanks! I've made a note to look into that.

Edit: since you mentioned "academic", I was thinking of your comment when I noticed the top two comments in the current Sci-Hub discussion:

https://news.ycombinator.com/item?id=22008977

https://news.ycombinator.com/item?id=22009107

To me that seems more common than it used to be, and a good sign. But it could just be sample bias.

Edit 2: here's another, randomly run across today: https://news.ycombinator.com/item?id=22012673. Not academic, but technical in an unexpected way. There are a lot of physicians posting to HN.

Edit 3: or check out the top comment of https://news.ycombinator.com/item?id=22025961

Edit 4: and of https://news.ycombinator.com/item?id=21963509


All perceptions are in the eye of the beholder. Anyway, my unobjective measure is number of interesting articles and discussions I want to read after scanning the front page. It’s declining over the years.

It’s not really surprising since there are two contributing trends:

Online communities generally degrade as they increase in size.

Attention economy brings “heat death” of internet: amount of produced content is higher, quality is lower.


Perceptions need not be strictly subjective.

There's the notion of where good conversations are found, or what sources are referenced by external sources, or what links where will result in more traffic (or conversions) to linked sites / services.

There's metrics such as MAU and user time-on-site.

As dang noted in another recent comment, one of HN's scarcest commodities is its front-page slots. As with attention and time/day, that's an absolutely rivalrous good, and one which decreases comparatively as the site increases in size and volume -- more users means more competition for the 30 front-page slots per user.

And if more stories rotate through those slots, there will be either less time per story, or a restricted exposure (e.g., stories presented to only a subset of readers), or both. (I'm not sure what HN's specific mechanics are here, though I believe it's a mix of both.)

All media tend to fall in quality with size. See Dwight MacDonald's 1953 classic "A Theory of Mass Culture" (https://is.muni.cz/el/1421/jaro2008/ESB032/um/5136660/MacDon...).

(Also apparently "Masscult and Midcult", which I've only just run across and have not yet read: http://xroads.virginia.edu/~DRBR/macdonald.pdf)


Conversational quality metrics are difficult to come by. Most tend to be expensive to calculate, especially in the absence of clearly parseable engagement information (e.g., likes, re-shares, comments). I've played with this concept in a few cases, most recently when trying to make assessments of activity / genuine community in Google+ Communities, prior to their shutdown.

Some of the more interesting research I'm aware of was based on Usenet, looking at thread dynamics, out of Microsoft Research, in the early/mid aughts. I think it was Marc A. Smith, see for example:

https://www.microsoft.com/en-us/research/publication/picturi...

https://www.computerweekly.com/news/2240051972/MIcrosoft-Res...

(There were more detailed discussions I've seen in the past, I don't have references handy.)

In particularly, post/response dynamics seemed to fit a number of distinct categories:

- Single post, many responses, no follow-up: question with a single obvious answer provided by many.

- Single post, extended discussion, "conversation-killing" response: hard question, with unclear answer, eventually obviously provided.

- Long descending "right-shifted" thread between two parties: personality conflict / flame thread.

- Post, much response, multiple parties: successful trolling.

I've played a bit with the notion of indicators of significant vs. insignificant discussion, based on sentinal keywords: https://old.reddit.com/r/dredmorbius/comments/3hp41w/trackin...

I know HN has an API and have considered looking at it though as yet haven't explored this.

Any ideas as to what would indicate desirable or undesirable trends, threads, and/or posts?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: