
How would you add 'reputation' to what people post? - gw666
SF often talks about reputation systems as part of a society&#x27;s culture. True, it&#x27;s a big hairy problem, but how could some person or group make an initial attempt at doing this? It&#x27;d be great to have web pages, tweets, and various posts tied somehow to a measure of the author&#x27;s current trustworthiness (based on previous behavior). Even a high&#x2F;medium&#x2F;low&#x2F;unknown rating would help.<p>No matter how imperfect the implementation might be, it&#x27;d be great to have anything that exposes to the general public the idea that people need to consider the source of <i>anything</i> that&#x27;s posted on the Internet.<p>What would you try doing?
======
samkater
Most systems seem to ask others to rate what they consume - up/down,
like/dislike, etc. If we’re talking about content that competes for eyeballs,
like a news feed, you might add the concept of having a person “stake” some of
their reputation to move something to the top. If it is universally pilloried,
they lose reputation, and vice-versa. Bonus points for a system that makes it
so my reputation to you doesn’t have to be the same as my reputation to anyone
else (if you have been “Up-voting” my content for a while, my rep with you is
higher than with somebody seeing my post for the first time).

Not sure if or how this solves an echo-chamber problem though.

Edit: just also had the thought to help tackle bias problems - the platform
itself could produce biased content on either side of an issue from time-to-
time to deduce people’s positions based on their votes. Then the reputation
algorithm has a chance to adjust for ideas that polarize vs ones with general
agreement and scale rankings accordingly.

------
ben509
The biggest problem of any moderation system is a majority downvoting things
they don't like into oblivion. You get a majority with a high reputation
taking control of your site. We complain about cancel culture, but it's
nothing new because people naturally want to kick out people they don't like.
After all, the people they don't like are heartless evil assholes.

I'd add a system to let people identify aspects of the post, and then use
reputation to verify that they're a good indicator of that aspect.

Liberals know what is "liberal" and conservatives know what's "conservative"
quite reliably (in aggregate), even though those concepts are _very_ fuzzy.
And the problem most systems see is that politics being politics, those are
frequently gamed. (If you've ever listend to C-SPAN's radio program, you know
that half the "republican" callers are democrats...)

Then you have to determine which keywords to use. I think some basic guidance,
"don't label a thing 'spam' unless it's someone selling crap" would go a long
way to incentivizing a critical mass of users to label a thing honestly.

And then you let readers decide what they want to read.

My other notion (building on another comment[1]) is that discussion should
include a team-based element. I think offering a way for small teams to come
together and present ideas is more useful than individual commentary. A team
allows less personal investment because you are now motivated by a desire from
admiration from known peers.

But I've put away good number of beers, so I might just be rambling like a
drunk idiot.

[1]:
[https://news.ycombinator.com/item?id=21289078](https://news.ycombinator.com/item?id=21289078)

