I always imagined that the reddit/HN/Digg algorithms (if you can call it that) went something like this:
1) submit article
2) attach Unix timestamp
3) increment/decrement each timestamp by a fixed time interval of Unix seconds for each upvote/downvote
With this scheme, there is one operation (add a positive or negative increment). Eventually each article will naturally decay as time moves on. You can adjust the size of the increment to be weighted more closer to the actual submission time and have the increment decay to an average to accelerate "hotter" postings to the top if you don't like linear increments.
The overall idea is to project an article (in Unix timestamp) into the future by the number of upvotes; this timestamp is merely a ranking "key". The front page articles would have a Unix timestamp of one or two days into the future depending on how many votes. This would naturally place currently submitted articles somewhere a few pages back.
It more or less mimics the same thing (doesn't it?).
In other words, unlike reddit, the rating of a story changes over time. Stories get one free upvote, so p-1 compensates for that (new stories have no points). Dividing by a power of time makes the number of upvotes that a story receives in its first few hours crucial to how likely the story is to stay on the front page.
In reddit's system, if you take a snapshot of the front page and then stop voting or submitting stories, the page will never change. At HN, stories might reorder themselves because a story with few points, which was given a boost from being very new, will lose rating compared to high-value stories that have been around a long time.
It's likely that a News.YC post isn't reordered until someone actually casts a vote on it. So if all voting stopped, News.YC would remain static, just like Reddit would.
I think it would be better to normalize time for the number of submissions: right now when I submit a story late night US time, usually it goes nowhere because there is no one to upvote it.
It is simple to do that: just use the number of stories submitted as time measure instead of hours.
I wonder what Hacker News would be like were the votes to decay in importance independently, or, alternatively, for there to be two rating systems: one for before something hits the front page (with very little time decay), and one for something after (say with the normal method).
One thing that often happens is that hackers, hunting and reading late at night (like me) submit something to YC News. But basically nobody's reading the news feed, and four or five hours later, one or two votes isn't enough to get something on the front page, which is sometimes needed to get recognized by many people.
I have a good metric for this: sometimes I'll submit something and it won't hit the front page at all. But one by one, over the next few weeks, votes trickle in. This must be from: (a) people reading my submission history, or (b) people submitting the same article. But because I submitted at a time when nobody was reading it, it never even gets a chance to hit the front page.
Edit: I did not notice gaika's similar, elegant proposal above. But if two people make the same sort of comments independently, it should say something.
One possibly better approach would be to replace the new feed with a feed composed only of items that have spent less than 15 minutes on the front page (because everything deserves 15 minutes of fame).
You then sort this new feed like the front page (though perhaps with nonzero start points) -- so there's both decay and a chance that good articles without a ton of initial votes, eventually get up there (though perhaps on slow news days).