It's a little annoying since I constantly pickup tabs that are a day old, and have to start from the beginning. Hopefully I'm not the only one on HN with that issue.
Incidentally, for those of you working on startups, this is a good example of why making something people want is not a simple matter of making what they say they want.
Yes, you probably haven't lost any users over expired links
However, I think you have inconvenienced a lot of users through the expired links. Sometimes, I think it is good to spend a little bit of effort on helping a large number of users.
I'm assuming among readers (who attempt to go past page-1), an overwhelming majority has hit this problem multiple times (and lost time because of this problem)
I'm not buying the argument that if HN has good stories and comments people will keep coming back and if they come back, it means they want a next page button that fails. A doesn't imply B.
If frequent expiration is to remain a feature, would it be possible to have a preference setting allowing us to put the first four pages or so on our front page? Then we could read whatever looks most interesting among the first 120 items instead of the first 30 before we get that expired next button we supposedly don't mind.
And to share everything I also use a theme for HN. It makes the design a little more as a flat design and with bigger letters which is especially what I wanted.
but I always use vanilla HN because I'm fond of it and the UI is pretty much the best for me.
That said, it would be very cool if users could implement their own custom CSS natively like you can on http://yayhooray.net
I personally detest browser extensions and I don't like how they slap custom CSS over the top of sites like make-up. Even those who aren't up for fiddling with style sheets could share in what others make.
Honestly, it gives HN a bit of charm, and it's never ever really bothered me.
'Inconvenienced' is broad word. Plenty of people eventually get the hang of driving a motorcycle with the front-brake in the "wrong" spot, and are no longer inconvenienced by that. But it is not truly dangerous or troublesome in the way that having the control pedals in a Car were to be put in the wrong place (eg swap the throttle and brake pedal location). So, in this case its more an inconvenience of having some learning curve; it's not the more dangerous or nefarious inconvenience of something that is truly troublesome, random, or wasteful.
I suspect that fixing expired links would improve HN's front-page story quality. I seem to recall patio11 or tptacek once saying if you really want to help out HN, you should sit on the New page and upvote stories. That's too much of a pain for me, but if I could get to pages 2-4 without friction, I'd see more non-front-page stories I could upvote. This would help good stories make it to the front page or stay there longer. So perhaps making things easier for users would also help HN. It may even make it easier for content-rich stories to compete against link-baity ones. What if fixing expired links kept HN from becoming like Reddit?
I read an article. I upvote it if needed. I read the comments page. I then refresh the page before making any comments. Once i've finished reading and voting and commenting i close the pages and move to the next article.
Fixing the new page is relatively easy, and it could free more closures to the other pages, so it increase the time that the links live before expiring.
I also sometimes use the “filtered 2+ new page”: http://hnapp.com/filter/d3a308f2ac9a071c0bf174e0c1a8fd22 , i.e. someone else thought that it is interesting. It’s not so painful and it’s also good for days where a single new saturate the front page.
Unfortunately, the user experience is quite unpleasant. I hit expired pagination links several times a day. You guys can do better.
1. Auto assign upvotes to new stories using machine learning algorithms that use older, already upvoted, stories for learning. This can fix one of the issues often encountered where good stories go ignored because too few people have time to screen new stories.
2. Instead of using ad hoc rules for ordering stories and comments, use machine learning engines again.
Some stories get flagged. These would also provide good data for supervised learning.
I understand some control would need to be exercised to prevent gaming of the system.
Yeah, actually, it is. Seriously.
Nothing makes "hackers" look worse than a hack that doesn't work.
I think this makes it more tempting to upvote something just after looking at the headline instead of after reading it, because one has to remember so much or use so many tabs or bear with going back to the front page and clicking next again.
I have no idea about everone's behaviour, but maybe expired links in the end also affect something like users voting up fluff posts?
I suspect the fix will result in a large change to the code base and you want to always own/understand the code for some reason. So if you can't fix it yourself, nobody is allowed to, either. That's fine--just admit it's your baby and you don't want to let it go.
So, the only answer that fits all of the parameters is that he's trolling us.
The change to the code base would be easy. You can increase the expiration timeout or keep around more closures. It would come at the expense of using more computer hardware, which is an elegant technique of scaling: you free your human hardware to work on more important things.
Is there anyway to penalise the people upvoting a fluff link? Mods have a 'fluff link' flag, and using that applies some karma loss to everyone who upvoted the post and puts a notice on their user page? (This is an ugly kludge. It's meant as a thought experiment rather than a serious implementation idea).
I have made some comments on HN which should have been heavily downvoted but which got no downvotes or even upvotes. I have no idea how this can be fixed.
Some people on HN have an aversion to down voting and flagging and will only use these tools for the most egrarious examples of bad posts. It would be good if people accepted getting and giving downvotes as a useful tool in running the site.
There's another problem with people responding to people who are effectively trolls rather than ignoring or downvoting those posts. I know there are few moderators with very little time. More mods is not the answer, but perhaps 'big downvotes' for those mods would help. Applying a -5 or -10 downvote to people posting aggressive dumb comments would help set tone.
Perhaps having links to all the guidelines (including the links provided to new users) under the submission box would be useful?
This way I can only blame myself when pressing the button doesn't work as expected.
Great lesson on prioritization!
Starting from the HN main page, I first open the discussion pages of all the stories I potentially want to read in new tabs (till I start encountering older stories that I have already gone through). This process is fast enough to not encounter expired links.
For each story now, I click the link leading to the story from the HN discussion page, and get back to the discussion page simply using the back button. When done, I simply close the tab.
I'm often more interested in interesting discussions rather than the original links and bestcomments is a good way to jump right into these discussions.
It'd be great if bestcomments could me migrated away from the fnid mechanism, but I appreciate this is a side project and this is low priority.
They don't have to fix it or make any improvements. They can run a very poorly performing site with no features and a horrible appearance, because they are entrenched, and they literally own quite a large portion of their userbase.
Unfortunately no matter how many people invent software that does the same thing as Hacker News but works much better and offer it up for free, nothing will be done.
The owner of the site has too much invested in its technology stack (Lisp) and its particular implementation and appearance. The improvements necessary probably require going back on some of those fundamentals, which is never going to happen, because they are too proud.
However, after seeing this:
> ... because they are entrenched, and they literally own quite a large portion of their userbase.
> Unfortunately no matter how many people invent software that does the same thing as Hacker News but works much better and offer it up for free, nothing will be done.
... I have to say, if you are disappointed, have you thought about testing this theory? This website has not been around forever. It isn't the first of its kind either. I'm probably not the only one here who came to this site after having frequented a bunch of others that were essentially the same thing. So I don't expect it to be the last one, either. If someone has the interest and the skill set, I say, please do make this. Worst case you waste a few weekends or something.
To that end I often feel it's refreshing when people say something that is not a majority opinion on this site, and don't hold back.
they simply do not care
If it's negligence that you're implying, consider a more valuable alternative. The expiration link bug is starting to look like the HN test of one's ability to tell what's important to care for. Any hacker would want to develop this ability.
Not at all. It's pretty trivial to replace code that uses dynamically generated links with code that doesn't.
Can they change the 'lasts' parameter so that things don't expire as quickly?
To be more sophisticated, instead of killing fnids, could first
; replace them with fns that tell the server it's harvesting too
; aggressively if they start to get called. But the right thing to
; do is estimate what the max no of fnids can be and set the harvest
; limit there-- beyond that the only solution is to buy more memory.
(def harvest-fnids ((o n 50000)) ; was 20000
Another idea. Could you change it so that when they are harvested the functions are serialized to disk (maybe temporarily), and make fns a function or something so if it had been removed from memory it could be deserialized? It wouldn't need to be a serialization solution that would work for arbitrary closures, just something that would work for those cases, like timed-aform or whatever.
I have barely ever tried to use Lisp and I don't know much about this system so this may not make any sense.
Another idea. Use one of the Hacker News clones that people have built over the years that doesn't have this problem.
"Why editors are 'literally' changing the dictionary" - http://www.bbc.co.uk/news/uk-23729570
Someone who was actually a slave would strongly disagree.
It makes it look a little slicker too, but I truly couldn't care less about that.
(Also, they let people scrape them and create improved versions!)
We should name this - how about "The Craigslist Problem"?
This has been there since Day 1 and will probably never be changed.
See also https://news.ycombinator.com/item?id=163696 https://news.ycombinator.com/item?id=2677469
If he wants to continue with that technique, I'd hack it like this: use reflection to crawl the closure, isolate the variables bound to the closure context, serialize those (the user id's, post id's, etc). Then the closure code, which will be one of only a few "templates", can be put into a global dictionary. The serialized state goes into the URL parameters. The closure is collected, and "reified" from the global dictionary and URL state once the user performs the action.
I don't know if LISP's reflection can pull this off, but I'm pretty sure it's no biggie.
- “Bugs” (in this case, already seems to be known and talked about; see ).
- “Feature Requests” for anything that isn’t a bug.
This is intentional. I forget the specific rationale, however.
So no, to answer your question (though I'm not pg), it likely won't be fixed.
This particular problem isn't going to cause tons of people to leave, and fixing it isn't going to cause tons of people to join.
Unless someone does the work for free, or manages to get a large portion of the current users to quit over the issue, there's no reason to expect it to be changed.
I would definitely like to see this one sorted out.
People need to refresh the thread before they respond to ot, especially if it is a day old. This ensures they have an uptodate version of the thread and allows them ro see if anyone has already made their point.
People clicking next to read past the first page is good. Open any comment pages you want to read in new tabs, and refresh those tabs just before you comment.
I strongly agree that there are other problems with HN that need fixing.
On a desktop it doesn't save much due to bookmarks bar but on mobile it saves a lot of effort on starting over.
I don't think this is the case.
This fix would have a low priority on my list. I would rather have work done on improving the quality of comments and unfortunate high ranking of middlebrow comments.
I am not saying, "it's not a bug, it's a feature!" I'm just saying that's how it works.