
Ask HN: When did comments start being split across pages? - joshstrange
I searched for an announcement but I didn&#x27;t find anything. On a recent post I was reading the comments [0] on I noticed a &quot;more&quot; link at the bottom to take me to the second page of comments, I had never seen that link before and wondered if it was new.<p>[0] https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=13817557
======
nilkn
I think this is a less than ideal workaround because it breaks (Cmd|Ctrl)+F
search on large comment threads. I do this quite a bit just to see if some
particular topic has already been discussed somewhere. Actually I think the
current design is a little unintentionally insidious just because it doesn't
make it clear that any comments are being hidden, so if not for seeing this I
probably wouldn't have realized for quite a while that there was even a "More"
link to click.

------
dang
At least as far back as 2013, but we only turn it on if the site is getting
crushed. These days that happens when (a) there's a story on the front page
with many hundreds of comments (say 750+) and (b) there's higher than usual
traffic. That was the case yesterday with the big CIA/Wikileaks thread. We've
turned it off now though.

One of you will now point out that it would be better not to have to do this
(or, as someone gently put it downthread, "maybe it is time they actually
considered fixing HN"). You're right!

~~~
EGreg
Just out of curiosity, how do you crush a site with just posting maybe 100
text based comments on the top 20 items or whatever?

I mean what exactly is the load that causes this?

If you are executing some algorithm for each request - stop. Just run it every
minute to rebuild a cache. HN discourages rapid fire replies anyway.

How hard is it to simply build an html fragment from a tree and cache it? You
just see what nodes need other nodes to be appended.

OK, I can imagine MAYBE the most intensive thing is the votes and re-ordering
of replies.

First of all I hope you are using this algorithm: [https://medium.com/hacking-
and-gonzo/how-reddit-ranking-algo...](https://medium.com/hacking-and-
gonzo/how-reddit-ranking-algorithms-work-
ef111e33d0d9?source=linkShare-a7703696fe94-1489002238)

Secondly, seriously once again you can batch the upvotes by thread and then
process them during the recalculation phase. You only need to recalculate
weight totals for the threads where there were upvotes - and this can be
parallelized.

~~~
pvg
_some algorithm for each request - stop. Just_ [...] _How hard is it to_ [...]
_OK, I can imagine MAYBE_

It can't be any harder than not coming off blithely condescending.

~~~
EGreg
Not condescending at all. Incredulous.

That's because the solution is straightforward, the people who built it are
smart, had years to do it.

If I can build the basic scalable submission and comment system in one 8 hour
day I am sure they can fix it in less than that.

Then you need to iterate of course and get the bugs out. But come on. I am
incredulous that HN is buckling under a load of 20 stories with on average 100
comments each. Seriously?

~~~
pvg
_If I can build the basic scalable submission and comment system in one 8 hour
day_

...

[unless this is deliberate parody in which case, you got me]

~~~
EGreg
of what

~~~
pvg
It feels like joke-explaining because it is but "I can implement this in a
weekend" is one of the central self-unaware brogrammer cliches. You asked.

~~~
EGreg
Everything I say is self-aware and the result of reflection before posting.

And btw I said 8 hours, not a weekend.

Look up Poe's Law.

------
Raphmedia
If anything they should make it so that when you click "show more" it adds the
next page to the current page instead of redirecting you to another page.

~~~
jasonkostempski
That would require javascript, don't they try to avoid that?

~~~
dustinhayes
Not really, you could just load the same page with something like
?comments=all from the server.

~~~
jasonkostempski
That seems better actually, considering "more" doesn't really make sense when
the things you've already gotten may have shifted in the mean time.

------
1123581321
I don't mind pagination if it's for mobile performance reasons, but I'd really
like an option to disable it in my profile settings.

------
midgetjones
Maybe it's a workaround for the issue where HN hugs itself to death if
comments get over a certain number.

~~~
noir_lord
If that's the case and given all the recent issues (including the somewhat
hilarious "Can you all log out because S3 falling over is killing us EDIT: for
clarity - Because of the level of auth'd user activity not because HN depended
on S3" stuff) maybe it is time they actually considered fixing HN.

~~~
positr0n
I've seen the mods make the "please log out to save load on our server"
announcement a few times on really busy threads.

My understanding is it wasn't because HN depends on S3 but because everyone
was on HN at the same time talking about it!

Edit: my google/HN Search fu is failing to find a reference, but they can't
easily horizontally scale HN because it is running on a single arc process.
Comments/Submissions are stored in text files, and the "More" links are
actually stored in memory in closures. When the closures get garbage collected
you get the "Unknown or expired link." error.

~~~
foxylion
I'm really curious why the backend is in that state since so long. Today
everything is scalable but HN performance is limited to a single process? And
the complexity shouldn't be too high to migrate it to another framework/...
when the current one does by design not support a multi process architecture.
Do I miss something?

------
alva
It seems to have been implemented after the 2016 election threads. I remember
dang being frustrated at the page loading times on threads with thousands of
comments.

------
krapp
It could also trim comments after a certain depth, like Reddit does.

HN may do that, but I've only seen it paginate by the number of subthreads.
The bulk of new traffic in a thread is likely to be comments to existing
threads. This would also make thread folding faster, because there would be a
hard limit to the depth of any thread on a single page.

------
dmbtwy4dmbcmmnt
It's been this way for years. It's trivial to use the site search to pull up
threads that have hundreds of comments. This doesn't prove whether those
comments were spilling over back then, or whether you're seeing pagination
being retroactively applied, but IIRC, this has been going on since at least
as far back as 2013.

------
chippy
I'm sure some smart developer will write a browser extension to grab the next
page and append it to the current one.

------
drwl
I saw this too. So far I've encountered it on Who's Hiring threads.

~~~
ryandrake
It's extra painful on Who's Hiring threads because think--what do we all do?
Full-page keyword search, right? Now you have to check if there's another page
to search.

------
toephu2
I noticed it 2 days ago and no option to change it back. It really sucks when
doing CTRL+F to search for stuff. Please give us an option to turn pagination
off.

------
snowpanda
I prefer it the old way, but I understand why they might have changed it.

