Hacker News new | past | comments | ask | show | jobs | submit login
Tell HN: Clickable domains and other new features for story quality
306 points by dang on Sept 15, 2015 | hide | past | web | favorite | 155 comments
Here are some experimental new features to help improve story quality on HN.

We've adjusted the dupe detector to reject fewer URLs. If a story hasn't had significant attention in about the last year, reposts are ok. That's been the policy for a while, but we've brought the software closer to it. It will still reject reposts for a few hours, though, to avoid stampedes. Allowing reposts is a way of giving high-quality stories multiple chances at making the front page. Please do this tastefully and don't overdo it.

When reposting, please don't delete the earlier post. Deletion is for things that shouldn't have been posted in the first place, such as if you regret having said something publicly.

When a story is a duplicate—that is, has had significant attention on HN in the last year or so—it's helpful to post a comment linking to the previous major thread, so users and/or moderators can flag the dupe. In addition, when a URL isn't the best source for a given story, it's helpful to post a better URL in the thread. We often see those and change the posts to use them.

Both these practices are common in the HN community and make a big difference to story quality here. Thank you all! The following features are intended to make them quicker to do. We built them to make moderation easier for ourselves, but hope they'll be helpful for community moderation too.

First, you can click on a story's domain to see the previous HN submissions from that site.

Second, when you're logged in, stories on /newest and on /item pages have 'past' and 'web' links. Click on 'past' to search HN for previous stories with that title. This helps with finding duplicates. Click on 'web' go to a Google search for the story title. This helps with finding better sources and catching spam.

Finally, when a story is the first post from a site, logged-in users will see the site name in green, by analogy with the green usernames of noob accounts.

These really are experimental and if any proves unhelpful, we'll toss it. We want HN to stay simple and coherent and not just be an agglutination of features. Your feedback will mostly decide what we do, so feed away!

Edit: ok, we tossed the green sites. More people disliked than liked them, and the same information is available just by clicking on the site name anyway.

The "past" link is a nice feature, but what I'd really like to see is a way of weeding out duplicate stories before they're posted. For example, today there were 9 similar stories posted (so far) about Facebook's new "dislike" button[1], causing none of them to receive a significant numbers of upvotes. Could we have a feature in the "submit" dialog that displays similar stories in the last 24 hours and then asks the user if they really still want to submit theirs?

[1] https://new-hn.algolia.com/?experimental&sort=byDate&prefix=...

When a story gets posted in many versions, most are knockoffs. If a knockoff has made it to HN first, I wouldn't want someone to hesitate to post a more solid article. I'm also not sure how to define "similar stories". Your link picks out the word "dislike" to search for, but that essentially encodes the answer to the hard part in the question.

I do agree that it would be good to weed out variants of the same story (or better, merge them), and we're open to working on that. But today's features are more modest—nothing more than a way of easing the manual work that many of us are already doing. Not least yourself!

How about grouping stories covering the same current event/topic and ranking them by an aggregate score? This would keep the front page clean while also giving the event/topic a more appropriate rank (i.e. one higher listing rather than 2 or 3 separate lower ones).

  6. SOPA making a comeback  	
    217 points | guardian.com 46 comments | eff.org 27 comments | motherboard.vice.com 13 comments
or perhaps:

  6. SOPA making a comeback  	
    217 points 76 comments
      SOPA Lives on for the International Anti-Counterfeiting Coalition (www.eff.org)
      113 points by DiabloD3 2 hours ago | flag | 41 comments 
      Google, Facebook, Twitter and Yahoo Claim MPAA Is Trying to Resurrect SOPA (www.theguardian.com)
      89 points by runesoerensen 3 hours ago | flag | 23 comments 
      An Undead SOPA Is Hiding Inside an Extremely Boring Case About Invisible Braces (motherboard.vice.com)  
      12 points by lizzard 15 minutes ago | flag | 12 comments  
Groupings would be created on-demand by the editors in response to user flags.

Within the group you could list the links in order of their individual scores. The title of the group could come from the highest ranked or could be manually chosen by editors. You could have separate comments pages, but also a group comment page which is just the union of all the comments.

Later you could take this a step further and include historical posts in the list, thus avoiding redundant comment streams (but only use new votes and new comments for ranking the group -- it has to freshly earn any reappearance on the front page).

If and or when they add thread folding, folding for the story links themselves might be a logical extension of that. You could have the top ranked story and under the fold the last, say, five related.

The discussions could likely be put on one page as well, allowing you to reference to the other articles without having to add links to other similar articles or other comment threads. That'd be pretty cool.

I think this is a great idea; you just put a "group" flag that we can click on new stories, and the context should be enough to know which submissions to group. Limiting them only to new also avoids the urge to just group all resubmissions, create ongoing topic threads, etc.

I quite like the way Stackexchange displays a list of similar questions when you're submitting a new question. It doesn't stop you from posting, but it does give pause if you didn't know it existed.

Yes, stackexchange really got this right. Many times i've been searching for the answer to a question on google, even with site:stackoverflow.com. I don't find anything and eventually i give up and decide to post a question myself but after writing half of the question, stackoverflow finds a previous post about this very topic which answers my question.

I also experienced this. I believe it's due to the fact that I phrase my questions differently from how I phrase my searches. :)

What about this for a sketch of an idea:

Give us the ability to cite 'other sources' for the same story (ideally it should be smart enough to dereference other HN submissions if that's passed in). Then mods can, if they so desire, merge comments from separate discussions into one pool.

Finally, for the alternate links, you could consider allowing users to vote the best source. I say 'consider' because I'm a bit iffy as to how it might work out. I don't remember any significant controversy over which story was best and mod rule might be for the best. There are also issues with how you score a poll when some options may not have existed during any given vote and HN tends to use approval voting in polls.

I think a small bibliography type feature would be interesting. so, a story on topic X hits the front page. instead of killing the next 5 stories on topic X, find a way to cite 2-5 as bibliography on topic X.

Point is, the first story on topic X has a random element. killing all other stories leads to monoculture.

legitimate issue is having 5 / 30 front page stories be topic X drowns out other discussion. solution is to have the bibliography become deeper...rather than spam the front page with the bilbiolgraphy-type stories.

Some way of aggregating the stories and their votes would help. Almost certainly manually. But a way of tagging and linking similar submissions under a single link, with grouped votes and comments.

One thing I've appreciated about Ello is that its re-shares of posts aggregate activity rather than divide it. In terms of increasing apparent activity, it's a tremendously useful feature.

Second this. I feel like I've seen more dupes lately as people rush to get a hot story in. The past and web links will help for sure but it won't stop the submitter unless they're prompted.

That being said, these are all great features. More information access without being overly cluttered is a huge win for everyone.

> I feel like I've seen more dupes lately as people rush to get a hot story in

Hard to say, of course, but this might be sample bias, not counting a bug we had briefly yesterday. From what we see, the community has been doing a lot more to track down duplicates lately. That's what inspired this new work.

That's fair. I tend to browse /newest so I'm probably seeing them before they get cleaned up from on high.

What is the difference between new-hn.algolia.com/?experimental, and just regular hn.algolia.com ?

"Experimental" changes the layout a bit. You can try it by going into the settings (icon at upper right) and changing "Style" to "Experimental".

I think that at this point, "hn" gives you the same thing as "new-hn". I just have an outdated link in my bookmark.

Any chance we can have collapsible comments without a greasemonkey script, or bookmarklet?

This is an absolutely bare-bones feature for HN. It would make the site so much more usable in general. The big case for collapsible comments relative to HN is stories where the top voted comment is only semi-related to the article (usual suspects being security, software freedom, political leaning freedom, ...) but spawns its own megathread and drowns out any hope of other discussions more germane to the original post.

Even if the don't collapse, It might be nice to see first level comments have some kind of distinguishing visual context...to get to another thought... eg, comment number 2,3 etc...is ofetn buried under sub comment discusion minutia....when a top comment has 30 or 40 sub comments.

you should be able to scroll a page and pick out the top 5 original comments...even without collapsing...

Yes. What do you think of my https://github.com/alain94040/arguably, which is an extreme take on your suggestion?

Interesting. Some of this fits in with my "non-brief moderation thoughts" essay: https://redd.it/28jfk4/

In particular: no content-ranking system is perfect. Some level of random ordering (within a range) is probably actually preferable.

Also, Aon on randomness:

"When your reasons are worse than useless, sometimes the most rational choice is a random stab in the dark" http://aeon.co/magazine/philosophy/is-the-most-rational-choi...

Official word is that it'll be the "only major change to the Hacker News UI that we're committed to"


This seems to be the most obvious thing they're missing. I can't stand trying to read through HN comments without a greasemonkey script.

Quite this.

One of the problems of collapsible threads is that it reduces the amount of people who are prepared to flag and downvote comments that shouldn't be here.

Having the ability to not just collapse threads, but to auto-collapse threads which don't contain any new comments would make it (user scripters: "and does make it") easier^H^H^H^H^H^Hpossible to find new comments on repeat visits, putting more eyes on them.

I'd like to suggest a very basic fix that will help new users.

People need to know about rules before they can reliably follow them. And Hacker News makes that surprisingly hard for new people.

Above the submit button, there is a line that says: "If you haven't already, would you mind reading about HN's approach to comments?"

I wanted to make sure I followed the rules, so I clicked the hyperlink and began reading a new page. It began, "Hacker News is a bit different from other community sites, so we'd appreciate it if you'd take a minute to read the site guidelines."

I thought that was the page that I had just clicked through to, so I continued reading. And it seemed like I was right: I learned about the rules against crap links, rudeness, etc.

Unfortunately, I was wrong. The first link took me to the 'welcome page' — which provides some guidelines for using the site, but not all of the rules. If you want to learn the rest of the rules, you have to know to click on another hyperlink to reach the site guidelines page. I found this counter-intuitive, and I doubt I'm the only one.

If you want people to learn the rules, please consider placing them beneath the existing text on the welcome page, so everyone will realize they exist. Or, at minimum, place a link to the guidelines page directly above the submit button, instead of just sending people back to the welcome page. And maybe add a "Rules" tab to the top navigation bar.

Thanks for your help!

Ok, we changed the text of both https://news.ycombinator.com/newswelcome.html and the "would you mind" blurb to make it clear that the HN guidelines are an additional thing to read.

Nice job! :) Thanks for responding to my feedback.

Very good improvements!

However, the green-URL-for-noob-domains may be problematic. The green is used to give an indication for potentially low-quality content, but public startup/project launches would all fall under the noob-domain heading.

> The green is used to give an indication for potentially low-quality content.

I wouldn't say that at all! It's just interesting to know what's new.

We've been displaying new sites in green to moderators for a while now. When a high-quality story comes from a brand new (to HN) site, that's interesting. The converse, too: it's interesting when you can see that an obscure site isn't new to HN (because it's not green). Often someone posted it like 6 years ago and it got no upvotes; or sometimes there was actually a major thread. Now you can click on the domain to find out. HN has a rich history that's fun to explore.

Edit: I'll give you, though, that green sites and green usernames have higher variance. The good ones are really good (think of the author of a story showing up in the thread to discuss it), while the bad ones are more likely to be spam. I think it's fine to call attention to it in either case.

I'll argue against the green domains on slightly different grounds: the color is being applied as a sort of caution or warning sign, and from a UX perspective green is about the worst possible color for that sort of thing. Green is positive; green means "go," not "caution" or "warning" or "stop."

When I first saw the green domains I thought HN was trying to tell me that these domains were particularly good, as in sources that have yielded a lot of high-karma stories in the past or something. It never occurred to me that it could mean they were new or iffy until I read this thread.

I'd suggest using yellow instead, as (in the US, anyway) yellow has a strong association with caution and warning signs, without the strong "do not proceed" vibe of red.

(And the same logic would probably apply to colors on usernames, come to think of it. Green should be for usernames that HN wants to call out as MVPs for some reason. New accounts could be yellow, to indicate that they don't have a track record yet.)

As an HN user, I've never thought of it as a warning sign. And "green" can of course mean "new" as well as "go".

The color choice is moot, though, because green usernames have been around for so long that we're not likely to change them.

If people really don't like the green sites, we're happy to plunge them back under the moderator-only covers. But maybe let's try them for a while? In practice I think they're pretty cool.

I agree. And even if green does signify "good", I don't think that's a bad thing here. I'd rather be drawn towards fresh sources first. Those items are likely to require more oversight, and voting (and potentially flagging) as well.

Red is more auspicious than green in China, though green is also generally favourably viewed. Except for hats.


It's actually supposed to be a pun -- while "green" often refers to the color, it also can mean "young or unripe" (fruit) or "inexperienced, naive, or gullible" (person), such as in the phrase, "a green recruit fresh from college".

I think it's meant more for regulars -- a gentle reminder to be nice in the comments because green users might not be as familiar with the norms -- but I can see how the color choice might be confusing for new users.

I would argue against green usernames as well. I think it appears to be a stronger signal of quality than it is (like karma, which if it works should be enough.) Hacker News needs fewer such filters, or at the very least more intelligent ones. If noob accounts act out, downvote and correct them, and if they continue, someone will ban them.

I found green usernames to be useful for voting -- I'll upvote anything by a green user that's a reasonable post a normal functioning human would make, and if it's something to downvote, you can check their posting history and often there's stuff worth upvoting to cancel out the downvote.

I realise this gets complicated, but showing some level of stats for domains could also be interesting. Possibly:

● Submissions

● FP/Submissions ratio

● Most recent submission

● Submissions/interval (day, month, year).

● Ratings and comments scores -- what sites typically get high discussion and/or high comments?

Might want to trial some of this internally. But I'm sure the data itself would be fascinating.

I've been assuming that green usernames indicated some kind of super user. That's because the green usernames stands out so much compared to all other usernames.

Maybe add a link to 'cached' in addition to 'web'? You could either do your own caching when the link is submitted, or use a service. The link will be useful immediately if the site goes down over load, and keeping an archive will keep the comments comprehensible for future readers if/when the link erodes.

I really like this idea from a historical preservation perspective.

HN comments contain a lot of a value in aggregate, and it would be a shame to lose necessary context to exploit this value due to simple link-rot.

However, I can see issues arising with paywalled links. The HN cache would likely display a rather useless paywall for many of the most popular stories. Navigating around the paywall by technical means may present an IP issue.

What do you think about HN auto-submitting to archive.org?

Archive.org does follow some rules (e.g. robots.txt and passwords as you mentioned), so not everything would be guaranteed.

It could be a good start for a lot of the content here, though.

EDIT: Forgot to add that I'm not sure how much it'd help with sites going down from the HN attention. I could easily see archive.org moving a bit slower than HN's users.

> HN auto-submitting to archive.org

Can you describe this in more detail?

Not the same person, but I assume they're referring to the "Save Page Now" function of the Wayback Machine (https://archive.org/web/).

This tells Archive.org's crawler to immediately process a page and add it to the Wayback Machine's cache. Unfortunately there's no public API for this, but it is possible to programmatically submit a request to their endpoint and scrape out the resulting archive link (and I have code that does this, if that would help).

This would be an excellent start IMO.

We worked on something like this for a while last year (code name "the archivist") with the intention of making Readability-style versions of stories with plain text, major images and no cruft. The purpose of the experiment was to see if it would speed up moderation. If we kept it, we hoped to share it with everybody (where by "hoped" I mean "would have unless we couldn't"). In the end, we didn't keep it because it didn't speed up moderation and it is one of those problems that turns out to be increasingly nontrivial the closer you get.

If we did anything like it again, I'd still hope to share it with everybody, but perhaps not by adding a third link. I already feel bad for adding two.

Sorry to see you've not kept up with this.

I access HN on a few devices, including some whose rendering of "modern" (that is: broken) site designs is at best poor. Frequently no content is visible, either due to text not appearing at all, or being completely obscured by other elements. A Readability view, stripped of cruft, would be excellent for this. I'm aware of issues with site referrals, copyright, etc., but really, it would be helpful.

Otherwise: Internet Archive and Coral Cache are both existing systems which can and do cache some content, on request. IA seems to like having hot stuff fed them, CC have been quite spotty in reliability over the past year or two (both not properly caching content, and simply not responding).

I'm open to working on it again. Qua user, I would love to be able to view Readability-style versions of stories quickly. And think of all the analytics people could do on a near-complete archive of all HN stories.

But it's a matter of priorities. Had it sped up moderation it would have both paid for itself and made certain campers happier. But it didn't turn out that way. Beyond that, technically it's a nontrivial problem to get working on the full range of content, and then there are the nontechnical obstacles. We wouldn't do it without being sure we could release it.

Sending requests to Internet Archive might be an option if they'd be ok with it, but that of course would only help with caching, not decrufting.

Totally understand on all points.

Caches on their own would be totally worthwhile, even without de-crufting. If IA are up for it, HN as signal for relevance would likely be worthwhile. Talk to Brewster.

As I said, decrufting/readability would be a really nice value-add!personally. Readability themselves have an API for this which might be one way to approach the concept, and they've done much of the heavy lifting in terms of sorting out sites' various CSS/HTML cruft and sanitizing it. I do my own pretty significant CSS restructuring locally (we've chatted about this before w/ HN), and with some 1800+ individual sites' CSS modified to some extent or anohter, I've got a really good idea of just how effed up the stuff can be.

I totally agree with Nic Bvacqua's "Stop Breaking the Web" posted yesterday.

But on an effort/reward basis as a greenfield project, likely not worth it. Going with Readability (or Instapaper, or Pocket) themselves could well be worth investigating.

As a suggestion: another consideration would be to simply reject submissions which aren't accessible via some putative minimal client. If enough aggregators started penalising sites for inaccessible content, they might start wising up.

I guess I need to comment more so I can at least get to 30. :)

So far I like the feature that new links are posted in green. That will give new sites the opportunity to standout against the larger, common sites posted (like bbc, wired, nautilus, etc). I just hope some users won't take advantage and scurry around for new sites to take advantage of this new feature.

Thanks for the changes. I'll definitely send feedback along the way.

I'm not sure how the UI for this would work, and perhaps it's best left as an idea for a browser extension, but I like the idea of showing a selection of the five or so most recently submitted stories on the homepage, separate from the main articles. It wouldn't be enabled by default, only when an option is selected in the user profile.

I never visit the new section of HN. I really wish I did but I always forget about it. However, if there was a little section for new articles on the homepage (perhaps at the bottom or the side of the page in a box) I'd definitely check some out and upvote the interesting articles.

You wouldn't even need a browser extension in Chrome, as it supports "greasemonkey" style user scripts natively: any `.user.js` file you drag onto the "chrome://extensions" page is converted to an anonymous Chrome extension automagicly. YMMV with other browsers, if course.

I have the feeling that creating a new domain is going to be a way of growth hacking one's story toward the HN front page. Green is a big bump in a world of greys.

Fluff tends to get flagged off the front page pretty quickly, so it may not turn out to be a problem. But if that does start to happen, please let us know. We can turn it off, or maybe turn it off on the front page.

I'm surprised that the green sites turned out to be controversial because I have found them helpful as a moderator and a reader, but in retrospect I can see why a visually noticeable change would be. We're not attached to this feature.

This morning, I just felt that the green was striking and drew my attention to those articles. Admittedly, a lot of it could be my habits and expectations. But I know I wouldn't ordinarily have focused on them based on other factors.

I'm not being controversial, though now that I think about it, there is a shift in the semantics of the color: it used to be reserved exclusively for people, now it is also being used for "things". That may account for some of the undercurrent for resistance to change.

I can see why it is helpful for moderation. I guess the real question is does the green spike the level of participation in moderation activities? As a reader, it doesn't really highlight the sort of information that I make my decisions on.

We just turned it off. These were all experiments, the green sites generated more negative than positive feedback, and it's fine for an experiment to fail.

Also, the same information is available by clicking on the domain: if you see only that story in the list, it means there has only been the one submission from that site.

Was there any change in behavior...other than a new target for complaints? I'm curious about how much editorial activity users do, particularly since mine seems erratic at best.

Maybe it could be a profile option at 50 karma [aka badge]. Even at the risk of people saying HN is turning into StackOverflow, gamification for editorial engagement may be worth pursuing.

It was too soon to have observed any change in behavior. But the feedback was more negative than positive, and it did visually disrupt the front page—not something we want to do without a strong reason.

I don’t mind the use of green (as compared to some other color) to indicate 'new site', but could you tone it down a little bit, or give users the option to disable it? On an otherwise neutral page, the green stands out and is quite distracting.

Maybe use something like #6a8966 (http://www.colorpicker.com/6a8966)?

I agree that the classic noob green is a little loud, but wow, your eyes are way sharper than mine, or we're using different devices, or both.

Maybe we can tone it down a bit though.

I want to be able to notice it if I look carefully, but not be distracted by it if I don’t care.

Right now the green screams “look at me” and unbalances the whole page. http://i.imgur.com/GhUfRn5.png

Here’s my proposed color, which is still plenty noticeable if you look for it, but now subtle enough to not stand out: http://i.imgur.com/vjW5DYT.png

I was going to ask to make it more prominent!

Maybe an option like the top bar colour?

Avoiding option proliferation seems important. It might be better to just toss the feature.

Edit: we tossed the feature. I'll edit the OP.

Can this be built into HN? If not, why not?

Hoping for updates on:

> But we'll think about it.

Clicking the domain to see other submissions from that site is the best feature HN has added in years.

Agree! Thanks 'dang and gang.

agree...good discovery tool

between /new and /firstpage


edit: hope sw prevents prs/upvote rings from abusing)

Very good improvements, thanks.

I think the dupe detection would be even more useful if done during submission.

The dupe detection software, of course, does run during submission. Clicking on a search link is something that humans have to do, though. That's for catching duplicates that escape simple URL matching.

Writing software to identify which URLs are really about the same thing and which URLs are not is a nontrivial problem. I'd love to work on solving that in the general-enough case to be useful for HN, but we shouldn't let that stop us from doing incremental things to make life easier in the short run.

Possibly you could split submission into multiple steps? The user first submits just the URL. This returns the results of the dupe detector (if any). They then confirm submission (or resubmission) in the next step.

The first submission could also test the URL, and pre-fill the title field with the actual title of the page. The user then edits the title (if desired), and confirms to submit the page. Or bails out when they realize the submission already exists under a duplicate URL.

This could be done within a page with AJAX, or kept as it is with multiple pages. In either case, you'd leave the final choice of title and URL with the user, and are only offering them more information (and an easy option to cancel) during the process.

I suspect what kawera meant is if the submission form notified you that the URL is a dupe via ajax, so that then you wouldn't have to bother to copy and paste the title (which can be a pain on phones and tablets).

In that case I'm not sure I get it. Care to explain in more detail?

A submission requires a title and URL. It would be nice if we could paste in the URL and see a green checkmark if the URL is not an invalid dupe or a red x if the URL is an invalid dupe. Then we don't have to copy and paste (and sometimes reformat) the title if the submission is an invalid dupe.

While I can't speak for the OP, I had been on the site for well over a year before I realized that you could search through past HN submissions at all (even though the search box is at the bottom of basically every page).

As far as I can tell, neither /submit nor the guidelines mention that you should do a search first before submitting.

Then, once you assume you should do a search before you submit, the idea is that such a search could happen on the same page as submit, as soon as you finish pasting a URL, and before you hit submit.

It's like when you go to a support site and type in a question, and it tries to give you answers by searching for other instances of that question (or by searching a knowledge base), before letting you make a new support ticket.

I was referring to the new "past" functionality running automatically on the submit form, on focus change for example.

Ok, I get it now. HN is extremely light on the Ajax, so it would be a major design change at that level. If we decide to take that step, implementing this feature would be reasonable. But we probably wouldn't do it for one feature.

That's pretty minor, how about a mobile style sheet?

Please don't make the dupe detector "too perfect". Fact is good content sometimes doesn't get noticed first time it is submitted.

It is common for re-submissions (sometimes with a better title) a few hours or days later to make the the top of the front page.

> good content sometimes doesn't get noticed first time it is submitted.

Amid all the discussion about catching dupes I can see why this wasn't clear, but the point of these changes is to let more reposts through. That's the motivation for what we've released today (see "We've adjusted the dupe detector to reject fewer URLs" above). Every good story that goes unnoticed is a loss for HN.

What about a way to merge stories that are duplicates but with slightly different URLs that didn't get caught in the filter? Perhaps comments from all the articles could get merged together under the post with the most upvotes.

We do exactly that when users find the duplicates and link to them from the thread, as well as when we find the duplicates ourselves.

Making higher-octane software that can automate more of this has been on our list for a while, but it's hard to know when we'll get to that.

Minor feedback, if you look at "past" for a domain that's a YC company that often posts jobs to HN, the results are littered with those postings rather than other content from the site which may be more relevant.

Good catch. Let me see if we can exclude job posts. Do you have a good example you could link to?

Edit: this should be fixed now. Is it?

yep yep! thanks dang :)

(But it isn't fixed for the cases where bad boys and girls submitted job ads as regular stories. That would require a job ad recognizer.)

I didn't realize people did that! I only checked my own company (vanity!) – I hope we're not guilty of this, if so, I'll raise hell :)

They do it without meaning to. People building startups don't always have time to re-learn the intricacies of HN.

Yeah, it's always good to assume good faith :)

Are those penalised (perhaps on a submitter basis)?

A separate flagging feature (jobs ad) might help? Or just plain flagging.

When job ads appear as regular stories, users should certainly flag them, whether they're by YC companies or not.

I meant more that moderation actions (upvotes, flags) after a time seem not to be enabled. Surfacing crap / spam with improved search seems highly likely. It's another thing that happened on Ello.

Actually, their fix was pretty slick: spamming profile posts don't show up in normal search.

New feature idea: At some point, merge duplicate posts and their comments into a single master post. Create forwarding links as needed.

What does everyone think about links opening in a new tab rather than the parent tab? I'll often find myself getting off on a wikipedia tangent (or some similar learning loop) and HN will be several back buttons back. I now open every link in a new tab manually to solve for this problem.

No, please. Managing your tabs is not the site's job, it's yours or your browser's, such as you choose to split that labor. In most modern browsers, manually opening a new tab simply means middle-clicking or Ctrl+left-clicking instead of regular clicking. And if you really want it to automatically apply to all HN links, that's a great use case for a userscript.

In application design, external links tend to open in a new tab (see Facebook, Twitter, etc...) It's been a standard on the web for almost a decade now.

My feelings are mixed on this.

On FB, I'm usually glad that links open in a new tab (well, I middle click them anyway, but sometimes I forget) because if they didn't, I'd have the huge problem of trying to infinite scroll my way back to where I was.

On just some random business website, OTOH, I usually take it to mean the business owner (or developer) thought their site was so important, it should make me keep it open while I go to another link.

I usually middle-click on HN anyway, but I feel like it's sort of nicer in a way, to be in control of that. Maybe HN feels like more of a page than an application to me because it doesn't use infinite scroll / a ton of custom Javascript behavior?

I'm inclined to agree that web _applications_ should open external links in a new tab, but I'm not sure it's best for HN.

Absolutely not.

Any chance of changing the table background color from #f6f6ef to something that has sharper contrast with white page background? I look at comment distance with this border to find new top level comments, but it's hard when one is white and one is light grey.

It does seem odd, that the only thing people can change the color of is the top bar (and not even the text color with the top bar) and that this is somehow worth being a karma perk. Just being able to set a default text size would solve readability issues for a lot of people.

Dup detection didn't seem to be working a couple days ago. Bug or just the changeover? Details: https://news.ycombinator.com/item?id=10211499

It was a short-lived bug we introduced while working on this. https://news.ycombinator.com/item?id=10216280

Hence the 'avoid stampedes' part I mentioned above.

Pretty please fix the url parsing so it doesn't include the angle bracket after a link, like this URL <http://google.com/> embeded in a sentence.

@Dang, I once posted a link and then someone reposted the same link but added /# to it. HN treated that as a new URL and that link, because karma -I think- got to the FP. Any solution to that?

That's the flip side of what I said above: "allowing reposts is a way of giving high-quality stories multiple chances at making the front page". It's pretty random which particular submission ends up making it.

We've thought about some form of karma distribution across every user who submitted the link, but that gets complex quickly and I'm not sure it's a good idea. In the meantime, the best approach is to realize that any individual submission is subject to randomness (i.e. /newest is a lottery), and the reliable way to gain karma over time is simply to enter more often, i.e. post more high-quality stories. That's best for the community too!

Nice to see some new features! I hope it's not rude to ask if there's a special reason the URLs in Ask HN submission text aren't hyperlinks?

That one's in the FAQ ("How do I make a link in a question").


I'd like to see stats of first time URLs pre and post greenification.

I suspect we may see more green links on front page than prior... just a hunch.

We'll watch for that. If people start to abuse it we'll deal with it somehow.

Low-quality posts tend to get flagged, though, so it's also possible this won't be a problem.


To be clear I didn't mean to imply that would happen. I was more interested in the psychology of it. And I thought the data would be fun to look at.

But thanks!

What I really miss is the score against comments.

This change just broke the Chrome extension Georgify for HN.

Would love the ability to fold comment threads.

Why Google and not DuckDuckGo?


Even better if we could config it in the settings page (like topcolor...)

These other features you've added are good. But years after I started using HN, I still accidentally downvote when I want to upvote because the mobile experience is...well, you know. You all probably often use HN on a mobile device.

Please, please, please, can we have a mobile-friendly layout?

I've also done this on desktop. The ability to change your vote, possibly only for a limited amount of time after casting it, would be far more valuable to me.

Agree 100%! If you click "downvote", the arrow should just turn to orange instead of disappearing. If you click "up" ditto. You can then change your vote for a limited time.

As a bonus: please don't refresh the page just because I upvoted someone. Why should I search where it is I left off reading just because I upvoted someone?

EDIT: nevermind about second point, it looks like it happens only if JS is disabled. My bad. :)

HN is gracefully degrading you ;)

Until there's an 'official' version, @cheeaun's HackerWeb is an excellent alternative for all platforms:


Best looking alternative I've seen so far.

You can't login to comment though. Or am I missing something?

Perhaps do it reddit-style - always show voting arrows, highlight current vote if there's one and allow re-voting? Cap the re-vote count if need be, but allow for at least one.

Upvote on left of title, downvote on right?

If I accidentally downvote I immediately find two other posts by the same author and upvote them.

That will help their karma score, but is still a problem for comment ranking. Downvoted comments appear lower on the page, thus get less attention.

Yes. We've been working on it for a while and are hopeful we can start alpha-testing it soon. There has been much discussion of this, including elsewhere in this thread (https://news.ycombinator.com/item?id=10224045, and follow the links from there).

The last few times this has come up I have begun to feel a gnawing fear that everyone has an incompatible idea of what "mobile-friendly" and "responsive" means, and the disagreement about how it should work will never cease. I guess we'll find out soon enough.

Someone should make a round up of HN apps. My personal favorite: Boreal

Seconded - this really should be a top priority. I know there are some good apps out there, and I use them, but it would be so much better to have a responsive design.

More information about internal policies like how much karma you need to downvote and other thing people don't even know exists would be nice.

You currently need 501 karma to downvote. What are the other things?

When are you going to fix the mobile css? Surely that'd take about 4 minutes.

Not 5? :)

This is what happens when we try 4-minute fixes: https://news.ycombinator.com/item?id=9205733 and https://news.ycombinator.com/item?id=9206427.

We've been working on a more thorough solution since then. I've posted about it many times, e.g. https://news.ycombinator.com/item?id=10104936, and no one will be more relieved when those days are over. I just hope everyone doesn't still disagree and propose more 4-minute fixes forever.

I just hope everyone doesn't still disagree and propose more 4-minute fixes forever.

In the infamous words of Lily Tomlin, Oh, dang, that's so cute.



I'm quite certain the world's optometrists are paying HN to keep that font tiny so my eyes are strained every time I use the site on mobile.

But seriously, I've heard in the past that they wish not to break the many apps that scrape HN, and therefore have hesitated to change the markup.

That shouldn't be HN's problem, though. Apps that scrape the HTML of a site take it entirely upon themselves to keep up with changes in the markup.

The "break HTML scraping" reason makes zero sense when an HN API exists.

The API is available but not everyone has switched to it. There is a valid argument for backwards compatibility, like Linus refusing to break userspace even when it means rejecting some improvements. However, I'm in full agreement with you here. The benefit of usability improvements far outweighs the disadvantages of breaking screen scrapers, especially when so many of those scrapers exist solely to make HN more mobile friendly.

The HN API still has some important gaps. But you have to expect the HTML to change, so cautious scrapers never hardcode XPaths!

Not sure why the current markup couldn't support some responsive styling.

I'll look up the link, but previously dang (I think?) said that it will be slowly improved, in a minimal-impact sort of way.

Introduce the HN API gradually, get the scrapers to shift over to it, improve html generation and css used. Would be interesting to hear updates on that.


Found the link! http://blog.ycombinator.com/hacker-news-api

Just increase the font size on the main page by a couple of sizes. On both Chrome and FF Mobile it have to zoom in like crazy just to click on the "# comments" link.

Can I ask why, when using FF on Android showing the mobile version of HN, replies appear using different font sizes? Some are tiny, some are actually readable.

I have to zoom/unzoom constantly to read just a thread, or switch to the desktop version, which doesn't have this issue.

It's so dumb, I feel dumber just asking.

all it takes is a separate mobile URL, or dynamic response to mobile user-agents.

What ungodly scraper would be broken by a change in the stylesheet?

The best solution I could find for reading was


But there is no commenting, upvoting, etc (b/c of CSRF).

I tried rendering Hacker News in an iframe on a page with zoom set but they set X-FRAME-OPTIONS: DENY which destroys any hope of cross-origin css magic.

[edit] just tested zoom with a site that can be iframe-d and my css hopes were squashed - the size of the iframe is zoomed, but not its contents.

In fact, in stead of the ugly hack with background-image gifs for the tiny vote-buttons -- why not just use a couple of link-elements with proper unicode ▲ &#9650; ▼ &#9660; glyphs? If for some bizzare reason one wants to keep adding this via js in an otherwise very html4-ish design, those could be added with some css :after-magic -- but I don't see why one would do that.

Maybe mobile correlates with lower quality comments, if so then improving HN is less obvious.

Can you at least add an option to enable mobile CSS and view port settings on a per account basis? I know there are a lot of people on here who love tiny fonts but there are also lots that do not.

So greate!!!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact