Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's Mathematics Stack Exchange, not Stack Overflow.



It's pretty much the same across all of the properties, though. I quit using them; they're run poorly and, especially in system administration or programming areas, the quality is uselessly low


Years ago it was the first place I looked - but now, Stack Overflow has a currency problem. Many programming answers are obsolete, the languages or frameworks to which they relate having evolved, and this is reinforced by aggressive moderation policy and toxic attitudes.

Not sure if the curators are unaware or simply don't care as long as they're still getting fresh questions for whatever's shiny and new.

In practice, nonetheless, it is now the last place I look, not the first, and also the last place I contribute. There are plenty of other forums for relevant and current answers, and without dripping poison.


It's the same problem a lot of internet properties that crowd-source data have now, which is that they don't have the concept of time and knowledge degrading over time built into them.

For the first 5 years or so, Stack Overflow was amazing. For newer subjects that haven't changed significantly in a the last couple years, it's probably still amazing, but the shelf life on this is much shorter in general, because new subject generally change quite quick (conversely, when Stack Overflow was first filling up, all the C/C++.Perl/Python/Bash etc stuff were long past their high churn stages).

When all these sites were new and shiny and still filling in the gaps, this wasn't a problem. Now that we've gotten a decade or more of some of them, they have the problem of being filled with information that was highly relevant at one point, but as it's lost relevancy over time is still treated as relevant.

As the timeline of major changes to these crowd-sourced repositories of knowledge goes, I think the next major advance will be a good algorithm to degrade relevancy (points?) over time with the ability for people to "vouch" for data as still accurate, or to "refine" data to limit the scope it applies to when the subject has expanded and it no longer is as accurate as it was because of that.

E.g. an answer for a Python which works only for Python 2.X but at the time applied generally because Python 3 had not been released. After Python 3 comes out, that answer either needs to be less relevant because it's only sometimes correct, or refined to note where it does and does not apply, in which case it retains its relevancy and should hopefully not crowd out data for the portions of the subject it does not apply to (Python 3 in this case).

I'm convinced this, or something like it, is the next big thing for crowd sourced data, as it becomes more and more important as our collective online data ages.

Edit: It's worth noting Google has likely already solved this, at least for their problem space. Then again, the way Google could be considered to be a repository of crowd sourced data is slightly different than most sites. That said, I wouldn't be surprised to hear they have a solution, either in place or planned, for how to deal with this affecting the knowledge graph.



Are you implying if I go look at any python question from 8+ years ago it will be correctly tagged as not just python but also the correct python subversion?

Part of the problem is that at the time of the question the tags might be entirely sufficient but become less so as the the subject changes over time and new tags are available.


If you think that a new version of Python is that different, just ask a new question with the corresponding tag. To avoid its closing you might want to specify that it's not a duplicate of previous question because their answers are for an older version of Python.

Also you could try retag a question with the appropriate version tag, e.g. [python-2.x] or [python-3.5], so other people will know that when using a newer version of Python there might be other (better) answers.


> If you think that a new version of Python is that different, just ask a new question with the corresponding tag.

I use python versions to illustrate the point. That doesn't.meannit encapsulates all the nuances. For example, take a question that asks about the best way to handle HTTP client needs in Python that doesn't include an answer suggesting the requests module[1] because it predates it? Its not strictly tied to a python version, but it does.auffwr for being older, and adding a new, better answer years later may take years more to be ranked high in the answers (if ever). Can.that problem. Be solved with tags? Maybe. Can it. Be solved well by fairly free-form community decided tags? I doubt it.

> Also you could try retag a question with the appropriate version tag, e.g. [python-2.x] or [python-3.5], so other people will know that when using a newer version of Python there might be other (better) answers.

Yes, and people could instead just volunteer their time to accurately rate every question and answer on some absolute scale, say 1 to 1000. Unfortunately systems like that don't scale because it requires too much effort from the individual, and people tire of it.

What sites like Stack Overflow and Reddit spearheaded was to instead give users many tiny, easy, and individually insignificant decisions which when taken in aggregate allowed for the emergent behavior to show its value.

I suspect we'll see a similar solution (at least in part) to this, where some small behavior is incentivized to not just generate and rate the data, but curate it over time. Tagging may be part of that solution, but I think it's fairly inadequate in its current incarnation.

1: I'm assuming requests is still popular. I'm not really a python programmer, but it's more common than my preferred languages so I figured it would convey the point easier.


My guess is, if you opened a Python question, it would be closed as a dupe right away


Stack Exchange has quickly turned into a garbage dump of obsolete answers. They have no means of clearing out answers that are totally out of date, which can lead to needless frustration when people need help with a framework or language.

For instance, Ember.js is still seeing lots of development and has changed drastically in some ways, but every time I've searched SO it almost always brings up questions from ~2013. I'm convinced that a lot of the so-called "learning curve" would go away if Stack Overflow was wiped out of existence, which is a weird thing to say having once been a big fan of Stack Exchange.

A better way to find solutions is to just join Slack/Discord/IRC servers for different software communities and ask questions there.

Don't even get me started on the toxic attitudes on SE.


One thing related: I do find I frequently need to check the date of a answer to find out whether it's out of date. Then, ......, I can't find where the year is. Their date format is so bizarre .......


One department is probably arguing that this increases overall traffic to SO.


I am somewhat embarrassed to admit that I thought your first sentence was about money; turns out both 'currency' and 'currentness' can refer to 'the state of being current'.


There are essentially no CMake answers that are valid for creating idiomatic, modern CMake. I know how to correct the answers but can't unless I jump through the hoops. So I don't.


>There are plenty of other forums for relevant and current answers, and without dripping poison.

Would you mind listing a few examples?


They tend to be language or framework specific. e.g. for Ruby/Rails questions I haunt the GoRails slack and a couple of Ruby/Rails specific subreddits.

In quite a few cases the official dev forums for a particular vendor or technology are actually very active, and I think that's because forums just got a lot easier to implement. For example I get a lot back from the Apple Developer forums (not the public discussion community, which is hot garbage) and Vue.js's own forum.

A fair chunk of what I previously used Stack Overflow for (finding code snippets to learn from) has been replaced by the ease of browsing public repositories on Github.


Another issue is moderators close questions as dupes without reading them. I used to hang out in the private slack for the cabal that runs one of the sites; I imagine other stack exchange sites have similar cabals that auto close/auto vote on things


Fixed, thanks.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: