1. For things like remote work, it only became a trend because of the pandemic. You would have needed to predict the pandemic to predict that management would permit remote work. It was not really a trend, but rather an emergency measure that morphed into a desirable benefit.
2. This only looks at successful things. We must examine unsuccessful things to see if there is any real predictive ability.
This scares me. Here in Canada, the government scrambled to put "printed" cash into motion, and inflation is already having real impacts... Particularly on poor people. I worry we're only seeing the tip of the iceberg, so to speak. The same thing happened in the USA, but I suspect our economy here is less resilient to fluctuations.
Timing is the >one billion question. Markets aren't predictable because they are dynamic so any measurement and analysis affects the behavior of it's agents.
Nah, I think you haven't been following the permabears.
A lot of them talk about crap like the price of gold, the use of BTC as a hedge, PE ratios, "technicals", candlesticks, simple moving averages, charts, commodity prices, QE, negative bond rates, near-zero interest rates and the like. Some get political even, talking about presidents or policies they don't like and why they'll cause a crash.
Its pretty easy to weave a story around ill-defined feelings, because none of these things determine the market trajectory. You can be 100% correct in your facts and/or analysis of the present but still make terrible predictions of the future.
The permabears always have a good story. They're really good at making stories for why the crash is going to really happen this year.
I think part of this is media enjoying amplifying clickable stories. A lot of people (myself included) just can't help but click about news like "market crash is comming" "housing prices are about to crash" etc. Current media business models sadly are driven by sensational news.
See also: all those revolutionary stories like cancer cure, that happen to only work in mice for now and will probably never work in humans, but we still click on those
Sure, but also, there's a generation of us finance folks who learned how the pipes break. So that when they see something like Evergrande, they go 'well it will probably get bailed out, but if it doesn't.....'
I think it’s that people realize that the underlying issues that caused the financial crisis haven’t actually been fixed, and the Fed has been experimenting with policies that have never been seen before to prop up markets. While it’s worked so far it still feels like there are structural weaknesses that could cause massive problems, and just because things have been good doesn’t guarantee they won’t eventually fail, but timing is notoriously difficult.
"Policies that have never been seen before" seem to describe a good part of the last century, which has certainly had booms and busts, but the novelty has been more of a constant. We just don't have that long a history of modern economics. We've been flying by the seat of our pants the whole time.
And then a year later there are articles saying, "Oh, it turns out that the regulation we created 80 years ago that was specifically designed to prevent exactly this situation was repealed 40 years ago."
And then a year after that there are articles saying, "Why isn't Congress fixing this obvious thing that only makes the very rich richer and always leads to economic disaster?"
And then a year after that, there are articles saying the next crash is coming....
Long term near-zero interest rates have fueled unlimited growth for companies. Unlimited growth means no ceiling to stock market prices. A correction will come when the Fed raises interest rates. We started to see it in 2017 but the Fed curtailed. It's inevitable though. But to your point, no one has called it correctly yet.
> I am constantly seeing articles about how a market crash is coming and have been seeing them since 2014.
I’ve been seeing them nonstop since the early 80s, and I suspect I would have seen them earlier except that my media consumption before elementary school was fairly constrained.
It's a lot like predicting earthquakes right? You can measure strain and know that you're due, but you still don't know precisely what the trigger will be or when it will happen.
We don't make such jokes about seismologists though. Why not?
I suppose it's the matter of media exposure: unlike economists and meteorologists, seismologists aren't constantly on TV telling us bad times are coming soon.
If the meteorologist says it will rain on Saturday and be sunny on Sunday, and in fact they happen in the opposite order, my weekend plans are ruined.
But the seismologist is just saying, "I don't know exactly when but it reeaaally looks like we're going to get a big earthquake someday soon better make sure your building is up to code and you have your emergency supplies ready". And we nod, and check our emergency supplies and upgrade our infrastructure, and when five more years go by without an earthquake, and the seismologist is saying the same thing, nobody says "silly seismologist, predicting twelve of the last three earthquakes!"
But when an economist says a crash might be coming soon, and you'd better make sure you have emergency savings set aside, people do make those jokes.
Maybe one difference is with the economy there is a lot more obvious benefit to be had for almost any person to expect (pretend?) there won't be a crash. For earthquakes, challenging the seismologists only benefits the niche group of real estate developers. While the cost of worrying about earthquakes is relatively little ostensibly.
I'd heard some argument that humans inherent tolerance for risk is much lower than it rationally should be today because it was evolved in an environment where we were much more likely to get killed. I think human discourse is constantly grappling with that biased instinct.
It's more like predicting Y2K. There was no countrywide carastrophe from Y2K, therefore the computer scientsists and engineers who predicted it were wrong. And by extension, computer science and engineering are bogus. Right?
As someone who has worked on exclusively remote teams for around 7 years, the pandemic accelerated the trend. It was already very much headed in that direction.
Why is your work situation relevant? If you meant it as a "this is what my bias is" then it further weakens your argument, which may or may not be desirable for you.
Sometimes people share their bias in the interest of just being transparent. This is mostly because not every discussion is necessarily adversarial, and having more contest is helpful for their companions. However it also makes them appear more trustworthy, adding informal weight to their subsequent statements.
My dad likes to remind me that he was 'working remote' (had a dial up modem and two factor auth in to whatever he was doing at AT&T/Ameritech) back in the early 2000s
I started working remotely end 1994. An analog line ran to my office had two analog 19k2 modems on it and I worked remotely on the server which was connected to the internet using a 10mbit fiber.
Worked from home before that '90/'91 but not online. Had an IBM cash register in my bedroom for which I wrote a POS application.
“It exists and is growing” and it being a trend are very different. I think it is an irreversible trend now but it wasn’t before the pandemic. There are just way more people like myself didn’t even think of it seriously for many reasons (I did know it existed) and they all have woken up now.
Remote work seemed to be declining in the Bay Area pre-pandemic. My company and many others were pushing for more in-office work over remote work as they felt that employees were able to be more creative and productive if they were able to collaborate spontaneously which in their eyes could only happen if everyone was in the office. This really seemed to take off in the area after Yahoo had implemented their no remote work policy.
if course it declines in the Bay Area, a place that people famously move into for work. It'd be silly to move to the Bay only to then work in a shitty rental appartment.
Point 2 notwithstanding, I wonder what's really the threshold of "predicting" for point 1. I've been reading and participating in conversations about remote work on HN for many years before the pandemic; hell, HN is how I discovered this concept in the first place, and what helped me become a remote worker some 3 years before COVID-19 hit the news. REMOTE tag in the monthly "Who is hiring?" / "Who wants to be hired?" threads has also been established before the pandemic.
Ironically, not only nothing made me think "I have got to buy Zoom stock" in 2018, IIRC around 2019 there were plenty of reasons to short Zoom. I wonder if they'd have survived if not for the pandemic.
Before the pandemic, zoom was a starting to gain market share over a bunch of the legacy conference software vendors, like cisco webex. They had their act together on the enterprise sales side and were getting traction via their freemium model ... just dropbox/box got enterprise traction even when companies already had things like SMB/CIFS or sharepoint. The UX was better for the same reasons many an HN didn't understand what advantages Slack has over IRC.
As someone who has been remote, hybrid remote, and almost every other version distributed teams and remote/asynchronous working over the past 13 years, I assure you there was and still is a remote working trend well before COVID. COVID only poured gasoline on an otherwise very long-term "how we work" trend. COVID probably condensed 10 years of where remote work probably would have ended up in the span of 18 months.
To point 2., This article seems like a great example of selection bias. Hacker news discusses many new ideas in the tech world. I'd also like to see things HN didn't like and see the success rate on those services (the famous Dropbox comment[0] comes to mind)
It's rather short of that of course, but after hearing about an outbreak of flu in China on Canadian television in a hotel lobby on holiday from the UK, very early January 2020, I think the 2nd [*] - I heard approximately nothing about except via HN, which included 'this will be bad/global', until March.
[*] (For a while I remembered, because it was notably earlier than UK newspapers kept claiming as a start date, or 'detected in China', in their charts from March. I didn't claim to know the correct date, but obviously that was an upperbound!)
> It was not really a trend, but rather an emergency measure that morphed into a desirable benefit.
Do you have data to support this? Anecdotally (from own experience and tech workers I know) I feel that remote work has been growing steadily over the past decade, with a an increase in job ads mentioning remote (even if they went out their way to say "office only") but I don't have data either way to back this up. It feels that long-term changes in tech as well as management culture were making remote more common in the long term, and the pandemic accelerated a trend (much like the shift in the movie industry from cinemas to home streaming).
Pandemics have worked as an accelerator or even as a catalyst for technological change since at least the Black Death and this one certainly is no exception.
However, remote work didn't become a trend because of the pandemic, it became the default scenario for many jobs that don't actually require on-site presence but that due to cultural inertia nevertheless often still happened in a colocated office pre-2020.
Remote work has been a trend since long before 2020. With the pandemic the proposition in many cases now simply was "Either work remotely or see your company go out of business.", which had many companies reevaluate their ways rather quickly.
That was not difficult to do, even in Late December. Most discussions were taking it really casually, and believed that none of that would happen, which is exactly why it happened and how I predicted it. People are be filled with unfounded optimism, overestimate their skills, and underestimate their bad lack.
Combining the three with the belief that most humans are incapable of understanding exponential and autoregressive processes, it became clear.
But all of this is anecdotal evidence and could very well be an instance of a bear predicting a depression, after all, bears predicted 10 out of the last 3 economic crises.
Not sure if it is true in general. But from the perspective of the software industry, our company has been remote-first since its inception. We do encourage coming to the offices, but it has never been mandatory. I think there is a fine balance in this. Personal time saved vs. the social aspect. And of course some problems just get solved faster in front of a whiteboard.
This would have been a better analysis if it ignored headlines and used comments, perhaps in combination with some sentiment analysis. Also, nfts, Tesla, etc aren’t really tech trends but rather consumer and valuation trends. They should have tested for specific technologies like programming languages, paradigms, frameworks, and libraries. Or specific hard tech
Would you get the same result if HN is a popular content marketing forum that’s frequently used to promote new businesses and products? Also, I skimmed the graphs and only concluded that HN contributors have a short attention span. I must lack the sophistication to interpret the results.
for 1.. I'd say it was a trend...just a much smaller one.
I've been 100% WFH in tech for 20 years, and it got easier every year even before covid. What you had to predict, wasn't the pandemic but what tech would be hot enough employers wouldn't care where you worked.
I'm pretty sure overlaying Google Trends onto keywords in HN is more of an informal analysis than a rigorous decision process. True predictive value needs a more rigorous model, but these sorts of data explorations are a great way to motivate folks to dig deeper.
I am actually surprised as it seemed that with IBM, Yahoo, and various other companies dragging everyone back to the office in 2017-2019 that remote work was in retreat.
I always interpreted digital nomad to be a Tim Ferris type person working 4 hours a week and living off relatively passive income or some kind of online self employment, not employees.
3. Hacker News comments have been vehemently anti-cryptocurrency for a decade. I remember being shocked how out of touch with actual hackers this place is.
It's just another speculation scheme. "Protect your money from inflation" means "make a bet that our token's price will go up".
If you want to safeguard your savings against inflation, the way to do it is to invest it in vehicles that are backed by something, like corporate stocks, government bonds, real estate, or commodity money, not a vacuous line on a digital ledger supported by nothing but people's hope that it'll be worth more tomorrow.
Valiu's claim that they're backed by USD is dubious at best. Tether was running a similar scheme, we now know that they only have 2.9% of the money that they claim to have in reserve, the rest having been traded away to mysterious LLCs in the Caymans, and many of their coins printed out of thin air.
The only "legitimate business" anybody's done with crypto is buying lots of it and hoping the value keeps going up. Nobody is actually using BTC/ETH/XMR/etc to do business or conduct commerce, it's all just speculation.
There is no compelling reason to believe that cryptocurrencies are anything other than digital beanie-babies. At least beanie-babies had inherent value as a toy.
"actual hackers" use cryptocurrencies because they are easy to steal and launder. Despite the name, most of the people on Hacker News are not interested in the criminal part of "hacking".
Feels like a great case study in confirmation bias: you only looked at the topics that ended up being successful. How many tech trends did Hacker News hype up only to have them go nowhere? Without that info we don’t really know what HN’s success rate is.
That was my first instinct but I'm not sure it's actually true.
If my priority is to know about interesting topics so I can be informed on them when they take off then this is the right methodology. HN might be a bit inefficient in time, but this suggests that new tech trends will generally not arrive without having had some advance warning on HN. This is what I need - as a consultant it's useful to know what the client is talking about when they bring up something cutting-edge, and useful to know about cutting edge things to recommend as relevant.
> How many tech trends did Hacker News hype up only to have them go nowhere?
Do all of these graphs taken into account actual "hype" or just the existence, and number, of posts on hacker news? I would like to see the graphs adjusted for discussion. Take number of comments into account, analysis of sentiment in the discussion, etc. Are they actually being hyped?
For instance, the description above the Bitcoin graph gives a link to the very first mention of Bitcoin[1] but the link goes to a post with no comments.
I guess number of posts and timeline of posts could indicate trend but should we call it "hyping" those trends? I would imagine that would come from the averaged response sentiment of the discussion on those posts.
Exactly. Or at last a comparative analysis with similar news feeds. Would reddit tech posts not yield similar results? If not then perhaps one has an edge in term of early discovery future trends.
The variety of tech subjects submitted by hacker newsers is very large, and mostly about the most innovative and financially rewarding domain: tech.
So sure, hacker news does predict every upcoming trend since someone will be posting something about each and every plausibly promising thing.
HN talks about Bitcoin/crypto frequently, but the overwhelming opinion for years has been that it's pointless.
That will certainly be a huge miss, similar to the dismissal of easy file syncing. DeFi/crypto/NFTs seem to be gaining real traction and HN hasn't accepted it.
Not only confirmation bias, but also they're essentially polling a very specific subset of the population.
This is a segment of the population that would be replete with model train enthusiasts, RC plane hobbyists, and HAM radio operators back in the day. So, yeah, things of interest in the tech space would be of even more interest to people on HN.
But yeah, they also picked the "winners", they really have to filter for all topics discussed by HN and then find out how many of those became "future tech" and how many fell by the wayside. Because if it's 1 in 100, it's not a really good predictor.
And if something becomes mainstream, then of course it'll be talked about. That's kind of how it works.
Understanding what will go nowhere would be so much more useful! Following trends can be valuable, but old ways still work. Chasing wild geese is catastrophic.
This! Since people have limited time, if HN leads more more often then not on some tech goose chase, then it doesn't matter if some tech trend is mentioned early on HN and has interest.
But ultimately what makes something successful is investment in that thing in both time and money. HN is very VC heavy tech crowd and of course it should lead in trends where VC tech crowd creates out of pure effort and money.
Telsa and 3D printing is an interesting example. If it isn't strictly a digital good, HN might not be ahead given the audience.
HN is pretty good at identifying trends. I generally read some comment here way before I hear about it somewhere else. Does HN accurately predict where the trend goes? Not necessarily. But if something actually does trend, I will have heard low-level chatter about it on HN for a long time.
Remote work trended because of the pandemic, for example, but there was constant chatter about remote work anytime the topic of employment came up, and the monthly job match-making posts even post whether the job / worker is open to remote. Functional programming has had low-level chatter from the beginning (given that PG viewed it as one of the reasons for the success of his company), and indeed, functional programming has grown more mainstream. I started exploring Python for my Perl-type tasks because it kept getting good mentions on HN.
I don't think any of the above ever got to the point where it got so many mentions it would hit a strong 100 in his methodology compared to Google trends. I'm also not sure you could identify successful trends early on, but I think you hear chatter on HN long before other places.
* Dolphin emulator progress reports (I love it when they show up)
* Random posts on x86-64 emulation or emulating a gameboy or something like that. Fabrice Bellard is a hero.
* People who post their very first side project ever made either via no-code solutions or some React/Bootstrap type of thing. In many cases, these things are job boards for a specific niche.
* HN praising Apple for standing up to the FBI (or CIA?). HN hating Apple because of recent events with scanning pictures on iCloud and all that.
* All the things the blog post talks about
* Hardcore learning resources on many topics, one that I actually used: http://neuralnetworksanddeeplearning.com/ (tons of books on Calculus as well, I remember a 1900 textbook called Calculus Made Easy)
* The random physics or biological discovery
* And so much more
The way I see it: HN talks about tech (and interesting stuff outside tech), it's bound to cover tech that eventually becomes popular. But will the Dolphin progress reports become as popular as Bitcoin? I don't think so. It'd be interesting times if they would ;-)
Another Apple perennial is blog posts about switching away from Apple to Linux/Windows/ChromeOS. This successfully predicted that 300% of Apple's user base is gone now.
This only looked at submissions it seems. I gather a lot more useful information lurking the comments. I think discussions in the comments here are far ahead of the mainstream.
It's a bad habit since it encourages commenting before reading the submission. But I do the same thing as well, and it's not clear to me how someone could do otherwise without spending extraordinary time on HN. I find HN a very valuable source, so scanning some amount of comments is more important to me than reading all the submissions.
>I think discussions in the comments here are far ahead of the mainstream.
Discussions on HN and the articles they comment on can buy into the hype of the day to just as large a degree as anywhere else. Go back a few years. How many people were confidently predicting that widespread door-to-door autonomous taxis were just 2 or 3 years out?
I do not know how many, but I do not remember that being a popular position. If I recall, the simple answer to someone proposing that was “we will see when they start testing outside of pristine Arizona/California weather.”
If you are lucky, for every technical subject discussed on Hacker News, you will find at least one expert commenting in the discussion. Here the issue with the pristine weather is directly linked to the difficulty of making LIDAR works under the rain something I remember someone knowledgeable with LIDAR posting very early here.
Sadly my feeling is that during the past few years these posts have been becoming rarer and are less upvoted that they used to be. Controversial or very assertive comments seem to have more success. I fear that Hacker News is slowly turning into Reddit.
HN is fantastic for surfacing up and coming technologies. Reading HN daily is almost like a superpower.
The HN community is...not so great...at predicting which things are exciting and the industry should adopt. If it were the industry would have settled on a LISP dialect long ago, the web would have adopted good readability standards (based on thousands of meta-complaints on virtually every top rated post on any topic), and mobile app development would still be a growing and profitable business (and Apple would finally be catering to niche HN reader needs).
Hacker News, like any news aggregator, pops up new stuff all the time.
The hardest part is keeping an open mind for jumping on early. Hard because 99% turn out to be dud anyway, not to mention our crowd are skeptical bunch.
My biggest "regret" is ignoring Bitcoin/Ether.
Saw Bitcoin posts when it was $5.. nah it's a scam.
Oh it's gone up to $10.. nah it won't go up further.
Went up further.. nah won't buy bcoz already too expensive.
Ether announced.. lolz nobody going to buy another coin since there's already Bitcoin..
That said, unless you were a True Believer, if you bought bitcoin at $5 you probably wouldn't have any left by now. You likely would have sold at $100. Or $1,000.
I know a lot of people that got in early. But only one made millions because he thought it would be the next big thing. Everyone else jumped ship at some point because they weren't True Believers.
I picked up 0.1-and-some-change Bitcoin, largely from "faucets" and a tiny amount of collaborative mining. Basically free, a few hours of screwing-around being my only real "investment". I neglected the wallet, since it wasn't worth anything anyway, and lost the file in a format-and-reinstall at some point. Did something similar with Doge, later. Whoops.
> Remote Work [...] I thought [in contradistinction to the graph] that Hacker News would have been more ahead of the trend on this one.
> For each topic I counted how many times it had been mentioned in a post title
I suspect that's the problem. A lot of trendy discussion happens in comments; I think that's going to be particularly true for a topic like remote work.
People are going to extol its virtues when it's tangentially related to a submission, but especially pre-pandemic there's only so many stories you could submit about it? 'Company goes full-remote'? 'How we handle remote working at Company'?
It'd be interesting to search comments too I think. hn.algolia.com supports it.
Both the selection (survivorship bias) and weighting (mentions rather than votes/discussion) are problematic.
Discussion itself can be a very poor metric, as when there's a great deal of general agreement there's frequently little (or at least less) as when compared with highly contentious topics.
Research methodology can often be restricted by data availability. Finding novel ways to torture^W explore data can be insightful, though raises its own risks (p-hacking, as an example).
Author here. I agree with you but the only recent HN data dump that I could find contained posts and not comments. I suspect the trend lines would look similar though.
I find that when Hacker News really likes a piece of highly technical SaaS technology it's usually pretty good. The best example I can think of is Datadog which got rave reviews here years ago and has continued to be enormously successful. Particularly for enterprise tech as there tends to be less herd behavior up-market. If I'm diligencing a technology I'll often actually look at HN to see the opinions.
The mood on Hacker News (and reddit/twitter) has become too pessimistic to be a good predictor of the future.
The long-term trend is that the world is becoming a better place. Pessimistic people are biased against that fundamental trend that other trends will develop on. Groups of pessimistic people will not be good predictors of the future.
Fewer people in extreme poverty.
Hunger is falling.
Child labor is on the decline.
People in developed countries are working less.
The share of income spent on food is down (in the US).
Life expectancy is rising.
Child mortality is down.
Death in childhood is down.
Guinea worm has almost been eradicated.
Teen births are down (in the US).
Smoking is down (in the US).
Violent crime is down.
Nuclear weapon stockpiles are down.
More people are living in democracies.
Literacy is up.
Internet access is increasing.
Solar energy is getting cheaper.
It's unfortunate that spreading such news is not as profitable as fearmongering.
It is only recently that I have begun to suspect that there are strong political reasons for suppressing this as well. Feeling pretty naive about that.
Everytime that comment is referenced, people forget that the rest of comments were mostly positive plus constructive criticism or challenges they'd face.
Yes, and BrandonM's comment was constructive criticism. He was trying to help Drew with his YC application (that's what "app" meant back then) and clearly wanted it to succeed ("I only hope that I was able to give you a sneak preview of some of the potential criticisms you may receive. Best of luck to you").
I think that example really exemplifies HN's biggest blindspot. A lot of successful companies are built around taking an existing technology, then packaging it in a way that drastically reduces the frictions for the user. The typical HN user doesn't understand those pain point, because they're technically sophisticated enough that they're mostly invisible.
So, you take something like Dropbox. Yeah, it pretty much is just mounted FTP. And the average HN is looking at that and thinking "I could do the same thing with a bash script, what's the big deal".
It is a blind spot of knowledge. Sure you can make a script that does it. But most people can not do that. Heck just yesterday something in my hacked up system did something odd. It spit out an error that I will now have to research (probably find nothing) and then have to dig out the code on. Then spend several days probably reverse engineering some tech API that I have never seen before, in a style that will be different than what I am used to. But if I had bought a product that did the same thing. More than likely 50 other people would have the same issue and a few dozen work arounds. If those failed I could have a shot at just taking it to whoever sold it to me and saying 'fix it'. But as I hacked together something on my own. I own it, warts and all. I enjoy doing that but most people have zero idea where to even begin in fixing something like that. Buying something that 'just works' solves so many issues (and creates others).
Probably the biggest one that sticks out in my mind was when the iPad came out and all the tech boards like this were down on it. But it was what most people wanted to use a computer for. Most people did not want a computer. They wanted a media device. For many years the only way to get that was to get a computer. Now you can get that media device without most of the headaches of a computer.
I think Dropbox is only a blind spot because it took a syncing tool and combined with running a user application with root privileges in the 2000's era of computer security applications and gamification for user acquisition.
> Jobs had been tracking a young software developer named Drew Houston, who blasted his way onto Apple ’s radar screen when he reverse-engineered Apple’s file system so that his startup’s logo, an unfolding box, appeared elegantly tucked inside. Not even an Apple SWAT team had been able to do that.
It’s not a blindspot for many people. It’s open source and open protocols vs closed source spyware. SCP is far better than Dropbox if you value privacy for instance.
A lot of resistance to things like Dropbox come from the experience of being screwed over by closed source software. If you look at the state of Dropbox today, those initial comments look pretty spot on.
I applaud the originality of the statement, but there's a good deal of trust in the cryptocurrency ecosystem compared to 2009. I understand if you're personally still a skeptic.
The comment is still valid, I think. You can't really trust crypto to hold its value in any way that a currency should. To use cryptocurrencies you need to tolerate anything from a 50% drop to a doubling, all in extremely short time scales. If that's what trust means to you, that's interesting.
Compared to 2009?! You don't trust Coinbase in 2021 any more than you did Cryptsy in 2013?
PS if you want to do the whole "volatile prices!" fear mongering you can do WAY better than 50% price drops. You also accidentally criticize the US dollar here, USD saw a 30% decline in the early 2000s and analysts predict a possible decline of 30-35% in the broad dollar index soon. Is that how a currency "should" hold its value?
I don't trust it now, and wouldn't have in 2009 either.
I trust the US dollar because it is very stable. There's a crisis now because we're looking at roughly 5% year-over-year inflation compared to a goal of 2% -- meanwhile crypto swings 10% every damn day!
The dollar crashed 30% in the 2000s (and the 80s, and the 70s, and probably the '10s). Soon it will be because of trade wars and losing value to the euro and the yuan though.
To your point, remember when the crypto market had a net loss of almost 3000 points in a single day in March 2020?! Oh wait, that was the Dow, sorry.
So you don't trust the stock market or the assets that back it either, do you? It swings 10% or more some days!
Oh you just said "crypto," you never mentioned a currency. Do we get to include the overcollateralized stablecoins or are we just cherrypicking volatile ones?
Do you have any experience buying or using cryptocurrency in the last 10 years?
We cannot predict future trends in technology, if we could, then Bitcoin would have today 0 value because it is boosted only by FOMO (fear of missing out) feelings, not because it is useful.
In 2009, I was starting with Scala, which had functional features way beyond what was achievable in other languages at that time. Scala was a game changer, it rode later on the Big Data (Spark, Kafka) wave (circa 2015). Many academic things that Scala proved to be production-ready were backported after to Java, JS, etc. I have built systems with Scala that were actually useful for people and organisations, while Bitcoin as a technology... I am just glad that I have predicted worse in this case.
Bitcoin's value partially derives from permissionless transactions on an immutable ledger, if you'd actually been following it from 2009 you'd know that (but you don't).
> Bitcoin would have today 0 value because it is boosted only by FOMO (fear of missing out)
An incredibly tired take (but it's cute that you spelled out "FOMO" and abbreviated it). Compared to 2009, 2021 is the future, where the value of BTC is not 0. We don't have to speculate incorrectly as you do when you say "Bitcoin would have today 0 value," we know what value it is.
You completely failed to predict the effect of BTC and cryptocurrency on the global financial market, which supports your idea that "we cannot predict future trends." We can make educated guesses though.
No, that is a fairly straightforward explanation of Bitcoin's utility. Bitcoin did two fundamental things:
- it was actually useful for people transacting things
- it introduced the world to blockchain as a general technology, which at this point many new things (which people find useful in ways more complex and more varied than the above) have been built atop.
The idea that BTC would be worth $0 without speculation is just poor reasoning. It misunderstands the past, ignores the present, and doesn't even look to the future, so not surprised you didn't pick up on this particular trend.
Every "permissionless transaction" is an action of supplying and demanding, the "immutable ledger" is the proof that the action happened in the past. The price of Bitcoin, as the price of anything in life, is determined by supply and demand. Stop glorifying stuff using specialized language because people will actually believe you are the opposite of gifted. Reading "Thinking Fast and Slow" by Daniel Kahneman will do you good. The price of Bitcoin will remain forever volatile as the system is inherently flawed and nobody has faith in it.
Usually I block trolls, but in the lack of such a functionality on HN take this comment as my last interaction with you.
> Every "permissionless transaction" is an action of supplying and demanding.
Of course it's not, it's simply a transaction; it doesn't need to be between two separate parties. I don't think you understand Bitcoin as well as you believe, you have it confused with an economic theory.
Reading "Bitcoin: A Peer-to-Peer Electronic Cash System" will do you good.
Coinbase is successful b/c it went against the HN crowd wisdom. On the very HN thread Brian Armstrong announced his Coinbase idea the participants widely expressed skepticism.
During the infamous leap second fiasco of Java in 2012 which brought down all JVM servers by saturating the CPU, I saved the day because I was reading about it here on HN while my colleagues were struggling to undestrand why all our environments stopped working.
It took me a few minutes to realize that we were indeed hit by what I just read, look at ten of my colleagues each closely inspecting logs at different ssh sessions, and gleefully give them a precise description of the problem, an approach to verify it and a solution, adding “it’s all over the news guys, you should keep yourself updated”.
One thing worth nothing is that HN didn't launch publicly until 2007 February. There is some older content in the DB, but it looks mostly like very low volume manual testing. So the data set definitely shouldn't be extended that far back, and is affecting a few of the be graphs.
The author didn't control for what was on the front page, how many points the article received, or the amount of discussion on the topic. Not to mention the myriad of technologies that are on Hacker News that fizzle.
They missed crypto and NFTs. For some reason HN doesn't appreciate the potential of these tech. Even when some prominent Silicon Valley investors do. I wouldn't have expected this from a technical and freedom minded audience.
Both crypto and NFTs are strugging to find productive real world applications; predicting those are like predicting which toy will be the "one to get" this Christmas or the Beanie Baby fad.
> are struggling to find productive real world applications
You can lend and borrow on compound.finance, aave. You can play no-loss lotteries like PoolTogether. You can bet on your beliefs and hedge your investments on markets platforms like Polymarket. You can gamble. More on: https://ethereum.org/en/dapps/
Remember, anyone can use all of this without any arbitrary restrictions imposed by governments or companies.
> Wether risk reduction as financial instrument is productive for society can remain open to interpretation. However, it, without a doubt, has value.
That's my original point, I don't believe speculation (and tools to reduce the risk of speculation) is productive. At best, it's a hovering up of the value delta between two actors with differential information.
They CERTAINLY have value, for the financial actors using them. Just it isn't a productivity gain for society as a whole. Certainly gambling isn't.
Yes. And what I tried to point out, albeit not very clearly in retrospect, was that hedging is not only applicable and useful in cases that are plainly speculative in nature. E.g. I would like to hedge against the case of Forex fluctuations when doing a large international deal, or delivery delays, or bad harvests, etc.
I'm not aware of HN doing any predicting at large. If you mean 'will the average person later act like a given segment of HN people do now?' - then it depends on how good you are at selecting your segment, which makes it no better or worse than using any other sample of people for prediction.
Less abstractly, productisation is what takes things from obscurity (e.g. HN) to mainstream audiences, so future trends might grow out of stuff that you see here, but the people who can figure out whether certain tech can become a product (would-be predictors) -hopefully- also have the means to turn it into a product themselves.
Two things I learned on HN before they became mainstream:
* Rust
* Covid
For the latter I remember thinking that it was going to become a big deal soon but no one seemed to be paying attention except for those paranoid HNers.
"My conclusion: Hacker News is typically ahead of the mainstream, often by a few years, but you would need to be paying very close attention to catch the early mentions of a new tech trend. Most of the linked posts and comments have very few upvotes and probably wouldn't even make it to the front page."
Author of the article here. I would love to run an unbiased analysis on that, but it would tricky to list and identify all of the companies that flopped.
I guess you could grab a list of every successful and defunct company listed on Crunchbase/AngelList/etc, but there would still be bias, as flopped companies may never have got to the point of creating a profile there, or they scrubbed the profile later.
I'd stick to public companies and run sentiment analysis on all tickers and company mentions. Then identify the correlation between positive sentiment and historical market return. I'm sure there are at least a few funds incorporating this data into some portion of a portfolio.
Doesn't need to be cool. The prevailing attitude has always been basically that JIRA sucks, but 3/4 of devs are using it and the alternatives are worse.
Hacker News skews towards richer, more technical, and higher-educated and then makes systematic bias against pedestrian, working-class, and uncomplicated solutions. If it's too popular on HN I'd say it's a red flag.
I've read about an indie trading algorithm that works like this. It's basically a PageRank for professional forums, which apparently did quite well for the author. A more fundamental-based form of sentiment analysis.
> My conclusion: Hacker News is typically ahead of the mainstream, often by a few years, but you would need to be paying very close attention to catch the early mentions of a new tech trend. Most of the linked posts and comments have very few upvotes and probably wouldn't even make it to the front page.
HN also tracks front-page history: news.ycombinator.com/front.
To go back to a particular date all you need to do is give the date, i.e. news.ycombinator.com/front?day=2020-08-24. Seems to go back to 2006-10-09.
HN is the best way of learning about future tech trends before they happen. But it's not because there's any kind of prescient consensus here. The important stuff is a small minority. You have to do your own filtering.
I was ahead of the curve on Bitcoin, VR, and deep learning because I read about them early here, but they weren't the most popular things at the time. HN hates Bitcoin to this day. But I learned about it from HN and decided that HN was wrong about it. When lots of people are wrong about something, I see it as an opportunity.
It's not clear to me if this is predictive. E.g. HN has been right about AMD. Years ago when people were talking about AMD it was worth peanuts and look at it now. I don't think it's a case of a group of people being right/wrong, it has more to do with since you're fed all the perspectives, you can find the right/wrong ones yourself and take action accordingly.
Yes, I'm not saying HN is wrong about everything. HN isn't opposed to my other examples of VR or deep learning (though there are certainly people who argue strenuously that deep learning is a dead end and GOFAI is the way forward). I'm just saying that when the consensus about something is wrong, that's an opportunity to get ahead of the curve. And reading HN is a great way to gather the information you need to decide whether the consensus is right or not. But you have to figure that out for yourself.
How do you measure what is wrong? In the example of Bitcoin, if the measurement is price, I think that's a misunderstanding of what people are criticizing.
The criticisms have changed over time. But I'm pretty sure the early criticisms were mostly wrong, and I'm certain that the people who criticized it early missed big opportunities, even if Bitcoin itself ultimately fails (which is of course still a possibility).
Using total number of mentions as a way to predict future tech trends seems dubious at best. I understand coming up with a better metric would be difficult, but this one seems useless.
For example, with this metric, how can we tell the difference between people constantly dismissing Bitcoin or people constantly praising it? Just because you mention it doesn't mean you're mentioning it in a positive light.
HN commenters famously dismissed Dropbox when it was first posted. I would love to see a variant of this based on sentiment analysis and whether the conclusion would turn into "yes, but not the way you think".
A lot of people fall into the trap of negativity = IQ. Your brain can spit out a reason that DropBox will succeed or fail, but the failure case sounds smarter.
Alternative take, people didn’t like Dropbox because it was the repackaging and close sourcing of existing solutions. It was more polished and financially successful but that doesn’t mean it was good for tech. It’s corporate software, yes it became valuable but it definitely represents an anti-user trend that actually would best have been resisted. You should never give closed source software root!
> Alternative take, people didn’t like Dropbox because it was the repackaging and close sourcing of existing solutions
It wasn't. Even the famous comment just pointed it was possible to replicate with existing tools. And apart from one person the thread was actually pretty welcoming.
That comment was just a grandfather of the very popular "Why should I use this instead of [completely different solution]?", that is also so very often also used against open source projects themselves.
I think the point is noticing the trends, not agreeing with them. If people were criticizing bitcoin in hn in 2013, it means the community was aware of its importance. And they were before the rest of the world did.
That does not mean this is the perfect (or even a good) metric, but I do not think that predicting tech trends is the same thing as embracing them.
What is the prediction being made? Whether or not something is a "trend"? What is the definition of a trend?
> Can I convince myself that checking the HN front page multiple times a day is a useful and productive exercise?
That seems to be a different topic entirely and should probably be the headline of the article. Also, it's clearly a bit cheeky, but confirmation bias abounds nonetheless.
Of things which are "interesting," absolutely. But to the stickiness of these things I'd say the record is mixed. If you lurk here chances are you'll hear about any single tech topic earlier than most other places. But that the ones you hear about here (or are most upvoted here) also tend to be the ones with greater stickiness is not clear to me at all.
I think it’s more that many of the “leaders” of this community are investors and entrepreneurs who have vested interests and use this site and community for promotion or strategic purposes. This site’s contents and commentary are mostly ads, thus not good at prediction.
This is a good start. It does not do much to answer the question of "predicting" trends. You would need to quantify false positives somehow. That is, tech that is big on HN but not elsewhere. I'm thinking of all the JS frameworks of the day, for one. For instance, Meteor JS was the bees knees for awhile.
You'd have to also somehow decide that they're 'past it' too, and it's not merely that HN is still ahead on them! Easy with a JS framework that's been discontinued say, hard in general, especially if you want some kind of objective rule for it.
As with all online communities HN is suffering from it's increased popularity. If you remember how Reddit was a good 10 years back, on every thread there would be 1-2 actual experts posting substantiated and well-sourced commentary on news relevant to their field. Usually undergraduates and young professionals passionate about their field.
At some point this was discovered by journalists and for a brief period of time major news organizations would rely on reddit users for sources. Many if not most discussions would be done in good faith or at least with solid arguments. The userbase was usually 18+. As it got more popular it got worse, mainly because of the "eternal september", altering userbase due to smartphone proliferation and changes in ranking algorithms and business model.
HN is not the same as reddit and isn't doomed to the same faults, but as as a forum gets more popular, experts reduce in percentage, they're more reluctant and careful on what and how they comment, the "populism" of the voting system becomes harmful and all the attention creates financial incentives for bad actors.
As for using it as some kind of blackbox betting algorithm for startups and trends, that's just a bad idea. I'm certain that you will get the same graph structure for all things that never became popular.
Obviously any forum focused on leading edge technologies will discuss advancements long before they become a subject of interest in the general populace. But they'll also discuss everything that will fail or never reach the mainstream sphere.
I would say it's more a good radar for what (software, mostly web) technologies are entering general awareness. If something might be successful one day, you'll probably find out about it. And then lots of those things will sputter out, but not all of them.
Basically, no one knows what is going to catch in until it has. As one poster mentioned, false positives could be plentiful. Also, for companies or brands such as Tesla, their PR machine will be working harder than HN to spread the word, so not a good data point.
I think a lot of people on HN are already working remote (I've been 100% remote since before the pandemic). It's a business process and business management issue that doesn't really come up on Hacker News except in productivity discussions.
If something looks useful, I'll sign up day one. Even if it doesn't pan out. That's how you get early adopter status. Found out about: Figma, Airtable, Stripe, Coinbase, Robinhood, Alpaca. All through HN posts ;)
Remote work is so obvious to the hacker news crowd that there’s no need for it to trend here. What’s artificial is going to an office to use a computer there. There better be food to make up for the traffic.
Love the visualizations, I would also want to compare against false positives though. From my time here, I feel like there's been a lot of flops that never broke into the public conscious.
Nothing pops out aside from crypto, to me, like you said.
But for anyone passing by, crypto assets are like 20 distinct sectors and ever expanding, with many solutions and competing services providers in each sector. HN typically gets stuck on surface level arguments such as limitations in the "currency" concept and the merchant use sector. It also gets reliable stuck on environmental impact, as well as surface level use of NFTs. Solutions providers in oracles, insurance, IOT, yield farming, yield farming aggregators, SaaS products (information, tax), marketplaces and many more entire sectors are almost completely ignored. All alpha bets.
The serverless post on the Scoble blog is not even about what we think of as Serverless now, just good old cloud. As in "you don't have any physical servers?"
As backed by this post, HN isn’t precisely good at prediction. It simply has far less noise than the general internet, making it easier to notice the upcoming trends.
> For each topic I counted how many times it had been mentioned in a post title, then calculated a 0-100 index, where 100 is the month with the most mentions.
This is not really useful data then. A time series with a big spike will "lose" against an identical one without the spike.
Maybe the amount of days when a certain word was in the front page would be a better comparison metric on the HN side?
On the contrary. The very nature of vote-based forums like Reddit and HN means that you must agree with the masses and uphold the status quo to even be seen at all let alone heard, so it’s very rare that you get a glimpse of upcoming disruption or anything new at all.
TL;DR: No, most people here are sticks in the mud. If you listened to them you’d think that future laptops would still have parallel ports and floppy drives and communicate via IRC.
Hacker News is where you to to learn that self-hosted rack-mount servers in your own basement running a LAMP stack provisioned with self-generated bash scripts is the future.
2. This only looks at successful things. We must examine unsuccessful things to see if there is any real predictive ability.