Hacker News new | past | comments | ask | show | jobs | submit | chubot's comments login

From the title, and skimming over this article, I notice an irony: the analysis is "short" by Eastern standards

They're only talking about the last 100-200 years. That's significant on the American time scale, but insignificant on a Chinese one.

It's interesting to me that older cultures like the Chinese, Japanese, and the Maya view history as cyclical, while Americans tend to have more a narrative of "linear progress", or even "manifest destiny" (a term I heard in grade school but probably didn't really understand until being an adult)

---

I think Ray Dalio's explanation is the most concise and takes into account the cyclical nature of history (and wow this video has 55 M views now)

https://www.youtube.com/watch?v=xguam0TKMw8

Roughly speaking, [edited] China and India likely had most of the world GDP for more than 1000 years [1] ... you can think of it as a miracle for them to rise again, or just part of a cycle.

And there are absolutely cycles in history, just ones that are longer than any "news" article tends to talk about

I also think Americans are now waking up to a cyclical view. Everyone is talking about how this is the "first" generation to make less money than their parents, or the "first" to have shorter life span

But it's only "first" if you don't recognize the cycles! That's understandable because of America's history, but it is a limited analysis

---

[1] edit: I looked deeper into this, and I agree it's flimsy, as mentioned below. But I think the larger point still stands -- it's a only a "miracle" if you're looking at a certain time frame, and n=1


> Roughly speaking, China had the largest GDP in the world for more than 1000 years

For a continuous duration of nearly 1700 years from the year 1 CE, India was the world's largest economy, constituting 35 to 40% of the world GDP.

https://en.wikipedia.org/wiki/Economy_of_India


Yes, yes, we know that China and India are becoming extremely nationalistic and like to use their tendencies for historical revionism to find justifications for their new found imperialism. Still I’m not convinced we actually need a fight of propagandists here.

Estimating economy sizes and growth before a couple hundred years ago is at best an exercise in educated guessing.

OK I tracked down that claim to Michael Cembalest of JP Morgan, in a 2012 article

https://archive.is/kQNT4#selection-1577.155-1581.525

https://archive.is/Z2qEu

Although there are problems with the analysis that are mentioned in both articles:

A major caveat: If you looked at the chart in any depth, you probably noticed a big problem with it. The time periods between data points aren't equal -- in fact they are not close at all

---

So, one way to read the graph, very broadly speaking, is that everything to the left of 1800 is an approximation of population distribution around the world and everything to the right of 1800 is a demonstration of productivity divergences around the world

But the analysis for China is equally flimsy, so I edited my comment to say "China and India" and "likely"

---

TBH now that I see this data, I think both claims are a little flimsy. I was more referring to Dalio's analysis, which covers the last 500 years only, and probably got a few claims mixed up.

For some reason Dalio does not include India much in his analysis -- it is mostly focused on China, the Netherlands, Great Britain, and the USA. That could be his bias, or perhaps a definitional issue with what counts as "India" (and similarly what counts as "China"), which is a difficult question over 500-1000 year time frames


Was India even a country before 1700? It's like saying "Asia was the largest economy"

In a pre-industrial economy, size of your economy was dependent on a) having a lot of peasants, b) having a lot of reasonably fertile soil.

China and India between them counted for at least half of humanity during that period, and probably more. No wonder they were the largest economies.


You are forgetting..

c) Having lots of ships, sailors and powerful army (many in Europe)

d) Being a trading hub (Italy)

e) Having access to lots of valuable metals, salt, gold (richest person in history was Mansa Musa, King of Mali)


Statistical outliers like very rich kings or ports do attract attention, but there was no way to avoid low productivity of agriculture overall. At least 80 per cent of the population had to tend the crops, which lowered the average GDP of entire regions / countries.

The problem with cyclical view appears as soon as one tries to review the underlying causes why this movement happens. You can have rises, prosperity, overshoots, bad habits, slowdowns and depressions coming one after another in a mostly closed system. Or you may consider a system which can be affected by externalities, those which can be so powerful as to cause opium depression or technology copying rise. It's a little like pendulum with external forces, which dictate what's the actual behavior of the pendulum is - disregarding its own internal properties.

It's true, but really it's both ... there is "progress" and there are also cycles. Dalio explicitly superimposes them on each other

I don't think I believed this until 2017 or 2020 or so, but it seems clear that the USA is on a natural downcycle and probably will be for several decades

I think the causes are pretty well laid out in the book - basically having the global reserve currency is kind of an unfair advantage, and it leads the population to become decadent and unproductive, it leads to institutional decay, etc.

---

That brought to mind the tweet/slogan "what if your whole personality depends on low interest rates?", and now I see that something similar was discussed on Hacker News too:

https://news.ycombinator.com/item?id=33946342

Basically people live in a particular part of a cycle, which lasts longer than their lifetimes

(but there are also non-cyclical changes / progress, and yeah there are big flaws with trying to go back 1000 years)


That’s not true, India had the largest GDP for the vast majority of human civilization.

Hm what’s the best way to look up the owner of a business? I share this viewpoint on dentists and doctors

The fastest and simplest way is to ask them directly by calling or emailing. Searching for information about private legal entities, while possible, is inconvenient and usually incomplete/sparse.

But why would they respond with that information?

PE already has a pretty bad press, so they know that customers are calling only because PE is a red flag.

They could just serve some meaningless half-truth and try to confuse you.


PE often doesn’t care because most folks don’t know what is going on. Private practitioners are usually proud of it.

Bing is an alternative to Google Search, that took billions of dollars to build

I used it, it's OK, not really any better overall. Slightly worse in some areas and slightly better in others

The problem seems to be that the incentives set up by Google flooded the web with low quality information

Also, some people probably LIKE some of the low quality information ... but I still think there was a fair amount of mismanagement


Bing was unironically fantastic at searching for adult content but they hobbled that through the years.

The memes were excellent promotion for Bing being the place where you find uncensored results.

“Split-face” would immediately return the horrible accident (don’t Google it, really - and yes, pun intended), “Scientists” would actually return non-editorialized scientists… Microsoft could have been farther and marketed Google as the colourful kindergarten that you give your children access to, whereas journalists and researchers would use Bing. But they folded.


Yandex

I use Bing everyday, via Ecosia. It's is amazing how far Bing has come. Overall I'd say that Bing is the better search engine today. The only consistent fail for me is when I forget the address of "The True Size Of" website (www.thetruesize.com), Bing refuses to find that URL, while it's the first result on Google. Other than that, Bing will find anything that Google does.

To this day Bing fails to find Microsoft Learn content, their own stuff!

I find the idea of a home filled with books and its inhabitants talking about them pretty romantic.

Does anyone else find themselves talking about insipid “headlines” with certain relatives? Like various wars and political events

Because that stuff is saturating our media environment and seems to be what people read


The sad thing here is that it is extremely effective and permeates everything. The mainstream media has their talking points and narratives, then a counter culture arrived not to fix this, but to provide their own talking points and narratives.

Every time a family member calls me it's either something they saw on MSM, or something they saw on Twitter/Facebook. Nobody seems to actually educate themselves and have a discussion, it's just talking points. Half the time they can't expand beyond the headlines.

It feels like we're at peak attention media. At least, I hope so.


Yeah. It's all trying win an argument rather than discover any kind of truth.

I find myself dreading talking to my siblings these days. I call up to ask how they are and get a rant about Hunter Biden or whatever the flavour of the day is.

Thanks YouTube and Twitter.


> Yeah. It's all trying win an argument rather than discover any kind of truth.

I suspect this has been the case for the majority of people for the majority of history, unfortunately.


I find it hard to imagine a solar event that will wipe out info on both sides of the earth evenly :-)

I think all the missing people and countries would be just as big a problem …


I think the reference is more about geomagnetic storms caused by solar flares, like the Carrington Event (https://en.wikipedia.org/wiki/Carrington_Event), which if it happened today would probably wreck most power grids across the planet.

This is wild! So much diversity in life

Yeah also heroku and the whole generation of “PaaS”

I was never quite sure why we got the name “serverless”, or where it came from, since there were many such products a few years before, and they already had a name

App engine had both batch workers and web workers too, and Heroku did too

They were both pre-docker, and maybe that makes people think they were different? But I think lambda didn’t launch with docker either


> I was never quite sure why we got the name “serverless”, or where it came from

Serverless refers to the software not being a server (usually implied to be a HTTP server), as was the common way to expose a network application throughout the 2010s, instead using some other process-based means to see the application interface with an outside server implementation. Hence server-less.

It's not a new idea, of course. Good old CGI is serverless, but CGI defines a specific protocol whereas serverless refers to a broad category of various implementations.


Pedantry police here. I would define serverless to mean that all the hardware is completely abstracted away. For instance, on EC2, you have to pick an instance type. You pick how much memory and compute you need. On a managed kuberenetes cluster, you still have to think about nodes. On a serverless platform, though, you have no idea how many computers or what kinds of computers are actually running your code. It just runs when it needs to. Of course there's still an HTTP server somewhere, though.

So, you could run a CGI script on a serverless platform, or a "serverful" one. You could even run it locally.

https://en.wikipedia.org/wiki/Serverless_computing

Per wikipedia: "Serverless is a misnomer in the sense that servers are still used by cloud service providers to execute code for developers. However, developers of serverless applications are not concerned with capacity planning, configuration, management, maintenance, fault tolerance, or scaling of containers, virtual machines, or physical servers."


FWIW I agree with you -- serverless does not refer to "web server", it refers to "linux server machine" (whether it's physical or virtual)

You don't care about the specific machine, the OS kernel, the distro, the web server, or SSL certificates when you're doing "serverless"

And the SAME was true of "PaaS"

This whole subthread just proves that the cloud is a mess -- nobody knows what "serverless" is or that App Engine / Heroku already had it in 2008 :)


> it refers to "linux server machine" (whether it's physical or virtual)

No, "server" most definitely refers to software that listens for network requests. Colloquially, hardware that runs such software is often also given the server moniker ("the computer running the server" is a mouthful), but that has no applicability within the realm of discussion here. If you put the user in front of that same computer with a keyboard and mouse controlling a GUI application, it would no longer be considered a server. We'd call it something like a desktop. It is the software that drives the terminology.

> nobody knows what "serverless" is or that App Engine / Heroku already had it in 2008 :)

Hell, we were doing serverless in the 90s. You uploaded your CGI script to the provider and everything else was their problem.

The difference back then was that everyone used CGI, and FastCGI later on, so we simply called it CGI. If you are old enough to recall, you'll remember many providers popped up advertising "CGI hosting". Nowadays it is a mishmash of proprietary technologies, so while technically no different than what we were doing with CGI back in the day, it isn't always built on literal CGI. Hence why serverless was introduced as a more broad term to capture the gamut of similar technologies.


fly.io is "serverless", but there are HTTP servers inside your Docker container, so I don't agree -- in that case it refers to the lack of pinning to a physical machine

https://fly.io/blog/the-serverless-server/

Pretty sure Lambda has an option for that too -- you are responsible for the HTTP server, which is proxied, yet it is still called serverless

---

On the second point, I wrote a blog post about that - https://www.oilshell.org/blog/2024/06/cgi.html

It would make for a much more interesting conversation if you cite some definitions/sources, as others have done here, rather than merely insisting that everyone thinks of the terms as you think of them


> fly.io is "serverless"

Right, with the quotes being theirs. Meaning even they recognize that it isn't serverless-proper, just a blatant attempt at gaining SEO attention in an effort to advertise their service. It is quite telling when an advertisement that explicitly states right in it it has nothing to do with serverless is the best you could come up with.


I agree the "serverless" is not a good name. But hey, it stuck :/

I also can't come up with one that's significantly better.


For all intents and purposes, when is the hardware not fully abstracted away? Even through the 2010s when running as a server was the norm, for the most part you could throw the same code onto basically any hardware without a second thought.

But pedantically, serverless is to be taken literally. It implies that there is no server in your application.


EC2 and managed kubernetes are two examples where you still have to think about hardware.

Not really. The application doesn't care. Hell, many of these modern serverless frameworks are built so that they can run both server and serverless from the very same codebase, so it is likely you can take the same code built to run on someone's MacBook running macOS/ARM and run it on an EC2 instance running Linux/amd64 and then take it to a serverless provider on any arbitrary hardware without any code modification at all! I've been around the web since Perl was the de facto way to build web apps, and it has always been an exceptional situation to not have the hardware fully abstracted away. Typically, if it will run on one system, it will run on any system.

The move away from CGI/FastCGI/SCGI to the application being the server was a meaningful shift in how web applications were developed. Now that we've started adopting the server back out of the application in favour of the process-based model again, albeit now largely through propriety protocols instead of a standard like CGI, serverless has come into use in recognition of that. We don't want to go back to calling it CGI because CGI is no longer the protocol du jour.


Serverless, to me, is purely about efficiency. One way to measure that is the time for a "cold start" or "going from a state where you pay no money to one where you pay money". These gains in efficiency remove the need for over-provisioning and in many cases allow you to pass these savings onto the consumer (if you want to).

Heroku is a few seconds:

> It only takes a few seconds to start a one-off dyno process or to scale up a web or worker process.

Lambda created Firecracker to be snappier:

> The duration of a cold start varies from under 100 ms to over 1 second.

I think App Engine is in the same ballpark as Lambda (and predated it). Fly.io uses Firecracker too:

> While Fly Machine cold starts are extremely fast, it still takes a few hundred milliseconds, so it’s still worth weighing the impact it has on performance.

but WASM is yet an order of magnitude faster and cheaper:

> Cloudflare Workers has eliminated cold starts entirely, meaning they need zero spin up time. This is the case in every location in Cloudflare's global network.

WASM is currently limited in what it can do, but if all you're doing is manipulating and serving HTML, it's fantastic at that.


When lambda came out and serverless started getting big, most scrappy startups hired many frontend devs.

It was the heydays of SPAs, light backends, and thick frontends.

“Serverless” is a great way to say “you don’t need to be a backend dev or even know anything about backend to deploy with us”

And it worked really really well.

Then people realized that they should know a thing or two about backend.

I always really hated that term.


PaaS, Containerization and Serverless are different concepts.

App Engine is PaaS: You provide your app to the service in a runnable form (maybe a container image, maybe not) and they spin up a dedicated server (or slice of a server) to run it continuously.

Lambda is Serverless: You provide them a bit of code and a condition under which that code should run. They charge you only when that thing happens and the code runs. How they make that happen (deploy it to a bajillion servers? Only deploy it when it’s called?) are implementation details that are abstracted from the user/developer as long as Lambda makes sure that the code runs whenever the condition happens.

So with PaaS you have to pay even if you have 0 users, and when you scale up you have to do so by spinning up more “servers” (which may result in servers not being fully utilized). With Serverless you pay for the exact amount of compute you need, and 0 if your app is idle.


> They charge you only when that thing happens and the code runs.

That's how App Engine worked in 2008, and it looks like it still works that way:

https://cloud.google.com/appengine/pricing

Apps running in the flexible environment are deployed to virtual machine types that you specify. These virtual machine resources are billed on a per-second basis with a 1 minute minimum usage cost.

This applied to both the web workers and the batch workers

It was "serverless" in 2008!

> spin up a dedicated server (or slice of a server) to run it continuously.

Absolutely NOT true of App Engine in 2008, and I'm pretty sure Heroku in 2008 too!


I recall you could configure app engine with maximum number of instances you wanted, but you definitely weren't charged if usage was 0. They would start the instances as needed.

The fact that lambda would automatically scale to meet whatever QPS you got sounds terrifying.


Serverless is indeed a weird name if you know what you are talking about. I was dumbfounded by the term until I met people who actually thought of anything beyond pushing to git as "the server".

Backend returns 4xx/5xx? The server is down. Particular data is not available in this instance and app handles this error path poorly? The server is down. There is no API to call for this, how do I implement "the server"?

Some people still hold the worldview that application deployment is similar to mod-php where source files are yoloed to live filesytem. In this worldview, ignorant of complexities in operations, serverless is perfectly fitting marketing term, much like Autopilot, first chosen by Musk, chef's kiss.


> Serverless is indeed a weird name if you know what you are talking about.

It is a perfectly logical name if you know what you are talking about and are familiar with the history of how these so-called serverless applications used to be developed.

Which is to say that back in the day, once CGI fell out of fashion, the applications became servers themselves. You would have a listening HTTP server right within the application, often reverse proxied through something like Apache or nginx, and that is how it would be exposed to the world. The downside of this model is that your application always needs to be resident in order to serve requests, and, from a scaling perspective, you need to predict ahead of time many server instances are needed to handle the request load. This often resulted in poor resource utilization.

Now with a return to back to the CGI-esq model, where you have managing servers call upon the application through a process-based execution flow, albeit no longer using CGI specifically, the application is no longer the server again. This allows systems to save on resources by killing off all instances of your application when no requests are happening, and, with respect to scalability, it gives the freedom to the system the ability to launch as many instances of your application as is required to handle the load when the requests start coming in.

Hence, with the end of the application being the server under the adoption of said process-based model, the application became serverless.

> I was dumbfounded by the term

The marketers have certainly tried to usurp the term for other purposes. It seems just about everything is trying to be called "serverless" nowadays. Perhaps that is the source of your dumbfoundary? Then again, if you know what you are talking about then you know when marketers are blowing smoke, so...


I don’t really understand the question, because based on the details in the article, SOME people knew he was wealthy, but he wasn’t a public figure

Most people aren’t public figures, wealthy or not

He had a job, a bunch of kids, multiple wives, and a 250 M divorce, among other things.

Obviously some people knew this. It just wasn’t in the public interest, like 99.99999% of things that have ever happened :-)


I definitely think their work is deserving of awards, but I kinda agree with other commenters in that this says more about the Nobel committee than anything

i.e. Hinton has already won a Turing Award in 2018, and there is no Nobel for computer science

And this work was already recognized to have impact ~12 years ago, when he auctioned his company of 2 grad students to Google/Microsoft/Baidu/Facebook, for over $40M, ultimately going with Google [1]

---

i.e. IMO it feels a little late / weird / irrelevant to be giving this award in physics to machine learning research – it doesn’t feel like that would have happened without the news cycle

At least IMO the scientific awards are more interesting when they're leading indicators, not trailing ones -- when they are given by peers, acknowledging impact that may happen in the future.

Because it often takes decades to have impact, and it may occur after the researcher has passed away

---

[1] https://www.amazon.com/Genius-Makers-Mavericks-Brought-Faceb... - good book if you’re interested in how technology transfer happened in the last 10-15 years


>IMO it feels a little late / weird / irrelevant

>scientific awards are more interesting when leading indicators

Peter Higgs waited 50 years, the Nobel is not a "leading indicator." If it was, it would be given out on the basis of the "hype cycle," which would not be very helpful to anybody.


Well, it's possible to wait 50 years, and still NOT have realized the full impact of your work in society

Sometimes science/engineering turns out like that

e.g. I think Claude Shannon is like that -- his impact continues to rise, and he's viewed as more important after he died

He apparently never won a Turing Award or Nobel Prize, probably because there was and is no Nobel in computer science

https://en.wikipedia.org/wiki/Claude_Shannon#Awards_and_hono...

So I guess I mean "drawing attention to something that would have not otherwise had attention", and based on the consensus of people working in the field


The Higgs boson was first detected in 2012 and he won the Nobel the following year. Saying he waited 50 years for the prize is a bit disingenuous.

Not the poster, but I don't understand the downvotes: this is exactly right. Higgs was awarded the Nobel after the mechanism he theorized was experimentally confirmed, and that is 100% the reason it took so long.

Right! Einstein didn't get the Nobel because the theory of relativity is awesome, he got it after Eddington observed gravitational lensing during an eclipse, confirming a key prediction.

Brilliant theorizing can be both brilliant and wrong.


Einstein got the Nobel Prize for his work on quantum physics, not relativity.

It's not disingenuous, the Higgs mechanism was theorized in the 60s: https://en.wikipedia.org/wiki/Search_for_the_Higgs_boson

I generally agree with this article in that PROGRAMMABILITY is the core of Unix, and it is why I've been working on https://www.oilshell.org/ for many years

However I think the counterpoint is maybe a programming analog of Doctorow's "Civil War on General Purpose Computing"

I believe the idea there was that we would all have iPads and iPhones, with content delivered to us, but we would not have the power to create our own content, or do arbitrary things with computers

I think some of that has come to pass, at least for some fairly large portions of the population

(though people are infinitely creative -- I found this story of people writing novels their phone with Google Docs, and selling them via WhatsApp, interesting and cool - https://theweek.com/culture-life/books/the-rise-of-the-whats... )

---

The Unix/shell version of that is that valuable and non-trivial logic/knowledge will be hidden in cloud services, often behind a YAML interface.

And your job is now to LLM the YAML that approximates what you want to do

Not actually do any programming, which can lead to adjacent thoughts that the cloud/YAML owners didn't think of

In some cases there is no such YAML, or it's been trained out of the LLM, so you can't think that thought

---

There's an economic sense to this, in some ways, but personally I don't want to live in that world :)


I agree so strongly. I'm a Vim user and from time to time discuss Vim/Emacs vs. $IDE, and there are usually good considerations of how the various differences might affect a code base: would the ease of refactoring in an IDE lead to better consistency, does IDE autocomplete scatter typos throughout, etc.

But I can't recall a discussion of how they affect us. The tools we use, the techniques those tools allow and foreclose, profoundly shape our thoughts and feelings. This applies to any creative practice (I'm not one of those "code is art" people, but you are creating something), not just software.

I think I share your worry, but in a more abstract sense: how does the act of thinking about building software shape us, and what would we lose without it? Would we be better off? Would it have been better if we applied those mental resources elsewhere? Has society benefited from a huge swell of humans thinking this way?

Something that heartens me a little is that I think the rich world is on the cusp of being able to do things broadly only because we want to. I may never write another Django app again, unless I want to experience how they did it in the early 21st century. I think this culture is emerging--I wish it were more widespread, and we were more focused on bringing it to all humanity, but its emergence gives me hope.


Agreed, I definitely like how Vim makes me feel, and how it opens up some head space

I recently wrote a comment about how I started using it 20 years ago, and even then it was viewed as OLD !! My older co-workers were using Java IDEs, wondering why I started using such an old editor

https://lobste.rs/s/20t1jj/wonderful_vi#c_rnkwas

And the original article is that the creator of Rails has switched to Linux + vim, from Mac + SublimeText I think.

So it's funny how Vim is timeless, and people keep re-discovering it. I discovered it in 2005 and never looked back!


> And your job is now to LLM the YAML that approximates what you want to do

s/YAML/JCL/g

s/LLM/clone and edit/g

And you've pretty much described the mainframe world.

For this reason, one of the things that AT&T thought to do back in the 70s with its new OS, Unix, was to give it to their engineers as a more sensible interface with which to write programs for, and submit jobs to, the mainframe. The version that was built for this purpose was called PWB/Unix (for Programmer's Workbench).


Yes! And I remember "Back to the 70's with Serverless" (2020) as a great article which specifically mentioned JCL and the clunkiness of the modern cloud:

https://archive.ph/sGwuv

https://news.ycombinator.com/item?id=25482410

I referenced it on my blog -- it's a shame the link has rotted now


I see your concern, but don't think it's anything to be worried about. Is an electrician's job at risk because homeowners can purchasing wiring and outlets from a big box store and tap a new outlet in their home? Are mechanics worried about people who do oil changes at home?

There will always be a demand for skilled labor, but the definition of "skilled" is going to continue changing over time. That's a good sign, it means that the field is healthy and growing.


My fear, perhaps ill founded, is that the "electricians" of the machines will just age out like COBOL devs. Highly sought after and in demand, yet work no one new is learning to take over.

A large percentage of the current software workforce, professional and open source, are people who learned these skills casually growing up rather than explicitly in school as a career. I'm not sure this demographic exists in any meaningful numbers in younger generations.

Will there be enough people to maintain our foundations when the only ones who understand them are the ones formally educated? What happens to the actual number of people even interested in a computing career path when they didn't grow up with "classical" computers?

I am happy to be totally wrong here, it's just the kind of thing that keeps me up at night.


My concern is not really about jobs! It's about the "thoughts you're able to think"

I guess whether you think this is real or not is a similar question to whether you think the iPad/iPhone thing is real

Did that happen, or not? (honest question)

The irony is that if it did happen to a certain person, that person won't notice it

Personally I do think it's very real, because thoughts are correlated with what's "ready at hand", what you can accomplish in an environment


You can think whatever you want, despite what Paul Graham says. No one is required to pay you for the thoughts you think though

Usually in many countries, insurances won't pay if something went bad by not being done by a professional electrician or mechanic.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: