I think Europe should resume drilling in the North Sea and Groningen if they have exploitable reserves there. Europe depends on energy imports and that won't change in our lifetimes (I'm in my early forties, so at least in my lifetime.) They should take advantage of whatever resources they have.
I'm guessing you think otherwise? Why? Do you think the energy transition will be faster? What makes you think that?
Because the continued survival of civilization depends on leaving fossil fuel in the ground. If the transition isn't fast enough then we will have horrible, lethal shortages, but that's still better than the worst climate scenarios.
Closing Groningen didn't leave fossil in the ground. It took LNG from US and gas from Norway out of the ground instead.
The decision to stop using fossil fuels is not tied to the decision to stop one of the sources of fossil fuels. They're divorced.
Stopping fossil fuels requires investments in alternatives, and price mechanisms that disfavour fossils. Absent those mechanisms, closing one source of fossil just shifts demand to another source of fossil, which is exactly what happened.
Meanwhile closing the gas source cost the NL a few hundred billion euros, the amount of money it needs to transition to renewables. Instead it is spending that on US LNG and Norwegian gas.
The field shouldn't have been closed in 2023, it should've remained open until e.g. 2030 and all proceeds earmarked for massive energy transition subsidies. Instead we're just importing expensive fossil now and have insufficient money to meet our green ambitions.
That may or may not be true. But it won't stay in the ground as long as there is money to be made by extracting and consuming it.
Right now all that's happening is the US is extracting that natural gas, and the middle east extracting that oil, and Europe is importing it. Which pollutes more and costs more. Just develop your domestic supplies.
The only direct thing we (the Netherlands) can do to prevent carastrophic climate change is to leave fossil fuels on our territory in the ground. Everything else is indirect.
I know this is an unfathomable concept, but to actually "leave fossil fuels [...] in the ground" you have to stop using fossil fuels. Burning fossil fuels someone else refused to leave in the ground means--surprisingly--that fossil fuels weren't left in the ground after all.
And it turns out that we actually live on a shared planet with a common atmosphere; sourcing your fuels from abroad does nothing to prevent climate change. But it does mean that you are unable to secure some of the most fundamental inputs to your economy.
Plus you have no control over the standards for extractions (e.g. methane leaks), and shipping it causes more pollution.
They're actually worse off, and they pay more for it instead of creating jobs and keeping the money in their own economy. Meaning less money for e.g. green programs to move away from fossil fuels.
> That is because that money is allowed to be made by externalizing the cost to future generations.
I don’t understand why you wrote this in response to the comment you replied to.
No matter which way you slice it, the UK and Europe using the oil from wells physically closer to them has to be less energy intensive that shipping oil / gas from far away.
What bearing does externalising anything have on that fact.
Economics 101: if Europe taps new wells, global supply increases. Higher supply drives down prices. Lower prices induce more consumption.
We wouldn't just be cleanly swapping imported fuel for domestic fuel 1:1; we'd be making it cheaper to burn more fossil fuels globally. The marginal emissions saved on shipping are completely wiped out by the net increase in total carbon burned.
The only reason expanding that supply looks like a "win" on a balance sheet today is exactly because the long-term climate cost of burning that newly available fuel is still being passed on to the future.
> long-term climate cost of burning that newly available fuel is still being passed on to the future.
That’s not science.
That’s wishful thinking.
We can’t actually know the long term climate-costs of burning fossil fuels.
It’s unfalsifiable.
We don’t have a second identical Earth we can use as a control.
Expending the fossil fuel supply today (months) reduces the impact of global oil / gas shocks to people suffering high prices today.
Waiting for your team to invent new battery and storage technology, and littering the countryside with wind turbines and replacing the entire existing vehicle fleet does nothing to help people now.
You’re willing to sacrifice the lives of at least some poor people who exist now, or are likely to exist in the near future, for a theory that is unfalsifiable.
That’s not since.
That’s brainwashing, and it’s not even good brainwashing.
> You’re willing to sacrifice the lives of at least some poor people who exist now, or are likely to exist in the near future, for a theory that is unfalsifiable.
What exactly do you mean with "unfalsifiable"? We actually measure atmospheric CO2, sea level and temperature; that's plenty falsifiability to me. And the greenhouse effect itself is not even in question.
Fossil emissions are sacrificing people not just from climate change in the future, but right now from air pollution, too (about 5M deaths per year actually, according to https://pubmed.ncbi.nlm.nih.gov/38030155/).
Climate science wants us to ignore the geological record and ignore geological processes.
A cubic kilometre of lava at 1200 degrees C is enough energy for thirty (30!) hurricanes.
It’s entirely possible that sea temperature rise is a result of geologic processes at or near the sea bed, and when you warm a liquid dissolved gasses are liberated.
But climate science wants us to ignore all that and place the blame entirely on human caused CO2 emissions and cow farts, while we are literally living through and ice age.
If you think that the greenhouse effect is real (CO2 contributes to warming), why would human emissions not have any effect? we currently emits tons of it per year and person for a substance only in the 400ppm range-- even if you split a single humans emissions over a whole cubic kilometer it makes a substantial concentration difference already.
> It’s entirely possible that sea temperature rise is a result of geologic processes at or near the sea bed, and when you warm a liquid dissolved gasses are liberated.
No, this is not remotely plausible, because we have a pretty solid understanding of how much heat is transferred from the earths interior (see https://en.wikipedia.org/wiki/Earth%27s_internal_heat_budget), and this is completely negligible (and off by many orders of magnitude) compared to the oceanic warming that we already observe (for a 0.5K increase in oceanic surface temperature you'd need thousands of times the total heat that we get from the planet itself).
> It’s entirely possible that sea temperature rise is a result of geologic processes at or near the sea bed, and when you warm a liquid dissolved gasses are liberated.
You're mistaking possible for probable. There's no evidence to suggest that's the case, and lot's of evidence that it's from climate change. In science you follow the evidence, not your pet theory.
> But climate science wants us to ignore all that and place the blame entirely on human caused CO2 emissions and cow farts, while we are literally living through and ice age.
I don't think you understand either science or ice ages.
What exactly is your argument here? That organic chemistry is all wrong and oxidization is unfalsifiable, or that the fossil industry itself is fudging the numbers to make it look life we're oxidizing less organic matter than we think?
It would be an inefficient use of capital to support more fossil exploration considering the deployment rates and cost decline curves of renewables and storage.
> But what the past year has shown is that it’s possible to go harder and faster in deploying solar panels and batteries, reducing energy use, and permanently swapping out entrenched sources of fossil fuel. Solar installations across Europe increased by a record 40-gigawatts last year, up 35% compared with 2021, just shy of the most optimistic scenario from researchers at BloombergNEF. That jump was driven primarily by consumers who saw cheap solar panels as a way to cut their own energy bills. It essentially pushed the solar rollout ahead by a few years, hitting a level that will be sustained by EU policies.
(Europe has enough wind potential to power the world, their energy constraints are deployment rate of renewables, battery storage, and transmission)
You're talking about electricity, so I assume your answer is directed to the natural gas fields at Groningen. The EU imports a lot of natural gas. Don't you think it would be better to have a domestic supply? It's better for the environment too.
Heck right now, Europe is still burning coal (and worse yet - lignite coal) for electricity. Natural gas would be a huge improvement on that.
Note that drilling for oil in the North Sea is a completely different subject, because that's not used for electricity generation, nor is electricity a substitute. EV market share in Europe is still far too low for that to be a conversation for a long time.
Your comment is wishful thinking and ignores the current reality of how Europe imports and uses energy.
But even if your best case scenario were somehow possible (and it really isn't) there's still money to be made exporting fossil fuels to the developing world. So your assertion "inefficient use of capital to support more fossil exploration" is just flat wrong.
No, everything can move to electricity, China is doing it, Europe can too. You are free to your opinion, but the facts and evidence are clear. If you would like an hour of time with an expert from Ember Energy to explain this, happy to pay for that hour of time for you to update your priors and mental model on the topic of Europe's energy transition trajectory.
> Note that drilling for oil in the North Sea is a completely different subject, because that's not used for electricity generation, nor is electricity a substitute. EV market share in Europe is still far too low for that to be a conversation for a long time.
Europe's EV uptake will speed based on the price of oil increasing and remaining high for the foreseeable future.
> But even if your best case scenario were somehow possible (and it really isn't) there's still money to be made exporting fossil fuels to the developing world. So your assertion "inefficient use of capital to support more fossil exploration" is just flat wrong.
The developing world is leapfrogging fossil fuels and going straight to solar, batteries, and EVs. What will expensive LNG do to this market? It will force them to renewables and electric mobility faster. Ethopia's uptake of EVs after they banned combustion vehicles is an example of this. Why did they ban combustion vehicles? Because they have no domestic fossil fuel supplies and the import cost was crushing them; their EVs are powered by domestic hydro electricity production.
Doesn't the world only have about 50 years [0] worth of oil remaining in the ground? Climate change and war aside, it seems like that should be a major reason to accelerate the change to renewables.
No I don't think so. The oil industry is very good at discovering and developing resources previously thought to be out of reach.
People have been talking about peak oil for decades, as long as I can remember, and it never happened.
I think we're technologically capable of extracting more oil, coal, and gas than we would ever want to. We would cook ourselves with the damage we'd do to the climate. I think that's the real constraint - and I hope we pay attention to it.
Conventional oil actually peaked around 2005–2006, but the shale oil revolution in the U.S. and technological advances have certainly postponed peak oil itself.
Here comes the kicker, though: we obviously extracted the easy-to-access resources first. While there may be counterexamples, looking at ore grades makes it clear that this is not particular to oil.
What happens next is that the economics of the wells are getting worse, which means we need a higher oil price for them to be viable. This also results in a lower energy return on energy invested (EROI), which reduces the surplus energy available to transform our environment. Consequently, this implies slower growth in the economy. Which I think is pretty obvious in the west and would explain the explosion of debt.
I think your analysis is US-centric. I don't think non-shale oil has peaked yet globally.
What you say about the economics getting worse and lower EROI may be true. It certainly seems like common sense. There are some counter-examples though.
The inflation adjusted cost of extracting oil from the oil sands in Alberta, Canada has actually decreased over time, not increased.
But generally I'd expect increasing cost of extraction to be the norm.
I've done my best to educate with facts and citations. Appreciate the discussion regardless. My offer stands to pay for you to talk to a subject matter expert.
Edit (to respond to your edit):
> Global consumption of coal, oil, and natural gas all rose in 2025. We've not even peaked yet.
Do you think global LNG consumption will peak considering a material amount of production has been taken offline for the next five years as of today? If I am an LNG consumer on the global market, am I re-evaluating my options today for the next half decade of energy needs? And we are not even done yet with additional potential attacks on Middle Eastern fossil infrastructure as long as the conflict continues; there are more targets available, and more capacity that could be diminished for the foreseeable future.
> Renewable energy continues to expand rapidly, but not fast enough for a total reduction in fossil fuels. Emissions from burning oil are projected to rise by 1% in 2025, while gas emissions are set to increase by 1.3%, and coal by 0.8%.
These increases are not material in a world where 1TW/year of solar PV is being deployed. Global solar capacity doubles every three years [!!] at current rates. If that rate holds, without accounting for increases of that rate as more PV manufacturing capacity comes online, it will replace all fossil energy globally (not just fossil electricity, all fossil energy use) in under twenty years when you consider the efficiency gains of not burning fuel for energy.
> Solar and wind are now expanding fast enough to meet all new electricity demand, a milestone reached in the first three quarters of 2025. Ember’s analysis published in November shows that these technologies are no longer just catching up; they are outpacing demand growth itself. Together, solar and wind supplied 17.6% of global electricity in the first three quarters of 2025, up from 15.2% over the same period last year, pushing the total share of low-carbon sources to 43%.
> For the first time across a sustained period, renewables, including solar, wind, hydro and smaller sources such as geothermal, generated more electricity than coal. At the heart of this shift is solar, whose growth was more than three times larger than any other source of electricity so far in 2025, confirming its role as the dominant force reshaping the global power system. Another analysis showed that the world is set to add 793 GW of renewable capacity in 2025, up 11% from the 717 GW added in 2024. At this pace, only a modest increase in annual additions is needed for the world to stay on track to triple global renewables by 2030.
If that's true, I don't think we reach 50% of current levels by 2100. That's my very non-scientific WAG. I'll be long dead by then. Europe, if they continued drilling in the North Sea and Groningen would have long since exhausted them - a great capital expenditure and investment to bring things back to the original subject of conversation.
What do you think? That would give me a good window into how realistic your view is.
I think where you're going wrong is perhaps not taking into account continued increases in per-capita energy usage worldwide. But of course that will happen, not just because of population growth, not just because of the rest of the world rising slowly towards Western standards of living, but continued technological progress which depends on energy (or at least it has been that way historically.)
Doomberg (the green chicken) correctly observes that when we add a new energy source to the mix, we don't tend to decrease our consumption of previous energy sources.
For example: global wood consumption for energy is at or near all-time high levels, with approximately 2 billion cubic meters (m³) of wood fuel consumed in 2023, up from 1.5 billion m³ in 1961. While the percentage of global energy provided by wood has plummeted from over 90% in the early 19th century to around 3-6% today, the total volume burned has increased, driven by population growth in developing nations and increasing bioenergy use in developed ones.
Those people, and the world in general, would be better off burning natural gas for heating and cooking, rather than wood.
But environmentalists in the west deny them that option because they don’t give a fuck about poor people, they can just freeze in the dark or choke on the fumes of whatever plant fibres / dung they can scavenge from the local environment.
I don’t know how else to frame it.
I spent, more like wasted, two decades of my live in the cult of environmentalism, and they literally just out and say it: some people are going to die in the transition away from fossil fuels, oh well.
That’s easy to say when it’s not you who’s going to freeze in the dark.
> Why did they ban combustion vehicles? Because they have no domestic fossil fuel supplies and the import cost was crushing them
Your closing argument is that some far away land with no nat gas / oil reserves of their own isn’t convincing anyone with nat gas / oil reserves of their own.
Europeans need inexpensive fuels to power their existing fleet or vehicles now.
Europe has no choice but to lean into low carbon generation and EVs now, their hand is forced by geopolitical energy events outside of their control. These options are objectively cheaper than attempting to develop new domestic fossil resources.
China will build every cheap EV Europe will buy if the EU cannot build them fast enough (citations on EU EV sales are in my comment you replied to), so buy them or experience economic pain and ongoing energy inflation from choosing to continue to burn fossil fuels for energy. These are straightforward choices to make. The clean energy path is the cheaper path, based on all available data as of this comment.
(from your profile, "They force us to use extremely expensive renewable energy to run our energy efficient extremely disposable appliances," so I'm unsure how effective facts and data will be in this discussion, but I am trying very hard to share the relevant facts as a shared foundation to discuss from)
I am 100% bullish on both solar energy and EVs, and I share your optimism around the technology.
But I think you're being too optimistic about what this means for global fossil fuel usage. Definitely over the next decade, but potentially over much longer periods than that.
I’ve been following you for 13 years in this site, and I really expect more intellectually honest comments from you.
You’re trying to tell us that developing new battery technology, new storage technology, deploying more wind / solar and replacing the entire European vehicle fleet, is cheaper than building new oil / gas infrastructure.
I’m not buying it.
Regarding the comment in my profile, I’ve done warranty repair work on home appliances. Some manufacturers have moved to assembly methods that render appliances uneconomical to repair, or impractical. Also, in Australia, electricity price increases have been double or triple that of general inflation.
The comment in my profile is an objective assessment of the facts, not an ideological screed.
> I’ve been following you for 13 years in this site, and I really expect more intellectually honest comments from you.
I believe the failure is on your part, not my part, as I am simply providing facts. Whether you agree with facts is beyond my control. They remain facts. You keep stating, inaccurately, that renewables and batteries are expensive, when they are the cheapest combined generation technology. This is widely proven, and again, I am sorry if for whatever reason you are ignoring that fact.
> Also, in Australia, electricity price increases have been double or triple that of general inflation.
The facts do not align with your assertion. I have provided citations below to assist you in updating your mental model on the price and carbon intensity trajectory of the Australian power markets.
> Power prices on Australia’s east coast are predicted to fall from July because of increased output from wind generation and batteries, and falling electricity contract prices, with potential savings up to $1,320 for some small businesses. In a draft decision on Thursday, the Australia Energy Regulator (AER) proposed a price reduction for customers on standing electricity plans – known as the “default market offer” – of between 1.3% to 10.1% for residential customers, and between 8.5% and 21.2% for small businesses, depending on the region. Savage said reduced wholesale prices were the “biggest driver” behind the draft decision for 2026-27. “We’ve had lots more renewables come into the market. We’ve had good wind, solar and battery performance.” The draft determination also introduces the “solar sharer” offer, an opt-in plan that includes three hours of free power in the middle of the day to take advantage of abundant solar energy. The energy minister, Chris Bowen, said the idea was designed to share the benefits of Australia’s solar success. “For households that can shift some of their usage into the free power period, this can mean real savings on bills, whether that is running the dishwasher, doing the washing, or heating hot water during the day.”
Solar is so cheap and plentiful (as there is not yet enough battery storage on the network to time shift this power), they plan to give it away for free for ~3 hours/day in parts of the Australian NEM system.
Australia's renewables boom delivers coveted power price payoff - https://www.reuters.com/markets/commodities/australias-renew... - February 10th, 2026 ("Australia's wholesale electricity prices fell to the lowest in four years in 2025, bucking the rising price trends seen elsewhere and validating claims that renewables-heavy power system overhauls can help lower consumer power costs.")
Further data can confirm this at https://openelectricity.org.au/ (which has both carbon intensity and price data for all Australian electrical grids except the Northern Territory)
I’m an Australian living in Australia connected to the eastern grid.
I have the invoices from my electricity bills to prove my assertion.
I used to pay 12 cents per kilowatt hour, now I pay 36 cents or more. That’s a 300% increase for electricity prices vs, if I recall correctly, 26% for general inflation over the same 25 year period.
And you’re trying to tell me I’m wrong.
Why?
I’ll believe a drop in electricity prices when I see it.
Renewables do not reduce the need for grid investment costs between generators and you, that revenue and capital would be required regardless if you are unable or unwilling to produce all of the electricity to meet your domestic consumption requirements from rooftop solar and on site battery storage.
Wholesale generation costs + distribution costs + taxes = your bill.
So you've decided it's more ethical it not be born or live at all. Obviously the only reason beef cattle exist at all is because we eat them.
It doesn't seem such a clear cut ethical decision to me. Certainly there are some forms of raising livestock that are terrible (broiler chickens come to mind), but there are other forms that actually seem quite pleasant for the animals most of the time (e.g. free-range cattle).
Free-range isn't pleasant at all, that's a fair bit of clever marketing. It's virtually impossible to find meat that has lived a "pleasant" life and slaughter. I'd eat meat myself if I could find a reliable and cost effective source of ethical farming.
You actually could find a reliable and cost effective source of ethical ranching, it typically requires buying e.g. a whole side of beef from a specific ranch.
Not very convenient, but if you worry so much about the conditions of the animal, that gives you a way to choose one that fits your standards.
There are lots of bets on Polymarket about when certain politicians cease to be the leader of the country. Trump, Netanyahu, Putin, Mojtaba Khamenei, Zelensky, etc. If they die, those markets are resolved in a predictable way. Death pools already exist, and it's a matter of time before we see an assassination attempt motivated by it.
It is a lot simpler and fairer to tax payers, due to high litigation costs, to simply make adhoc SaaS betting, as Polymarket provides, illegal. Why should tax payers have to pay for the regulation costs and, more broadly, society pay for the obvious disruption vectors arising from arbitrary speculation?
We're already seeing “insider gambling” from the current administration so I'm pretty convinced we'll see assassinations motivated by polymarket gains alone soon enough.
You will also add a markdown file to the changelog directory named with the current date and time `date -u +"%Y-%m-%dT%H-%M-%SZ"`, record the prompt, and a brief summary of what changes you made, this should be the same summary you gave the developer in the chat.
From that I get the prompt and the summary for each change. It's not perfect but it at least adds some context around the commit.
Isn’t the commit message a better place to add what and why? You might need to feed some info that the agent doesn’t have access to “we are developing feature X this change will such and such to blah blah”. The agent will write a pretty good commit message most of the times. Why do you need a markdown file? Are releasing new versions of the software for third parties?
Cheaper and faster retrieval to be added to the context and discoverable by the agent.
You need more git commands to find the right commit that contains the context you want (either you the human or the LLM burning too many token and time) than just include the right MD file or use grep with proper keywords.
Moreover you could need multiple commits to get the full context, while if you ask the LLM to keep the MD file up to date, you have everything together.
The problem isn't giving MORE context to an agent, it's giving the right context
These things are built for pattern matching, and if you keep their context focused on one pattern, they'll perform much better
You want to avoid dumping in a bunch of data (like a year's worth of git logs) and telling it to sort out what's relevant itself
Better to have pre-processing steps, that find (and maybe summarize) what's relevant, then only bring that into context
You can do that by running your git history through a cheap model, and asking it to extract the relevant bits for the current change. But, that can be overkill and error prone, compared to just maintaining markdown files as you make changes
"You want to avoid dumping in a bunch of data (like a year's worth of git logs) and telling it to sort out what's relevant itself"
So instead you give it a years worth of changelog.md?
"Better to have pre-processing steps, that find (and maybe summarize) what's relevant, then only bring that into context"
So, not a list of commits that touched the relevant files or are associated with relevant issues? That kind of "preprocessing" doesn't count?
"You can do that by running your git history through a cheap model, and asking it to extract the relevant bits for the current change. But, that can be overkill and error prone, compared to just maintaining markdown files as you make changes"
And somehow extracting the same data out of a [relatively] unstructured and context-free (the changelog only has dates and description, that will need to be correlated to actual changes with git anyway...) markdown file is magically less error-prone?
Hey you can try it if you like. That's one of the beauties of the current moment, nobody REALLY knows what works best, just a whole lot of people trying stuff
And no, I wouldn't ever give it a year of changelog.md. I give it a short description of the current functionality, and a well-trimmed list of 'lessons-learned' (specific pitfalls/traps from previous work, so the AI doesn't have to repeat them)
If you think git logs are a good way to give context, try it and and see how it works! My instinct's that it won't work as well as a short readme, but I could be wrong. It's so easy to prototype these days, no reason to not give it a shot
"a short description of the current functionality, and a well-trimmed list of 'lessons-learned'"
Where does that come from?
"And no, I wouldn't ever give it a year of changelog.md."
No, instead you'll "[run] your git history through a cheap model". Except it's "overkill and error prone". So you're writing it up yourself? You didn't do the work, how do you know what the pitfalls and traps are?
How often, in your experience, do people read those auto-generated markdown files? Do you have any empirical data on how useful people find reading other people's agents' auto-generated files?
While I have written elsewhere[1] that I think AI is causing a bubble right now, AI is also the biggest technological change to the world since the Internet.
I'm a software engineer, and I don't write code anymore. I'm still coming to terms with that, grieving the loss of my old career and getting used to the new career which is more like a technical lead and product person than a computer programmer.
Stop calling LLMs AI. We have LLMs as a product, now, but not AI. AI does exist as a research field, and so do "flying cars" and "nuclear fusion" (with arguably those two being much closer to materiality than AI).
And no, that doesn't make me some kind of "AI hater", or someone unable to see value in LLMs.
It’s funny to see the treadmill on the term “AI” moved again.
There’s a reason the term AGI is used a lot now. What are LLMs if not intelligence that is artificial? Just today I used it to debug code, write a shader (which, to be fair, it’s only slightly better than me at doing), and tell my daughter and I what hedgehogs and foxes eat. Seems pretty intelligent to me.
Of course, not long ago we were using the term largely for things that were basically big chains of conditional statements.
"the treadmill on the term “AI”" hasn't moved much or at all, and that's essentially my point. Only 3-5 tech giants want us to think it has.
> Seems pretty intelligent to me.
Convenient? Yes. Intelligent? I mean, you can redefine AI to be whatever you want by lowering this bar however low you want. LLMs gave us slightly more convincing chatbots than prev-gen and nobody then would call them intelligent. The only reason we do now is marketing. It's laughable to me that we are still calling that something that's so obviously unable to push back on trivially impossible requests.
> Of course, not long ago we were using the term largely for things that were basically big chains of conditional statements.
No, we weren't? What are you even talking about? We had already built large-enough artificial neural networks whose outcomes couldn't be trivially explained in the 70's. Nobody with a sane mind and an understanding of what they were would call them oracles or intelligent devices. That's where the true genius of Altman lies.
The only brand new definition of AI is the one that came in full marketing speed shortly after ChatGPT, to have us believe that "AI has been solved and is a commodity, now" while all we got were more chatbots.
In the academic fields where this taxonomy matters, nothing much has changed with LLMs, or not more than with DNNs, SVMs, etc. Nobody that's been involved in ML research for more than 5 years seriously thinks "job's done pals, we got to pack it up, after 70 or so years of effort, we've finally figured it out, and that AI we were looking for, we got it".
In 1968 Marvin Minsky defined “artificial intelligence” as “the science of making machines do things that would require intelligence if done by men”
I suspect Alan Turing would also have disagreed with you, but to my knowledge he didn't actually use the precise words "artificial intelligence" so maybe we can disregard him.
But hey, your definition is much better established, right?
What you're proposing is in fact a brand new definition of AI. There are terms in use to describe what you want, AGI and ASI for example are more in that direction.
> But hey, your definition is much better established, right?
I don't like this manner of baselessly singling me out. Once again, there is no "my definition" of AI. Here is the one from wikipedia, that matches the academic definition with which I am aligned "It is a field of research in computer science that develops and studies methods and software that enable machines to perceive their environment and use learning and intelligence to take actions that maximize their chances of achieving defined goals."
My whole point is that the current AI-craze boils down to one single highly-specific method that got added the AI toolbox. It has clear shortcomings and limitations that major tech corporations hide or disguise knowingly, for their profit and our own deception. My take is that we collectively benefit from calling the current breed of AI products what they are: LLMs. For transparency's, correctness' and honesty toward consumers' sakes. The marketing around those products needs to be regulated accordingly.
Now I want to elaborate on why the commonly used taxonomy of AI, as formalised by scientists, or, what you call "my definition" is the only one that matters (with the practical consequence that nobody in the field uses the term "AI" without more qualifiers). Say, you develop a program to play tic-tac-toe. It is something whose complexity is low-enough that it has an analytical solution, with all the states of the game that can be stored and look-up. It could also be trained via convolutional neural networks, a family within machine learning, within AI, on lots of games. Would you call the resulting capability of playing tic-tac-toe "AI"? Theoretically yes, you must (and myself as well). Where we differ is in the importance of specifying how this ability is accomplished. The analytical implementation will be fully explainable, while the neural-network won't. In other words, we won't be able to tell whether the answer from the latter is faithful and desired, or subject to what we nowadays call "hallucination" in a turn of the tech giants to further anthropomorphise LLMs.
Now, maybe more about where I'm coming from. I researched computer vision algorithms for self-driving vehicles for several years. The trolley dilemma and other considerations about what it means for machines to behave "morally" has been ingrained through my formative and academic years. As Engineers and machine learning scientists, we have been evolving our field from predictable analytical methods to new ones with better results but less introspectability. I'm of the standpoint that this is only fine as long as the society is educated on those matters and willingly commits to using those methods with there trade-offs, for about the same reasons we put labels on food products and list known side-effects and their occurring changes in our medication. Because now those computational methods affect us in the real world in very material and tangible manners.
It's not as easy to build a business as just copying someone (otherwise we'd have all been doing that long before LLMs).
I expect the software market will change from lots of big kitchen sink included systems and services to many smaller more specialized solutions with small agile teams behind them.
Some engineers that lose their jobs are going to create new businesses and new jobs.
The question in my mind: is there enough feature and software demand out there to keep all of the engineers employed at 3x the productivity? Maybe. Software has been limited on the supply side by how expensive it was to produce. Now it may bump into limits on the demand side instead.
Meanwhile LLMs are better than junior devs, so nobody wants to hire a junior dev. No idea how we get senior devs then. How many people will be scared away from entering this career path?
The job has changed. How many software engineers will leave the career now that the job is more of a technically minded product person and code reviewer?
I can't predict how it all plays out, but I'm along for the ride. Grieving the loss of programming and trying to get used to this new world.
Provided one can go through the silliness of fitting golf balls into planes, or going through the algorithms and data structures books from 30 years ago.
At work we do both. If you pass the somewhat artificial interviews we invite you for a trial week of 3-5 days working with the actual team on actual real work (a somewhat contained feature or problem).
Lets us evaluate people in real conditions and visa versa.
In Germany for example, that isn't allowed, and there is the whole insurance discussion if something happens to someone that shouldn't be there as employee in first place.
It also works for people willing to take time off (PTO or otherwise). We do lose some people who aren't willing to do the trial, but that's considered acceptable. People who do the work trial and accept an offer are much more likely to stick around (much lower rate of mistakes on their part and our part.)
I'm laughing a bit at Germany, but Europe in general has lost the plot when it comes to innovation.
I bet the same thing happens when the AI bubble pops.
"but this time is different, it's not a bubble, there's real value there"
Economists use the term “bubble” to describe an asset price that has risen above the level justified by economic fundamentals, as measured by the discounted stream of expected future cash flows that will accrue to the owner of the asset.
I think there's little argument that is happening, the question is more about to what extent is it a bubble.
The entire global software industry is worth less than $1 trillion dollars. Or in other words smaller than the current valuation of just OpenAI + Anthropic.
Planned capital investment this year by the Magnificent 7 alone is $600B. More than 2/3 of the total global software industry. In one year. Good luck buying any computer hardware this year, there will be a shortage of everything, including electricity.
That's not how you should measure "worth". In that world, you'd have a P/E ratio of 1. Comparing to a bond, it would be like expecting to get paid the face amount in a single year. Many people are quite happy with 5-10% interest as a risky benchmark, so 10-20 P/E isn't wild. That puts the market cap for tech itself at 10-20T as a reasonable baseline.
The entire global software industry is worth less than $1 trillion dollars. Or in other words smaller than the current valuation of just OpenAI + Anthropic.
Apple, Microsoft, Google are all worth 3-4x the global software industry just for some context.
Is Microsoft 3x more important than OpenAI and Anthropic combined? Personally no. I think the value generated by OpenAI and Anthropic will surpass Microsoft.
Going off what I could find easily from Google + ChatGPT [1]:
But arguing about the details is kind of missing the point. Microsoft's value is also inflated by the AI bubble and can't be used as a point of reference.
I don't think MS value has been inflated by AI, if anything it's value has decreased from it's AI investments. MS has mostly been on it's same growth path for the last decade.
I think you haven’t been paying attention to the market then over the last couple years. Despite getting hammered recently, it is still up 100% from 2023 lows.
Am I crazy? MSFT has was up about 100% in the prior three years before 2023 as well, and again for the three years before that.
AI has had very little to do with MSFT growth. Pretty sure the 2023 lows were a response to the massive AI spending, and the recovery mostly due to continued Azure services growth.
Two years. Basically the period when people were stuck at home during COVID restrictions and were willing to spend extra money to make that experience more comfortable. Prices fell precipitously after restrictions were lifted and people had desires outside of the home again.
1) the only reason any part of the economy is growing at all
2) the only reason US banks aren't bankrupt due to the commercial real estate debacle they got themselves into
In other words, if this is a bubble, if this pops, we're back in the 2008 situation. Where banks will go bankrupt one after the other like dominoes (in the sense that this amount is large enough that large banks will fail their financial obligations). And you can argue as much as you want based on "real" valuation metrics but none of your investments, not even cash dollars or even gold, will come out of that one intact.
Fortunately, there's the counterargument: you know what else is higher than ever? The revenue produced by the software industry. To the point that at the moment you can say, as crazy as it sounds: if revenue of the big software firms keeps growing the way it IS currently growing, this is not enough investment.
In case you're wondering what exactly that means, not enough investment. Think of it like this: you're selling shoes. If you invest too little in new shoes (or whatever resources you need to sell shoes), then you will have to tell customers coming in "sorry, all out of shoes, take your money elsewhere". Currently it's not enough investment. If this growth rate keeps up for 1.5 years, Amazon will have to close the store to anyone who wants more machines, in fact they are turning away large customers right now at Amazon, Google and Microsoft. That's where the "spend more now" madness is coming from. Is it unjustified?
The problem is "markets can stay irrational longer than you can stay solvent". It doesn't matter when the bubble pops if the governments (especially the US') bail those companies out.
The damage is already being done, whether you are a 401k/IRA holder with a position on the S&P 500 way too overweighted by the Mag7&co and their circular dealings, or just needing to buy computer parts way over their market value because some companies are over-leveraging to outcompete you for that hardware (or electricity), or even at a smaller scale by increasing software costs because everything is "AI-powered" now and of course you wouldn't want only "deterministic" software that just works and doesn't have a slop machine integrated.
I'm guessing you think otherwise? Why? Do you think the energy transition will be faster? What makes you think that?
reply