I don't really get how this stops captcha solving as a service, which is the actual way that scaled recaptcha solving is done? Those things are incredibly cheap and are staffed by humans anyway. Instead of selecting grainy busses, they will just scan the image with their phones.
Perhaps for some very specific capabilities such as TTS, translation, voice recognition and so on. But for general intelligence models, better hardware just directly allows better models and that doesn't seem to be changing any time soon.
I'm pretty sure that's not linear, so I personally expect the benefits of larger models to diminish. The question is at what point that's the case. I guess a lot of variables play into it, but it is possible that the benefits of running larger models will be too expensive for the little benefit they provide
I don't understand why you are downvoted. Placing economic interests of entertainment megacorp over the rest of internet is one of the things thats wrong with society these days.
> Believing we're in a climate crisis and also being anti-nuclear are mutually exclusive positions
I also used to believe that but now I'm not so sure. Nuclear carries massive and unpredictable risks on failure. We can fairly well predict what will happen on catastrophic wind turbine failure, but with nuclear it is much more difficult. And what is arguably worse is that nuclear catastrophic failures are very infrequent and so we have very hard time estimating and thinking about probabilities of them happening.
Personally I think that keeping existing reactors running is better than the alternatives, but I'm not so sure about building up new reactors compared to building more predictable green energy sources.
Burning coal in coal power plants causes more deaths each year in Europe than the total deaths caused by Chernobyl accident (4000-8000).
"The health burden of European CPP emission-induced PM2.5, estimated with the Global Exposure Mortality Model, amounts to at least 16 800 (CI95 14 800–18 700) excess deaths per year over the European domain"
But only nuclear accidents get the media attention, because they are big and infreqeunt. Similar to deaths caused by aircraft crashes vs deaths caused by car crashes.
While your statement is true, it leaves out relevant details:
There is a certain threshold for radiation exposure where if exceeded the animal isn't deemed safe for consumption anymore. The vast majority of these cases are from boars in certain areas of Germany nowadays and affect less than 1% of all killed boars [1] [2].
The question is simple, if you are afraid of nuclear, are you willing to erase any more dangerous tech? I'm not even mentioning cars which cause each 2y same nr of fatalities in Germany as Chernobyl is expected during whole lifetime per UN (or even worse if we look at updated uncsear reports)
Serious question, when has there been a serious nuclear accident? Fukushima was caused by a natural disaster that killed far more people than the nuclear failure did. Chernobyl was pure communist stupidity. This level of incompetence would never happen in a well functioning country. So that leaves Three Mile island?
Meanwhile coal kills millions each year (mostly the old and children).
And what are these predictable green alternatives? Only hydro is reliable and is heavily restricted by geo. We’d need massive breakthroughs in battery technology to make solar and wind reliable in most of the world (by population).
Look up historical weather patterns days with no sun and no wind, you need massive, massive amounts of energy storage.
My point is that since we have had so few nuclear incidents, but they have done massive damage, it is very possible that we don't actually know much worse it could get. We have only seen a few points from a distribution that could be much wider than we think. Compared to renewable failures for which we have a pretty good idea.
Renewable generation is not the hard part. Renewable transmission and storage is the hard part. Its so hard, in fact, that building very expensive nuke is still much cheaper and more attainable.
> So nuclear plants, by and large, get the market price whenever they produce (which is most of the time) and this does not equal the average price as they will be producing a higher share of total production at times of low demand (and low prices), and a smaller share of total production at times of high demand (and high prices).
The assumption here is that the price is set by only demand rather than the combination of supply and demand. Under that false assumption, generating power when demand is lower (i.e. at night) is bad. But how much solar generation is there at night, and what does that change in supply do to prices if you make solar a higher percentage of the grid?
It does the oppose of this:
> whilst the capture price for solar is often higher than the average price (thanks to power demand generally being higher during the day)
Because solar generates only during the day, in order to supply power with solar at night, you would need it to oversupply power during the day and then pay extra for storage to resolve the undersupply it leaves at night. So once you have a certain amount of solar, you end up with lower prices during the day, when solar is generating a higher proportion of the power, and higher prices after sunset.
And solar is double screwed by this. Not only does it get the soon-to-be-lower daytime prices for all of its output rather than half, its output is further regionally correlated, so that on sunny days when its output is highest, even the daytime price is lower than it is on cloudy days, because higher or lower solar output is a cause of lower or higher prices, i.e. the daytime price anti-correlates with its output.
Your post is completely orthogonal to the point I made and also fails to account for the fact that distributed renewables have enormous capex cost in the form of transmission and storage that we just disregard. Building a nuke plant on a major transmission route doesn't require any of that.
"The average capacity factor for the French nuclear fleet has been, due to the optimised management of production," - lol, lmao even. Optimized management in france? Check out how much their refuel outages take vs in US and say that again
Open multimodel tools will start dominating as soon as frontier labs stop massively subsidising their models only inside their tools and align with api pricing. Personally I think that the inflection point is near considering the slew of recent drama with Claude Code.
Claude Code and Codex are solid, but the real reason people use these over alternatives is that they have dramatically lower overall cost compared to open alternatives.
This could probably be a generic MITM HTTP proxy as well, keep OPENAI_API_KEY=OPENAI_API_KEY in your .env and then replace this with the real key inside the proxy. It wouldn't need to know anything about endpoints or services.
It used to be worse, these days you can at least link between storybook and figma and have similar component naming and figma mostly uses css mental model. Before we had invision and sketch and designers and developers lived in their own worlds that were just completely disjoined.
"hard cap it's technically impossible" is really funny because every other provider manages it just fine. Even wrappers like OpenRouter enable you to set a hard cap on proxied Google resources, but Google themselves are unable to manage that inside gcloud.
reply