Just because this article isn't well formed or sourced doesnt make its claim incorrect.
I program daily and I use AI for it daily.
For short and simple programs AI can do it 100x faster, but it is fundamentally limited by its context size. As a program grows in complexity AI is not currently able to create or modify the codebase with success (around 2000 lines is where I found it has a barrier). I suspect it's due to exponential complexity associated with input size.
Show me an AI that can do this for a 10,000 lines complex program and I'll eat my own shorts
Doesn't even take that much. Today I did some basic Terraform with the help of GenAI - it can certainly print out the fundamentals (VPC, subnets) faster than I can type them myself, but the wheels came off quickly. It hallucinated 2 or 3 things (some non-existent provider features, invalid TF syntax, etc).
When you take into account prompt writing time and the effort of fixing its mistakes, I would have been better off doing the whole thing by hand. To make matters worse, I find that my mental model of what was created is nowhere near as strong as it would have been if I did things myself - meaning that when I go back to the code tomorrow, I might be better off just starting from scratch.
Here's a thought experiment. Think back on how that statement would have sounded like to past-you, 3 years ago. You would probably have dismissed it as bullshit, right? We've gone a long way since then. Both in terms of better, faster and cheaper models, but also how they're being intertwined with developer tooling.
You could have said the same for crypto/blockchain 3-4 years ago (or whenever it was at peak hype).
Eventually we realized what is and isn't possible or practical to use blockchain for. It didn't really live up to all the original hype years ago, but it's still a good technology to have around.
It's possible LLMs could follow a similar pattern, but who knows.
It created a speculative asset that some people are passionate about.
However, if you saw the homepage of HN during blockchain peak hype, being a speculative asset / digital currency was seen almost as a side effect of the underlying technology, but it turns out that’s pretty much all it turned out to be useful for.
As you inadvertently pointed out, AI improvements are not linear. They depend on new discoveries more than they do iteration. We could be in either out of jobs or lamenting the stagnation of AI (again).
After an innovation phase there is an implementation phase. Depending on the usefulness of the innovation, the integration with existing systems takes time. It is calculated in years, tens of years. Think back on the 80-90s, where it took years to integrate PCs into offices and workspaces.
From your comment, is sounds like you think that the implementation phase of LLMs is already over? And if so, how do you come to this conclusion?
It's not as if we have no idea how to make use of AI in programming. We've been working on AI in one form or another since the 70's, and integrated them with our programming workflow almost as long (more recently in the form of autocomplete using natural language processing and machine learning models). It's already completely integrated into our IDEs, often with options to collate the output of multiple LLMs.
What further implementations of integrating AI and programming workflows have LLMs shown to be missing?
You can imagine all sorts of things, and then something else might happen. You can’t rely on “proof by imagination” or “proof by lack of imagination.”
We shouldn’t be highly confident in any claims about where AI will be in three years, because it depends on how successful the research is. Figuring out how to apply the technology to create successful products takes time, too.
Ten years ago when Siri/Google/Alexa were launching, I really wouldn't have expected that 2024 voice assistants would be mere egg timers, and frustrating ones at that - requiring considered phrasing and regular repeating/cancelling/yelling to trick it into doing what you want.
A 10x near future isn't inconceivable, but neither is one where we look back and laugh at how hyped we got at that early-20s version of language models.
It also might be that the language everyone uses 20 years from now that gives a 50X from today is just being worked on right now or won't come along for another 5 years.
The way people who would have thought that humans could never fly were not completely wrong before the airplane. After the airplane though, we are really talking about two different versions of a "human that can fly".
In my very uninformed opinion, all we need is more clever indexing, prompting, and agents that can iteratively load parts of the codebase into their context and make modifications.
Real engineers aren’t expected to hold 10,000 lines of exact code in their head, they know the overall structure and general patterns used throughout the codebase, then look up the parts they need to make a modification.
- (1) It shall be unlawful for a person to advertise or offer for sale a digital good with the terms “buy,” “purchase,” or any other term which a reasonable person would understand to confer an unrestricted ownership interest
(B) The affirmative acknowledgment from the purchaser pursuant to subparagraph (A) shall be distinct and separate from any other terms and conditions of the transaction that the purchaser acknowledges or agrees to.
(b)(1) says that "buy" is not permitted for these goods... EXCEPT
(b)(2)(A) says that it IS permitted, if you follow the rules in subsections i through iii.
> (2) (A) Notwithstanding paragraph (1), a person may advertise or offer for sale a digital good with the terms “buy,” “purchase,” or any other term which a reasonable person would understand to confer an unrestricted ownership interest in the digital good, or alongside an option for a time-limited rental, if the seller receives at the time of each transaction an affirmative acknowledgment from the purchaser of all of the following:
My read on that is that either (b)(1) controls and you cannot use the words "buy" and friends, OR you do the things in (b)(2) and you CAN use "buy" & etc.
My read on subsection (ii) when combined with (i) is that simply "providing" the EULA for a digital software download and making the customer tick a box saying that they've "received" the EULA would be sufficient. If it's not (and it might not be), then having them scroll through the whole EULA to "prove" that they read it would clearly be sufficient, as it's common practice.
> (B) The affirmative acknowledgment from the purchaser pursuant to subparagraph (A) shall be distinct and separate from any other terms and conditions of the transaction that the purchaser acknowledges or agrees to.
Yes, but I think that this just means that this acknowledgement is a thing that's separate from the EULA, and separate from extended warranties, and such. The language that says that the customer must acknowledge that they received the license for the thing they're "purchasing" indicates that they must be -at minimum- given a chance to read the EULA... and I'm pretty sure common practice is to either provide a link to the EULA, or force you to scroll through it.
That's interesting. I don't for a second think this will actually curtail the harmful business practices, but what do you recon they'll write on their buttons? Maybe just dance around any meaningful verbiage with a button that just has a dollar sign or shopping cart on it? Just "Proceed" or "Confirm"?
“Get” sounds good to me. I’ll know not to get any games that have “Get” button. Hopefully this law spreads to Steam across the board so that people outside of California can also benefit from it.
I’d argue that a reasonable person would understand these terms to confer an unrestricted ownership interest.
I’m putting this good into a metaphorical container and taking it to a metaphorical till. This implies a sort of tangibility, a property of physical goods that I’d walk out of the metaphorical store to own.
That’s a good point. The real world experience they’re analogizing is me putting a bottle of ketchup in a shopping cart at a grocery store and checking out at the cashier. Afterward, I own that bottle of ketchup, not a license to ketchup, but that instance of it. “Shopping cart” and “checkout” imply “buying”, and I can’t think of a counterexample.
Replacing the terminology is the first step to this methinks. You'll always be able to buy a bagel, but not a video game. It's still shitty, but it's not deceptively shitty.
This struck me from two angles: First, it's beautiful, well produced, clear, and concise representation of the federal budget including key areas like the deficit and spending breakdowns. However, it also struck me as "useless", in the sense of "I found it difficult to take away any useful, new, or actionable information". I'm not sure what Ballmer's intended audience or result was, but it must not've been me ...
Personally I would have benefited from less "moving flashy graphs" / "Steve explaining each node in moderate detail", and more "Here's one clean boring way to look at the data", "here's what X means for us / here's what people are considering because of X."
That said I respect what's being done and hope he continues to produce more informational content!
It gives people a baseline for where to start in any discussions regarding the Federal budget, whether that’s immediately actionable, it is useful to have the vocabulary to discuss the budget if you’re going to talk about it at all.
I’m occasionally having to correct people who believe and are insistent that our largest ticket spending item is the Military. It’s certainly the largest part of our discretionary budget, but if you’re going to talk about the Federal budget, it is unhelpful to disregard the non-discretionary budget that goes mostly into Social Security, Medicare and Medicaid and giving people an education on discretionary vs non-discretionary spending drags out conversations; then it’s a coin toss if that person retains any of that the next time you talk politics, or if it got muddled up in their minds by poor news reporting in-between, the news they follow to “stay informed”.
It’s because he presented what could have been a single chart and did so by basically just reading the data. There was no analysis other than the tidbits about debt to GDP being at a high point. That said, great production and I hope it reaches people less familiar with these budgetary line items. If you’re aware what they are, like I am, it landed pretty flat but if you’re not familiar with them I could see it being quite informative. It’s good to build a baseline of financial literacy that I think we also have a general deficit of too.
It's using relative line numbering, at least for me this is how I've had vim set up for the last 10 years. It makes commands like 10j easier to do because you can just look to the side and see how many lines away the target is.
That’s so funny, I love relative line numbers, but I had no idea about half the tips in here.
I use relative numbers while in normal mode, and absolute numbers in insert mode. That way you get easy vertical jumps, but you still have a sense of your line numbers.
My intuitive assumption, then, is that on Mars they would have come up with a different meter such that π² ≈ 10 "mars meters" / s².
Or alternatively stated, that the Mars meter would be much shorter than Earth's meter if they used the same approach to defining it (pendulums and seconds).
A Martian meter defined by martians should relate their average size, the number of fingers they have on their hands and some basic measure of the planet.
I mean, one meter is defined as 1/10^7 of the distance between the equator and the poles which leads to a round number in base 10.
A unit system is not just something that matches objective reality but something that has some cognitive ergonomy.
> A unit system is not just something that matches objective reality but something that has some cognitive ergonomy.
Beautifully stated!
And that's one reason why I like the US units of measurement better than SI. I mean, the divide-by-ten thing is nice and all. But _within a project_ how often are you converting between units of the same measurement (e.g, meters to centimeters)? You pick the right "size" unit for your work and then tend to stay there. So you don't get much benefit from the easy conversion in practice.
But if you're doing real hands-on work, you often need to divide by 2, 3, 4, and so on. So, for example, having a foot easily divisible by those numbers works well. And even the silly fractional stuff make sense when you're subdividing while working and measuring.
Of course it all finally breaks down when you get to super high precision (and that's probably why machinists go back to thousands of an inch and no longer fractions).
I think there's a little bit of academic snobbery with the SI units (though, it is a good idea for cross-country collaboration), but for everyday hand-on work the US system works really well. I always love the meme: There are two kinds of countries in the world, those who use the metric system and those who've gone to the moon.
I'm an AMO physicist by training and my choice of units are the "Atomic Units" where hbar, mass of the electron, charge of the electron, and permittivity are all 1. That makes writing many of the formulae really simple. Which is what you say: it has cognitive ergonomy (and makes all of the floating point calculations around the same magnitude). Then when we're all done we convert back to SI for reporting.
One example where picking units within a project is still not saving you from cognitive load is e.g. when doing woodworking. Ymmv, but I can add decimals way faster than I can add 7 9/16" + 13 23/32" (numbers picked arbitrarily but close to a precision of 1mm so if you are ok w/ that precision, you don't even need fractions in SI).
Would 100% try and buy a tool like this as I am the target market, but no way I'm going to go out of my way to sign up and pay for something I can't see value in behind a paywall.. "you and everybody else." Try a freemium or something? Good luck.
Addendum: It's not about the money, it's about the complexity. It isn't worth my mental overhead if I can't see a transparent value of the thing I'm buying. (Then I have to cancel, etc.. )
Photoshop hasnt added a feature I need in 10 years, yet I cant use the 10 year old single license I paid for, so now I pay yearly to subscribe. It's still the best on the market for me tho.
Seconding this: adding "-pix_fmt yuv420p" to your conversion will ensure compatibility, for example if you try attaching a converted MP4 to a WhatsApp chat, it will not show a preview or play unless you include this flag.
I program daily and I use AI for it daily.
For short and simple programs AI can do it 100x faster, but it is fundamentally limited by its context size. As a program grows in complexity AI is not currently able to create or modify the codebase with success (around 2000 lines is where I found it has a barrier). I suspect it's due to exponential complexity associated with input size.
Show me an AI that can do this for a 10,000 lines complex program and I'll eat my own shorts
reply