Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Few days back, Sam Altman tweeted this

"Prediction: AI will cause the price of work that can happen in front of a computer to decrease much faster than the price of work that happens in the physical world. This is the opposite of what most people (including me) expected, and will have strange effects"

And I was like yeah I gotta start preparing for next decade.



>Prediction: AI will cause the price of work that can happen in front of a computer to decrease much faster than the price of work that happens in the physical world.

I'm skeptical.

The envelope of "programming" will continue to shift as things get more and more complex. Your mother-in-law is not going to install Copilot and start knocking out web apps. Tools like this allow programmers to become more productive, which increases demand for the skills.


I strongly agree with you.

Reminds me of something I read that claimed when drum machines came out, the music industry thought it was the end of drummers. Until people realized that drummers tended to be the best people at programming cool beats on the drum machine.

Every single technological advancement meant to make technology more accessible and eliminate expertise has instead only redefined what expertise means. And the overall trend has been a lot more work opportunities created, not less.


I had a front row seat to the technological changes in the music industry. I opened a recording studio when you had to "edit" using a razor blade and tape to cut tape together. I remember my first time doing digital editing on a Macintosh IIfx. What happened to drummers is that technology advanced to the point where studio magic could make decent drummers sound great. But it's still cheaper and faster (and arguably better) to get a great session drummer who doesn't need it. Those pros are still in high demand.


Yeah, but less drummers are being hired than before drum machines came out. What you describe sounds like work has become more concentrated into fewer hands. Perhaps this will happen with software as well.


What happened is what typically happens: Concentration of expertise. The lower expertise jobs (just mechanically play what someone else wrote/arranged) went away and there was increased demand for higher expertise (be an actual expert in beats _and_ drum machines).

So the winners were those that adapted earlier and the losers were those that didn't/couldn't adapt.

This translates to: If you're mindlessly doing the same thing over and over again, then it's a low value prop and is at risk. But if you're solving actual problems that require thought/expertise then the value prop is high and probably going to get higher.


But there's also the subtext that if you find yourself at the lower-skill portion of your particular industry, then you should probably have a contingency plan to avoid being automated out of a job, such as retiring, learning more, or switching to an adjacent field.


Exactly, and AI only means that this adage now applies to programming as well.


but this was true anyways -- the lower your skill, the more competition you have. At the lowest skill levels, you better damn well have a contingency plan, because any slight downward shift in market demand is sword coming straight for your neck.


I think you have another thing coming. Think about what really got abstracted away. The super hard parts like scaling and infrastructure (aws), the rendering engines in React, all the networking stuff that’s hidden in your server (dare you to deal with tcp packets), that’s the stuff that goes away.

We can automate the mundane but that’s usually the stuff that requires creativity, so the automated stuff becomes uninteresting in that realm. People will seek crafted experiences.


It would be funny if after the AI automates away "all the boring stuff" we're left with the godawful job of writing tests to make sure the AI got it right.


I think it'll be much more likely that the AI writes the tests (the boring stuff) for the buggy code I write.


I can see the AI suggesting and fixing syntax in the tests. Determining their semantics, not without true AGI.


I'm not sure that all of that has really gone away.

It's just concentrated into the hands of a very few super specialists, it's much harder to get to their level but their work is much much more important.


True, and if the specialists retire, there may be some parts that no one understands properly anymore.

See: https://www.youtube.com/watch?v=ZSRHeXYDLko / Preventing the Collapse of Civilization / Jonathan Blow


Better yet -- the jobs of those specialists got better, and the people who would have done similar work did not end up unemployed, they just do some other kind of programming.


Do you have any actual data for that? Last I saw, most bands are still using live drummers and studio recordings for small to mid-sized bands are still actual drummers as well - unless it's a mostly studio band trying to save cost.

I think the analog to programming is a bit more direct in this sense; most companies aren't going to go with something like Copilot unless it's supplemental or they're on an entirely shoestring budget; it'll be the bigger companies wanting to squeeze out that extra 10% productivity that are betting hard on this - same with where larger bands would do this to have an extremely clean studio track for an album.


Source? I would actually expect there to be around the same amount of drummers, but more people making music.


Based on these very unreliable sources, the number of drummers in the US may have increased from ~1 million in 2006 to ~2.5 million in 2018. That's during a time when the population increases from 298 million to 327 million.

So, during this period, a ~10% increase in population saw a 250% increase in drummers.

It does not appear that the drum kit killed the drummer.

Big caveats about what these surveys defined as "drummer" and that this doesn't reflect professional drummer gigs, just the number of drummers.

[1] https://m.facebook.com/Bumwrapdrums/posts/how-many-drummers-...

[2] https://www.quora.com/How-many-people-play-drums-in-the-US


Are we in a drummer bubble?


If you could get by with a drum machine, did you really need a real drummer in the first place? Maybe a lot of drummers were used for lack of any automated alternative in the early days?

By the same line of thinking, If you can get by with AI generated code did you really require a seasoned, experienced developer in the first place? If your product/company/service can get by with copy pasta to run your CRUD app (which has been happening for some time now sans the AI aspect) did you ever really need a high end dev?

I think its like anything else, 80% is easy and 20% is not easy. AI will handle the 80% with increasing effectiveness but the 20% will remain the domain of humans for the foreseeable future.

Worth considering maybe.


The counter argument comes from photography. 20 years ago digital photography didn't exist and if you were a photographer it was a lot easier to make a living.

Nowadays everyone can make professional looking photos so the demand for photographers has shrunk, as the supply has increased.


Drummers do tend to be good drum programmers, but I believe they're a small fraction of the pool of pros. The drum machine made percussion feasible for lots of people living in dense housing for whom real drums were out of the question. (Also drummers tend to dislike using drum machines, because the real thing is more expressive.)

AI will be similar -- it will not just give more tools to people already in a given field (programming, writing, whatever), but also bring new people in, and also create new fields. (I personally can't wait for the gardening of AI art to catch on. It'll be so weird[1].)

[1] https://www.youtube.com/watch?v=MwtVkPKx3RA


Those drum machines didn't use machine learning though.


Exactly. I'm currently reading The Mythical Man-Month. 90% of what the book discusses in term of programming work that actually has to be done is completely irrelevant today. Still the software industry is bigger then ever. In the book it is also mentioned that programmers spend about 50% of their time on non-programming tasks. In my experience this is also true today. So no matter the tools we've got, the profession stayed the same since the early 70s.


What are notable books nowadays? It seems all the books I can cite are from 2005-2010 (Clean Code, JCIP, even the Lean Startup or Tribal Leadership…) but did the market for legendary books vanish in favor of Youtube tutorials? I’m running out of materials I can give to my interns to gobble knowledge into them in bulk.


[prioritization] Effective engineer - lau

[systems] Designing data intensive applications - kleppman

[programming] SICP - sussman & abelson

Last one is an old scheme book. No other book (that I read) can even hold a candle to this one, in terms of actually developing my thought process around abstraction & composition of ideas in code. Things that library authors often need to deal with.

For example in react - what are the right concepts to that are powerful enough to represent a dynamic website & how should they compose together.


I have the same question when it comes to a modern version of the Mythical Man Month. I know some computer history, so I can understand most examples. But still it would be great to have a comparable modern book.


The amount of "programming time" I spend actually writing code is also quite low compared to the amount of time I spend figuring out what needs to be done, the best way for it to be done, how it fits together with other things, the best way to present or make it available to the user, etc. Ie most of my time is still spent on figuring out and refining requirements, architecture and interfaces.


Same, the actual solution is pretty darn easy once I know what needs to be done.


Tools like email, instant messenger, and online calendars made secretaries much more productive which increased demand for the skills. Wait...

Replacement of programmers will follow these lines. New tools, like copilot (haven't tried, but will soon), new languages, libraries, better IDEs, stack overflows, Google, etc will make programming easier and more productive. One programmer will do the work that ten did. That a hundred did. You'll learn to become an effective programmer from a bootcamp (already possible - I know someone who went from bootcamp to Google), then from a few tutorials will.

Just like the secretary's role in the office was replaced by everyone managing their own calendars and communications the programmer will be replaced by one or two tremendously productive folks and your average business person being able to generate enough code to get the job done.


Secretaries became admin assistants who are much more productive and valuable since they switched their job to things like helping with the content of written communications, preparing presentations, and managing relationships. I saw my mother go through this transition and it wasn't really rough (though she's a smart and hard-working person).


> Secretaries became admin assistants

That doesn't mean anything. The last 20 years have seen an absurd chase of more and more stupidity in job titles to make people feel they are "executive assistants" instead of secretaries, "vice presidents" instead of whatever managerial role, etc, etc.


but secretaries are still a thing, they are just usually shared by a whole team / department these days


I had a corporate gig as a coder reporting to at most three economists at any one time. I spent at least two hours of every day getting them to explain what they wanted, explaining the implications of what they were asking for, explaining the results of the code to them, etc. So even if I didn't need to code at all my efficiency would have expanded by at most a factor of 4.


The future as I see it is that coding will become a relatively trivial skill. The economists would each know how to code and that would remove you from the equation. They would implement whatever thing they were trying to do themselves.

This would scale to support any number of economists. This would also be a simpler model and that simplicity might lead to a better product. In your model, the economists must explain to you, then you must write the code. That adds a layer where errors could happen - you misunderstand the economists or they explain poorly or you forget or whatever. If the economists could implement things themselves - less room for "telephone" type errors. This would also allow the economists to prototype, experiment, and iterate faster.


That game of telephone is certainly an enormous pain point, and I can imagine a future where I'm out of a job -- but it's extremely hard for me to see them learning to code.


And if what the economists needed to do could be programmed trivially with the help of AI then their job is probably also replaceable by AI.


That would be harder. I shuffled data into a new form; they wrote papers. All I had to understand was what they wanted, and how to get there; they had to understand enough of the world to argue that a given natural experiment showed drug X was an effective treatment for condition Y.


That was the point, I think.


>Tools like email, instant messenger, and online calendars made secretaries much more productive which increased demand for the skills. Wait...

There are more "secretaries" than ever, and they get to do far more productive things than delivering phone messages.


It may reduce the demand for the rank-and-file grunts, though.

Why would an architect bother with sending some work overseas if tools like this would enable them to crank out the code faster than it would take to do a code review?


I think the thought process is from the perspective of the employer, if you assume these two statements are true:

1) AI tools increase developer productivity, allowing projects to get completed faster; and

2) AI tools offset a nonzero amount of skill prerequisites, allowing developers to write "better" code, regardless of their skill level

With those in mind, it seems reasonable to conclude that the price to e.g. build an app or website will decrease, because it'll require either fewer man-hours 'til completion and/or less skill from the hired developers doing said work.

You do make a good point that "building an app" or "building a website" will likely shift in meaning to something more complex, wherein we get "better" outputs for the same amount of work/price though.


Now replace “AI” in your 1 & 2 points with “Github” (and the trend of open-sourcing libraries, making them available for all). All you said still works, and it did not harm programmer jobs in any way (quite the opposite).

And actually, I really don't see AI in the next decade making more of a difference than what Github did (making thousands of man-hour of works available for free). Around 2040 or 2050, maybe. But not soon, AI is still really far.


>that the price to e.g. build an app or website will decrease

Yes, and this in turn increases demand as more people/companies/etc.. can afford it.


If there are diminishing returns to expanding a given piece of software, an employer could end up making a better product, but not that much better, and employing fewer resources (esp. people) to do so.

And even that could still be fine for programmers, as other firms will be enticed into buying the creation of software -- firms that didn't want to build software when programming was less efficient/more expensive.


Decreasing the price of programming work doesn't necessarily mean decreasing the wages of programmers, any more than decreasing the price of food implies decreasing the wages of farmers.

But on the other hand, it also can mean that.


here's the difference: the farmers are large industrial companies that lobby the government for subsidies, such that they can continue to produce food that can, by law, never be eaten.

programmers, on the other hand, are wage laborers, individually selling their labor to employers who profit by paying them less.

industry is sitting on the opposite side of the equation here. I wonder what will replace "learn to code". whatever it is, the irony will be almost as rich as the businesses that profit from all this.


It's hard to use history to predict the implications of weird new technologies. But history is clear on at least two happy facts. Technology generates new jobs -- no technology has led to widespread unemployment. (True general AI could be different; niche "AI" like this probably won't be.) Technology raises standards of living generally, and making a specific sector of the economy more productive increases the wealth accrued to that sector (although it can change who is in it too).

There are exceptions -- fancy weapons don't widely raise standards of living -- but the trends are strong.


what does unemployment have to do with it? if you're 22 and paying off loans and suddenly find yourself unable to work as much but a barista, continued employment isn't exactly a perk. meanwhile, the benefits of the increased productivity brought by new technology do accrue somewhere - it's just not with workers. the trends on that are also quite clear. productivity growth has been high over the past 40 years but real wages have at best been stagnant. and on the wheel turns, finding souls to grind beneath it's weight, anew.


On your second point I agree -- the distribution within a sector matters, and most of them at the moment are disturbingly top-heavy.

On the first though, we have little reason to think tech will systematically diminish the roles people can fill. In the broad, the opposite has tended to happen throughout history -- although the narrow exceptions, like factory workers losing their jobs to robots, are real, and deserve more of a response than almost every government in the world has provided. For political stability, let alone justice.


For every programmer who can think and reason about what they are doing, there are at least 10 who just went to a bootcamp and are not afraid to copy and paste random stuff they do not understand and cannot explain.

Initially, they will appear more productive with CoPilot. Businesses will decide they do not need anybody other than those who want to work with CoPilot. This will lead to adverse selection on the quality of programmers that interact with CoPilot ... especially those who cannot judge the quality of the suggested code.

That can lead to various outcomes, but it is difficult to envision them being uniformly good.


>Tools like this allow programmers to become more productive, which increases demand for the skills.

Well like everything in life I guess it depends? The only iron rule I can always safely assume is supply and demand.

But for programming especially on the web, it seems everyone has a tendency of making things more difficult than it should be, that inherent system complexity isn't going to be solved by ML.

So in terms of squeezing out absolute efficiency from system cost, I think we have a very very long way to go.


This is just shortening the time it takes developers to Google it, search through StackOverflow and then cut and paste and modify the code they are looking for. It will definitely speed up development time. Sounds like a nice quality of life improvement for developers. I don't think it will cause the price of work to decrease, if anything a developer utilizing copilot efficiently should be paid more.


You don't know my mother-in-law.


I think this will result in classic Jevons paradox: https://en.wikipedia.org/wiki/Jevons_paradox . As the price of writing any individual function/feature goes down, the demand for software will go up exponentially. Think of how many smallish projects are just never started these days because "software engineers are too expensive".

I don't think software engineers will get much cheaper, they'll just do a lot more.


I'm guessing low expertise programmers whose main contribution was googling stackoverflow will get less valuable, while high expertise programmers with real design skill will become even more valuable.


I'm both of those things, what happens to my value?


Your legs will have to move faster than your arms.


Sonic the Hedgehog's employment prospects are looking up.


It goes up/down


Googling Stackoverflow itself can sometimes be a high expertise skill, simply because sometimes you need a fairly good understanding of your issue to figure out what to search for. A recent example: we had an nginx proxy set up to cache API POST requests (don't worry - they were idempotent, but too big for a query string), and nginx sometimes returned the wrong response. I'm pretty sure I found most of the explanation on Stackoverflow, but I didn't find a question that directly addressed the issue, so Googling was a challenge. You can keep your job finding answers on Stackoverflow of you are good at it.


unfortunately companies don't make interviewing for real design skills a priority. you'll get weeded out because you forgot how to do topographical sort


Hopefully tools like this will finally pursuade companies that being able to do leetcode from memory is not a skill they need.


Certainly but the higher expertise isn't a requirement for most dev jobs I would argue; If you are developing custom algorithm and advanced data structure, you are probably in the fringe of what the dev world do.

Otherwise I am struggling explaining why there is such a great demand for devs that short courses (3-6 months) are successful, the same courses that fail at teaching the fundamental of computing.


Guessing that if all you have to do is keep your metrics green, they are not selecting for the skills they are educating for.


With AI now they are on your level. It equalizes.


> Think of how many smallish projects are just never started these days because "software engineers are too expensive".

Maybe many. If the cost/benefit equation doesn't work, it makes no sense to do the project.

> I don't think software engineers will get much cheaper, they'll just do a lot more.

If they do more for the same cost, they are cheaper. You as a developer will be earning less in relation to the value you create.


> If they do more for the same cost, they are cheaper. You as a developer will be earning less in relation to the value you create.

Welcome to the definition of productivity increases, which is the only way an economy can increase standard of living without inflation.


Inflation and productivity might be correlated but neither is a function of the other. Given any hypothetical world where increased productivity leads to inflation, there's a corresponding world equal in all respects except that the money supply shrinks enough to offset that inflation.


> You as a developer will be earning less in relation to the value you create.

Doesn't matter as long as I create 5x value and earn 2x for it. I still am earning double within the same time and effort.


Oh now I see! This is how we will enter a new era of ‘code bloat’ - Moore’s law applied to software - where lines of code double every 18 months-


We went through the same hype cycle with self driving cars. We are now ~15 years out from the DARPA challenges and to date exactly 0 drivers have been replaced by AI.

It is certainly impressive to see how much the GPT models have improved. But the devil is in the last 10%. If you can create an AI that writes perfectly functional python code, but that same AI does not know how to upgrade an EC2 instance when the application starts hitting memory limits, then you haven't really replaced engineers, you have just given them more time to browse hacker news.


Driving is qualitatively different from coding: an AI that's pretty good but messes up sometimes is vastly more useful for coding than for driving. In neither case can you let the AI "drive", but that's ok in coding as software engineering is already set up for that. Testing, pair programming and code reviews are popular ways to productively collaborate with junior developers.

You're not replacing the engineer, but you're giving every engineer a tireless companion typing suggestions faster than you ever could, to be filled in when you feel it's going to add value. My experience with the alpha was eye opening: this was the first time I've interacted with an AI and felt like its not just a toy, but actually contributing.


Writing code is by far the easiest part of my job. I certainly welcome any tools that will increase my productivity in that domain, but until an AI can figure out how to fix obscure, intermittent, and/or silent bugs that occur somewhere in a series of daisy-chained pipelines running on a stack of a half-dozen services/applications, I am not going to get too worked up about it.


I agree. It kind of amazes me though there is so much room for obscurity. I would expect standardisation to have dealt with this a long time ago. Why are problems not more isolated and manageable in general?


It's extremely hard to reason about the global emergent behavior of a complex system than the isolated behavior of a small component.


I don't think it's a function of complexity per se, but determinism. This is why Haskellers love the IO monad. Used well, it lets you quarantine IO to a thin top-level layer, below which all functions are pure and easy to unit test.


Distributed systems are anything but deterministic.


What is your definition of "replace"? Waymo operates a driverless taxi service in Phoenix. Sign ups are open to the general public. IMO this counts as replacing some drivers as there is less demand for taxi service in the operating area.

https://blog.waymo.com/2020/10/waymo-is-opening-its-fully-dr...


According to this article[1], the number of ride hailing drivers has tripled in the last decade.

I think full self driving is possible in the future, but it will likely require investments in infrastructure (smarter and safer roads), regulatory changes, and more technological progress. But for the last decade or so, we had "thought leaders" and VCs going on and on about how AI was going to put millions of drivers out of work in the next decade. I think it is safe to say that we are at least another decade away from that outcome, probably longer.

[1] https://finance.yahoo.com/news/number-american-taxi-drivers-...


Self driving is used in the mining industry, and lots of high paid drivers have been replaced.

But you are clearly more knowledgeable with your 0 drivers replaced comment.


Mining as in those big trucks or mining as in trains on tracks?



excellent if dsytopian article. thank you for sharing!


> AI does not know how to upgrade an EC2 instance when the application starts hitting memory limits

That's exactly the kind of thing "serverless" hosting has done for a while now.


Yeah really bad example there.


Ahh yes, the serverless revolution! I was told that serverless was going to make my job obsolete as well. Still waiting for that one to pan out. Not going to hold my breath.


This isn't self driving for programming, its more like GPS and lane assist.


15 years is no time at all.


I am blown away but not scared for my job... yet. I suspect the AI is only as good as the training examples from Github. If so, then this AI will never generate novel algorithms. The AI is simply performing some really amazing pattern matching to suggest code based on other pieces of code.

But over the coming decades AI could dominate coding. I now believe in my lifetime it will be possible for an AI to win almost all coding competitions!


I guess it's worth pointing out that the human brain is just an amazing pattern matcher.

They feed you all these algorithms in college and your brain suggests new algorithms based on those patterns.


Humans are more than pattern matchers because we do not passively receive and imitate information. We learn cause and effect by perturbing our environment, which is not possible by passively examining data.

An AI agent can interact with an environment and learn from its environment by reinforcement learning. It is important to remember that pattern matching is different from higher forms of learning, like reinforcement learning.

To summarize, I think there are real limitations with this AI, but these limitations are solvable problems, and I anticipate significant future progress


Fortunately the environment for coding AI is a compiler and a CPU which is much faster and cheaper than physical robots, and doesn't require humans for evaluation like dialogue agents and GANs.


Well you still have to assess validity and code quality which is a difficult task , but not unsolvable.

Also Generative Adversarial Networks original implementation was to pit neural networks against each other to train them , they don't need human intervention.


> They feed you all these algorithms in college and your brain suggests new algorithms based on those patterns.

Some come from the other end of the process.

I want to solve that problem -> Functionally, it’d mean this and that -> How would it work? -> What algorithms / patterns are there out there that could help.

Usually people with less formal education and more hands on experience, I’d wager.

More prone to end up reinventing the wheel and spend more time searching for solutions too.


What?


Most people I know who’ve been to college, or otherwise educated in the area they work in, tend to solve problems using what they know (not implying it’s a hard limit. Just an apparently well spread first instinct).

Which fits the pattern matching described by the grandparent.

A few people I know, most of which haven’t been to college, or done much learning at all, but are used to work outside of what they know (that’s an important part), tend to solve problems with things they didn’t know at the time they set out to solve said problems.

Which doesn’t really fit the pattern matching mentioned by the grandparent. At least not in the way it was meant.


To reduce intelligence to pattern matching begs the question: How do you know which patterns to match against which? By some magic we can answer questions of why, of what something means. Purpose and meaning might be slippery things to pin down, but they are real, we navigate them (usually) effortlessly, and we still have no idea how to even begin to get AI to do those things.


I think those are the distance metrics, which is what produces inductive bias, which is the core essence of what we consider 'intelligence'. - Consider a more complicated metric like a graph distance with a bit of interesting topology. That metric is the unit by which the feature space is uniformly reduced. Things which are not linearized by the metric are considered noise, so this forms a heuristic which overlooks features which may have been in reality salient. - This makes it an inductive bias.

(Some call me heterodox, I prefer 'original thinker'.)


To generate new, generally useful algorithms, we need a different type of "AI", i.e. one that combines learning and formal verification. Because algorithm design is a cycle: come up with an algorithm, prove what it can or can't do, and repeat until you are happy with the formal properties. Software can help, but we can't automate the math, yet.


I see a different path forward based on the success of AlphaGo.

This looks like a clever example of supervised learning. But supervised learning doesn't get you cause and effect, it is just pattern matching.

To get at cause and effect, you need reinforcement learning, like AlphaGo. You can imagine an AI writing code that is then scored for performing correctly. Overtime the AI will learn to write code that performs as intended. I think coding can be used as a "playground" for AI to rapidly improve itself, like how AlphaGo could play Go over and over again


Imparting a sense of objective to the AI is surely important, and an architecture like AlphaGo might be useful for the general problem of helping a coder. I'm not seeing it, however, for this particular autocomplete-flavored idiom.

AlphaGo learns a game with fixed, well-defined, measurable objectives, by trying it a bazillion times. In this autocomplete idiom the AI's objective is constantly shifting, and conveyed by extremely partial information.

But you could imagine a different arrangement, where the coder expresses the problem in a more structured way -- hopefully involving dependent types, probably involving tests. That deeper encoding would enable a deeper AI understanding (if I can responsibly use that word). The human-provided spec would have to be extremely good, because AlphaGo needs to run a bazillion times, so you can't go the autocomplete route of expecting the human to actually read the code and determine what works.


> we can't automate the math, yet

This exists: https://en.wikipedia.org/wiki/Automated_theorem_proving


This is moreso automation-assisted theorem proving. It takes a lot of human work to get a problem to the point where automation can be useful.

It's like saying that calculators can solve complex math problems; it's true in a sense, but it's not not strictly true. We solve the complex math problems using calculators.


and there's already GPT-f [0], which is a GPT-based automated theorem prover for the Metamath language, which apparently submitted novel short proofs which were accepted into Metamath's archive.

I would very much like GPT-f for something like SMT, then it could actually make Dafny efficient to check (and probably avoid needing to help it out when it gets stuck!)

0. https://analyticsindiamag.com/what-is-gpt-f/


Someone tell Gödel


You mean like AlphaGo where the neural net is combined with MCTS?


> If so, then this AI will never generate novel algorithms. This is true, but the most programmers don't need to generate novel algorithms themselves anyway.


> I now believe in my lifetime it will be possible for an AI to win almost all coding competitions!

Then we shall be reaching singularity.


We will only reach a singularity with respect to coding. There are many important problems beyond computer coding like engineering and biology and so on


Coding isn't chess playing it's likely about as general as math or thinking. If you can write novel code you can ultimately do biology or engineering or ultimately anything else.


Reading this thread it seems to me that AI is a threat for "boilerplate-heavy" programming like website frontends, I can't really imagine pre-singularity AI being able to replace a programmer in the general case.

Helping devs go through "boring", repetitive code faster seems like a good way to increase our productivity and make us more valuable, not less.

Sure, if AI evolves to the point where it reaches human-level coding abilities we're in trouble, but that's the case this is going to revolutionize humanity as a whole (for better or worse), not merely our little niche.


C’mon guys, your standard backend schema with endpoints is like way easier to automate away.


I mean, we already have? Use Django Rest Framework and simple business models and you're pretty much declaratively writing API endpoints by composing behavior. Almost nothing in a DRF endpoint definition is boilerplate.

The hard part has always been writing an API that models external behavior correctly.


Generating the code with help from AI: 25 cents

Knowing what code to generate with the AI: 200k/yr


Will this tend to blur the distinction between coder and manager? In the end a manager is just a coder who commands more resources, and relies more on natural language to do it.

Or maybe I'm thinking of tech leads. I don't know, my org is flat.


Issue is not writing code. Its changing, evolving or maintaining it.

This is the problem with things like Spreadsheets, dag-drop programming, code generators.

Its not easy to tell a program what to change and where to change.


I feel like you might be moving the goalposts. Maybe they're different problems, but it's not at all clear to me that mutation is harder than creation.


"Prediction: AI will cause the price of work that can happen in front of a computer to decrease much faster than the price of work that happens in the physical world. This is the opposite of what most people (including me) expected, and will have strange effects"

I've been saying something like that for a while, but my form was "If everything you do goes in and out over a wire, you can be replaced." By a computer, a computer with AI, or some kind of outsourcing.

A question I've been asking for a few years, pre-pandemic, is, when do we reach "peak office"? Post-pandemic, we probably already have. This has huge implications for commercial real estate, and, indeed, cities.


I just don't believe it. Having experienced terrible cheap outsourced support and things like Microsoft's troubleshooting assistant (also terrible), I'm willing to pay for quality human professionals. They have a long way to go before I change my mind.


huh... I've found that people who tend to describe their occupation as "knowledge work" are the most blind to the fact that white collar jobs are the first to get optimized away. Lawyers are going to have a really bad time after somebody manages to bridge NLP and formal logic. No, it won't result in a Lawyerbot-2000 - it will result in software that enables lawyers to do orders of magnitude more work of a higher quality. What do you think that does to a labor market? It shrinks it. That or people fill the labor glut with new, cheaper, lawsuits...


i don't think that people will ever fully trust an AI lawyer, given all the possible legal consquences of a misunderstanding between the AI and the client. You could literally go to jail because of a bug/misunderstanding due to an ambiguous term (this might make a good sci-fi story ...)

But yes, getting some kind of legal opinion will probably be cheaper with an AI.


Nor I, which is why I said so. A SaaS will pop up called "Co-chair" and that'll be that. It would definitely be a lot easier to trust than any of the black box neural networks we are all familiar with - as the field of formal logic is thousands of years old and pretty thoroughly explored. I used a SAT solver just last night to generate an exhaustive list of input values that result in a specific failure mode for some code I'm reverse engineering - I have no doubts about the answer the SAT solver provided. That definitely isn't the case with NN based solutions - which I trust to classify cat photos, but not much else.


Legal discovery and other "menial" law tasks are already quite automated.


I wouldn't describe keyword search engines or cross reference managers as "quite automated" - so I would expect little market change from whatever LexisNexis is currently selling.


I would -- I remember my mom as a lawyer having to schlep to the UCLA law library to photocopy stuff -- but current legal automation includes NLP at the level of individual clauses.

https://www.suls.org.au/citations-blog/2020/9/25/natural-lan...


Oof, as somebody who has studied the AI winter - that article hurt, suggesting that an unsupervised NN-centric approach is going to lead somewhere other than tool-assist... its the 1970s all over again.

> I would

Well you're going to have a problem describing actual automation when you encounter it. What would you call it when NLP results are fed into an inference engine that then actually executes actions - instead of just providing summarized search results? Super-duper automation?



I kinda believe this but I still think it hugely depends on what you're doing in front of a computer. If you're just a generic developer that gets a task and codes it by the spec, then you can probably be replaced by AI in a few years.

But I don't think AI will become capable of complex thought in the next one/two decades, so if you're training to be a software architect, project manager, data analyst I think you should be safe for some time.


People have been saying AI would end everything and anything since I was a wee baby. It still hasn't happened. How about instead of making the same old tired boring predictions about the impending apocalypse of things we love we start making and listening to predictions that actually talk about how life has been observed to progress. It's not impossible, science-fiction authors get it right occasionally.


As it stands, this looks like it will actually increase the productivity of existing programmers more than it will result in "now everyone is a programmer".

Over time it will certainly do more, but it's probably quite a long time before it can be completely unsupervised, and in the meantime it's increasing the output of programmers.


Honestly, the only reason I'm still doing work in front of a computer is that it pays well. I'm really starting to think I should have followed my gut instincts when I was 17 and go to trade school to become an electrician or a carpenter...


True self-improving general AI would put all labor out of business. Incomes would then be determined by only the ownership of capital and government redistribution. Could be heaven, could be hell.


I’m not sure why it’s unexpected when it’s essentially a reframing of Baumol's cost disease. Any work that does not see a productivity increase becomes comparatively more expensive over time.


So either his prediction or expectation will be correct.

I think I’ll side with his expectation, but then again, my salary depends on it.


I wonder does Sam Altman also believe that you can measure programmer productivity by lines-of-code?


Don't hold your breath. (Coming soon right after the flying self-driving car.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: