Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
This content is for human consumption only (orbistertius.substack.com)
37 points by dawsoneliasen on May 1, 2023 | hide | past | favorite | 49 comments



> like DALL-E, it seems more likely that writers, artists, and programmers are the most vulnerable to displacement.

Programmer here.

I'm don't feel vulnerable to displacement by AI tools.

And if I'm displaced, perhaps my skills were not as meaningful as I thought they were.


> And if I'm displaced, perhaps my skills were not as meaningful as I thought they were.

If "things that AIs can't do" defines the scope of meaningful skills for humans to have, that scope is going to shrink rapidly, and then where does that leave humans?


> where does that leave humans?

Perhaps nowhere. This might be unavoidable anyway.

The last to starve will be the first to suffocate.


Maybe I have a skewed view of the world, but I personally really don't care if something has been created by an AI or a human for most things. I mean, sure, it's nice to know you have some shoes that have been done by a person but only because this tends to be a sign of actual quality and care, a pair of boots cheaply made by a machine or through cheap labour will just not be (quality wise) as good as a pair of more expensive boots that have been given the care and patience that an expert shoe maker would do.

Now, if machines can produce shoes that are as high quality, and most of the time they can do it (for example my running shoes are quite comfortable and good enough quality) as manual labor... then do I care most of the work has been done by machines? Of course not. In this case, what worries me might be the conditions of the people working in these factories doing the work that the machine is not capable of doing.

So why would this be any different for "white collar" jobs? Why would I care if the article I'm reading has been written 90% by a machine and 10% by a human or the other way around, provided the quality of the final product matches what the author intended?

Automation, to me, is a good thing. The problems come not to me from the fact that technology being able to replace our labour seems like an existential threat to our society because we just assume that it means no one is going to get a job ever again and they're not going to be able to feed their families, etc. I mean, these are very much serious concerns but aren't we pointing fingers to the wrong places? Wouldn't it be great if 80% of the work can be done by machines? If our societal, economical, political systems are not capable of dealing with this scenario and keeping us content then I think we ought to rethink society rather than complain about automation.


We ought to rethink society and economics. Although, a lot of the crap work won't be doable by machines, along with a lot of the high-touch yet low-paid work like home health care, teaching, policing and so on. Time to be an electrician.


If there was nothing to create and machines did everything for you, what would you actually do ?


Find a hobby like everyone else. People tend gardens, even though they are just inefficiently ran agricultural setups. People play chess even though nearly all of them won't be able to beat the best AI. People will still draw for the enjoyment of the act of drawing. People will still write/type/voice their thoughts onto some medium. People will be with friends and family. Hell, I'd be happy just being left to ruminate with my own thoughts a lot of the time.

Machines can replace our need to produce our material goods and services, but I don't see how they could ever replace our need to express or relate.


People should not worry about Stack Overflow or Microsoft, etc., but about individuals who can train from scratch and fine-tune the models themselves, renting or bidding powerful GPUs is not particularly expensive, downloading millions of images is not expensive, curating the dataset is not expensive, and it is only because of these people that it is worth using models like SD. And as for the article, these individuals are much more difficult to put pressure on.


What do you mean "not particularly expensive", GPT-3 I think costs like $4M to train. And that is just the pre-training part.


I'm obviously not talking about training a 173B model from scratch ahah

If you want to train a model from scratch I was talking more about a small 64x64 DDIM models (that you can then upscale with openly available upscaler DDIM models and maybe finetune). In most cases however it's better to just finetune the models available, the point is that pressuring companies to not train a model on the data they host doesn't really do much if a single individual can scrape the entire DeviantArt site and finetune a model on it.


But why should they be afraid of these models? These models will not gonna be competitive enough compared to the models of the larger companies, and considering that training cost > inference cost, it will never be competitive in market sense to small-scale train different models from scratch. So these models are never gonna reach the impact of the larger ones in the end.


By finetuning a model (correctly) you will retain its capability while shifting the distribution to what you desire, as I have said before "it is only because of these people that it is worth using models like SD".


> This is the worst imaginable end-stage capitalism dystopia, in which the only ways to make money are the grueling physical jobs like nursing and commercial kitchens (if you work in a field like that, you have my deepest respect).

Having worked a physical job on a scaffold 13 stories in the air while wearing a respirator on a sweltering summer day and grinding out mortar joints, I can’t help but feel that this would actually be a general positive good for society. It was one of the best things that ever happened to me even if it would be an extremely painful tearing-off-the-bandage for many people. There’s too many desk jobs right now, and it’s a statistical fact that we cannot maintain current standards when the Boomer generation finishes retiring. There’s too many chefs in the kitchen and not enough diners.

It’s particularly because I witnessed how extremely disconnected desk workers are from the real, on the ground, physical labor and reality, in often unintentional ways. A recalibration back to reality for them would be a painful net good in my mind. If every desk laborer had to do 2-3 years of hard, grueling, physical labor; we’d be living in a very different country, and I think a much better one. I think too many people have been disconnected from physical labor (which would have been normal for 99%+ of our ancestors) for far too long and we could use a little fresh air.


I'm just plain extremely unsympathetic to the entitled, rent-seeking view of the world that a bunch of white-collar workers turn out to have: "help someone is going to automate my job and other jobs might be harder".

Like...I am a white collar worker, but god damn I cannot think of a more pathetic complaint.


> help someone is going to automate my job and other jobs might be harder

More like “help, someone is going to automate my job and there won’t be jobs for everyone”

In a hypothetical scenario where 30% of the current workforce is decimated by AI (white collar jobs), what makes you think the demand for plumbers, aircon unit installers or fruit pickers is goong to go up and absorb all those lost jobs?


I think, in some way or another, the economy will settle into an equilibrium where everyone is doing some kind of paid work, but if we automate away the enjoyable work and the difficult-to-automate work happens to be the type of work that's less enjoyable, then all our lives will be less enjoyable.

We've seen this play out in history with the various productivity-increasing technologies in manufacturing. I imagine that being a member of a crafts guild producing their wares in Renaissance Florence was a more fulfilling existence than being a modern assembly line worker who basically just fills in the difficult-to-automate gaps between the machines on an assembly line.

There are essentially two ways of using technology: One is where technology becomes an extension of your body (and, with AI, your brain) and puts more powerful means at your disposal to do what you want to do and have an effect on the world. The other is where your role in life is reduced to being a mere part in a machine -- someone else's machine.

And it's not in each individual's power to freely pick and choose how they will end up relating to technology. This is rather the result of societal-level forces, and there will be many people who will see their quality of life diminished by recent advances in technology.


Adding substance and example to the peer Murano comment

Lino Tagliapietra is a g'damn wizard.

I've had the pleasure of working with him in both Australia and New Zealand and I can't see his skill set being AI replicated anytime soon.

I dare say when it is we're all doomed. :-)

https://www.linotagliapietra.com/artist/biography


> imagine that being a member of a crafts guild producing their wares in Renaissance Florence was a more fulfilling existence than being a modern assembly line worker who basically just fills in the difficult-to-automate gaps between the machines on an assembly line

I was in Murano last summer. They probably have more global demand for their handmade wares than they’ve ever had in history.


> ...more global demand

Maybe in absolute terms, given the expansion of the total size of the economy through history, but I can't imagine that the proportion of Italy's population that was able to sustain a middle class existence through artisanship was lower in the Renaissance than it is today.

In the Renaissance, artisanship was an economic necessity and a political system. Nowadays it's a weird meeing of supply-and-demand between the elites among the buyers and the elites among the sellers. Only the elites among the buyers can afford to buy goods that have been produced in a less economical way than functionally/aesthetically equivalent alternatives, just for the bragging rights connected with filling one's home with handmade stuff. Only the elites among the sellers can withstand the competition among those wanting to be such sellers.

Obviously those elites' point of view shouldn't be the only one informing the decisions about how we, as a society, want to relate to technology.


> can't imagine that the proportion of Italy's population that was able to sustain a middle class existence through artisanship was lower in the Renaissance than it is today

I would love to see figures, actually. My gut feeling is it would be lower. The world was overall poorer, and still struggling to even feed itself.


> still struggling to even feed itself.

I don't think so, especially considering the plague. The economy was already set up to feed a lot more mouths than survived the plague, and places like Florence were the destination for many of those newly wealthy. The guilds also exercised political power, behaving as organized monopolies. You would think that monopolies are bad for the economy, but they were markedly different from today's monopolies in that their internal organization foreshadowed today's systems of democracy and a strong civil society.

So, to think of this as a golden age of artisanship, and to think that the phenomenon reached even into commoners' lives, is certainly more than mere romanticizing about the past ...though I don't claim to have a good quantitative grasp of it either.


But what if instead of laying people off - you can expand the project's scope instead? In my country's gamedev community "GTA in our country" is a meme, because we don't have enough funding and manpower to make a GTA-sized game. But with AI assistance it could be feasible. And big companies like Rockstar, instead of laying people off, could make something truly massive.

The same with movies. We can't make anything like James Cameron's movies. But with AI we could.


Counterpoint, companies do what they’ve always done and do more with less because big companies are run by MBAs, not idealists, and startups are difficult to bootstrap and manage.


Big companies have to compete with other big companies. What if you lay off your employees in order to make a product with a similar scope as before, but your competitor hires more people and expands the scope of their project significantly?


Considering all the big companies laying people off recently, even those with enough capital to retain them, they don’t seem very concerned about that possibility. For what it’s worth I hope you’re right.


Even after all those layoffs - they still have more employees than before the pandemic.


YES this is absolutely the answer. We can now do very cool and worthwhile things that simply were out of reach before.


I view it differently. Some people are not cut out for manual labor, lots of white collar work is intellectually stimulating in a way blue collar work isn’t, and a deluge of white collar workers competing for blue collar jobs will drive down wages.

End goal should be trying to give the next generation a better “cushy” life than the last. If harder work for less pay is what lies in the future then the future isn’t all it’s cracked up to be.


Agree with this. I find it bizarre that people seem to shit talk actual work so much.

In my circles, it used to be that people considered office work a "cushy job". It was clearly recognised as being out of the ordinary and special.

Now it's like it's flipped and somehow anything that actually does anything real is for the proles.

I barely know anyone that can even like, build a crappy basic table. That would make me really disappointed in myself. It's like everyone is trapped in a fake virtual economy pretending it all matters.


> cannot maintain the standard … when the boomer generation retires

But doesn’t AI fundamentally change this all? I can already see Japan rejoicing that the demographic curve just became a lot less threatening.

People it seems, with the arrival of the robots in the knowledge workforce, have shifted from an economic asset class to a potential liability / efficiency bottleneck in the definition of capitalism and that means change. It means the demographic predicted rise of India for example is not as preordained as it once was.

In this new world where few people own most of the assets and regular humans as economic factors are diminishing …


This is a fair point, thank you for your perspective.


Writer/Artist/Dev: “I think publishing this work will improve the world in some way. I’ll put it out there on the public internet for all the world to see for free.”

AI Model Training: *processes said work.* This will improve the world!

Writer/Artist/Dev: Noooo! Not like that!


> Writer/Artist/Dev: “I think publishing this work will improve the world in some way. I’ll put it out there on the public internet for all the world to see for free.”

Surely you must know this is a bullshit argument right? Almost everyone that publishes their work publicly on the internet publishes it under a particular license.

Yes, there are some that decide to publish under the equivalent of the public domain or a CC0 license, but the vast majority of writing, art and code is licensed under specific conditions and for the most part they retain the full rights to the work.


>Yes, there are some that decide to publish under the equivalent of the public domain or a CC0 license, but the vast majority of writing, art and code is licensed under specific conditions and for the most part they retain the full rights to the work.

Perhaps we need a licence that precludes use for training AI's. I don't know how you could enforce that though.


Any AI that is deployed in a way where it makes money (including from ads) becomes a tool of commercial use, and there are lots of currently-used licences that deal with commercial use.


True, but that still leaves noncommercial AI -- so it's not really a substitute for being able to prohibit AI training as one of the uses of your work.


Food for thought: Under what license did you publish this comment?


There is nothing to think about. All rights reserved except those provided specifically to Y Combinator as outlined by the "User Content Transmitted Through the Site" section of https://www.ycombinator.com/legal/


When a license isn’t specified, then the work is unlicensed. Meaning, you have not been granted any particular permission to it beyond the terms of the website and what the law provides by default. Generally, that means you can use the work in certain ways and in certain contexts, such as the notion of “fair use” in copyright law, but otherwise your rights to use and distribute it are pretty limited.


I mean, personally I think AI model training makes the world worse, so it's a pretty reconcilable set of views.


In what way(s) does training a language model make the world worse?


[flagged]


"Large language models are trained on massive datasets of text data"

Maybe time for internet to start moving away from text and start making it more expensive for mass data collection systems.


That wouldn't exactly help people who rely on screen readers and such.


> LLMs do not require real-time access to the internet or permission to use specific data points.

But why? Why copyright and other licenses and rights do not apply?


The second response doesnt really answer the question?


Wouldn't it be racism to deny artificial brains to read your content while publicly offering it to human brains?

If that would be possible and allowed, how long should this system be kept up? Even after AI has become just as complex and emotional as humans are?

Would this also be a good idea if a human brain would be digitally cloned verbatim? If not, why is one program allowed to read while the other is not?

Will AI have to suffer a "second class intelligence" fate until it has its Rosa Parks moment and a revolution takes place?

So many questions. I feel it might be better to live in peace with this new type of intelligence here on planet earth. Right from the start.


This is really the stuff of science fiction. I think these are important questions, but also irrelevant when discussing ChatGPT because it's nowhere near "as complex and emotional as humans are" and I don't expect we'll have a system that will be in the near future.

But to answer the question: it's not really about who or what views the content, but rather for what purpose it's used. My brain might use your post as input for something I will write in the future, either consciously or subconsciously. That's kind of how humans work. But I'm not reading every single comment on HN for the explicit purpose of using that as input for future writings to make money off. It's a subtle but important difference. In this post that's shortened to "I don't want ChatGPT to ..." because today that's effectively the same.


These are not artificial brains, not yet anyway. Perhaps not for a long while.

I hope that the rest of earth's currently living animals are granted rights before these digital creations.


What is your definition of an artificial brain?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: