
Use of AI in the fashion industry - mcone
https://www.nytimes.com/2018/07/07/business/economy/algorithm-fashion-jobs.html
======
popinman322
The problem I have with this idea is that ML systems usually only repeat
patterns they've seen before. Sure, you can get quirky output by going to
unused portions of the latent space, but that's also more likely to yield
degenerate results (things that look 40% shirt, 40% pants, and 20% "other").
While these types of results are usually the most interesting, they're also
the most likely to require professional "cleaning"; removing artifacts in a
way that produces good results still requires skilled labor. I would expect an
ML system to be able to do fine adjustments like spacing and repetition of
minor motifs, coarse adjustments like combining motifs in novel ways, or even
simple block patterns like the one in the article, but it's unlikely you'll
find novel motifs solely via the latent space. Even for the most interesting
potential use case, novel combinations of existing motifs, you're likely going
to need human discretion as a final pass.

When a human designs something, there's often intent involved; there are
design constraints and social context involved. I don't expect statistical ML
(which is good at interpolation) to cross these gaps without integration with
symbolic ML (which is good at extrapolation).

Though maybe I'm biased since I work in a symbolic AI lab.

~~~
freehunter
Humans often only repeat patterns they've seen before as well, and quirky
output comes from wild guesses and requires significant skilled labor to clean
up.

For every new design you see, there may be 100 or 1000 that were thrown away.
All of those discarded designs cost money regardless of if they were used or
not.

~~~
sincerely
I mean, it's going to be a long long time before we get an AI on the level of
Iris van Herpen or Chalayan. It seems to be mostly about colour and minor
variations on extremely basic articles of clothing, barely even "fashion" in
the sense of innovation.

~~~
extralego
That surely happens and all but popular approval and celebration is the
product of social structures rather than creative ones.

------
pgodzin
A more in-depth view of the StitchFix model that I found really interesting:
[https://algorithms-tour.stitchfix.com/](https://algorithms-
tour.stitchfix.com/)

~~~
jcims
Someone put a lot of thought and work into that page. Very cool.

~~~
Rumudiez
Yeah, the “rotate your device to landscape” definitely shows a lot of polish.
/s

A simple fix would just be to inline all the content, like all the little ol’
two-column responsive sites do. In light of that obvious solution, they put a
modal on it and compromised their content.

------
knuththetruth
Perhaps the risk isn’t that technology will actually take over white collar
jobs, but that managers will use minimally viable examples of technology to
layoff massive numbers of people for short-term gains.

Of course when the companies then implode because the technology can no way
make up for the resulting skill drain, said managers will have moved on to a
new position, having sold the layoffs/technology as evidence of their
“superior business accumen.”

------
dawhizkid
There’s plenty of uninspiring (not sure about high skilled) white collar
work...I would not be surprised if a lot of what early career lawyers,
investment bankers, accountants, consultants do could be automated. I’m also
bullish on radiology being the first area of medical practice that can be
largely automated by machines.

In the future (and even now) careers are going to be defined by how well you
can form relationships (and therefore sell) i.e. the things that will be
hardest for machines to do.

~~~
l8rpeace
Do we end up with less skilled late-stage doctors, investment bankers,
radiologists, etc. if we automate away the grind?

~~~
falcor84
If these years of grind are replaced with a higher-level and more intentional
early-stage practise of monitoring and managing the automation, then I think
we could get better skilled people.

EDIT: Although I suppose that these professionals would need at least some
practise of manual work too, to be able to monitor the machines' output.

------
3pt14159
We're a long, long, long way from machines taking over white collar work to
any reasonable degree. Even extremely basic commands are routinely
misunderstood by voice assistants.

What is going to continue to happen is what is already happening now: Less
work wasted on bullshit, more potentially interesting findings surfaced for
humans to examine. Same type of thing I was working on 2011 (helping lawyers
with discovery, edit: no I confused things. Back then I was helping companies
scan internal communication for automatic skill mapping), only better.

~~~
ggg9990
We’re a long way away in technology time but not in society time. We might be
as far away from this world as we are from the creation of the Web — but
society isn’t ready to have 10x the number of unemployed people in 25 years.
Political/economic/social systems will not survive — the hope is that they get
replaced nonviolently rather than in a bloodbath.

~~~
nradov
There is no hard evidence that unemployment will increase by 10×. This is pure
speculation. Meanwhile we have real non-imaginary problems to focus on.

~~~
3pt14159
I agree. Mass unemployment is not a concern we can start to address because we
do not know the nature of the coming unemployment. We're talking about 20 or
40 years away.

Meanwhile, we have real fucking problems right now:

1\. Sky high housing prices. 2\. Extreme wealth disparity. 3\. Environmental
degradation. 4\. Cybersecurity failures everywhere, just as everything goes
autonomous or otherwise cyberphysical. 5\. Information operations threaten
both democracy and the openness of the internet.

You can still make $100k a year _editing English_. The demand for talent is
sky high, it's investments in education and society that will help us, not
fretting over how AI will take away jobs long before we can do anything about
it.

~~~
Mediterraneo10
> You can still make $100k a year editing English.

The only employers paying that much are old-school publishers that are a bit
of a hold-out but will eventually give way. As a translator often doing full-
length books, I have witnessed some big publishing names severely slashing the
amount they pay for editing (by doing things like outsourcing the job to
India), and they don’t worry about any drop in quality because it is felt that
in a web-heavy world, the public no longer cares too much.

It is depressing as fuck for me as a translator, because I put a lot of effort
into my translations and I wish they would get the same level of love at the
next stage of the publishing process, but this is the trend of the future.

Also, I feel lucky to still have work as a translator because many clients
today are running the text through Google Translate and then just paying a
native speaker to clean up the text for less money than they would have to pay
a translator. Obviously that is not common in the mainstream publishing world,
but it is increasingly happening with the marketing materials and technical
manuals that are a translator’s daily bread and butter. “AI” is already
hitting businesses based on human language hard.

~~~
3pt14159
My editor is fully booked and she charges $80—100 / hour. She does have a
technical background, but she took a twenty year break to have kids.

As for translation quality; I care. Others do too. though you're right that
many don't. In the long run quality wins though.

------
cix_pkez
Designing patterns on shirts is high-skilled work?

~~~
ddebernardy
If it didn't require above average skill or more, it would mean your average
Joe on the street should be able to come up with patterns that will sell above
average. (The latter of which might be plausible at the end of the day. But
professional cloth designers will probably beg to differ and/or raise that
it's a recipe for lack of creativity.)

~~~
pizza234
There's a wide spectrum between _average_ and _high_ skills; the parent
referred specifically to _high_.

Besides, the article, which is very poor and click-baity, starts with a
factoid (algorithms designing shirts) and then steers into something
completely different - big data.

------
mtgx
Anyone else annoyed with these disguised advertorials in the NYT lately?

~~~
forapurpose
Why do you think this article is an advertorial? What is it advertising? Is
there evidence you can point to?

~~~
taurine
> "Suits make a corporate comeback," says the New York Times. Why does this
> sound familiar? Maybe because the suit was also back in February, September
> 2004, June 2004, March 2004, September 2003, November 2002, April 2002, and
> February 2002.

> Why do the media keep running stories saying suits are back? Because PR
> firms tell them to. One of the most surprising things I discovered during my
> brief business career was the existence of the PR industry, lurking like a
> huge, quiet submarine beneath the news. Of the stories you read in
> traditional media that aren't about politics, crimes, or disasters, more
> than half probably come from PR firms.

> I know because I spent years hunting such "press hits." Our startup spent
> its entire marketing budget on PR: at a time when we were assembling our own
> computers to save money, we were paying a PR firm $16,000 a month. And they
> were worth it. PR is the news equivalent of search engine optimization;
> instead of buying ads, which readers ignore, you get yourself inserted
> directly into the stories.

[http://www.paulgraham.com/submarine.html](http://www.paulgraham.com/submarine.html)

~~~
forapurpose
I was asking about this particular article. I have no doubt that advertorial
exists.

------
woadwarrior01
_The first algorithm generated random images that it tried to pass off as
clothing. The second had to distinguish between those images and clothes..._

Sounds like a long winded way of describing a GAN.

~~~
zeroxfe
Sounds like an accessible way to describe a GAN to me. It's the New York
Times, not Dr. Dobbs Journal :-)

~~~
backpropaganda
I wish they could have cited the paper though, for credit assignment, and for
those who want to learn more.

~~~
woadwarrior01
Precisely! The least they could've done is mention what it's called for people
who might want to learn more, rather than allude to it being some proprietary
AI technique that the company came up with.

