
Ironies of Automation (1983) [pdf] - kiyanwang
https://www.ise.ncsu.edu/wp-content/uploads/2017/02/Bainbridge_1983_Automatica.pdf
======
detaro
Some newer papers looking back at this and it's continuing relevance:

2017: _Ironies of Automation: Still Unresolved After All These Years_
[https://doi.org/10.1109/THMS.2017.2732506](https://doi.org/10.1109/THMS.2017.2732506)
(couldn't find an openly accessible link :/)

2012: _The ironies of automation... still going strong at 30?_ \-
[http://johnrooksby.org/papers/ECCE2012_baxter_ironies.pdf](http://johnrooksby.org/papers/ECCE2012_baxter_ironies.pdf)

~~~
tw1010
[http://sci-hub.tw/https://ieeexplore.ieee.org/document/80130...](http://sci-
hub.tw/https://ieeexplore.ieee.org/document/8013079)

------
thewhitetulip
I used to work in ETL sometime back on a popular tool daily.

It used to take a LOT of time for doing code reviews.

30min per object. Decent project has 300 such objects.

Turns out the tool can export in XML so I wrote a Python script to do the
review.

What happened?

I could review basic manual things like coding standards in 15milli seconds
irrespective of the object number.

That's automation. It is usable in other projects , and nobody other than my
team used it ever for review. Why? Because it requires efforts to understand &
a basic understanding of XML &Python and they don't have it because it's not
their field. I am a nerd so I kept exploring.

So the script is now ancient history.

And coming to my point. These people say automation is demise of everything.
AI is taking over all jobs. Won't happen for a long time.

For starters, scientists in the cold war had said "We will have live voice
translate from Russian to English in 5yrs"

Cold war has ended for how many years now? And in 2010s MS finally was able to
bring this tech. Imagine the time gap.

secondly, if such an 'automation' thing is sold by a company, then who will
create & sell it? And will that firm have 1000000000 of such small scripts for
automation? Highly unlikely. Whatever automation I did in past five years was
something that relieved my work load.

It is highly unlikely that such automation packages will be bought by
companies and the much harder thing is for automation companies to _sell_ such
tools.

Companies like mine have spent millions to buy that tool, will they really
spend more money on buying a bunch of automation tools?

Note that the core aspect of the tool's usage can't be automated . You still
need a living thinking human being to use it, and make objects using the non
existing requirement documents. Scripts need 100% perfect everything which is
not possible in real life

------
fizixer
Yeah, unfortunately it's fairly easy to rebut this paper. I only had to read
the abstract and the first paragraph. The last line of the first paragraph:

> ... the more advanced a control system is, so the more crucial may be the
> contribution of the human operator.

This is absolutely true. The human operator working with an automation system
needs to be way more skilled, more educated, and smarter compared to a human
working as a cog in a sweatshop.

But this still doesn't support the thesis that it expands the problems
associated with human operator, for the very simple reason:

You are replacing 1000 low-skilled sweatshop workers with 5-10 (or less)
highly-skilled engineers and experienced technicians. And at the same time
increasing the throughput of your production system by an order of magnitude
or more. It's a complete win.

And I didn't even need to invoke the role and/or potential of modern AI (i.e.,
one based on ML/DL/DRL/CV etc, etc), which is going to take the automation to
the next level.

Highly recommended watch: CGPGrey 'Humans Need Not Apply'

[https://www.youtube.com/watch?v=7Pq-S557XQU](https://www.youtube.com/watch?v=7Pq-S557XQU)

(in the video just look at the human labor chart regarding agriculture, at 15
second mark, and you'll understand what I'm saying).

~~~
logifail
> this still doesn't support the thesis that it expands the problems
> associated with human operator

In some circumstances it might. Look at Air France Flight 447 and the role of
the co-pilot in stalling the aircraft.

 _" Neither weather nor malfunction doomed AF447, nor a complex chain of
error, but a simple but persistent mistake on the part of one of the pilots"
_[0]

An aircraft like that has so many highly automated systems it is impossible
for the human operators to fully understand them.

[0] [https://www.popularmechanics.com/flight/a3115/what-really-
ha...](https://www.popularmechanics.com/flight/a3115/what-really-happened-
aboard-air-france-447-6611877/)

~~~
lmm
> An aircraft like that has so many highly automated systems it is impossible
> for the human operators to fully understand them.

The aircraft was stalling because the pilot kept the flight stick pulled all
the way back. Nothing automated or complex about it - a wood-and-canvas
biplane would have had exactly the same problem.

~~~
detaro
If I remember correctly: Due to sensor issues the plane dropped out of
autopilot and switched into a different control mode ("Alternate law").

The pilot had to suddenly take over and had a wrong mental model of the state
and configuration the plane was in, and likely didn't realize the consequences
of the different control mode, among which is that it doesn't provide the same
level of automated stall protection as the normal case. Since automation
normally works, the pilot wasn't trained or experienced in flying in this
mode.

While in the end the operator made the wrong decision, these automation
ironies fit very well contribute to _why_ he made those decisions or wasn't
prepared to make the right ones.

~~~
lmm
> While in the end the operator made the wrong decision, these automation
> ironies fit very well contribute to why he made those decisions or wasn't
> prepared to make the right ones.

Not at all convinced. Hauling back on the stick and just keeping it there
would never have been the right decision under any circumstances. At the same
time it's a reasonably common panic reaction, and was long before any kind of
cockpit automation.

A pilot who had been in tense situations before would have been more likely to
make the right decision, sure, and modern automated systems may well mean that
the first truly deadly situation occurs later in a pilot's career than it
might otherwise. But by definition there's no safe way to test how a pilot
will handle genuine danger.

~~~
logifail
>> Hauling back on the stick and just keeping it there would never have been
the right decision under any circumstances.

From the BEA report [0]:

"The horizontal bar then indicated a slight nose-up order compared with the
aeroplane symbol."

and

"Nevertheless, the PF was also confronted with the stall warning, which
conflicted with his impression of an overspeed. The transient activations of
the warning after the autopilot disconnection may have caused the crew to
doubt its credibility.

Furthermore, _the fact that the flight director was advising a nose-up
attitude_ may have confirmed the PF’s belief that the stall warning was not
relevant"

(my highlight)

[0]
[https://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp09060...](https://www.bea.aero/docspa/2009/f-cp090601.en/pdf/f-cp090601.en.pdf)

------
vesinisa
Should add (1983).

------
v8engine
Is there some LaTeX library to have your PDF show up like its been through the
printer/scanner at least twice?

On a serious note, why do academics do this? It doesn't help me with
readability.

~~~
detaro
Is it really that surprising that they don't have a crisp source PDF for a
paper from 1983?

~~~
v8engine
My mistake. Didn't notice that.

