
Do You Trust This Computer? [video] - armansu
http://doyoutrustthiscomputer.org/watch
======
transpute
Regulation comes up later in the film. Ross Anderson, professor at Cambridge
University recently wrote,
[https://m.cacm.acm.org/magazines/2018/3/225467-making-
securi...](https://m.cacm.acm.org/magazines/2018/3/225467-making-security-
sustainable/fulltext)

 _" Once software becomes pervasive in devices that surround us, that are
online, and that can kill us, the software industry will have to come of age.
As security becomes ever more about safety rather than just privacy, we will
have sharper policy debates about surveillance, competition, and consumer
protection. The notion that software engineers are not responsible for things
that go wrong will be put to rest for good, and we will have to work out how
to develop and maintain code that will go on working dependably for decades in
environments that change and evolve._"

~~~
terminado
The Ivory Tower has a way of phrasing concepts such that they are framed by
finality and totality. The mentalility comes with having made it through the
admissions process, and passing your final exams with a good grade point
average, conferring some lofty degree of distinguishment into one's
possession.

So, to look at the words:

    
    
      The notion that software engineers are not 
      responsible for things that go wrong will be 
      put to rest for good.
    

I'd have to say that this sort of high-minded platonic concept needs some
revision.

    
    
      The notion that *some* software engineers *cannot*
      be found as responsible (in part or in whole)
      for *some* things that go wrong will be 
      put to rest in *some* situations.
    

There needs to be a degree of responsibility ascribed to some classes of
systems development.

Meanwhile, there is very obviously a line to be drawn between the programmer
that programs their VCR clock to time a recording, the programmer that
programmed the VCR as a consumer-grade product intended for purchase by
unlicensed individuals, the TV network that broadcast the television show at
the time the individual programmed their VCR to record 60 minutes of broadcast
on a given channel, and the programmer who locked me out of the firmware on my
smart phone.

~~~
goalieca
> software engineers

I've had the idea for a while that most of us practice software development
rather than software engineering. I have a degree in computer engineering but
i consider myself software developer now rather than a software engineer. The
reason, I don't practice engineering in the legal and professional sense.

In engineering school we learn about engineering as a formal process and
professional responsibility. Both of these things are largely absent in most
shops now. I get that not all projects need to be professionally engineered
with all the costs and timelines associated with it. I think this is why agile
came along. Sometimes it's just good enough to hack something together and
demo it until a manager says it's time to release.

But there are many other projects which are extremely important to society and
should follow more traditional engineering practices. There shall be external
and internal engineers who must formally approve any product before release.
There shall be specific and testable formal requirements. There shall be a
formal design and documentation for engineers to review and people to develop
from. etc. etc.

~~~
terminado
There's no legal framework, to distinguish devices that matter from devices
that don't.

There's no clear demarcation between devising a convenient contraption, versus
implementing an inadvisable hack that leads to a hazardous outcome, especially
within the scope of web based systems, since no portion of the internet is to
be regarded as reliable life-saving infrastructure.

I think the mistake is to trust packet-switched networks and peer-oriented
protocols as reliable systems at all.

If you cannot control the whole system, end-to-end, and any unwitting peer can
over-consume bandwidth (jamming traffic and communication with interference),
effectively cutting you off from a necessity, why would you bet your life on
the availability of that system?

~~~
transpute
Work is underway to support Time-Sensitive Networking in modern operating
systems: [https://schd.ws/hosted_files/elciotna18/b6/ELC-2018-USA-
TSNo...](https://schd.ws/hosted_files/elciotna18/b6/ELC-2018-USA-
TSNonLinux.pdf)

------
bdefore
I enjoy the... irony? of visiting the site with a set of ad block and privacy
related extensions and seeing a set of 'Sorry' messages that I can't see the
trailer because of my privacy settings.

[https://imgur.com/a/4IEsx](https://imgur.com/a/4IEsx)

~~~
qntty
direct link: [https://vimeo.com/263108265](https://vimeo.com/263108265)

------
cryodesign
I was hoping for a bit more in-depth material - allowing the experts to
explore their topics a bit more and potentially talking about potential
solutions? Where is the call to action for the viewer? What now?

It's good for non-technical folks to watch, but nothing really new since the
'Humans need not apply' 15 min documentary [0]

[0]
[https://www.youtube.com/watch?v=7Pq-S557XQU](https://www.youtube.com/watch?v=7Pq-S557XQU)
(2014)

Edit: added link to humans need not apply

------
neals
I watched the first 20 minutes or so. What I really miss here is some
naration. A documentary that is tied together by quotes from interviews and
flashy stockfootage is hard for me to follow.

~~~
jsharf
This is exactly how I felt. The whole thing is like a stream of consciousness
experience without any central story or narration. I couldn't take it. It's
like one big infomercial or something

~~~
syntaxgoonoo
I agree. 10 min was all I could bare. It's just a big mix of sound bites.

~~~
jochung
I find it eery that a documentary that isn't telling people what to feel or
think or do is criticized for being incomprehensible and boring.

Maybe we're already far more doomed than people realize.

~~~
carlosdp
A narrator doesn't tell you "what to feel or think", they piece together what
you're looking at and go deeper into concepts, whereas this quote after quote
stream doesn't go very deep into anything.

In fact, I'd say it _is_ currently telling you what to feel and nothing more,
because most of the quotes in the first 10 minutes are derivations of
"technology is scary, you should be scared".

~~~
creep
This doc definitely gave me the heebie jeebies. But I don't think the point is
necessarily to go deeper into these problems. Most will watch this and finally
gain the intuitive sense of how intertwined intelligent machines are in our
lives. It isn't just smartphones and laptops, but everything from the military
to health, and in between. It gives a good light overview of what
professionals are thinking. Most people are not afraid of AI, but I think we
should be, to the point that we start making changes to how we develop it.

------
substandard
Half an hour in I'm left thinking: "this is a great trailer, but where's the
documentary?"

------
dredmorbius
Very strongly recommended. The interviewed subjects are largely experts in AI,
and many are concerned.

The film is likely a bit long to trigger much by way of discussion here,
though that's not always bad.

Word is that free play / download is this weekend only. Grab a copy via yt-
download if you can't watch immediately.

~~~
gooseus
That's weird, because I just got to the part in Pinker's _Enlightenment Now_
where he says the large majority of experts in both artificial and human
intelligence are not concerned.

Who is misrepresenting the expert consensus? Or are they both misrepresenting
the fact that there is a consensus?

~~~
ollin
"""Things AI researchers agree on:

\- that documentary was pretty unhelpful

\- Terminator images are usually inappropriate

\- AlphaGo Zero (if not other Alpha•s) was pretty cool"""

([https://twitter.com/Miles_Brundage/status/983063456424308736](https://twitter.com/Miles_Brundage/status/983063456424308736))

I don't have any survey results to point to, but my impression from following
AI researchers from industry/academia is:

* Modern methods are many leaps of understanding behind anything resembling AGI, so any concern about research groups developing a sentient computer program behind close doors with no warning is probably misplaced.

* AI/ML causing large-scale unemployment will be a serious issue eventually, but it's difficult to make a strong case that's it's happening right now.

* The ability to monitor and manipulate individuals using ML/AI is dangerous, doesn't depend on particularly advanced technology, and is already being used by corporations and governments right now. It's a lot easier to get the public worried about terminator-style robots than about (what appear to be) simple advances in advertising or law enforcement.

* There's a strong incentive for those selling "AI technology" to oversell its ability to generalize/improve automatically. To quote Elon (of all people), "It's a mistake to think that technology automatically improves. It does not automatically improve. It only improves if a lot of people work hard to make it better."–this applies to "deep neural networks" as much as anything else.

------
eugf_
Honestly, I personally don't believe that regulations would solve or diminish
the potential issues related to AI. We will be constantly challenged by
people/machines trying to dominate others, and for that reason, I don't see
regulations being the solution, but education instead. I think that people
should better understand the trade-offs that AI can bring to our lives and act
based on that. Therefore, democratizing AI and educating people about it
should be a good starting point for this problem.

Also, I tend to agree with Mark Cuban[1] about the importance of philosophy
degree in the near future. There will be so many issues to be assessed that
such degree would bring much value to the society.

[1] [https://www.cnbc.com/2018/02/20/mark-cuban-philosophy-
degree...](https://www.cnbc.com/2018/02/20/mark-cuban-philosophy-degree-will-
be-worth-more-than-computer-science.html)

------
greggman
I don't really get the argument for regulation. This is not nuclear material
(something relatively easy to control). This is computers, something most 8yr
olds in the western world at least have access to. If you regulate it in the
USA or the USA+Europe will it be regulated in Russia? China? Can you even
regulate it in the west if you want to without confiscating all computers?

I'm not saying we shouldn't try to make friendly AI (One of Musk's
initiatives), rather I'm just saying I don't see how it's possible to remotely
regulate this.

------
ptr_void
It looks good with editing, graphics stock footage and such but not really for
HN demographic. Some of the AI commentaries are also very exaggerated in
places.

------
TheWoodsy
So I bailed after the first X minutes. Why cant we just unplug the power
and/or the network cable and move on from this FUD?

~~~
goshx
Finish watching and you might find out.

~~~
ribs
Could you tell us?

------
iandanforth
Don't waste your time. It's fear mongering. This would be fine if it had any
suggestions at all about how to avoid the horrors it imagines, but its
argument boils down to completely unspecified 'regulation.'

------
cortexio
It's just a sensationalized docu. Take it with a grain of salt. I enjoyed it
though.

