I've seen so many ridiculous "hacking" sequences in films that I really wasn't expecting this series to go into as much realistic detail as they do. Also most hacking stuff in films is at its worst when it comes to the dialogue which somehow always has to have something to do with shutting down a "mainframe".
But this series does a great job both from a tech and dialogue perspective. There's even a scene where the main character and the CTO of another corp (trying not to spoil anything), have a conversation about gnome vs. kde.
What I love about that movie is that google tried to use it to re-recruit people from my internship (I mean we all interned at google togeather so in reality it was more of a reunion but still...)
I particularly hate unrealism which is hopelessly unnecessary, such as showing people using ridiculously bad faux search engines and email clients, or results displays that rub your face in the result: SUSPECT IDENTIFIED with one result on the screen, vs. a list of results the person has to interpret.
Gnome vs. KDE is probably the least technical thing they've gotten right in the show. There were a couple of minor howlers in the most recent episode, unfortunately (up until then, no significant lapses I've noticed -- staggering for a TV show).
As the Gnome-KDE bit unfolded, my first reaction was, "Guffaw! What a silly detail! That's not proving any thing. And no one would expect to bond with a stranger over that."
But that is not a flaw or limitation in the writing -- it's helping to characterize Tyrell as a bit of douche. This goes beyond the typical 'ssh' or 'wget' screengrab (which only provides some mechanistic plausibility in achieving a plot-point.). The Gnome-KDE bit defines character. It's brilliant.
>or results displays that rub your face in the result: SUSPECT IDENTIFIED with one result on the screen, vs. a list of results the person has to interpret.
That's the entire point, though - telling the viewer who the villain is quickly (and, most importantly, cost effectively) to get to the part where the hero busts down their door. Designing the UIs realistically and eating up valuable screen time with analysts poring over the data would be a waste of money in most cases.
Also, realistic design increases the chances of someone actually suing for copyright infringement or something.
Realistic design doesn't cost more; using actual sites may get you fired, but altering them sufficiently is cheaper than building whole b.s. displays from scratch.
I think the "tell the viewer who the villain is quickly" is correct in terms of the director/writer's intention, but in fact the word is "badly".
Realistic web design means hiring someone who knows how to design websites, or redesign them, and budgeting that. It might also mean translating the results into whatever digital effects workflow they use, because I doubt there are any actual applications being run or sites being shown.
Also bear in mind the relationship between camera distance and screen resolution, and that the design requirements for UIs intended to be used by someone physically sitting inches away from the screen, and the composition of elements in a scene for television or film, which may place the viewer halfway across the room in terms of perceived distance, are necessarily different. Text that would be normal sized on the web might not even be legible when the viewer sees it, much less be capable of leading the viewer's attention.
FWIW I mostly agree with you, it's absurd for anyone who's technically competent, but still not as simple a matter as getting it realistic versus getting it wrong, rather of not boring the viewer with tedium and visual clutter.
Just to add to your comments: technically, shooting screens close enough for more realistic computer use was actually a problem ten years ago, but today any decent camera can do it.
The showrunners don't want to get in trouble when armies of fans type in stuff from screencaps. It's like how cars in movies have fake license plate numbers and telephone numbers all start with 555.
Using a real phone number, license plate number, IP address, or physical address in a show/movie/song has the potential to provoke throngs of fans into malicious, or merely inconsiderate, actions -- prank calls, ping bombs, or simply visiting and not cleaning up after yourself (which is what led to the Goonies house closing: http://www.independent.co.uk/arts-entertainment/films/news/g... )
There were a few subtle slips in the show but overall they're doing a good job at keeping it as real as possible (within television's scope). Someone, for once, thought about hiring consultants I suppose.
While the technical stuff mostly manages to avoid degrading into gibberish, I find the philosophizing about "bugs in the system" cringe-worthy. The show would have the layman believe that bugs are some kind of metaphysical agents of chaos that crack the facade of reality (or something), as opposed to reality where most bugs are caused by sloppy copy-pasting while reading HN with one eye.
In a sense, this is true. The view that all bugs are static, determinable, and caused by a lack of care and attention, is old fashioned and demonstrably wrong.
In mature code bases (e.g core linux kernel), (almost) all the low hanging bugs have been picked clean. There remain only faults triggered by untested configurations, unconsidered usages, and by ordering non-determinism. The last one is particularly chaotic - you can have a system which stays up for years, but falls over due to an extremely unlikely combination of events.
Think about it this way: every statically compiled program with an unbounded loop, in which there are calls to malloc() with runtime-determined sizes, and differently ordered but balanced free() at later times, may crash. This is a possibility despite there being no leaks and worst case size fits in memory. The issue is the heap, with some combinations of allocation sizes, and order of malloc vs free, will fragment and effectively "leak", eventually exhausting resources. It is generally not determinable whether this will happen to a program. They all survive at the whim of probability, although the odds are usually very much in their favor.
And yet the meat-and-potatoes of guys like Eliot and his adversaries are things like heartbleed or goto fail. (Missing bounds check, improper braces + copy-paste). The bugs you speak of are IMO mostly unexploitable, don't make the news, and are generally about as existentially terrifying as the possibility of being hit by lightning.
If I was searching for some deeper meaning in bugs, I'd rather think about how the most elaborate and secure of digital structures are still vulnerable to a moment of distraction or laziness in just one person, and how software seems to be resistant to the sorts of engineering best-practices that make aerospace as safe and consistent as it is.
> (almost) all the low hanging bugs have been picked clean
This has never ever been true. Every new framework has to design mass-assignment protection hopefully. Every new language needs to handle file access. Every new authentication library will have to rediscover reams of textbooks/whitepapers.
Actually, I'm pretty sure a lot of security professionals don't use Kali Linux because 1) they only need a subset of the pentest tools, 2) many companies' internal monitoring systems trigger alerts if they detect someone is using Kali Linux/BT (via OS fingerprinting).
This is only my anecdotal experience though - thoughts?
I think some of the entries could serve as inspiration. Think about the effect Minority Report's UX has had. It was an idea that wasn't new, but seeing it (I think well done) in a movie gives it a more concrete goal for a lot of people (tech) and a reference point for other people (non-tech).
Just as an FYI, the designer of the Minority Report UI/UX created a company (http://www.oblong.com) and actually built the system. Boeing (my former company) actually has a patent using the system to control a swarm of UAVs.
That itself ain't bad. Star Trek: The Next Generation pretty much introduced the world to the concept of touchscreens, back in the time when they pretty much didn't exist.
Also, somehow the "internally incoherent" movie UIs still feel order of magnitude better than what we use in the real world, especially on mobile, which is mostly a mess of crappy and incoherent design.
Actually the TNG interfaces were very coherent and based on technology that could be replicated at that time. The designer of LCARS, Michael Okuda, needed to make a set of interfaces that could display readable information, be used for over a decade worth of shows, and be believably interacted with in a variety of modalities. He likely can be credited with making the most inspirationally accurate SciFi UI ever.
Star Trek interfaces set the gold standard for realistic UI, while Minority Report was not a UI anyone would ever want nor even should aspire to create.
Footnote: Oddly enough the consultant on the interface for Minority report was Jaron Lanier, who I respect immensely for his technology theory -- just not his designs.
As far as I know, we didn't have many touchscreens at the time TNG was shot; while the interface was indeed believable, it was also inspiring. I'm very happy to see touchscreens on the newest Dragon, and I would really love it if they were running LCARS :).
I wholeheartedly agree with your evaluation of LCARS. That's why I still keep sketching out LCARS-equivalents of UIs for existing devices (it's actually quite hard to do right; you can't go there half-way - LCARS mixed with non-LCARS elements looks like crap) and want all my home automation to be run by one :).
Another thing Okuda was a master at was logos and insignia. They set a quality standard that is very rarely met in the real world.
Was any technical minded person really influenced by that, other than cringing in front of the TV?
Anyone who has ever worked with any kind of media editor knows that "waving your arms in the air" is about the least desirable imaginable interface for such.
Minority Report is exactly what I think of when I'm trying to guess what bizarre combinations of gestures and swipes and pinches and shakes I need to perform to get this bit of smartphone software to do whatever trivial thing it's supposed to.
The only difference is that John Anderton seems to be able to get his software to do the stuff he wants it to do.
I totally disliked the series... It's poorly written, disconnected and the lead character doesn't convince enough. Even the story-line is totally poor IMHO.
So the only good thing is the realistic hacking scenes...
I feel sorry for the lead character and he obviously has problems, but the wannabe CTO of EvilCorp reminds me more of Dexter. This is the first TV show I've enjoyed enough to schedule for it in a long time but that character (Tyrell Wellick[1]) is a bit too dark for comfort.
By the way, what language do they speak? I get some "hits" in my rudimentary Norwegian language skills. So either a Norwegian dialect or one of the Norse family...
Tyrell speaks Swedish, and his wife speaks Danish.
It's no surprise that you get some "hits" (even with "rudimentary Norwegian language skills"), as the Scandinavian languages are so similar that we generally understand each other – Norwegian speakers even more so, according to Wikipedia [0].
I was really confused by this headline -- having not seen the show, but knowing an indie video game by the same name (http://moonpod.com/English/about_mr.robot.php ) that used the word "hacking" to describe RPG-ish battles set inside the circuitry of other robots.
It is refreshing to see on-screen depictions of hacking that use legitimate tools in ways they're actually used in, instead of "GUI interface using visual basic to track the killers IP address" ( https://www.youtube.com/watch?v=hkDD03yeLnU ).
I'd just like to say that Mr. Robot is not a show that is enjoyable only for the HN-like crowd. The cinematography is fantastic, the plot is engaging, the aspect of Elliot being an unreliable narrator is unique, and the realistic hacking is just a bonus.
Just started watching this show tonight - so far I find myself wanting to like it, but man, the socialist bullshit is laid on so heavy it's ridiculous. :-(
Still, at least it is a show about hackers that shows "computer stuff" in a semi-realistic light. No crazy 3D "inside the computer" visuals like in Hackers or most other movies and television programs about this stuff.
The Gnome vs. KDE bit was a nice touch. As soon as that scene was over, I was already thinking "I'm probably going to like this show". Anyway, I'm impressed enough to keep watching and see how it all develops from here.
One thing that bugged me: why is he using DVD-Rs? I (almost) haven't used such ancient technology in years. Many modern computers don't even have DVD drivers. And then with hand-written labels on then. WTF? Those can be found and read(!) by laymen. No way! He should just have a few (encrypted) externel HDDs or use cloud storage or something.
Otherwise: great show getting so much stuff right without boring the hell out of people who don't recognise what's on the screens. That's special. I love it.
Actually, he uses a software called DeepSound to encrypt the data as audio files (music), the hand-written labels are album/artist/band names of the audio files.
In that whole GNOME vs. KDE scene, what's bugging me is that Elliot doesn't seem to be using GNOME. If he's really using it, GNOME was modified quite a bit.
I know this is only a movie, a good movie, but in the episode 3 Elliot speaks about bugs. In some case, could it be right?
Elliot: A bug is never just a mistake. It represents something bigger. An error of thinking.
http://transcripts.foreverdreaming.org/viewtopic.php?f=303&t...
The screen stuff has been legit but two things were funny:
1. Eliot, an elite hacker, has a Excel 2014 book in his apartment. Why does he need to use Excel so much? Why does he need Excel at all if he's using Linux? And why would he need a user guide to figure it out?
2. When Eliot has to fix a hack to E Corp at Allsafe his boss gives him a folder of paper logs and says "look through these"
1. is probaby a mistake by set designers. Although hypothetically one could argue that (A) he actually needed to refernence Microsoft Excel for one of his hacks and kept it around (B) it is a disguise for when people enter his apartment and sees the MS Excel book cover, they will not think he is a hacker (and who knows what is actually contained in the book cover...could be some secret hacking stuff inside).
2. Sometimes, if extra paranoid about being hacked, may want to do things with paper as much as possible. Also, were they going on an airplane? That might explain need to have paper copies so have something to look over during silly takeoff and landing procedure. Also, to be honest, paper can be much more usable.
> 1. Eliot, an elite hacker, has a Excel 2014 book in his apartment. Why does he need to use Excel so much? Why does he need Excel at all if he's using Linux? And why would he need a user guide to figure it out?
While I agree that it's more likely that the prop team got this when they were task to buy some "computer books", I have some similarly useless books in my shelves that I got from relatives when I was younger because they thought "I was into computers".
But this series does a great job both from a tech and dialogue perspective. There's even a scene where the main character and the CTO of another corp (trying not to spoil anything), have a conversation about gnome vs. kde.
EDIT: after doing some reading looks like the writer/creator is responsible a lot of this realism see: http://www.slate.com/blogs/outward/2015/06/24/mr_robot_gay_c.... Here's an interesting reddit response feed from him: https://www.reddit.com/r/IAmA/comments/3bp1zz/i_am_sam_esmai...