This movie is the reason why I became a programmer.
I watched it when I was starting in high school and got inspired by it. Then, I literally googled "How to become a hacker" and found this incredible page by Eric Raymond [0], which I used as an mentor throughout my high school and college years. I ended up developing the recommended hard skills (learning programming, UNIX, open source culture...) but also the "points of style" (martial arts, science fiction, meditation, music). In fact, when I was trying to search for a hacker community online, I stumbled upon Hacker News and haven't left ever since!
As expected, I didn't become a black hat hacker as in the movie but, to this day, I still believe this movie changed the course of my life.
Same. This film made me believe that it was possible to be cool and find a social group that were all in to tech. I grew up in a really rural area so this revelation of possibility, whether actually practical or not, heavily influenced my trajectory in life and in tech. It’s still one of my most sentimentally favorite movies of all time.
You and I share the same exact journey as to how we got there. I've had random conversations over the years with other developers in my age bracket and many of them cite Hackers as the reason that piqued their interest in computing. What this movie lacks in authenticity it makes up for it with an amazing cast, good score and lots of fun.
I can't stand the ESR view on hacker-supposed media, music or martial arts. And I say this as a former Shotokan branch graded Karate alumnus. On music I can shift from psychodelic rock to Tangerine Dream to hard rock to electronic avant-garde jazz.
On sci-fi, OFC I like it, but the mistery genre it's really fine as it's pretty close to debugging a problem but with real life issues.
On tech, Unix philosophy and Emacs are polarly opposite. Unix it's focused on multiprocessing parallel tools doing the work for you (preferabily scripted) where the less code your run, the better; while Emacs wants you to reuse Elisp functions everywhere, with an always-loaded philosophy and not lightweight at all if you have tons of modules in memory. Also, Emacs it's single threaded, so Emacs is very prone to be I/O locked, while Unix will spawn background subprocesses like nothing not affecting your main task at all.
Good old Neal Stephenson books, but I really like Cryptonomicon and Reamde. I read Cryptonomicon in one day it was that good, and years later read it again and got more from it. Neal is a great writer, and I love his "wordiness", how he dedicated pages to the experience of eating Cap'n Crunch cereal.
I wrote about hackers too, http://www.dusted.dk/pages/phlog/2023-12-02.txt the gist of that rant is that what you see in hackers is not what's not their screens, but what's in their minds, through a poetic lens.
One of my favorite scenes is when the protagonist is in the airplane looking down at the city, and the buildings warp into computer chips and circuitry. This fits well with the idea that what we're seeing is more about the feeling that the characters are experiencing rather than what's actually on their screen.
Well said. It's amazing how often people dismiss computer visualization in media (< ~2005) as "unrealistic."
That's not the goal!
It's intended to be representative of the process of interacting with a computer, not the interface itself. At a time when most viewers had little or no first-hand experience with a computer.
If you showed a CLI then, maybe 1/20 people would understand it, and they'd be bored because they used it everyday at work.
That's exactly right! I didn't see hackers until I was in my mid 30s, but when they had the initial flyover of the city representing tables in a database ... I had a moment where I said "my goodness, this is the closest representation of how I think of data".
It's like when you're reading a book - you don't think about the words, the concepts and people come to life in your brain. I feel like this is the only movie to get that that happens with computers, too.
"I tried to picture clusters of information as they moved through the computer. What did they look like? Ships, motorcycles? Were the circuits like freeways? I kept dreaming of a world I thought I'd never see. And then, one day, I got in..."
Tron Legacy. Such a good film. My kids and I watch it a few times a year. I tried watching the original from 1982 and it just felt so... dated? Perhaps I'm spoiled like most people from all the great CGI and effects.
I always figured it was the closest they could do to the descriptions of hacking from Neuromancer. Especially since the computer was named the "Gibson", I figured the whole thing was a great big nod to it.
That was my interpretation, too, but this article shows me The Plauge skateboarding through physical towers, etc. Not sure such scenes can be written off as happening in Joey’s mind.
Interview with the guy who made the perspex "buildings"(?) props that represented the central computer database, with a photo of one of them if you scroll down a bit:
Cereal Killer, Lord Nikon, and Phantom Phreak, real names not given.
Disappointing because Cereal Killer's given name was Emmanuel Goldstein, an alias made famous by Eric Corley, editor of 2600: The Hacker Quarterly. Eric consulted on the movie, and it probably would have sucked without him.
That and more of the names are given in the film, so I dunno if this person even watched it. Lord Nikon aka Paul Cook. Phantom Phreak aka Ramon Sanchez! And the aforementioned nod to 2600 for Cereal.
And “Emmanuel Goldstein” was the name of a character in the novel 1984. He was the widely despised “enemy of the state” cooked up by the government to justify their autocracy.
Get me arrest warrants on Kate Libby, alias Acid Burn, Emmanuel Goldstein, alias Cereal Killer, Dade Murphy, alias Crash Override, also known as Zero Cool, and Paul Cook, alias Lord Nikon. We pick them up tomorrow morning at nine o'clock.
This is a nice website, with an interesting purpose. Though i suppose reviewing sci-Fi movies notions of interfaces will soon clash with the dryness of reality.
The 90's were wild in that sense, you could imagine that the internet superhighway would be a superhighway you could literally drive on with your Avatar, and countless movies and tv-series presented things thus.
The noughties were way more grounded in reality, even the Matrix had Trinity hacking into a server using a OpenSSH exploit on a black and white terminal.
> The 90's were wild in that sense, you could imagine that the internet superhighway would be a superhighway you could literally drive on with your Avatar, and countless movies and tv-series presented things thus.
> The noughties were way more grounded in reality, even the Matrix had Trinity hacking into a server using a OpenSSH exploit on a black and white terminal.
That's a wild contrast to try to draw, since the actual direct experience of the network in the Matrix (and which is the focus of the film) was an immersive virtual reality of exactly the type you are trying to contrast the portrayal in the Matrix with, and the “hacking in to a server with an OpenSSH exploit” occurred as a simulation within that virtual reality.
True enough, though if Trinity could jump between skyscrapers and run against a wall before disarming 5 machine gun totting henchmen, i imagine she could have enabled transparency in her Xterm in Enlightenment.
But in more general terms, the representation of computer interfaces, with avatars walking rigidly in 3d pastel coloured surroundings (as presented in a variety of examples, my favorite being the corporate network in the TV Series Profit) sort of fizzled out post 2000, when everybody got to owning a desktop and laptop and realised that the internet was just text in fact, and 3D environments were not so easy to navigate (remember Second-Life).
That is, of course, if your online life did not involve being an elf in World of Warcraft, or something.
~1999, you would have had Pentium III @ ~500 MHz max in a laptop, likely less. Pentium M wouldn't release for several more years.
So transparency was arguably still something a true hacker wouldn't waste cycles on.
To your general point, it felt like there was a shift around 2000, when computers needed to be "serious business" and the whimsy of the 80s and 90s was scrubbed out of software.
Honestly, I think we all would have been better off if we'd turned the web to something more approachable for common people (especially if it inspired them to be creators).
Instead, we built a brutalist efficient system where most expression is limited to setting your background.
In the 90's even hackers loved some bling-bling. Maybe not with E16, but some WMaker with a nice backdrop and a transparent UXTerm was really nice and much faster than the emerging KDE in late 90's.
Also, heck, it was cool and pretty having some city at night while you were day and night with URxvt with links or lynx on it reading media and posting to fora.
Yup. Circa 1997, I switched from AfterStep to (then-new) Enlightenment briefly before settling on WindowMaker as a good balance between bling and performance.
My PC at the time was a hand-me-down white box AMD 386 (40 MHz) running FreeBSD 2.x.
I think the change you mentioned (2000) has to do with maturity of the tech, as well as, maturity of the acceptance of this tech, as part of everyday life.
As far as the state of the web today, you can thank commercial entities. Yes, they did/do contribute to the web's existance, but at the same time, they make it much worse.
"~1999, you would have had Pentium III @ ~500 MHz max in a laptop, likely less. Pentium M wouldn't release for several more years."
and, Boy, did i waste cycles on enabling every eye candy possible on Linux/BSD on the only 150 Mhz Pentium pro with 32Mb of ram i could afford in 2001-2002...
"Instead, we built a brutalist efficient system where most expression is limited to setting your background."
Common people being creative was MySpace, remember the eyesore?
Just being sarcastic, i think the "Ugly" web of early HTML was actually wonderful, and we should encourage people to go back to it, but let's not delude ourselves too much, FaceStagram will always win the appeal of the masses.
Minority Report came out in 2002, and I can tell you, actually using a computer interface like that for an extended period of time would be pretty exhausting.
What it got right was swiping and pinch/zoom though; five years before the iPhone was released.
Those gestures were completely unknown in commercial UIs of the time; in fact you were lucky to find a touchscreen that didn't take three forceful stabs of your finger before it registered.
I remember seeing this movie when it came out and, while I enjoyed it, I chuckled at how silly the UI graphics looked. However, after rewatching it years later I really enjoyed it even more because I realized that, yes, the UI graphics weren't realistic, the movie did a great job of capturing what tech felt like at the time.
I just rewatched it again recently and find it a thoroughly enjoyable film.
It doesn't look like this site has done a review of Sneakers yet but I recommend they do. The interfaces are much more realistic for the time (even if the cryptography mathematics do suffer a bit)
Sneakers is one of my favourite hacker movies. I have always loved the scene where they trace his travel path based on sound. And it was the first time I saw how an interface for blink people worked.
I liked the movie and it reminded me of when I was a kid and my dad bought me a Commodore 64 and 5 1/4 floppy drive and a 1-year subscription to Creative Computing magazine. I spent so many hours learning programming and then wasted it all by going into the military because my parents couldn't afford to send me to university. Rural community and only my dad worked since my mother was injured.
When I got out and returned back up north, I got a "job", got married, and had a neighbor who was a Unix programmer. His home office was decked out with FreeBSD, NetBSD, and OpenBSD servers, BSD magazines and literature everywhere. He talked me into using my GI Bill to go get a degree in CS, which is what I did. My unofficial hero since has been Theo de Raadt. Still using a *nix OS, still in IT.
Bit embarrased to say I tried to implement the 3d file browser from this movie and failed when I was first learning C. I did get the 3d boxes to show up and display files/folders and it was cool (to me) to move them around in 3d but doing even slightly complex stuff required "polynomial math" which was beyond my skill level.
It still saddens me irrationally to see the state of UX today. It isn't cool and i have many visions of doing all this stuff with webgl and more (not 3d boxes but futuristic yet practical UI). Modern UX feels like art majors designed it by a committee and MBAs+lawyers were the target audience. I no longer even see anyone in tech thinking out if the box with radicallu new windowing systems and alternatives to hypertext and browsers.
I'm more saddened by the state of hardware. It's amazing and very practical and basically perfect for everyday use... But it was a lot cooler back then.
WebGL file management doesn't seem to hard to implement, I'm guessing it would be a weekend project to make a demo, now that one doesn't actually need math to draw 3D stuff.
I'm assuming you probably still need some math to make everything look actually good, but you could probably get something working just with assorted random hackery.
Most of the libraries seem to just be "Put X object on the screen at Y coordinates and rotation and size with Z texture and material", it's not like you're dealing with how rendering actually works by hand and figuring out what color each pixel should be.
But I've never done much of anything in 3D, aside from CAD work so idk.
Not exactly related, but I'm shocked by the amount of people, older people even working in IT who mock the girl in Jurassic Park saying "it's a Unix system, I know this", when in fact what they showed was a Unix system and the 3D interface they showed was real software SGI shipped with IRIX.
Ah but this was pre-systemd so you would have been screwing around trying to find whatever script was being used to lock the doors after close of business.
Well it's IRIX, so you'd just create a little bourne shell script in /etc/init.d, with symlinks from e.g. /etc/rc2.d to start/kill it on the respective run levels.
This was a pretty critical system though so why not just place this particular program inside /etc/inittab ?
Well, we are talking about managing critical infrastructure on a theme park riddled with dinosaurs. Potentially the definition of working under pressure.
I wouldn't think even teenage Kevin Mitnick on Mountain Dew can himself bypass the terminal security (if any), prompt his way to the safety mechanisms, and override their behavior. All while being assailed by a velociraptor.
pretty simple explanation: the girl in the movie is more talented than teenage kevin mitnick. if that's a bridge too far for you, i'd recommend not watching the rest of the movie because it's about people cloning dinosaurs using frog DNA.
Because its unlikely that the girl would have had access to an IRIX box, and while it was real software, it was dog-slow on low-end SGI without a graphics card, so she had to be using some pretty expensive hardware.
As a filesystem browser it was not useful. Someone with Unix system experience would prefer a 2D browser, which IRIX also had.
We only used the 3D browser for our demo setup for visitors.
> Because its unlikely that the girl would have had access to an IRIX box
She's related to Hammond, who owns an island. I'm sure he could get her an old box and some manuals to play with, if not outright a state of the art system.
> As a filesystem browser it was not useful. Someone with Unix system experience would prefer a 2D browser, which IRIX also had.
But isn't that exactly the sort of thing a clever kid with access to fancy stuff would mess around with?
> We only used the 3D browser for our demo setup for visitors.
Which would perfectly explain why it popped up on the production systems. To make it look fancier for Hammond/any investors/the people visiting in the movie.
Using the 3D browser would be akin to saying "That's Linux" upong seeing a Compiz cube a few years ago. I mean, yes, it would be technically correct, but not a defining term for deeply knowing Linux and its intrinsics.
I don't remember her ever saying she actually has experience with the browser, just that she knows it. She could have just used UNIX in general and only read a lot about the file browser, or perhaps briefly tried then incredibly slow version.
It's not even too much of a stretch to imagine she has never seen the file browser, but is able to figure it out on like one can with modern GUI apps.
She didn't need to know IRIX if she knew some other flavor of Unix, she was probably savvy enough to use man and access documentation if needed.
The point is though most mock the scene as though the software they show was entirely fictional and made up, when it was in fact real and was in fact still UNIX.
I don't know why others mock it, but as an older person working in IT, I can explain why I mock it even when knowing it was real software SGI shipped with IRIX.
I hadn't realized that was your reason for mocking the scene, I had thought it was just being kind of pedantic and making an observation about the scene.
Yeah that's pretty bad. I don't know what I'd say the worst movie scene was, most of my bad examples come from TV. That CSI visual basic example is pretty infamous.
She was the granddaughter of a billionaire so it's not unreasonable to imagine she may indeed have had access to high end hardware. "No expense spared".
I'm explaining why someone in the early 1990s, knowledgeable about Unix, IRIX, and fsn, would mock the adults who created that scene about a fictional 12 year old.
If the makers of The Matrix gets credit for its realistic looking use of nmap and a fictional "sshnuke", then Jurassic Park should get jeers.
We don't credit Trinity for that scene, we credit the creators of that scene.
And you are being hard on me, and it's kind of ridiculous.
Knowing Unix is not the same as knowing to use a program which only exists on one version of Unix, and which was not distributed with the OS, and which was less helpful at file system exploration than both the 2d file manager [1] and 1970s-based Unix shell tools [2].
You specifically called out "older people even working in IT", which includes me. Just because you want to jeer at people who don't know what you know doesn't mean there aren't other reasons to jeer at the same scene.
[2] On a related note, The Great CHI ’97 Browse-Off was a non-rigorous head-to-head contest between different hierarchical browsers. The Hyperbolic Browser was the clear winner, with Windows Explorer coming in second, and the DOS command-line doing pretty well until it came to comparison questions like "Which planet is also the name of a car brand?" where both categories in the ontology needed to be compared.
> And you are being hard on me, and it's kind of ridiculous.
No, lol, I'm not. I'm just disagreeing with you, and pointing out that IMO your point doesn't have much merit. I don't think that's a ridiculous response to what you're claiming at all.
> Knowing Unix is not the same as knowing to use a program which only exists on one version of Unix
She didn't say she knew a program which exists on one version of Unix, she just said she recognized the type of system. That's it. That's the claim she was making.
It's pretty similar to the hypothetical situation of a kid finding a mac and saying "This is a Macintosh, I know this" and using the finder to browse and look for a program to run. Same thing, with the only difference being Unix refers to a variety of operating systems not just one particular OS.
And you're making a big deal about how she probably wouldn't have known IRIX and all this, but that doesn't really make sense and it's extremely nitpicky. Others have explained why.
> You specifically called out "older people even working in IT", which includes me. Just because you want to jeer at people who don't know what you know doesn't mean there aren't other reasons to jeer at the same scene.
As I said though, most people who jeer at the scene do so on the mistaken assumption that the software scene on screen didn't exist.
I've not actually ever come across someone like yourself who doesn't refute that, but is just basically being very nitpicky and IMO unrealistic.
- the storm evacuates most people, essential staff only
- Nedry deliberately creates the IT emergency situation to lock out the other IT people still on the island
- The book has more details about how the IT system was over budget/rushed/flawed as another example of the hubris of the whole endeavor.
The book has a detail I especially liked: theres an automatic dinosaur counting system but it was written such that it stops counting once it finds all of the expected dinos (because the spec said they couldn't reproduce), which delayed them realizing that the dinosaurs were actually mating and in a bunch of places they weren't supposed to be. Classic example of a bug caused by software working correctly exactly to the spec.
What if she were originally baffled by the 3D display, and then saw recognizable paths like /bin and /usr and realized the system was actually familiar?
i grew up in university computer labs (parents were grad students then professors). all the machines in the labs were SGI Indys/Indigos running IRIX 4.x/5.x. as a 12yo, i used the 3D graphical file browser because it was fun and cool. it was installed on all the machines by default because it came bundled in a demo CD with every machine.
Hmm. Now I'm curious if you could actually get a 3d interface with all that information dynamically rendered running with reasonable performance on period accurate personal hardware - it's definitely in the scope of the 95 demo scene stuff.
This is the vision of the cybersecurity company i'm working for, to render information systems in a 3d interface. It works fine on mid-range hardware and can render assets (hardware and linked users) in the thousands, for more you'd have to segment your interface in secondary instances.
It's mainly for SecOps team tho, can't go full matrix yet on your home network.
Circa 2000 I interviewed for a three-letter-acronym agency that shall remain unnamed and they were interested in me working on a 3D visualization interface of traffic across a network which actually sounded pretty fascinating. Sort of like "packet filter plus visualization" after given physical location information about the computers in question; even without that information, though, it would use something like the graphviz algorithm (or one of its options) to lay out the map on a 2D plane in 3D. Different kinds of traffic was color-coded, you could zoom in and inspect any stream, roll forward or backwards in history, etc.
3D interfaces so far usually mean you navigate a 3D model of some digital space which appears to you behind a 2D screen accessible with a keyboard and a mouse which are 3D physical objects in your 3D real world (keyboard,video,mouse).
3D interfaces are quit different without the 2D screen barrier and either your are inside the experience (VR) or the experience is outside around you (MR/AR).
While I loved the 3D imagination of the 90s, 3D navigation is not the shortest way to the needed information. Imagine your file structure would be organized like a video game? It's the same reason why transparent touch screens and VR doesn't really work.
This is really neat! I wonder how they chose to represent the inner structure of these binaries visually - there seem to be some similarities between the .exes vs the other types.
Reminds me of pointing the viewport address to a block of code on the Amiga 500, and watching it run.
Thanks! My first questions there were about the choices of the cube dimensions (where the stream folds, etc, what's best to make cyclic patterns stand out, how come we see diagonal artifacts as on a heatmap) - the video seems like it covers that and so much more.
I think the thing I was/is most disappointed about is that I can buy the key caps that Cereal Killer uses for a Nordic keyboard layout. God damn those were cool.
iBook G4s had too and bottom panels that could have their paint stripped to be transparent. The earliest iBooks were translucent. Maybe a hunk of machined gorilla glass?
There are translucent lower receivers for AR-15s that are both light and durable[0]. Perhaps there is still hope.
My favorite line from the movie - they are huddled around someone's new computer and a character compliments them with: "Killer refresh rate!"
With so many great and much more accurate hacker movies like Sneakers, this one was just so much fun to see evry aspect of hacking amplified to the "X-treme!!!"
As I wrote at the time:
“WOW! Just a `git clone`, download and unpack the .zip with the assets (end of the blog post), type `make` and it compiles on Linux without warnings into `wipegame`, run it and it works first time!” https://news.ycombinator.com/item?id=37086874
This is awesome, but there were a couple of great laptop interfaces from that movie too. Spent some quality time in the 90s getting AfterStep/Litestep to look like them.
This whole article is so draaaaawn out. It's not supposed to be realistic, it's for delivering the story. Hardly any, if any at all, UIs back in the 80s and 90s were realistic. This level analysis just isn't necessary. It's a classic, legendary film for both the content/representation of the subculture, and the 90s "so bad it's good" style.
Has anyone written about the computers/interfaces in HBO's "Silo"? I've been trying to find information about that because they seem quite interesting.
I watched it when I was starting in high school and got inspired by it. Then, I literally googled "How to become a hacker" and found this incredible page by Eric Raymond [0], which I used as an mentor throughout my high school and college years. I ended up developing the recommended hard skills (learning programming, UNIX, open source culture...) but also the "points of style" (martial arts, science fiction, meditation, music). In fact, when I was trying to search for a hacker community online, I stumbled upon Hacker News and haven't left ever since!
As expected, I didn't become a black hat hacker as in the movie but, to this day, I still believe this movie changed the course of my life.
[0] https://www.catb.org/~esr/faqs/hacker-howto.html