
Lines of code that changed everything - brycehalley
https://slate.com/technology/2019/10/consequential-computer-code-software-history.html
======
thrownaway954
personally i think the line of code that the today's internet owes it's thanks
to is:

$(document).ready(function(){});

If it wasn't for jQuery, no one would be programming web applications the way
we do today. jQuery was THE single advancement in creating the modern web.

~~~
sandinmyjoints
It’s significant, for sure, but I’d probably go with XMLHttpRequest as even
more significant for web apps.

~~~
sli
I'd almost say jQuery pasting over the inconsistencies is what made
XMLHttpRequest and AJAX in general the killer feature it became.

~~~
Groxx
yea. swap out roomba routing (srsly?) and replace it with "jquery made dynamic
websites economically feasible" which led to practically the entire internet
becoming dynamic in some form.

~~~
lwf
Is "roomba routing" an industry term? I couldn't find it from a Google search,
but I like it.

~~~
espionn
He's just talking about how the article lists 'Roomba Routing' as a 'Line of
Code that changed everything'

------
AdieuToLogic
The four "lines of code that changed everything" arguably could be:

    
    
      #!/usr/bin/env perl
      use strict;
      use warnings;
      use CGI;
    

While no longer in vogue, nor applicable for many systems today, the web would
not be what we know it now were those four lines of code never typed.

~~~
mikorym
What is CGI?

~~~
nostrademons
Man I feel old. :-/

Back when the web was young (before 1998 or so), webpages were generally
static HTML files that you dumped in a directory and the webserver served up
verbatim. There was no PHP, no Rails, no Django, and certainly no Node. If you
wanted to do anything dynamic - like serving up content that you, the
webmaster, did not write yourself - you had to write a custom webserver.
Remember, this was back when C was the most popular programming language, Java
was for hardware devices, Python was for academics, and Javascript didn't
exist. Not many people wanted to write their own webserver.

CGI was a specification that would allow the webserver to launch an external
program - written in any language you wanted - in response to an HTTP request,
send it the request data, run arbitrary code, and write the response back to
the web browser. The query params and other information about the request
(path, user agent, host, request method, etc.) would be passed in environment
variables. POST data would be passed on STDIN. The program would write the
output HTML to STDOUT, and the webserver would echo that directly back to the
browser.

It was slow, it was clumsy, it was insecure, and it was magical, because you
_simply couldn 't do_ simple stuff like read from a database and dynamically
generate a webpage otherwise, unless you wanted to write your own webserver.

The lines that the grandparent posted would fire up a Perl script and then
import the CGI module to parse all the input data (no such thing as JSON back
then; if you wanted the actual query params, you got "/reply" in your
$ENV['PATH'], and "id=21654788&goto=item%3Fid%3D21648568%2321654788" in your
$ENV['QUERY_STRING'], and had to parse them yourself). It also enabled some
warnings and error checks for Perl to avoid some of the worst security
nightmares. This was even more magical, because Perl offered string parsing &
construction utilities and database libraries that were _much_ easier than C,
and so you could quickly write useful websites that did stuff instead of hand-
typing HTML into your editor.

~~~
mikorym
I understood the first three lines :P

So... what would be an example of such an external program? Something that
accesses a SQL database and changes values on the HTML page? I ask this
specifically since you mention that database access would be prohibitively
slow.

Or would you e.g. use CGI to call some script that changes the HTML? Let's
say, post something that was written in an input box?

~~~
AdieuToLogic
> I understood the first three lines :P

> So... what would be an example of such an external program?

The external program is responsible for producing the entirety of an HTTP
response; headers and the message body (see here[0] for details). CGI allows a
web server to provide what it knows from an HTTP request to an arbitrary
program and expects it to write the response as it sees fit.

Think of it as a specialized case of "fork-exec"[1] where the "child process"
has the responsibility of writing the raw HTTP content to the "parent process"
(the web server).

In the CGI model, each request causes a "fork-exec"[1] and, as such, any
interaction with other services (such as a database) requires connecting,
using it, then disconnecting each time. This can quickly swamp server
resources, which lead to the definition of FastCGI[2] to help alleviate this
type of thrashing.

HTH

0 -
[https://tools.ietf.org/html/rfc2616#section-4](https://tools.ietf.org/html/rfc2616#section-4)

1 -
[https://en.wikipedia.org/wiki/Fork%E2%80%93exec](https://en.wikipedia.org/wiki/Fork%E2%80%93exec)

2 -
[https://en.wikipedia.org/wiki/FastCGI](https://en.wikipedia.org/wiki/FastCGI)

~~~
mikorym
I like reading old Perl code. It's much more interesting than the Javascript
code that displays stupid YouTwitFace plugins.

BTW, is this kind of specialised fork-exec code that writes back HTML code to
the parent process still in use?

~~~
AdieuToLogic
> BTW, is this kind of specialised fork-exec code that writes back HTML code
> to the parent process still in use?

Yes, but is most likely found in intranet situations. As others have noted,
commercial offerings typically use an application server style approach, as
that greatly contributes to being able to scale.

However, not everything has to scale ;-).

------
mc3
I'd add Garbage Collection (Lisp was first I think) which transformed how we
write programs, allowing more focus on solving the real problem, and less on
memory management.

Also vms/containerization is worth a mention? The basis for the cloud.

~~~
ydb
Brilliant suggestion. All my most proficient and excellent times while
programming were among the purview of garbage collectors. They have assisted
my career in more ways than any conference talk ever has. In fact, I'd say GC
basically saved me from burnout.

I wish I could garbage collect my ex the same way that D rocks my garbage for
me!

------
Hitton
Also "rm -rf /*". It taught generations of sysadmins and programmers not to
run code they don't understand.

~~~
0x8BADF00D
More generally, it taught us to be careful with string substitution and
interpolation.

~~~
Izkata
There's actually (at least) two versions of GP's comment - unsafe code/data,
what you appear to be referring to:

    
    
      rm -rf $FOO/*
    

And typos, which is what I first thought of upon reading it:

    
    
      rm -rf . /*

------
kazinator
Another clueless rant against null-terminated strings.

The flaw isn't the null termination but the lack of encapsulation.

No string representation is error-proof, if the application programmer open-
codes the manipulations of it, taking on the responsibility for maintaining
all necessary invariants of the representation, including memory management.
If applications take on the responsibility of constructing and maintaining
length + data strings, there will be bugs.

Null-terminated strings are actually a very good choice of string
representation that is to be directly manipulated by programs, instead of just
through opaque handles via library functions.

Null-terminated strings allow for elegant tail recursion like:

    
    
       const char *my_strchr(const char *str, char c)
       {
          if (!str)
             return NULL;
          else if (*str == c)
             return str;
          else
             return my_strchr(str + 1, c);
       }
    

Compare with:

    
    
       (defun findel (list el)
         (cond ((null list) nil)
               ((eql (car list) el) list)
               (t (findel (cdr list) el))))
    

This leverages the fact that if s is a non-empty string, then s + 1 is also a
string, which is its suffix. This is super cool and has no equivalent in the
length + data representation.

Compilers can merge string literals that have a common suffix; if "pet" and
"carpet" occur in the same program, they can share storage.

Null terminated strings of byte-wide characters can be transmitted with no
byte order issues. Not so when length fields are involved.

Being able to split a null-terminated string in-place according to some
delimiter characters, by overwriting them with null bytes, is also handy, and
greatly simplifies some programs which open-code this logic rather than
relying on string libraries.

Null termination is nice for lexical scanning. Why? Because every string has
an accessible byte that is one position past its last character, and has a
unique value which can treated as a member of the input alphabet, and thus
integrated into the pattern matching logic. Code which scans a null-terminated
string for a lexical pattern doesn't have to do anything ugly and extraneous
like checking an index against a maximum length; it just contains cases that
handle zero as a part of the syntax.

Stream scanning code that looks for and handles EOF return from _getc_ can
easily be turned into string scanning code if it just handles zero instead of
EOF.

~~~
talaketu
> The flaw isn't the null termination but the lack of encapsulation.

> Null-terminated strings are actually a very good choice of string
> representation that is to be directly manipulated by programs, instead of
> just through opaque handles via library functions.

You seem to be having it both ways.

As I see it, "NULL" is a valid ascii character, but not a valid element of a C
string - except as a terminator external to the string of characters being
represented. It's fundamentally broken as an abstraction for managing variable
length strings of ASCII. It is incapable of representing all strings of ASCII
characters.

~~~
kazinator
You're arguing about requirements. Of course when Thompson and Ritchie went
for null terminated strings, they knew that NUL would be excluded. That's the
specification at the requirement level; it's not a "oh crap, we unwittingly
excluded NUL from strings" bug.

There are 32 control characters, and DEL. The purpose of all those characters
is to carry some kind of meaning. The designers of the null-terminated string
decided that the meaning of NUL will be reserved for that termination. That
still leaves 31 control characters and DEL which _can_ be stored in a string,
which programmers can use for other meanings. That turns out to be plenty.

The "unreasonable effectiveness" and ubiquity of null terminated strings shows
that the world is perfectly capable of getting along without NUL in the middle
of a string. Basically, the intuition that we can steal NUL for string
termination was right (and to some extent self-fulfilling, due to the
subsequent proliferation of the convention).

The idea that all possible code points of a character set are valid characters
that must be stored in a string is outdated in the age of Unicode, which
contains codes which are not character code points.

Null terminated strings are now the representation of a path name every major
OS that has any non-negligible installed base. The choices leading to this
situation weren't made by complete idiots.

And, by the way, NUL is the ASCII character; NULL is the pointer constant in
C.

~~~
kazinator
I apologize for that last remark. In fact, the first ASCII document from 1963
uses NULL!

[http://worldpowersystems.com/ARCHIVE/codes/X3.4-1963/page5.J...](http://worldpowersystems.com/ARCHIVE/codes/X3.4-1963/page5.JPG)

This got shortened to NUL in 1965.

------
imglorp
The banner at the top is wrong. It should be simply

    
    
        while(*dest++ = *src++);
    

And a fork bomb isn't a virus in any sense. Malware sure.

~~~
minitech
It’s not wrong, just not as short as it can be. Comparing things that aren’t
booleans with zero explicitly is a common style choice.

~~~
imglorp
Well, the article is about the historical context of the idiom, and they got
the context wrong. The full context (K&R second edition at least) is working
through pointers and trimming down strcpy(). The '\0' one is version 2, and
the following is version 3.

    
    
        As the final abbreviation, observe that a comparison against
        '\0' is redundant, since the question is merely whether the
        expression is zero. So the function would likely be written
        as
    
        /* strcpy:  copy t to s; pointer version 3 */
        void strcpy(char *s, char *t)   {
           while (*s++ = *t++)
               ;
       }
    
       Although this may seem cryptic at first sight, the notational
       convenience is considerable, and the idiom should be mastered,
       because you will see it frequently in C programs.
    

[https://rkvalley.files.wordpress.com/2010/04/kernighan_ritch...](https://rkvalley.files.wordpress.com/2010/04/kernighan_ritchie_language_c.pdf)
page 88

~~~
rstuart4133
> the notational convenience is considerable,

You are right of course - but this is a moment in time thing. ++ and it's
equivalents aren't so convenient now. One of the things that strikes me about
modern languages is the disappearance of integers, and the associated integer
arithmetic such as the s++ in this example.

It turns out we were using integers as a poor mans iterator, and s++ simply
meant "next item please". Once we starting using "for element in sequence" to
access iterators many of the integers disappeared - they are only used to
represent integer things in the real world now. Just about every instance of
++ also disappeared because "next item in iterator" was it's main use. Turns
out integers modelling things in the real world rarely need incrementing. So
much so that modern languages don't bother including ++ or the Pascal
equivalent, succ().

------
nayuki
As I scrolled through the article, I was surprised to see that they quoted a
piece of my published code. Section "Introduction of the JPEG", function
"NaiveDct_transform()".

------
thiscatis
In a way I feel the MP3 format and the digital music revolution in terms of
both easy of releasing, sharing and cross-device listening was even more
significant than what JPEG did for photos.

------
mtreis86
(loop (print (eval (read))))

~~~
tln
Iconic. A worthy addition

------
city41
I thought they explained null terminated strings pretty well. But I feel they
misled a bit on buffer overflows, which are more due to using raw pointers and
not having a runtime that does bounds checking than anything else.

~~~
rabidrat
I enjoyed the stories, many were new for me, but this was a major foul:

> But the RSA encryption algorithm—one of the basic building blocks of modern
> cryptography—is elegant enough that it can be written out in just four dense
> lines of Perl code … short enough to fit on a T-shirt.

And of course it has nothing to do with the elegance of RSA, but the largeness
of Perl.

------
polynomial
Nitpick: hyperlinks don't work w/o name resolution, arguably the prior
advancement that is oddly missing from this history.

I'd also argue for the inclusion of Leonard Kleinrock's seminal paper on
communication networks.

------
smartscience
Torrents/filesharing and ransomware would be possible additions to this list.

~~~
city41
Napster predates torrents by a little bit, and seemed to be the real gateway
for file sharing for most people IMO.

------
ddmma
Like button counter evolved from what it was the RSS reader counter from blog
post called Feedburner. Some things come from pure evolution not necesary
revolution.

------
rkagerer
I'm surprised some form of Comet didn't make the list (early approaches for
AJAX / XMLHttpRequest / Long Polling / etc).

------
thrownaway954
Another one I can think of:

rails new blog

Think about how many frameworks over the years have copied or looked to Rails
for inspiration. Rails pretty much changed what people expected from
frameworks.

------
est
I think the lines of code that changed everything is when a guy changed "utf8"
from 6 bytes to 3 bytes somewhere in a commits of
[https://github.com/mysql/mysql-server](https://github.com/mysql/mysql-
server).

I failed to bookmark it nor find it. Sorry.

~~~
lilyball
What does MySQL have to do with this? It seems it was RFC3629¹ that restricted
the domain of UTF-8 to cap at U+10FFFF and therefore remove 5- and 6-byte
encodings.

¹[https://tools.ietf.org/html/rfc3629](https://tools.ietf.org/html/rfc3629)

------
30f0fn
Weird that they went back to 1755 but skipped the universal Turing machine.

------
abdulhaq
Surely the most influential line of code was written in Algol,

let there = light

------
AnimalMuppet
I'd add telnet (not the same as telenet) and ftp.

------
symplee
print("".join(sorted("everything")))

~~~
kaoD
eeghinrtvy?

(No idea what language is that, just guessing the output)

------
egfx
$scope

------
mitchtbaum
diff and patch

time and uptime

uuidgen and ping

...?

------
sxagoraris
#!/bin/sh (long-lived)

------
cryptozeus
Great article !

