
Is this a change in the usage of the term “coding”? - CarolineW
Originally as a comment elsewhere, but I thought I&#x27;d ask more generally ...<p>I&#x27;ve been involved with the internet, and its predecessor, for a long time.  Before the web existed, &quot;coding&quot; meant writing computer programs.<p>When the web was invented, and for some considerable time after, writing HTML was not called &quot;coding&quot;.  It was called &quot;writing HTML&quot;.  This, of course, preceded the time when pages contained PHP or Javascript.<p>Now pages <i>do</i> contain programs, or fragments of programs, intended to be executed at serve-time or render-time, and writing those parts of the page are clearly &quot;coding&quot; in the original sense.<p>Now I&#x27;m starting to see people referring to writing plain HTML+CSS as &quot;coding&quot; as a matter of course, even if it has no PHP, Javascript, or similar, and it feels, thereby, that usage has changed.  Just as &quot;hacker&quot; used to mean something different and now is used by people at large to mean &quot;breaking into or otherwise doing nefarious things to computers and computer systems&quot;, is this a usage that has changed?<p>Does &quot;everyone&quot; now think of writing HTML+CSS as &quot;coding&quot;?<p>Please note that I have no skin in the game, and to some extent I don&#x27;t really care, I just want to know what the current general feeling and usage is.<p>Thanks.
======
mbrock
I think I've been talking casually about "coding" HTML or CSS for many years.
To me the word can mean any process of converting your ideas into "computer
codes". However, if someone says "I'm a coder", I assume they mean they work
with programming in some kind Turing-complete-ish language.

~~~
colorint
That's always a bit of a dodgy metric, because of two issues: The simpler
issue, that people are more likely to bring up, is that you need infinite
memory (or, if you're a member of the Church of Church, you need to be able to
represent infinite symbols). The tricker problem is that, since all computers
are analog while all theoretical formulations are discrete and symbolic,
Turing-completeness is a good-enough-is situation. You can't really build a
finite state machine or a lambda applicator, so what it means for something to
be a computer isn't as theoretically robust as you'd hope. It seems that, in
practice, a computer is something a computer scientist regards as a computer
(i.e., relative to the things they expect to be able to do with it, and the
ways they expect to interact with it).

I say this because of the weird question of whether CSS is Turing-complete. On
the one hand, it seems that existing proofs of concept (e.g., [1]) require the
user to play tippy bird with key inputs. On the other hand, real digital
computers require a clock to play tippy bird with register commits. To what
extent does the clocking of a digital computer have to be "out of sight, out
of mind" for people to agree that it's a computer? Also, of course, CSS has to
enumerate memory positions (i.e., DOM elements), but then a digital computer
has to do the same, so the extent to which one cares about infinite memory is
put in the spotlight.

[1] [http://eli.fox-epste.in/rule110-full.html](http://eli.fox-
epste.in/rule110-full.html)

~~~
mbrock
If it turns out that someone is indeed constructing logic gates out of CSS
selectors, I'll happily consider them a coder.

------
bzalasky
To someone who isn't a "coder" it may be difficult to tell the difference
between someone cranking out landing pages with HTML and CSS, and someone
building the checkout flow for a different part of the same app. If you were
uninitiated and saw both colleagues working with text editors (with fancy
syntax highlighting) and the command line, you'd probably call them both
coders. Superficially, the work looks similar. The semantic change you've
noticed could possibly be attributed to the tremendous increase in web
development in the last decade.

