
Where I started might not be useful to you - zdw
http://rachelbythebay.com/w/2018/04/04/learn/
======
tonyarkles
I frequently run into the same problem. I work with clients, and inevitably
one of their employees asks something like "how did you get to be so good at
Linux? You figured out a problem we've been chasing for weeks!"

And I hate my answer, especially when the employee is around my age or older:
I first installed Slackware from floppy disks when I was about 12 years old. I
installed it because I heard that Linux had a great C compiler, and I was
getting tired of the Power-C compiler that I'd been using for 2 years.

Like Rachel, I started on a Vic-20, around age 8. The computer was at my dad's
house though, and I lived with my mom most of the week, so I would take out
books from the library, write all of my code on loose leaf, and then fast &
furiously type it in when I got over to dad's house to see if it worked. POKE
36879 indeed.

~~~
taneq
I sometimes wonder if growing up before the internet and Wikipedia and Stack
Overflow was actually a boon for my problem solving skills. I spent so many
hours with nothing to go on but some books and my own persistent trial-and-
error. If I'd done the same stuff these days I don't think I'd have learned
half as much (although would I have done more, different things? Probably...)

~~~
caseymarquis
Ungooglable problems still exist. Yesterday, for example, 30 Fanuc i
industrial controllers started sending back RST tcp packets and refused to
initiate new tcp connections until restarted. This behaviour occurred in a
system which was stable for 2 years. Very likely an issue on one controller
spread to the others by causing an error in a set of dlls from Japan. This
occurred due to another set of the same DLLs connecting to 1 controller at the
same time. Maybe. I have to talk to the manufacturer to get any further. Or
disassemble their libraries. I have these sorts of ungoogleable problems all
the time. I doubt industrial automation is the only field like this.

~~~
taneq
Industrial automation is my field also, and yeah - you get that interesting
mix of low-volume hardware, software written by electrical engineers, and
messy real-world situations. You think you know what's going to happen but
you're never quite sure... And when something does go wrong, it'll be a doozy.

~~~
tonyarkles
> software written by electrical engineers

I mentioned elsewhere in this thread that I did an EE/CS dual degree after
having programmed for a long time beforehand. I really don't understand _why_
, but some of the EE-only software I've worked with has been absolutely
atrocious.

One of my favourite moments in undergrad was in my Digital Communications
class. I don't remember exactly what we were doing, but it involved Matlab and
some kind of signal decoding. I handed in my assignment and got called in to
have a one-on-one with the professor; he was certain that I had cheated and
just hard-coded the answers to my assignment instead of actually writing the
decoder. Why? Because my code ran instantaneously instead of sitting and
grinding for 30+ seconds. What was my secret? I'd seen a couple of triply-
nested for loops in the code and decided they could all be accomplished with a
matrix multiply instead of doing all of that computation by hand.

In the same class, a friend (also an EE/CS dual) and I came up with the
dynamic programming solution to the Viterbi decoding problem as well, and
again, got taken to task for having a solution that ran too fast. Sigh...

------
seanwilson
This is why I hate the quote "It is practically impossible to teach good
programming to students that have had a prior exposure to BASIC: as potential
programmers they are mentally mutilated beyond hope of regeneration." and "The
teaching of BASIC should be rated as a criminal offence: it mutilates the mind
beyond recovery."

I started coding by reading the huge instruction manual that came with the
Amstrad CPC and didn't have an issue moving on to other languages later. As
long as you're willing to keep up with what's new and are passionate about
improving your skills it doesn't matter what you started with.

~~~
smarks
That sounds like Dijkstra, from EWD498 (1975):

[http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EW...](http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD498.html)

At least, the "mentally mutilated" phrase is about BASIC. The "criminal
offence" phrase was actually about COBOL. You'll note that Dijkstra also had
choice words about FORTRAN, APL, and PL/I, which were popular programming
languages of the day.

It's tempting to say that Dijkstra was being tongue-in-cheek when he wrote
this. I never met him, but based the attitude I picked up from his other
writings, I'd guess that if you had asked him he'd have said he was completely
serious.

And he has a point. I first learned programming in BASIC, and look how I
turned out. (Well maybe I confirm Dijkstra's point. Others will have to
judge.) But I do recall a bunch of characteristics of early BASIC systems that
potentially led to terrible programming habits: GOTO and line numbers leading
to spaghetti code; all global variables; no scope; no functions; self-
modifying code (overlays); etc. All of these had to be unlearned at some
point.

~~~
seanwilson
> But I do recall a bunch of characteristics of early BASIC systems that
> potentially led to terrible programming habits: GOTO and line numbers
> leading to spaghetti code; all global variables; no scope; no functions;
> self-modifying code (overlays); etc. All of these had to be unlearned at
> some point.

Compared to just being told these things are bad and should be avoided, I'd
like to think that being forced to code in the above ways makes you better
rounded as you have first hand experience for why those practices are a bad
idea. It's not like you have much choice if you have to code in something like
assembly as well.

I've worked with programmers before who can regurgitate the names of current
bad practices but can't describe the reasoning behind them which isn't a good
thing.

Personally, I think it's an absurd idea that e.g. if you began coding with all
global variables and no functions you'd be unable code in a different way. If
you're amazing at coding in an environment that makes it challenging to write
complex programs, you're going to quickly become efficient at writing code in
environments that makes it easier to write complex programs.

~~~
smarks
Sure, I don't disagree. Personally, after moving away from BASIC to other
languages, I felt freed from a bunch of problems I had had with BASIC. For
example, scope eliminated problems with reuse of variable names. Perhaps lots
of other people who started with BASIC felt the same way.

On the other hand, sometimes other people never transcended those bad habits
from BASIC. Sometimes you run across programs that are one big function, with
variables reused, etc. I don't know what the thought processes are of
programmers who write such code. There's an old saw, "You can write FORTRAN in
any language." Maybe this should be applied to BASIC as well. That's probably
what Dijkstra was driving at. I don't know about "mental mutilation"
(certainly hyperbole, in my book) but only Dijkstra knows what he really
meant.

~~~
seanwilson
> On the other hand, sometimes other people never transcended those bad habits
> from BASIC

The kind of people that don't actively keep up with current best practices and
latest developments are always going to be bad programmers though. It's not
like you can pick a perfect language to start coding in that will teach you
all the current and future best ways to do things. I personally don't think it
matters what language you start coding in as long as you're always learning
and adapting.

------
FeepingCreature
I think that maybe webdev, maligned as it is, fills that role. You can make
something useful quickly. "Boot times" (page reload) are short. State can be
saved but is ephemeral by default.

~~~
RobertRoberts
I agree, webdev is the modern equivalent of a fresh new playground.

I started with computers in 3rd grade, logo. It was enthralling. I was
enamoured with computers right away. I learned you could stop some Apple games
and view the code. I learned you could even print it, (I had reams of printed
code I didn't know what 99% of it did, what is a gosub? lol) most of this
learning was by accident, as I didn't have a single book to teach me anything,
and only 1 other real nerd friend who showed me how to program lowres
graphics. (One friend got bootleg copies of games from his older brother
though, lol. Anyone remember punching holes on the side of a floppy disk to
make it store more data?)

I tried over and over to create games, and made dozens of odd text adventures
and stupid space invaders games that crashed after a couple minutes of play.
In highschool I found (by watching the older kids) about hacking system
resources on Macs, and we'd mess with the menus, lol. Pascal was interesting,
but I never learned to make anything cool with it. Finally when I got into
college, I got some real coding experience, and found out you can mix art (I
was a flip-book animating nut as a kid, made hundreds of them) with
programming.

Eventually the web took over, and I found out you didn't have to "install"
anything on someone else's machine ever again, I was hooked.

As bad as the web is in so many ways, I never want to go back to supporting
other peoples computers ever, ever, ever again.

I will take JS hell over that garbage. (Imagine helping a client debug an
install issue on a computer over the phone, and he's 2,000 miles away on a
Carribean island living it up and blaming your for him clicking the wrong
button because he simply refuses to follow your instructions "explicitly" and
can't ever be wrong... ugh)

~~~
icebraining
_Imagine helping a client debug an install issue on a computer over the phone
(...)_

Instead, that client is using IE6, doesn't know how to tell you that, and IT
restrictions prevent them from upgrading or using another browser. Or they
have Chrome and a shitty addon that breaks your site in a way that you can't
even imagine. Or they are behind a workplace firewall that blocks some
specific domain used by the site.

I've done support for a webapp, and I've had to use remote desktop sharing
quite a few times.

~~~
RobertRoberts
> _Instead, that client is using IE6, doesn 't know how to tell you that, and
> IT restrictions prevent them from upgrading or using another browser._

At least _I_ personally can do something about it, instead of grasping at
straws and begging someone else to do something right on their end. If they
want to use IE, fine, but they only get IE supported features and everything
cost 10x as much, win-win.

> _Or they have Chrome and a shitty addon that breaks your site in a way that
> you can 't even imagine._

Dealt with this repeatedly,"sir, try Firefox, it works there? Great, use that
instead, or remove the Chrome plugins."

I had a client that was in the Beta channel of chrome and had some obscure bug
causing her issues, but no one else. I figured that out just fine.

Just last week a client had the stupid Grammarly plugin wreck something
online, fixed in minutes.

> _Or they are behind a workplace firewall that blocks some specific domain
> used by the site._

Say, "Call your tech support guys, I am unable to fix that." I can't say that
to a guy that is holding only a CD that got mailed to him, because I am the
only support then.

> _I 've done support for a webapp, and I've had to use remote desktop sharing
> quite a few times._

Yep, and in the near future this will work in the browser, we are close now.
And the beauty of the internet is that if they don't have access, we don't
have to debug until they do. (their ISPs/intranet problem) But in the old
days, it didn't matter what the issue was, and if the computer was not online,
then the only data I could get to debug from was irate staff.

------
mrlyc
I started with a Vic-20. I wrote a BBS for it with multiple rooms (message
areas), private mail and an online game. Users could start their own rooms and
make them public or private. It was very popular with each user spending an
average of 70 minutes on it.

That's how I got into programming professionally. One of my users said "Anyone
who can write a BBS for a Vic can program!" and hired me as a programmer.
Thirty-two years later, that same guy now wants me to work with him at Google.

~~~
cbm-vic-20
This one speaks the truth.

------
yownie
Hey rachel, the bbc microbit definitely fits the parameters you listed at the
end of this post. Only $12, so definitely worth picking one up to fool with.

[https://en.wikipedia.org/wiki/Micro_Bit](https://en.wikipedia.org/wiki/Micro_Bit)

~~~
Narishma
You can't do anything with that without first connecting it to a bigger
computer.

------
jannotti
Javascript and reloading a web page are the obvious parallel. There are one
liners for changing colors, fonts, etc. A tiny bit of canvas setup and you can
draw freely. I think sound is slightly more complicated, but not much. It
might make sense to make a little js library (maybe call it basic.js, though
I'm not actually recommending copying the BASIC language) and then write some
blog post that start with loading it, and then fooling around in the JS
console.

~~~
neogodless
In a way, this is a big part of how I transitioned from someone who knew how
to program a Commodore 64 to someone that could help people build web pages
(and eventually be able to do other things good, too.)

I had a Geocities web site, and I loaded it down with JavaScript widgets that
I wrote. Things to flip images around. Things to annoy you with alerts. Then I
spent countless hours in MS Paint making images out of bitmaps which I then
wired up using JavaScript to emulate an Iron-man watch (complete with glowing
and a very bad battery!)

There was "no risk." It was fun and exploratory. It didn't achieve anything
useful for anyone else, but I learned and got confident in learning (and
experimenting.) When someone actually hired me and threw ColdFusion at me, it
was wildly different... from QBasic and from JavaScript... but I just
experimented and figured out how to do CRUDDY things.

------
mixmastamyk
Nice, my dad was quite prescient as well and bought me and my sister Vic-20s
with little black and white matching TVs. I have great memories of hacking the
random sentence generator in BASIC and loading/saving programs to cassette
tape over many minutes.

A teacher so far ahead of his time at school had a TRS-80 that we would have
lessons on. It would be at least 5 years later that I would touch another
computer, a PC XT running DOS.

------
telesilla
Installing mklinux in the late 90s on an old 6100 by transferring disks copied
over the internet that took a few days each, while everyone was sleeping, and
sometimes the disks failed, then getting the damn thing installed.. but then
it worked! Gave me a reward for my perseverance, and I've been getting that
reward since as I continue to stubbornly chase bugs and fix genuinely
interesting problems.

I learned by trial and error, before you could just google a problem and have
it pop up in StackOverflow. It taught me one major lesson: debugging is how
you learn about everything in the environment. A bug is an opportunity to
explore the programming environment, framework etc and you'll get a grasp of
the platform through searching for solutions. StackOverflow is nice for quick
fixes but it doesn't give you the deep knowledge that debugging does. I
recommend to my team, if they find a bug spend half a day figuring stuff out
from the ground up (where we have that time luxury) and it makes everyone more
knowledgable in the long term.

------
jgrahamc
Reminds me of something I wrote a while back:
[http://blog.jgc.org/2009/08/just-give-me-simple-cpu-and-
few-...](http://blog.jgc.org/2009/08/just-give-me-simple-cpu-and-few-io.html)

Those of us who started programming in that era had simple, relatively cheap,
fully comprehendable computers in front of us.

------
gfo
I agree with most of the sentiments, except for one key statement:

> "Just about any computer you can buy today is going to have some kind of
> non-trivial boot time. It probably will require a whole ecosystem of crazy
> things to make it work. _It will be fragile. You can break it easily._ "

I vehemently disagree with the italicized portion, at least in the way it's
presented.

Those on here probably aren't affected, but many of the non-technical users I
know are afraid of learning how to go beyond simple tasks on their computer
(word processing, internet browsing, etc.) because they're afraid they'll
break their expensive box and have to pay someone to fix it.

Most UIs today seem to help users avoid breaking things and typically have
some sort of notification if they'll change something that would break their
machine (e.g. UAC in Windows, padlock in OS X).

Even beyond this, general application development wouldn't pose these risks.
Hardware hacking certainly could, but any time you're dealing with the
physical workings of something that's an assumed risk.

~~~
diggan
I think "break it easily" does not refer to actually breaking the hardware but
ending up in a situation where you cannot come back to where you were before.
My mother frequently called for my help when she made an application
fullscreen and could not go away from it, and she considered the computer
broken at that point.

There is also numerous ways you can make computers unbootable by removing
files, installing the wrong thing that runs on boot and so on. I maybe
wouldn't say it's fragile and can break easily, but it can, and if you just go
around clicking on things, it's probably gonna break after a while.

~~~
gfo
That's a great take, didn't think of that. Thanks for your response.

------
smoyer
I've actually been pondering this problem for a couple of years - when
everything is based on FLASH, what do you do when something corrupts your
"base install"?

The VIC-20, C64, Timex-Sinclair and other computers of their day had the OS in
ROM that couldn't be changed. So you couldn't break these computers via the
keyboard (or other peripherals) but if there was a bug somewhere in the
firmware (which included the OS and BASIC) you had to live with it for the
life of the device.

You can get back to those days via emulation -
[https://vic-20.appspot.com/emus.htm](https://vic-20.appspot.com/emus.htm),
but there's nothing (that I know of) that's a physical device with these
characteristics.

~~~
sratner
You can protect pages on most flash devices (e.g. internal microcontroller
storage). Lock the bootloader and factory reset image, so you can still update
the running firmware, but also have a way to recover from any scenario with a
hardware button or sufficiently clever watchdog.

------
parsoj
Why aren't cloud, or even local, VM's being brought up? You irrevocably blow
up your linux VM... just spin up a new one - or revert to your snapshot from
earlier in the day. I'd actually make the argument that virtualization would
make it easier than ever to safely experiment with things. Complexity has
definitely gone up, and the time and ability to peacefully munch on problems
under a tree has definitely gone down. But, I would argue that if a learner
given the right sort of virtual sandbox, instructions on how to reset their
state to working order, and an internet connection with google, the linux
docs, and stack overflow bookmarked, there is a lot of learning to be had!

~~~
itomato
It's a Chicken and Egg Problem.

Bootstrapping an 8-bit Microcomputer is so much simpler than the first order
of operations in the hosted VM model.

------
taneq
> I suppose you could come up with some simple machine which plugged into your
> TV and let you experiment with a handful of primitives to get a feel for
> starting out.

A modern equivalent might be the Micro Bit
([http://microbit.org/](http://microbit.org/)) - it doesn't plug into a TV but
it does have a rudimentary display built in, it's programmable and it has
various sensors and I/O on it.

Also the entire Arduino ecosystem fills a similar niche, of a bare-metal-ish
platform which lets you jump right in and start coding things.

~~~
Angostura
They really don’t fit the same niche as the little machines that you could
plug into your tv and either load up some simple arcade games or produce your
own using Basic.

~~~
taneq
That's not really a niche, though, any more. It's almost cheaper to just put
an LCD on something than to interface with modern TVs.

If you want something more in that spirit regardless of hardware, maybe
PICO-8? It's basically a (virtual) minimalist 8-bit computer / gaming console
for the modern era.

[https://www.lexaloffle.com/pico-8.php](https://www.lexaloffle.com/pico-8.php)

------
mannykannot
The author asks whether her principles are relevant today, and I think they
are, at least for someone like me. Firstly, starting by poking around a small
issue that interests me generates motivation: as soon as I have made some
progress, I find lots of things that I want to look into more deeply.
Secondly, it gives me confidence that I can master the topic, an antidote to
the sinking feeling I get from looking at a multi-page table of contents.

------
forrestbaer
Does anyone make a c64-ish SOC with SID style sound limitations, HDMI out,
1/8" audio out, 8-16 GPIO pins, good documentation and development environment
(C/ASM)? I wish a simple (restrictive) mid-range system like this existed to
develop on as a hobbyist. The Pi is too much, arduino doesn't have the "fun"
(interesting sound chip, built in outputs, marketing/documentation).. Does
anything like this exist?

~~~
proppy
[https://ichigojam.net/index-en.html](https://ichigojam.net/index-en.html) ?

~~~
forrestbaer
Interesting!

------
noonespecial
Arduino is the spiritual successor she's looking for to the C64 era
experience.

It worked fantastically for my kids and brought back waves of nostalgia
watching them grok the basics like I did.

------
FearNotDaniel
Honest question: can anyone explain who this writer is, and why her posts keep
hitting the front page so frequently these days? There's no bio on the site so
I'm guessing there's some kind of Valley insider knowledge going on as to why
her opinions are considered significant. Sorry if that's a dumb question, I
just don't see the appeal of these musings and I'm wondering if I'm missing
out on some crucial context, other than the fact that she's old enough to
remember the 8-bit days.

~~~
BrissyCoder
Came here to ask the same thing. Thanks to to others in thread for providing
some context. Personally I haven't found any of the posts from this site to be
interesting/entertaining... but please link the ones people think are.

~~~
icebraining
The HN search is the easiest way to find the most popular submissions of a
certain site:

[https://hn.algolia.com/?query=rachelbythebay.com&sort=byPopu...](https://hn.algolia.com/?query=rachelbythebay.com&sort=byPopularity&prefix&page=0&dateRange=all&type=story)

~~~
JacobAldridge
Or you can click on the domain url from the frontpage (easier, though it
doesn't sort by popularity so is less helpful in that regard for larger sites)
-
[https://news.ycombinator.com/from?site=rachelbythebay.com](https://news.ycombinator.com/from?site=rachelbythebay.com)

