

ASM - steveridout
http://grumpygamer.com/8632893

======
GuiA
I had the exact same experience, only 25 or so years later.

I learned how to program in Basic on a Casio25 calculator, and later on a TI82
(the Casio couldn't do key polling, and my parents wouldn't buy me a new
calculator just for programming. So I broke it on purpose to get a new one.
Sorry mom! I later got a TI-83, while all my friends had overpowered TI89-92;
the underpowered nature of the TI-83 felt more romantic and elegant to me.)

2 friends and I got really into programming, making tons of games in BASIC. I
made a grid based SimCity clone, a turn by turn clone of [1], a very slow
dungeon crawler inspired by Diablo (my parents were fairly antivideogames, so
my way of experiencing the games I wanted to play was to program clones of
them), a flexible "engine" for text adventures (I was really into pen and
paper RPGs but none of my friends were into it), etc.

Then one day we discovered assembly, and our mind was blown. We would sneakily
use the library computers to print z80 reference sheets; it was awesome.

Middle & high school were awful times all around for me, but this definitely
made it better.

[1]:
[http://en.wikipedia.org/wiki/Star_Wars:_Galactic_Battlegroun...](http://en.wikipedia.org/wiki/Star_Wars:_Galactic_Battlegrounds)

------
moccajoghurt
>I started to realize that assembly language was real programming. BASIC was
just an imitation of programming.

That's exactly why I couldn't be happy if I only knew some high level
languages like Java / Python / C#.

After learning an assembly language and C I feel much more comfortable using
any other language. You know the real thing.

~~~
ajross
There is no "real thing". Now you'll never be happy until you know you're
filling as many execution units as possible, or that your loop targets align
on cache boundaries, or that you stuff as many ops as possible across the
likely-to-be-in-DRAM loads. Then you'll learn Verilog and start worrying about
starvation issues in your own pipelines. Then you'll get distracted by a
coffeescript REST app and worry about getting your MongoDB replication set up
robustly.

It's all good. Some people are happy in a single niche, some never are.

~~~
bm1362
Ah, yes. In a parallel programming class at the moment and now I can't stop
thinkin about cache boundaries, volatility, weird efficient instructions to
use.. Is this false sharing and throwing out the cache line? Where can I use
rsqrtf? Am I utilizing the SM fully? Is this busy-wait killing the memory
controller?

It goes on forever.

~~~
stelonix
Both of you talk as if generally, people never worry about such things. It
seems unusual to me. Is it because the first language I learned was 65816
assembler?

~~~
ajross
People never worry about machine-level micro-optimizations in exactly the same
that people never worry about MVC web app frameworks, which is to say "most
people almost never" for some appropriate metric for "most" and "almost". It's
a niche. My point was just that some people are happy in just one, while
others (the poster I was replying to, and myself among them) aren't.

------
RyanZAG
Sure, and this lasted right up until we got GPUs, where you can take advantage
of something even faster than general purpose ASM - specially developed
hardware. The death nail in using ASM was the library. You only need to
develop your ASM optimization once into GCC or a special purpose library, and
then you can just happily re-use that from a maintainable language.

~~~
stelonix
Not really, code in low-level GPU shader language (I believe that's no longer
allowed though) and compare it to HLSL. Compare intrinsics speed with inline
assembly.

Sadly, we live in a world where the _lie_ that compilers can optimize better
than humans is widely spread. The reality is, compilers can optmize better
than your everyday-javaguy (the kind of person who believes you can just throw
more processing power at a problem while the programmer just needs to abstract
ad nauseam).

I like using frameworks, libraries, high-level languages as much as anyone in
here, but I _know_ ASM will always be faster, just not always suitable for the
issue.

~~~
IsTom
It rarely matters enough to justify amount of work needed.

~~~
EliRivers
That's been true since the first compiler. There are always going to be people
who want the extra speed and are willing to work for it.

------
pgsandstrom
I have had a similar experience: compiling my little text adventure in Liberty
Basic took almost a minute. When my older brother told me that assembler
DOESN'T NEED TO COMPILE, I was completely awestruck. I did write a text
adventure in assembler, but it didn't have the same slick "windowy" feel that
I was so in love with 15 years ago. But I returned to LB as a better
programmer.

~~~
VMG
but assembler does need to compile

~~~
stephencanon
No, it needs to be assembled.

~~~
jrajav
Translating assembly to machine code is a trivial compilation, but technically
it is still compiling.

Compiling is just translating source code in one language into code in
another.

~~~
ctdonath
Long-standing convention is assembling != compiling.

You touched on the reason: compiling is translating one language to another.
Assembly code is not changing language, just replacing symbols from human-
convenient to machine-convenient, more like encryption.

You wouldn't consider Pig Latin a different language from English, it's just a
straightforward mangling thereof.

~~~
dkersten
_just replacing symbols from human-convenient to machine-convenient_

Its not quite a one-to-one mapping though.

For example, there are assembly instructions which map to more than one
opcode, depending on how it is used and opcodes can have various flags that
change its encoding (prefixes and such), like the MOV x86 instruction. Beyond
that, it is also rare to find assembly code which does not make use of macros
or assembler-implemented pseudo instructions.

[http://stackoverflow.com/questions/2546715/how-to-
analysis-h...](http://stackoverflow.com/questions/2546715/how-to-analysis-how-
many-bytes-each-instruction-takes-in-assembly/2761248#2761248)

~~~
CountHackulus
For nearly any other assembly language, like say System 390 or PowerPC, it's a
straightforward 1-1 mapping. Macros and pseudo instructions more fall into the
real of text pre-processing than compiling.

~~~
aidenn0
Nit-pick Power is many to one, not one-to-one (there is no "shift" machine
code, but there is a rotate-and-mask machine code that shift operators are
translated into).

~~~
CountHackulus
I've written part of a compiler for Power and never used the shift machine
code, maybe it's just the setup I was using, but I use rlwinm[.] all the time.

------
bajsejohannes
I had a very similar experience. I was filling the screen with white pixels
with PSET in Basic. Just a tight loop setting each pixel. It took a handful of
seconds.

When I rewrote it in assembly and the screen was instantly filled with white
pixels, I couldn't believe it was true. You just can't set pixels that fast! I
went back to "debug" my assembly code by setting the pixels to different
colors to convince myself that the assembler didn't somehow "cheat" by just
setting the background color to white. And sure enough, assembly just turned
out to be orders of magnitude faster.

~~~
NewAccnt
Exact same experience here. QBASIC allowed you to put ASM code in line with
BASIC, like so.

    
    
      y=199  
      x=319  
      c=15  
      def seg = &ha000 + (&h14 * y)  
      poke x, c
    

Of course, functions like CIRCLE were already much faster since the entire
function was already in ASM, instead of having to go in and out of the
interpreter for each pixel. You'd wrap the above code in a SUB routine and
replace all your PSETs with it. I even coded my own sprite maker that used the
mouse in a similar manner and made a pretty impressive looking Mario clone
with the appropriate functions I learned in math class. Pythagorean theorem
was very useful when working with grids. I didn't understand why everyone
wasn't doing the same thing and why everyone hated math.

Imagine the teachers surprise when on the first day everyone is learning FOR
loops and I'm making a RND bump map with directional lighting. This was 1997
so we were still using Yahoo to look up the ASM offset codes, which worked
just fine, you'd just have to be good at making your queries or you'd have to
dig through a couple of pages of results.

------
cpressey
_To this day, I find myself deleting comments or whitespace under some
misguided pavlovian notion that my code will run faster._

Ouch!

btw, I think "fill the screen with @'s" must be the "Hello, world!" of
Commodore machine-language programming. (Back when you could call it "ML" and
not get it confused with a functional programming language, too.)

~~~
spacemanaki
That's funny, I always get confused and disappointed when hearing/reading
people talk about "ML" only to realize they're talking about "machine
learning" and not Milner's ML ... I hadn't thought there might be yet another
overloaded definition.

------
raverbashing
This is very interesting

I started with the MSX. Alas, the literature was scarce, but I could play some
tricks.

I couldn't go too far with assembly though. 1 - see point above. 2 - There
were several ways of doing a "system call" but I think one way only worked
into "raw programming" (that is, running from basic) another one if you had
DOS.

Unfortunatelly I coudn't make that work.

Another thing is that the VPU was not memory-mapped (maybe on MSX2 it was),
you could do 'text mapping' sure

One nice thing I remember was reading bitmaps for the font in the memory and
then printing it as a sequence of (8x8) characters

------
zwieback
Exactly what it was like for me, except Apple ][ (6502 FTW!).

I think part of the excitement was also the limited access, there weren't that
many computers and you had to wait for the next issue of Byte (or c't, in my
case). I spent many hours at the magazine racks reading computer magazines I
didn't have the money to buy.

Having said that, things are great now too. The low-level stuff you can do on
a $20 HW platform are fantastic.

------
aidenn0
Okay, I swear I've read the exact story of filling up the screen with @ signs
before, but it wasn't on Ron Gilbert's site.

~~~
JamesAcorn
I was thinking the same thing; my initial thoughts were Jeff Minter or James
Hague, but Google is not showing anything relevant.

It might simply be that a common first-time benchmark exercise for programmers
moving from BASIC to assembler is to fill the screen with characters or
pixels. I have a sneaking suspicion I did the same too.

~~~
aidenn0
The story I remember took place in the UK.

------
coldpie
I am so done with websites with monospace fonts for body text. Please put your
natural language content in a format that is pleasing to read, or I'm not
going to bother.

~~~
mhd
According to the CSS it's '"Lucida Grande", "Lucida Sans Unicode",
helvetica,clean,sans-serif;', so this might be a browser/setting kerfuffle. It
works fine for me (FF/Chrome/IE on Win7).

