Don’t know if that sounds like a lot or a little but the answer is yes. If you watched the youtube video, yes it was hard do that and took a long time.
5000 lines, but if I recall correctly there was no way do something fancy like this using only one line of code:
x = 1000 + 1000
I don’t know what programming courses bother teaching, or even mentioning a line of code like this because, what’s there to say?
I’m not even sure how to put it in perspective to a person under 200 years old that, I had to spend a pretty good amount of time to learn enough to do that (add together two numbers with more than 8-bits each).
And that even the variable x is an oversimplification, because there were no variable names. You had a CPU, a few supporting chips, and some bytes of memory to directly throw instructions at or move stuff back and forth between.
Why would anyone have been interested in doing that?
Not sure, somehow seemed exciting at the time. Should the interest question have a different answer for today’s software when we look back from the future?
Or, are we going to be scratching our heads again? Why did it take them so long to generalize AI they might as well have worked farming the fields?
Can’t recall how much easier it became when you had a “Macro Asseembler” (which I guess most professionals used from day 1 but not always people who got these systems as a Christmas present). Just that finally having one felt like getting some kind of new luxury car.
Yes I know.
And it was uphill both ways to school, and my lawn, and so forth. My apologies.
Don't despair. It's not the merits of your "product" being judged. It's just the wrong audience (mostly). Colleges now don't even bother to teach two's-complement. Of course today's "coders" won't get it. This is a generation that has never had to count cycles, get timing just right, or worry about running out of memory. The idea of paying $1000 for a C compiler or $500 for an assembler (in 1985 dollars!), is almost unimaginable now. I remember the excitement of a 1200 baud modem -- my initial response was "wow! The text is coming faster than I can read it!"
The systems were primitive, sure, (but at the time they were amazing!) but this challenge was met with creativity. What the games lacked in photorealism they surpassed with "fun."
Programming was learned by doing. And most of the folks I met were amazingly generous about sharing techniques. It was like Jazz musicians sharing a cool lick, with others. There were no books, really.
Coding in assembly is like juggling. It takes a little effort to get all the balls in the air and get the rhythm going, then it goes until you hit a roadblock and it all falls down. You plan, think, and start again.
>>somehow seemed exciting at the time
For you, me, and thousands more, it was absolutely magical.
Thank you for sharing this. I'm reading it now and found myself smiling.
Glad to see your reply though, as it somehow makes the nostalgia seem less like it was all just a dream.
I think you may have had a rougher go of it being on an Apple, I mostly did Commodore stuff. Didn’t the Apple have the nightmare framebuffer layout? Something like 7-bit increments of pixels across a scanline instead of nicely aligned bytes, or some similar bit of graphics insanity?
I learned two’s-complement in college. And at the least: one’s-complement, sign-magnitude, BCD, IEEE floating point. And arithmetic on same.
Which colleges are you referring to?
This is is just a guess, but the guess is you didn’t learn any of that in CS101. Sounds more EE related. Of course there’s often no clear dividing line between the two.
Regatdless that’s assuming a full CS/EE undergrad program in either case. There are tons of people who takes courses, have careers, and all those boot camp style programs.
Any way people can learn and develop skills is good, so I don’t suggest any of the above as lesser ways to learn.
I’m just unaware of the concepts you mention being taught in typical 100-200 level courses that have emphasis on software programming or computer science.
The Basic language was great and easy, loved it. However like yourself I wanted to makes games. It wasn’t like you had to learn 6502 to get a mere 50% performance increase.
Doing anything with pixels seemed a couple of order of magnitudes slower so it wasn’t really optimizing just required.
The code for this game is really readable and easy to pick up after all these years of not doing any 6502 coding. Any 6502 wasm targets?
Shameless plug: I've started writing easy to integrate 8-bit chip emulators in standalone C headers a little while ago:
For instance used in these WASM emulators:
(and see https://news.ycombinator.com/item?id=18379387 to automate a bit the whole process )
For example, it doesn't count document-type files by default; you need to modify `.gitattributes` files to override that.
The Crazee Rider software seems to lack a license at all. Thus no permission is granted to engage in the freedoms of free software.
Fort Apocalypse is also nonfree; it is distributed under a nonfree license -- http://creativecommons.org/licenses/by-nc-nd/2.5/ -- that even the license's author recommends against using for software per https://creativecommons.org/faq/#can-i-apply-a-creative-comm... .
Are there any collections of source code for commercial games for the C64 ?
I started programming with assembly, and evem though today I rarely use it, any excuse I can find, I’ll dig into in.
No question, modern languages are more efficient, but knowing exactly what instructions are being run and how they interact with the hardware (a physical object), is so much more aesthetically pleasing.
Makes me happy to hear folks are still hacking bits in 2018.