Hacker Newsnew | comments | show | ask | jobs | submit | ericssmith's comments login

How to Design Programs 2E


This book is much more than the intro to programming that it appears to be. It is a foundational approach for producing robust programs, regardless of your implementation language or level of experience.

Unfortunately it looks like the authors never completed the next book on the series about real time programming. How to design worlds. http://world.cs.brown.edu/1/

Thanks for pointing this out. I was interested to see they credit Paul Hudak's "Haskell School of Expression" for inspiration. I will have to take a fresh look at that given my newfound appreciation of HTDP (due to stumbling across Norman Ramsey's assessment of it). Also need to look deeper into Hudak's FRP now that I think about it.

"I loved the hieroglyph page; I think that functional programming failed for years as an engineering discipline because of the culture of that style of code."

I don't quite follow your observation, but if you are suggesting that this function would be better with verbose, Englishy naming, I think you are mistaken. Quite the opposite, the parametrically is better illustrated with hieroglyphs.

Regarding failing "for years as an engineering discipline", are you suggesting that imperative programming was "successful" because of the reliance on using naming to communicate intent in the midst of rampant state mutation and unbridled complexity?

I am 53 and started programming in 1977. I feel EXACTLY the same as you.


Me 3! 56, first prog job was in '87. Now I'm steadily employed in a dream job, working from home, writing mobile apps and exploring new "IoT" technologies. If this job ends, which all jobs do sooner or later, I'm hoping to starting my own company, finally, and ride that into retirement. But life continues to be an adventure and you never know what lies around the next bend in the road!!! Best of luck to you and to people of all ages in this amazing field.


Let's make it an even 4!

Just turned 50 yesterday. Wrote my first contract program in 1982 when I was 17.

Today I teach teams how to rock-and-roll. I also keep coding, but mostly as a hobby. It has nothing to do with programming -- I simply have too many irons in the fire to do all of the things I love as much as I want to.


I guess I might as well pile on. Turned 50 last year. Got my first paycheck for writing code junior year of high school (writing BASIC code on a Commodore PET!) Working on my fifth startup now.


I can't imagine all the technologies and skills you guys have acquired.

If I may derail the conversation a little bit, may I ask if/how you use the things you've learned today?

I mean, is it things like paradigms that have stuck with you like OOP/functional/w\e, or do you always structure your exception handling in a certain way, no matter the language? Are there skills/technologies you've been using since you've started? For example little scripts of basic that you've never let go off that automate things like setting up build servers and the like?


SCHiM: "is it things like paradigms that have stuck with you..."

For me it's always been about problem solving rather than simple fascination with some particular technology. If I can solve a problem with a keyboard macro in Emacs, great. If it requires Perl or Java or javascript, so be it.

I try to use the tool that's appropriate. If I can solve a problem quickly and move onto something more interesting, one-and-done.

If I have to stop and learn something new (e.g., WKWebView in iOS ObjC) to get my task done, so be it, and I'll put in some late nights to get there because at my age, I'm a bit paranoid about looking bad so I try to give the good folks at HQ no reason to doubt me. I spend a lot of time on Stack Overflow and Youtube doing concentrated learning.

But the real thing is what others have also mentioned: properly defining a problem that needs to be solved, proper communication, keep good records, and try to maintain transparency, honesty, and pleasant comportment at all times.

Honestly the older I get, the less I notice people's age. If someone half my age knows something I don't, then the way I see it, they have something to teach me. I've been to many conferences and watched many youtube tutorials where the teacher was very young (from my wizened perspective) but the information is why I'm there and that's all I care about.

What do I think about young people? (you didn't ask but I'm saying it anyway)

I love young people. They have so much spirit, so much energy and creativity. I keep hearing critical (snarky) things about millennials this, X-gen that. But I don't see it. The young people I've been around (for a while I was back in school full time, surrounded by 20-somethings and a few 30-somethings) were a joy. Fun, humorous, inquisitive.

Everyone has his faults, not least myself, and I believe as we get older we become more tolerant of others' faults and shortcomings. In fact that may be the single hallmark of growing older (apart from physical issues).


No more BASIC, but I still code in Common Lisp whenever I can. And I'm using some library code that I wrote when I was in grad school 25 years ago.

Once you grok Lisp, everything else is easy. You come to realize that the vast majority of what passes for "new technologies" is really just a re-discovery of something that exists (or is easily implemented) in CL. That makes it a lot easier (if a tad frustrating at times) to keep up.

EDIT: CLOS, and generic functions in particular, are a HUGE lever that no other language has co-opted yet.


Ha! I'm still using a piece of Lisp library code that I wrote as an undergrad 35 years ago :-)


You come to realize that the vast majority of what passes for "new technologies" is really just a re-discovery of something that exists (or is easily implemented) in CL

So much this. (From a soon-to-be geezer who's staring down 40 soon.)


> no other language has co-opted yet.

Check out Dylan, Julia, and my old hobby language Magpie[1].

[1]: http://magpie-lang.org/multimethods.html


I guess I should have said: no other mainstream language.


Julia has generic function but they are not defined in Julia itself nor is dispatch mechanism can be extended in Julia itself. CLOS is all about meta-circular semantics so I wouldn't include Julia in there (at least not fully)


Same with conditions and restarts.


GOOPS[0] in guile is definitely hugely inspired by CLOS, which also has generic functions with multi-method dispatch.

I highly recommend guile to anyone who wants the battery included nature of common lisp. I know that racket is also a great choice here, I just have little experience with it.


Racket's object system is largely inspired by Java. So is its statically-compiled module system (in which you can't change anything once it's been compiled, unlike a real Lisp), and Raco build tool. Programming with objects in Racket feels a lot like programming with Java objects in Clojure.


I'm not quite in the same age range, but north of 40, and have been doing this for > 20 years.

The tech changes - I don't use BASIC or Z80 machine code day to day. The skills that I can speak to, and the skills I see from others with this level of experience, are problem solving and communication, and secondarily the confidence that comes from having made mistakes. Few projects fail miserably solely due to technical issues and skill; in my experience failures come about because of poor communication (including documentation, but also speaking, reading body language, office politics, etc).


We younger folks are incredibly curious about what you've learned. Can you elaborate on the topics you mentioned or recommend other sources? Can you give tips on best practices?



Not sure I have any specific sources to share (others might) but after working with all types of hardware, languages, business realms and what not... you start to see patterns. Human behavior patterns, mostly; the same sorts of needs were there 30 years ago as are here today. People need stuff done yesterday, don't know how to describe what they want, oversold a client on something, etc. The numbers may be bigger, the gigahertz faster, and the memory far greater than people could have imagined, but the problems of translating what people say in to what they want via code is ... fundamentally still the same.

A 16k RAM module was ... $200 in my early days. Then a few years later I got 512k for only $150. Today I can get a 16 gig USB stick for $10, at at a local grocery store checkout aisle no less.

No doubt technology has changed and become more ubiquitous. Peoples' expectations of what's possible and what's normal are somewhat different today than 10-20-30 years ago, but the communication problems are still largely the same. Who's in charge of a project? What are the parameters? What defines "success"?

I'm not sure there's any real silver bullet here - most of the "mythical man month" stuff (from even before my time!) is still largely true (speaking in generalities because I can't think of every single line right now).

Learn to figure out how to communicate better - emails, phone, IM, paper, phone, f2f, etc - they all require different skills, and have different impacts on different teams.

Happy to answer more specific questions here or via email - mgkimsal@gmail.com - not sure if I'm just rambling now or not.


Back in the early 90s, I was doing mac68k programming. Very large Inside Mac volumes were pretty much the only documentation for anything - no stack overflow, no internet, no searchable docs. Almost everything was closed source, and many people wrote their own data structures. When programming Macs, if there was any kind of null value dereference or pointer problem the computer usually crashed and you had to reboot.

The main thing I learned was to be able to look at code and determine if it was going to actually do what I wanted it to do without having to run it through the debugger a couple times.

It's important to spend time thinking about and reviewing code when edit, compile, and test cycles start to get long. There are a lot of instances where that still applies, such as with multi-threaded programming, integration testing or long running jobs.

I also learned the value of having muscle memory for APIs. Sure you can look things up in a jiffy, but actually rote memorizing stuff that is used often can speed development up.


Interesting the pattern in this thread - I'm 57 and started in 1973!

There's plenty of languages I've used over the years that I have no use for today, or hardly even remember: Basic, FORTRAN, assembler, Pascal. None of those languages even had exception handling, so it's hard to say that it influenced my thinking on the subject, except by forcing me to be familiar with the alternatives.

Today it's C++ and C# that pay the bills, with as much Python as I can sneak in. Next up in the queue is Javascript, for which I'm admittedly overdue, but that one's entirely on my own time.


Just curious, as you moved from language to language do you think there were any patterns as far as the timing of your moves? Did you jump into new languages as they emerged or wait for them to reach a stable mindshare within your peers? Or perhaps waited until you needed to pick up a new language in order to pay the bills?


Basic was what I started with, as did a lot of people from that era and after, just because it was simple and available. FORTRAN I tried on a lark, then got serious with it when I moved to the University mainframe. That's where I picked up the first of many assembly languages, after finding out it was the only way to take full advantage of the OS and achieve the best efficiencies. It's also where I learned Pascal, which was an up-and-coming teaching language, and it soon became my go-to language for many tasks. Between Pascal and assembly I was good through about 1990, when my job required me to use a Unix server, and of course the natural language there was C. That led to C++, first on my own time then at my next job in 1997. There was a short period where I found myself applying OO concepts to assembly code! C# was a recent addition for a job-related task where it was an obviously better choice than C++.

Python is the interesting one of the bunch. I was exposed to it about 1995, when a coworker selected it as an embedded scripting language for our product. I didn't pay much attention to it at the time, as it was outside of my immediate responsibilities. It was selected again as an embedded scripting language at my next employer around 2003. This time I paid attention, and came to love it. It's the language that lets me turn my thoughts into results the fastest.

Thanks for asking the question, I've never stopped to think about my programming history in this much detail.


Thank you for a thoughtful, detailed response. It seems like a mix of job exposure and natural curiosity led you from language to language.

Throughout this thread, I've noted a distinct lack of dogma about everyone's evolution as a programmer. While I imagine there was some of it during the popular phase of many of these languages, it's nice to see mostly pragmatism in everyone's career journey.


I think the people who stick to dogma are the ones who don't make long multi-decade careers out of programming.


Here is the first program I ever entered into a computer.

1 8 + A 5 + 0 0 + 6 5 + 0 1 + 8 5 + F A + A 9 + 0 0 + 8 5 + F B + 4 C + 4 F + 1 C

I remember how much like some kind of incantation it was. I still get that feeling. And I've had the good fortune to work some fun, powerful, and interesting magic with those incantations.

I suspect that most younger people haven't yet seen their tech choices slip into decay and disuse. And they hang on tightly, hoping to expend less effort in learning as time goes by and they master that tech. It's a vain hope. The essential part of the experience is the underlying creativity, the joy of getting to the next ledge, and really understanding what a marvelous thing a computer is.


Apparently this is KIM-1 code (https://en.wikipedia.org/wiki/KIM-1).


I fell as you guys. I am very excited with technology since I started. There is so much new and interesting things to play and read that I have no time to do it all.

I can clearly that each year more and more interesting things appear. The speed is also increasing, in way I can't digest it all. It's difficult what to choose, to prioritize is the key to deal with it.

I always get myself wondering how much new things I will see next years.

I'm 30 now, and I started with 17.


I'll be 52 this November. Have been coding mostly as a hobby since '83, with peaks in programming intensity at different stages of my life: sometimes to scratch an itch and sometimes doing some contract/consulting work. Still try to hit the computer whenever I can, but my health's been letting me down lately - nothing too bad, but it's been affecting the focused time I need to do programming right.

So now when I sometimes think of my mortality, the thing I regret is the programming I can't do when I'm gone. Or maybe there is programming after death? I'm hopeful. :)

In the meanwhile, the free time that's opened up from not programming as much is being wisely invested. I'm working hard on my guitar, and before I go dammit, I'm going to shred like a pro!


Happy birthday! :)


57 here, programming since 1972. Yes, it is an incredibly exciting time. It's particularly great to see AI taking off after all these years.


Almost 51 and I'm only a couple years away from my 40th programming anniversary. I can't imagine doing anything else ... I get paid but it's not work for me!


I'm 30, started 1.5 years ago, and feel the same :)


You aren't dismayed by the direction some things have gone? Are you using Java by any chance?

Edit: seriously, what is your env like? Are you into windows, osx, unix, linux?


"If you haven't watched a 12 year old play Minecraft, you need to do so. It's the ultimate fantasy universe, and the way you combine elements to make something new is very analogous to programming in the vanilla game."

Watching my 5 yo work in Minecraft reminds me of the programmer "crunching entities" in the movie "The Zero Theorem".


I have used Scala for building business solutions since 2012. I had been maintaining PHP and Rails systems of 50k to 150k LOC prior to the switch. I still have to maintain PHP and Rails systems. The move to Scala was driven by the need for better performance, system resiliency, and to mitigate the effect of multiple developers coming in & out of a codebase. It turns out to be ideal for this.

Idiomatic Scala is not a grab bag of features or paradigms. It is firmly based on programming with expressions and using the type system to improve abstraction. Statements and mutable data structures show up when interfacing with external systems or for performance reasons. The Collections libraries are arguably "object-oriented" as that term is commonly understood, but this is not typical for application code. However, object-oriented features such as objects, classes, and traits are the basis of Scala's module capabilities.

For programmers coming from 20th century mainstream languages that are all based on programming with statements, mutable variables, and are arguably a thin abstraction over the machine itself, programming with Scala can be a challenge. If you want to do your familiar imperative style of programming, Scala is going to be painful. Use another language. But if you would benefit from a type system and good support for expressions, then Scala is possibly the best choice currently for growing business applications. We are not in 2005 any more.

Scala is not ideal as an introduction to functional programming. Other languages, such as F#, SML, OCaml, and Haskell have better, simpler syntactic support for features associated with functional programming.

People who dislike on Scala are probably either stuck in the past or don't have pain that Scala would solve.


By idiomatic Scala, perhaps he meant the expressive use of language in contrast to writing an application in Scala using the last century's programming habits.


There is such a thing as idiomatic scala? How? Care to elaborate?


It's the opposite of "Java without semicolons".


What timing. I'm currently reading "The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal" by M. Mitchell Waldrop. And when I got to the 1968 Fall Joint Computer Conference this weekend, I went and watched Engelbart's presentation. The book gives great context for the demo.

As a side note, I know quite a bit about the early history of computing, and I was amazed how carefully Waldrop navigates these murky historical waters. Highly recommended.


If you haven't, it's worth picking up "What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry" by Markoff and "Hackers: Heroes of the Computer Revolution" by Stephen Levy as well. The three combined come at roughly the same material from sufficiently different angles to complement each other quite nicely without getting too repetitive.


"A software monoculture at least gives reviewers a fighting chance of spotting issues with software that might affect the validity of results."

If I understand it correctly, this observation shows just what a short distance we've come in programming languages and software development since the 1960s. Verifying the correctness of a program shouldn't depend on good naming or being able to execute it in your head.


Why is he interested in programming? It must seem cool to him for some reason. I'd try to tap into that in order to find a direction to go in. And for a 14 year old, it would be helpful to find some heroes. Without the cool and the heroes, it may be hard to devote the time for it to be impactful.


Yes. Continuing beyond monads to learn about category theory is mostly because it is interesting. It is definitely not a prerequisite.

The historical relationship is that Eugenio Moggi wrote a paper in 1988 "Computational lambda-calculus and monads" http://bit.ly/1x9uUHQ that used ideas from category theory to handle the semantics of programming language features. His problem was that the lambda calculus itself failed to adequately handle certain "notions of computation" such as side effects and non-determinism. This inadequacy goes way back to the origins of programming language semantics, starting with Peter Landin's 1964 paper "Mechanical evaluation of expressions" http://bit.ly/1rrBit3. He described a virtual machine (SECD) for evaluating lambda expressions, but had to add special constructs for handling assignment and jumps. In case you don't know what semantics is about, imagine that you had two small programs written in two different languages. How would you determine if they were the same? One way would be to demonstrate their equivalence to a common, mathematically sound, language. That's what Landin used the lambda calculus for. But it's only good for a subset of what we understand computation to be (particalurly what a Von Neumann computer is capable of doing).

That's what prompted Moggi to look to a more encompassing branch of mathematics -- category theory -- to describe semantics. In 1991, he wrote "Notions of computation and monads" http://bit.ly/1viXT6z, which is much shorter and more accessible, but essentially the same as the previous paper. Philip Wadler, one of the original authors of Haskell and long-time functional programming researcher and educator, was inspired by Moggi's work to write (several times) a tutorial showing how Moggi's "monads" could be applied to the restrictions of functional purity. For example, to simulate the incrementing of a global variable representing a counter (of some operation), one could use a "state" monad.

The monad of Wadler (and Moggi) is really pretty simple. First, you need a way of representing one of these notions of computation. In a programming language, this is done with types. Or to be more precise, a 'type constructor' since you will want the computation to be general. For instance, if your computation changes state (eg the counter above), you might want to construct a function type that given a value, takes a state (eg integer representing the counter variable) and returns a tuple consisting of the value and some new state. I should point out that my last sentence is what makes monads so confusing and challenging in languages other than Haskell. They don't have a nice way to represent what I just said. To use Wadler's notation:

type T x = S -> (x, S)

The second component of a monad is a function that takes a value and returns a "computation" (ie, the type described above). And the third component is a function that takes one of the computations and a function from a value to a computation and returns a computation. That's a mouthful, and it requires the language to support polymorphic types and higher-order functions. If your language does not or doesn't provide good syntax for them, then monads will be an elusive and difficult concept. But as you can see, a monad is just a pattern made up of these three things. That's all ... almost.

Because of the relationship to category theory, a monad (consisting of the three components) must also obey certain rules for how these operations are expected to behave when combined in certain ways.

In fact, monads are pretty cumbersome to implement, especially when there isn't good syntactical support in a language. But they provide a general solution to certain kinds of problems which is functional in nature.


The following two books are a fascinating look at Heaviside's contributions and extraordinary life:





Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact