Hacker News new | comments | show | ask | jobs | submit login
Programming Quotes (cat-v.org)
265 points by chauhankiran 11 days ago | hide | past | web | favorite | 117 comments

Missing the Ninety-ninety rule?

"The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time."

 Tom Cargill, Bell Labs


NB There should probably be a 90-90-90-90-<recurring> rule... could called the Y-90 rule!

Probably because the "and the last 10 is the other 90" quote has existed longer than programming has. It's borrowed.

"I should have run off with that hippie girl and started a homestead in Alaska."

Me, every fucking time I look at code.

In "The Soul of a New Machine," there is an engineer who spends months debugging nanosecond-level glitches in their new CPU, snaps, and runs away after leaving a note: "I am going to a commune in Vermont and will deal with no unit of time shorter than a season."

There are many days where this sounds like the next logical step in my life.

It is funny though. Look at all the physicists and mathematicians in that quote list. Positive, enlightened, a little snark maybe, but generally, moving things forward for the betterment of humanity.

All the programmers, "embrace the suck and the light at the end of the tunnel is an oncoming train".

I picked the wrong career.

As an software engineer, NO you did not (unless you actually suck at it). Part of the problem with software dev as a profession is that often the people responsible for resource allocation are out of touch and the overall process is complex enough that you can acquire a lot of technical debt for a short term gain. this makes bad managers look good while creating a lot of problems in long term. Since acquiring technical debt does not really breaks anything in short term its really hard to create a budget line item that allocates resources to it. this continues until something gives and then you have a crisis somebody can fix and take credit.

Until there are leaders 'woke' enough to proactively do the right thing I dont see anything change in computing industry.

Lol, dude, I'm 31. I've been doing this for around 10 years now. And I'm alright at it :P

Have you seen how happy homesteaders are?

On a serious note. I'm aware. That's why I did my own company with a friend. It cuts a lot of bullshit out I don't have to deal with anymore. As (essentially) a contractor, it's easy to tear people a new asshole in a meeting and get away with it. I can't imagine an employee being able to.

First off, people need to stop worshiping the ilk like Jobs, Musk and Bezos. They're just money makers. Just like Rockefeller, Morgan and Carnegie. None of them are visionaries as all their ideas are easy to spot from history or other people. Like the hyperloop... just about 80 years too late to call that one unique. Ooooo you put colored pieces of plastic on a computer and made a small digital music player. The Zune was WAY better! The other guy just sold books and moved on to other shit. Congrates on the better mouse trap. But it's not like they did it themselves.

Maybe a homestead on a tropical island... I like coconuts.

I think you're trivializing their accomplishments just a tad.

And then you meet network engineers and discover what true despair looks like.

I like this one:

"Debugging a program is twice as hard as writing it in the first place. So, by definition, if you write the program as cleverly as you can, you will not be able to debug it."

- (Maybe by) Brian Kernighan.

… and/or P J Plauger.

“Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?” in The Elements of Programming Style (p10 in the 2nd edition).


I've heard that's a good book, as is Software Tools by Kernighan and Plauger.

Missing what I think is the most important quote:

"Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won't usually need your flowcharts; they'll be obvious."

- Fred Brooks

My favourite as well!

> If you’re capable of understanding `finalised virtual hyperstationary factory class', remembering the Java class hierarchy, and all the details of the Java Media Framework, you are (a) a better man than i am (b) capable of filling your mind with large chunks of complexity, so concurrent programming should be simple by comparison. go for it.

> ps. i made up the hyperstationary, but then again, it’s probably a design pattern.

        — forsyth

Reminds me of a great deal of programmers in the banking world, especially those who used spring. Their software often failed, but their knowledge of Java design patterns never did!

I grouped XML quotes to share with my manager:

The essence of XML is this: the problem it solves is not hard, and it does not solve the problem well.

        — Phil Wadler, POPL 2003
XML is like violence: if it doesn’t solve your problem, you aren’t using enough of it.

        — Heard from someone working at Microsoft
XML is like violence. Sure, it seems like a quick and easy solution at first, but then it spirals out of control into utter chaos.

        — Sarkos in reddit
Most xml i’ve seen makes me think i’m dyslexic. it also looks constipated, and two health problems in one standard is just too much.

        — Charles Forsyth
Nobody who uses XML knows what they are doing.

        — Chris Wenham

That page misses my favorite XML quote, which is also on cat-v at http://harmful.cat-v.org/software/xml/:

XML is a classic political compromise: it balances the needs of man and machine by being equally unreadable to both.

  - Matthew Might

What are some valid complaints about XML? I was talking to one of the older IT guys at my company a while ago, and he was all about it because it made serializing data structures very simple. I'm not sure if that's a valid use case or an example of the kind of monstrosity that XML-haters hate.

If all you need to do is serialize data structures, we solved that problem 30 years ago—including all the schema-validation and querying stuff—with ASN.1 (and then re-solved it with Protobufs and Thrift, and solved it again in half-assed manners with JSON, YAML, etc.)

XML is a markup format, a regularization/minimization of the syntax of SGML. It's good at being a markup format—the paired named open+close tags allow for corrupted-stream repair in a way that e.g. Markdown just doesn't. XML is great in, say, DocBook.

But XML gets used for pretty much everything except actual markup. And for everything else, it's not solving those problems well.

It's very complicated. Leading to severe security problems, which means you basically should avoid any untrusted client supplied xml, if you can at all avoid it. It is a step backward from s-expressions, being harder to parse for both machines and humans whilst being less expressive. Whitespace handling is comically bad. It doesn't have a working comment syntax. The list goes on. The only excusably bad thing about xml are namespaces. The design is not a success, but it springs from a list of desiderata which look quite sensible on first sight.

XML conflates a bunch of stuff (XSLT, XPath, Schemas, namespaces, etc....) making it overly complicated

Then it has 3 different sets of data, elements, attributes, content. In other words <a c="foo">bla<b/>bla</a> is mess to deal with. Now replace the "bla" with spaces and assume <a> is supposedly only supposed to have <b> as a child except really it will have 3 children, the one before the <b> and the one after.

I think this is why JSON won over XML. It seems like all those extra features on XML would be a benefit but in reality they just give you more places to hang yourself.

It does not actually solve a problem. Ok, I cannot disagree with derefr's comment here regarding text markup. When it comes to serialization... it does not define how to serialize data items. But it has different mechanisms to group them: attributes and nesting. Nesting is usually wrong (one reason is it only works for 1:n relations, so you need named cross references (a.k.a foreign keys in a saner world) anyway). And there are multiple obligatory escaping schemes (quoted attributes, body text, and maybe we could also count tag names and attribute keys).

The hierarchy stuff is so complicated and wrong that I've personally never witnessed a collegue actually using a data schema. Which means data does not actually get validated.

On top of that it is near unreadable. Compare to something simple like http://jstimpfle.de/projects/wsl/world_x.wsl.txt

The whitespace situation is really bad and makes it all also not canonical at all.

I have to second this, I do not understand why XML receives so much hate. Sure it is verbose und editing it without, at the very least, good syntax highlighting can be a bit of a pain. And namespaces. But just use a good XML editor and the pain is gone. And in return you get a mature and powerful ecosystem where communicating data formats and validating data with XSDs or transforming data with XSLTs is done easily. And you have to write almost no code if your language has a decent XML library. Serialization and deserialization is usually easy and there are tools for generating matching classes for your schemas. I am not really following the development of other serialization formats like JSON, but as far as I can tell at least the JSON ecosystem is essentially coming up with analog ideas like JSON schema to solve the same problems that XML has already solved.

I find XML good for configuring pipelines since there's a clear "parent" node. Most other things I use JSON for. In other words, XML is good when a human is going to be editing it, but the ROI isn't enough to make a GUI for.

I don't think it's true that XML - in itself - makes serializing data structures very simple. An easy to use XML library might. But that library could use any structured data file format at all, and still make serializing data structures very simple.

Years ago there was a list of top 10 lists I had printed out, but I haven't been able to find them. One of the lists was "Top 10 signs you're a Microsoft programmer" and #1 or 2 was something along the lines of: "You think human teleportation will eventually be possible, and XML will be the transport."

“Walking on water and developing software from a specification are easy if both are frozen.”

Here's one I've seen around:

"Often a few hours of trial and error will save you minutes of looking through manuals."

Seems to be a corollary of Frank Westheimer's "A couple of months in the laboratory can frequently save a couple of hours in the library".

Hrm, my experience is quite the opposite. I can read a 2000 page manual which will take weeks because there's no way I can concentrate on all of it and where I probably still won't actually find the answer to my question which is an edge case OR I can just try the edge case and see what happens.

If I knew exactly where to look in the manual then the quote might fit but it rarely does.

Love it! Do you happen to know who said it?


Part of why I like Golang as I interpret beauty as readability.

"Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity."

        — David Gelernter

One of my favorites:

"If you know what’s in it, you don’t know when it will ship. If you know when it will ship, you don’t know what’s in it."

There's nothing more permanent than a temporary fix.

Basically all the really useful code I've written

If you'd like to be drip fed stuff like this, we've been running a programming quotes Twitter account for several years at https://twitter.com/codewisdom

LOL, 40 years later and some things never change :

“The most effective debugging tool is still careful thought, coupled with judiciously placed print statements.”

- Brian W. Kernighan, in the paper Unix for Beginners(1979)

This is fantastic. Whenever I'm at work, I always look at deltas on commits to judge personal success. The more lines removed, the better!

> Deleted code is debugged code.

        — Jeff Sickel

> The more lines removed, the better

I programmed Perl for a while so I like to make this more about _characters_ removed rather than lines :)

Some people, when confronted with a problem, think “I know, I'll use regular expressions.” Now they have two problems.

Everyone post your favorites.

> What's wrong with perl?

It combines all the worst aspects of C and Lisp: a billion different sublanguages in one monolithic executable. It combines the power of C with the readability of PostScript.

> To me perl is the triumph of utalitarianism.

So are cockroaches. So is `sendmail'.

        — jwz [http://groups.google.com/groups?selm=33F4D777.7BF84EA3%40netscape.com]*

 ~ and ~
If the designers of X Windows built cars, there would be no fewer than five steering wheels hidden about the cockpit, none of which followed the same principles – but you’d be able to shift gears with your car stereo. Useful feature that.

> but you’d be able to shift gears with your car stereo.

Many modern cars get pretty close.

"If debugging is the art of finding bugs, programming must be the art of creating them."

"Given enough thrust pigs will fly" -- I will use this for my next code review. Gold.

Missing the infamous "It compiles! Ship it!" quote.

For something like rust or haskell it's not outrageous. But yeah...

It's not outrageous to hear from a Rust or Haskell fanatic, in any case.

Given that a subtraction will easily compile where an addition should have been coded, so the content of the remark is outrageous.

I'm sure if you _just_ use the right experimental language extensions and strangle your program under a mountain of indecipherable nonsense, performing arithmetic at the type-level, you can trivially prevent that though!

This is the greatest sarcastic comment I've seen all year. You nailed the Haskell zealot to a tee. Meanwhile the Forth zealot is getting IBS just hearing about the code complexity from Haskell and all the myriad of language extensions to make it closer to actual mathematics.

Don't get me wrong, Haskell is really really cool and some brilliant coders use it. One day I'll actually make it through Haskell book.

> arithmetic at the type-level

See, the whole type-level and value-level is a false dichotomy, that's the whole idea behind dependent types. There shouldn't be a difference between a type and a value.

A (data) type is a set, and a value is an element in that set. So you're arguing that set versus element is a false dichotomy, which is preposterous. Even a set which contains just one element is still distinct from that element itself.

Rather, the whole idea behind dependent types (if dependent types can be glibly summarized by a "whole idea") is that logical predicates give rise to sets. E.g. "all integers that are squares" or "prime numbers". Or all points <x, y> such that x^2 + y^2 <= r^2.

A proposition is not the same thing as the domain of that proposition's parameters; but it does denote a subset of that domain.

A dependent type can contain a large number of values and that will often be the case.

If we have a situation in the program where we must calculate a value that is the member of some dependent type that contains billions of values, and only one values is correct, there is the threat that we can calculate the wrong value which is still in that set. That dependent type will not save our butt in that case.

We have to isolate the interesting value to a dependent type which only contains that value. "correct value" is synonymous with "element of that set which contains just the correct value".

Still, that doesn't mean that the set is the same thing as that value.

A big problem with dependent types is that if the propositions on values which define types can make use of the full programming language, then the dependent type system is Turing complete. This means that it has its own run-time; the idea that the program can always be type checked at compile time in a fixed amount of time has gone out the window. Not to mention that the program can encode a self reference like "My type structure is not correct". If that compiles, then it must be true, but then the program should not have compiled ...

> A (data) type is a set, and a value is an element in that set.

Nope, a type is not a set. Type theories are separate formal systems designed to be (among other things) computationally aware alternatives to set theory, and they are defined by "terms", "types" and "rewrite rules". The whole point of a dependent type theory is to be able to encode things like sets and propositions in the same formal system.

The rest of your comment is wildly off based on this misunderstanding.

What the GP was probably referring to is the persepctive of dependent types from the lambda cube, where you have three different extensions to simply typed lambda calculus's regular functions from value to value:

    1. Parametric polymorphism, which are functions from type to value
    2. Type operators, which are functions from type to type
    3. Dependent types which are fubnction from *value to type*
Since dependent types allows functions from value to type, it kinda erases the artificial barrier between type-level and value-level. In reality it doesn't really erase it, somuch as replace it with an infinte hierarchy of levels to avoid Girrard's paradox.

They fact that they can be conflated doesn't mean that they should.

No, they really should be since (1) it gives more guarantees about your program in compile-time and (2) lets you express more assumptions about your program to your compiler (3) for free.

Perhaps you should instead produce a counter-example, because as far as I can tell nothing you've said is actually true.

Their (3) is the only debatable point (aside from the inherent subjectivity of statements involving the word "should", which applies equally to you), since dependently typed languages tend to put a lot of the burden of proof onto the programmer. The other points are practically true by definition.

If types and values were identical, there would be no need for a runtime in any program, and that is simply not possible. There are many things which can only be modeled as values and not as types. (Randomness, for instance.) Dependently typed languages allow a little bit of value-level feedback into the type system, but they certainly do not eliminate the distinction.

And certainly none of this is free. Indirection has costs, and trying to model the entire behavior of a program in a compile-time system is going have costs. Some of those costs will push solvable problems into an unsolvable state, and then things break down. (For another example of this: software can try to emulate hardware, but there are things that hardware can do that software cannot and vice versa, so it wouldn't make sense to say they are identical.)

>The other points are practically true by definition.

"More guarantees" and "more assumptions" are not definable terms. You can't count the number of theorems that need to be true for some model to be accurate.

You seem to be operating from an overly narrow perspective. Type theories were invented for mathematical logic and only co-opted for programming later. It doesn't make sense to try to prove things about type systems by referring to a runtime. The rest of your objections show similar misunderstandings of the question. While I said myself a dependent type system is not free, its costs have nothing to do with "indirection", whether you mean type system complication or pointer chasing (the latter is more likely to be reduced). Randomness is about computational purity, which is nearly orthogonal to the expressiveness of your type system. I'm not completely sure what you're trying to say about hardware, not I'm pretty sure it's a red herring, at least from the theoretical perspective relevant to characterizing dependent types.

The Wikipedia article on dependent types is a pretty decent survey: https://en.wikipedia.org/wiki/Dependent_type

I don't know what you're trying to say. The point I am trying to refute is "the whole type-level and value-level is a false dichotomy", and the existence of dependent types do not support that claim, they only provide a narrow domain where the distinction is blurred.

> Randomness is about computational purity, which is nearly orthogonal to the expressiveness of your type system.

If types and values are to be considered identical, then random types should be just as useful as random values. They are not. That alone should indicate that values and types have a well-motivated distinction.

Additionally, dependent types offer no additional capabilities, only convenience. Having a type defined in terms of a value communicates nothing more to a compiler than would a functionally equivalent assertion, though it would certainly be easier to use and work with in some situations.

I'm not trying to say anything about dependent types here. I'm trying to say that the goal of producing a language which makes no distinction between type and value is neither useful nor possible.

To be quite blunt, what I'm saying is that you don't have enough context to have an informed opinion on the relationship between dependent types and the distinction between types and values. It's pretty clear you don't have a clear idea of what a dependent type system actually is. People who study type theory and logic are pretty much on the same page about this.

On the same page about what, exactly?

"Dependent types were created to deepen the connection between programming and logic. {clarification needed}"

"If it compiles it runs" does not mean that the result will be correct.

I don't think the person you're responding to is implying that.

Rust eliminates entire classes of bugs at compile time that plague other languages, usually out in production. "NoneType has no attribute 'enough'", "KeyError: this dictionary never has the key you want", etc. Multiple times per day I'll have a website fail to render simply b/c I blocked cookies (and they try to use localStorage, and the call fails, the failure isn't handled, and it bubbles all the way out, crashing the site).

Sure, unit tests could help here, but that's the thing about a typechecker: it's a unit test (of sorts) that you can't avoid (which prevents laziness), and I feel it is easier in the long run (writing the equivalent unit tests in Python requires more work). IME, laziness usually wins out, and sloppy code winds up in production.

> {ajh} I always viewed HURD development like the Special Olympics of free software.


Obligatory list of some good quotes that are not included:

Rules of Optimization: Rule 1: Don't do it. Rule 2 (for experts only): Don't do it yet. - M.A. Jackson

Never test for an error condition you don't know how to handle. - Steinbach

Choose mnemonic identifiers. If you can't remember what mnemonic means, you've got a problem. - perldoc perlstyle

Every program has (at least) two purposes: the one for which it was written, and another for which it wasn't. - Alan Perlis

Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration. - Stan Kelly-Bootle

"All code is technical debt; some code just has a higher interest rate." Paul McMahon

Stroustrup’s Rule: "For new features, people insist on LOUD explicit syntax. For established features, people want terse notation."

"Le meilleur moment du programmation, c'est quand on télécharge le compilateur." Georges Clemenceau

The Clemenceau quote is not grammaticaly correct, it should be: "Le meilleur moment _de la_ programmation". Also I've seen "développement" used more often than "programmation", even if both terms are valid.

Still where does it come from? The only George Clemenceau I've heard of died in 1929.

Thanks for the correction! I made it up as a joke. He really said, "Le meilleur moment de l'amour, c'est quand on monte l'escalier."

Ok, I didn't know that quote

He was really looking forward to downloading the compiler.

>Rules of Optimization: Rule 1: Don't do it.

Well that explains the modern web, and a great deal of modern software.

There is a huge difference between not optimizing (say, purge formatting white space from HTML) and putting out five megabyte of JavaScript junk for static website.

Well, optimization could also mean to remove unnecessary JS.

I think removing unnecessary code and generalizing redundant parts should be a given. On the other hand, early optimizations such as caching or in-lining should be avoided unless needed.

I remember someone saying simple code implementing well selected algorithms beat optimized code implementing poor algorithms.

Although, I would have to agree with your statement.

Selecting algorithms well is not trivial. How many people think about the difference between linear search in a small array versus just passing maps around without calling it optimization?

It also doesn't matter as much if your design is a mess of abstraction and indirection that spends 95% of its time chasing pointers and inspecting maps.

No, its not. Frankly, I haven't seen any tools to help which is an amazing shame. Its something that as a profession we need to be better at.

> It also doesn't matter as much if your design is a mess of abstraction and indirection that spends 95% of its time chasing pointers and inspecting maps.

Well, that's the other side of the problem of picking decent algorithms is picking a reasonable structure for your program.

I think this rule only applies in situations where the runtime is smart and fast enough that the naive approach is fast enough for whatever you need to do.

There was a surprising lack of Alan Perlis on the original list! His "epigrams on programming" should be required reading for every programmer, even if some are a little dated now:


> Choose mnemonic identifiers. If you can't remember what mnemonic means, you've got a problem. - perldoc perlstyle

From that source it should rather be a joke. What is the meaning of `@` again?

> What is the meaning of `@` again?

@rray (just like `$` is for $calar, and `%` is for … uhm …). Everything is mnemonic if you have a sufficient amount of context. :-)

Right. At least, the first two are right:

'@' is a stylized 'a' (for 'array') and '$' is a stylized 's' (for 'scalar').

And '%' is a stylized key-value pair, so used for a Perl hash. IIRC.

Whatever else you may think of Larry Wall, you can't accuse him of not being ingenious :)

>sufficient amount of context.

nice('perl pun') unless not intended;



> nice('perl pun') unless not intended;

Lovely catch! I wish that it had been intentional.

Heh. Thanks :)

The `@` symbol is a variable by itself, what is very clear from it's name, even more because `$` certainly isn't and I also don't think `%` is one (that last is for dictionaries, by the way).

Another system defined variable is `$_`, that is as well named as the one I cited.

There are many more of those in Perl, but I can't remember them for some reason.

> The `@` symbol is a variable by itself, what is very clear from it's name

I'm almost certain that this is false, at least in Perl 5. (Sigils are dramatically overhauled in Perl 6, and I haven't kept up.)

"@" is for an array variable or pulling something in array context. You can have variables declared with $, @, or % depending on whether it is a scalar, array, or hash structure.

That's a great collection! Thank you.

Downvoter, please get a life. It's hard not to feel soured by such miserable gestures as yours. Mine was the first comment on here, I thought maybe there wouldn't be any comments - it was a great collection and I wanted people to check it out, and express my gratefulness; not sure what else I should've done. I guess I should just accept that such spiteful downvotes happen, but it's shame. Maybe you wouldn't even have seen the page had I not upvoted and commented...

Pity the list isn't updated anymore, or uriel would've sure put these gems by Charles Forsyth in it:

[B]y treating "compiling C" as a subset problem of "compiling C++", gcc and especially clang are vast whales whistling at the wrong frequency for the problem.

Plan 9 C implements C by attempting to follow the programmer’s instructions, which is surprisingly useful in systems programming. The big fat compilers work hard to find grounds to interpret those instructions as ‘undefined behaviour’.

Dennis Ritchie, in his ‘noalias must go’ essay, described that proposal as “a license for the compiler to undertake aggressive optimizations that are completely legal by the committee's rules, but make hash of apparently safe programs.”

(That he did not write something similar about ‘undefined behavior’ makes it clear that no one at the time intended or even guessed at the mess it would make of the language.)

It is interesting that compilers try to be so damn smart. Is -O0 still not faithful enough to the programmer's instructions?

Often the generalized parsing/lexing/codegen/etc. algorithm that exists in service of -O1 through -O3 can be turned "down to a trivial level" with -O0, but not "off"—as that would require an entirely separate code-path.

I could not find one of mine favorites The two hardest things in programming are: 1. Naming variables 2. Cache invalidation 3. Off by one errors

Ah, quotes, the method of taking wisdom out of context and greatly increasing the chances of incorrect interpretation. The ancient form of twitter.

"Epigrams are more like vitamins than protein." - Alan Perlis

For the sinner deserves not life but death, according to the disk devices. For example, start with Plan 9, which is free of sin, the case is different from His perspective.

        — Mark V. Shaney
Hey, that one’s a bot.

I’m also trying to figure out what’s left if you take out all the “avoid complexity” and “avoid XML” quotes...

  You can't build an airplane from bricks.
    -- Steve McConnell, *Code Complete*

There's a lot of negativity here. A lot of these quotes seem to offer more heat than light.

I believe a more appropriate noun is "contempt" :-)

I found this page [1] some time ago, maybe someone can find some good quotes in here but the page is kind of long and heavy and not super quotable, maybe some of you will find some gems in there.

Top of the page is Xah Lee, a lovable, and sometimes a bit weird, cranky master of contempt:

    > The basis of computer languages' merit lies in their mathematical properties.
    > It is this metric, that we should use as a guide for direction.
    > As an analogy, we measure the quality of a hammer by scientific principles:
    > ergonomics, material (weight, hardness. . .), construction, statistical
    > analysis of accidents/productivity/… …etc., not by vogue or lore.
    > If we go by feelings and preferences, hammer's future will deviate
    > and perhaps become dildos or maces. (Xah Lee in comp.lang.lisp, July 2000)

I think Xah is cranky but many times on to something. For instance, point 6 in his list of "Python Doc Idiocies" [2] is hard to disagree with :-).

1: http://www.schnada.de/quotes/contempt.html

2: http://xahlee.info/python/python_doc_index.html

P.S.: if you enjoy Xah's writing you should pay for his excellent emacs manual [3]. It is actually openly available but he accepts contributions.

3: http://ergoemacs.org/emacs/buy_xah_emacs_tutorial.html

UPDATE: he actually has a Patreon account too! [4]

4: https://www.patreon.com/xahlee

I don't value contempt. I don't think that contempt is strongly associated with programming or computer science. I would consider that to be a problem. I can't condone the expression of contempt in the public discourse related to computer science. I do not typically have much free money for donations, but I would on principle refuse to financially support contempt. I would wonder that anyone would consider that a positive trait. I certainly think that this attitude would discourage newcomers to the field, and I think it's worth working to promote a different image and mindset.

I'm not suggesting that you support "contempt", but the person behind the writings, and only if you enjoy the writing style.

Also, Xah also has all sorts of other documents were he carefully tries to avoid the mistakes he criticizes in other documents, and were contempt is not the main driver. I'm sure he spent countless hours writing those materials and he's made those documents available for absolutely free! [1]

The best intellectual and artistic endeavors are deeply flavored by the emotions of their authors, which gives the works a measure of authenticity. I personally can appreciate that, the same way I can also appreciate a lot of forms of dark humor.

Also, I'm OK with others not sharing that appreciation.

1: http://xahlee.info/comp/comp_lang_tutorials_index.html

That's cat-v.org for ya, curated by the late Uriel[1]. If you want some real negativity, go read the harmful section [2].

Some of us enjoy rants and anger. Positivity is overrated.

[1] https://suckless.org/people/Uriel/

[2] http://harmful.cat-v.org/

I've heard about this guy before. Reading a few of his posts in the second link (particularly the one about children), it seems like he was a deeply unhappy individual. Rants and anger can be fun, but wallow in them too long and it begins to feel like there's nothing else.

He was clearly a very talented programmer though, and as someone who reads and writes Go regularly I have a lot to thank him for.

A lot of hard won negativity from painful experience.

Programming is tough. That doesn't necessarily mean that we need to emphasize the negative aspects. However, the more concerning element to me is that there are a fair number of gratuitous insults. I am as prone to making and/or enjoying a witty jab as the next person, but I recognize that this is not a wholly positive trait, and try to resist the impulse. I feel like the distilled genius of this profession has something finer to offer than Perl-bashing, or at least, I hope so.

"CS can be described as the post-Turing decline in the study of formal systems."

I don't know who said it originally.

> A data structure is just a stupid programming language.

> — R. Wm. Gosper

What's this supposed to mean?

One interpretation: I think there's a dictum (due to Dijkstra, maybe?) that says something like programming = data structures + algorithms. If you remove the 'algorithms' part, then you're left with data structures on the right, and, presumably, an impoverished programming language on the left.

Algorithms + Data Structures = Programs is a book by Niklaus Wirth.

Thanks! Wirth was going to be my second guess. For some reason, possibly because I had the order switched (or because the '+' itself, rather than just the words, is key to the phrase), I couldn't find the phrase by Googling.

It means that a data structure is an object which supports a very limited range of algebraic operations. E.g. for a stack, you just have push and pop. That's the whole set of actions for that data structure. You could argue that it constitutes a tiny self-contained language within any host language that supports the data structure.

One thing it brings to mind is this old argument: a configuration language is just a scripting language where you can't do anything useful to decrease repetition. (Or: Lisp programs don't contain config parsers.)

Instead of a "stupid" fixed set of declarative configuration options, just expose your program's high level abstractions as a set of macros and functions, and then let the user "configure" the program by just... writing a program, which uses those high-level macros and functions to do what they want.

I think this has to do with the fact that things like XML are not Turing complete, even though it is a language in its own right.

This is an absolute gem.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact