"The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time."
Tom Cargill, Bell Labs
NB There should probably be a 90-90-90-90-<recurring> rule... could called the Y-90 rule!
Me, every fucking time I look at code.
All the programmers, "embrace the suck and the light at the end of the tunnel is an oncoming train".
I picked the wrong career.
Until there are leaders 'woke' enough to proactively do the right thing I dont see anything change in computing industry.
Have you seen how happy homesteaders are?
On a serious note. I'm aware. That's why I did my own company with a friend. It cuts a lot of bullshit out I don't have to deal with anymore. As (essentially) a contractor, it's easy to tear people a new asshole in a meeting and get away with it. I can't imagine an employee being able to.
First off, people need to stop worshiping the ilk like Jobs, Musk and Bezos. They're just money makers. Just like Rockefeller, Morgan and Carnegie. None of them are visionaries as all their ideas are easy to spot from history or other people. Like the hyperloop... just about 80 years too late to call that one unique. Ooooo you put colored pieces of plastic on a computer and made a small digital music player. The Zune was WAY better! The other guy just sold books and moved on to other shit. Congrates on the better mouse trap. But it's not like they did it themselves.
Maybe a homestead on a tropical island... I like coconuts.
"Debugging a program is twice as hard as writing it in the first place. So, by definition, if you write the program as cleverly as you can, you will not be able to debug it."
- (Maybe by) Brian Kernighan.
“Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?” in The Elements of Programming Style (p10 in the 2nd edition).
I've heard that's a good book, as is Software Tools by Kernighan and Plauger.
"Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won't usually need your flowcharts; they'll be obvious."
- Fred Brooks
> ps. i made up the hyperstationary, but then again, it’s probably a design pattern.
Reminds me of a great deal of programmers in the banking world, especially those who used spring. Their software often failed, but their knowledge of Java design patterns never did!
The essence of XML is this: the problem it solves is not hard, and it does not solve the problem well.
— Phil Wadler, POPL 2003
— Heard from someone working at Microsoft
— Sarkos in reddit
— Charles Forsyth
— Chris Wenham
XML is a classic political compromise: it balances the needs of man and machine by being equally unreadable to both.
- Matthew Might
XML is a markup format, a regularization/minimization of the syntax of SGML. It's good at being a markup format—the paired named open+close tags allow for corrupted-stream repair in a way that e.g. Markdown just doesn't. XML is great in, say, DocBook.
But XML gets used for pretty much everything except actual markup. And for everything else, it's not solving those problems well.
Then it has 3 different sets of data, elements, attributes, content. In other words <a c="foo">bla<b/>bla</a> is mess to deal with. Now replace the "bla" with spaces and assume <a> is supposedly only supposed to have <b> as a child except really it will have 3 children, the one before the <b> and the one after.
I think this is why JSON won over XML. It seems like all those extra features on XML would be a benefit but in reality they just give you more places to hang yourself.
The hierarchy stuff is so complicated and wrong that I've personally never witnessed a collegue actually using a data schema. Which means data does not actually get validated.
On top of that it is near unreadable. Compare to something simple like http://jstimpfle.de/projects/wsl/world_x.wsl.txt
The whitespace situation is really bad and makes it all also not canonical at all.
"Often a few hours of trial and error will save you minutes of looking through manuals."
If I knew exactly where to look in the manual then the quote might fit but it rarely does.
"Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity."
— David Gelernter
"If you know what’s in it, you don’t know when it will ship. If you know when it will ship, you don’t know what’s in it."
“The most effective debugging tool is still careful thought, coupled with judiciously placed print statements.”
- Brian W. Kernighan, in the paper Unix for Beginners(1979)
> Deleted code is debugged code.
— Jeff Sickel
I programmed Perl for a while so I like to make this more about _characters_ removed rather than lines :)
> What's wrong with perl?
It combines all the worst aspects of C and Lisp: a billion different
sublanguages in one monolithic executable. It combines the power of
C with the readability of PostScript.
> To me perl is the triumph of utalitarianism.
So are cockroaches. So is `sendmail'.
— jwz [http://groups.google.com/groups?selm=33F4D777.7BF84EA3%40netscape.com]*
~ and ~
Many modern cars get pretty close.
Given that a subtraction will easily compile where an addition should have been coded, so the content of the remark is outrageous.
Don't get me wrong, Haskell is really really cool and some brilliant coders use it. One day I'll actually make it through Haskell book.
See, the whole type-level and value-level is a false dichotomy, that's the whole idea behind dependent types. There shouldn't be a difference between a type and a value.
Rather, the whole idea behind dependent types (if dependent types can be glibly summarized by a "whole idea") is that logical predicates give rise to sets. E.g. "all integers that are squares" or "prime numbers". Or all points <x, y> such that x^2 + y^2 <= r^2.
A proposition is not the same thing as the domain of that proposition's parameters; but it does denote a subset of that domain.
A dependent type can contain a large number of values and that will often be the case.
If we have a situation in the program where we must calculate a value that is the member of some dependent type that contains billions of values, and only one values is correct, there is the threat that we can calculate the wrong value which is still in that set. That dependent type will not save our butt in that case.
We have to isolate the interesting value to a dependent type which only contains that value. "correct value" is synonymous with "element of that set which contains just the correct value".
Still, that doesn't mean that the set is the same thing as that value.
A big problem with dependent types is that if the propositions on values which define types can make use of the full programming language, then the dependent type system is Turing complete. This means that it has its own run-time; the idea that the program can always be type checked at compile time in a fixed amount of time has gone out the window. Not to mention that the program can encode a self reference like "My type structure is not correct". If that compiles, then it must be true, but then the program should not have compiled ...
Nope, a type is not a set. Type theories are separate formal systems designed to be (among other things) computationally aware alternatives to set theory, and they are defined by "terms", "types" and "rewrite rules". The whole point of a dependent type theory is to be able to encode things like sets and propositions in the same formal system.
The rest of your comment is wildly off based on this misunderstanding.
What the GP was probably referring to is the persepctive of dependent types from the lambda cube, where you have three different extensions to simply typed lambda calculus's regular functions from value to value:
1. Parametric polymorphism, which are functions from type to value
2. Type operators, which are functions from type to type
3. Dependent types which are fubnction from *value to type*
And certainly none of this is free. Indirection has costs, and trying to model the entire behavior of a program in a compile-time system is going have costs. Some of those costs will push solvable problems into an unsolvable state, and then things break down. (For another example of this: software can try to emulate hardware, but there are things that hardware can do that software cannot and vice versa, so it wouldn't make sense to say they are identical.)
>The other points are practically true by definition.
"More guarantees" and "more assumptions" are not definable terms. You can't count the number of theorems that need to be true for some model to be accurate.
The Wikipedia article on dependent types is a pretty decent survey: https://en.wikipedia.org/wiki/Dependent_type
> Randomness is about computational purity, which is nearly orthogonal to the expressiveness of your type system.
If types and values are to be considered identical, then random types should be just as useful as random values. They are not. That alone should indicate that values and types have a well-motivated distinction.
Additionally, dependent types offer no additional capabilities, only convenience. Having a type defined in terms of a value communicates nothing more to a compiler than would a functionally equivalent assertion, though it would certainly be easier to use and work with in some situations.
I'm not trying to say anything about dependent types here. I'm trying to say that the goal of producing a language which makes no distinction between type and value is neither useful nor possible.
Rust eliminates entire classes of bugs at compile time that plague other languages, usually out in production. "NoneType has no attribute 'enough'", "KeyError: this dictionary never has the key you want", etc. Multiple times per day I'll have a website fail to render simply b/c I blocked cookies (and they try to use localStorage, and the call fails, the failure isn't handled, and it bubbles all the way out, crashing the site).
Sure, unit tests could help here, but that's the thing about a typechecker: it's a unit test (of sorts) that you can't avoid (which prevents laziness), and I feel it is easier in the long run (writing the equivalent unit tests in Python requires more work). IME, laziness usually wins out, and sloppy code winds up in production.
Rules of Optimization: Rule 1: Don't do it. Rule 2 (for experts only): Don't do it yet.
- M.A. Jackson
Never test for an error condition you don't know how to handle.
Choose mnemonic identifiers. If you can't remember what mnemonic means, you've got a problem.
- perldoc perlstyle
Every program has (at least) two purposes: the one for which it was written, and another for which it wasn't.
- Alan Perlis
Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration.
- Stan Kelly-Bootle
Stroustrup’s Rule: "For new features, people insist on LOUD explicit syntax. For established features, people want terse notation."
"Le meilleur moment du programmation, c'est quand on télécharge le compilateur." Georges Clemenceau
Still where does it come from? The only George Clemenceau I've heard of died in 1929.
Well that explains the modern web, and a great deal of modern software.
Although, I would have to agree with your statement.
It also doesn't matter as much if your design is a mess of abstraction and indirection that spends 95% of its time chasing pointers and inspecting maps.
> It also doesn't matter as much if your design is a mess of abstraction and indirection that spends 95% of its time chasing pointers and inspecting maps.
Well, that's the other side of the problem of picking decent algorithms is picking a reasonable structure for your program.
From that source it should rather be a joke. What is the meaning of `@` again?
@rray (just like `$` is for $calar, and `%` is for … uhm …). Everything is mnemonic if you have a sufficient amount of context. :-)
'@' is a stylized 'a' (for 'array') and '$' is a stylized 's' (for 'scalar').
And '%' is a stylized key-value pair, so used for a Perl hash. IIRC.
Whatever else you may think of Larry Wall, you can't accuse him of not being ingenious :)
>sufficient amount of context.
nice('perl pun') unless not intended;
Lovely catch! I wish that it had been intentional.
Another system defined variable is `$_`, that is as well named as the one I cited.
There are many more of those in Perl, but I can't remember them for some reason.
I'm almost certain that this is false, at least in Perl 5. (Sigils are dramatically overhauled in Perl 6, and I haven't kept up.)
[B]y treating "compiling C" as a subset problem of "compiling C++", gcc and especially clang are vast whales whistling at the wrong frequency for the problem.
Plan 9 C implements C by attempting to follow the programmer’s instructions, which is surprisingly useful in systems programming. The big fat compilers work hard to find grounds to interpret those instructions as ‘undefined behaviour’.
(That he did not write something similar about ‘undefined behavior’ makes it clear that no one at the time intended or even guessed at the mess it would make of the language.)
— Mark V. Shaney
I’m also trying to figure out what’s left if you take out all the “avoid complexity” and “avoid XML” quotes...
You can't build an airplane from bricks.
-- Steve McConnell, *Code Complete*
I found this page  some time ago, maybe someone can find some good quotes in here but the page is kind of long and heavy and not super quotable, maybe some of you will find some gems in there.
Top of the page is Xah Lee, a lovable, and sometimes a bit weird, cranky master of contempt:
> The basis of computer languages' merit lies in their mathematical properties.
> It is this metric, that we should use as a guide for direction.
> As an analogy, we measure the quality of a hammer by scientific principles:
> ergonomics, material (weight, hardness. . .), construction, statistical
> analysis of accidents/productivity/… …etc., not by vogue or lore.
> If we go by feelings and preferences, hammer's future will deviate
> and perhaps become dildos or maces. (Xah Lee in comp.lang.lisp, July 2000)
P.S.: if you enjoy Xah's writing you should pay for his excellent emacs manual . It is actually openly available but he accepts contributions.
UPDATE: he actually has a Patreon account too! 
Also, Xah also has all sorts of other documents were he carefully tries to avoid the mistakes he criticizes in other documents, and were contempt is not the main driver. I'm sure he spent countless hours writing those materials and he's made those documents available for absolutely free! 
The best intellectual and artistic endeavors are deeply flavored by the emotions of their authors, which gives the works a measure of authenticity. I personally can appreciate that, the same way I can also appreciate a lot of forms of dark humor.
Also, I'm OK with others not sharing that appreciation.
Some of us enjoy rants and anger. Positivity is overrated.
He was clearly a very talented programmer though, and as someone who reads and writes Go regularly I have a lot to thank him for.
I don't know who said it originally.
> — R. Wm. Gosper
What's this supposed to mean?
Instead of a "stupid" fixed set of declarative configuration options, just expose your program's high level abstractions as a set of macros and functions, and then let the user "configure" the program by just... writing a program, which uses those high-level macros and functions to do what they want.