This must have come out of the ITA Software acquisition.... (Heading "Attention required" 'You must follow the ITA convention of using...')
They were a big common lisp user apparently.
Google is normally very specific on the languages allowed for internal projects. A product/company acquisition with large assets written in common lisp would necessitate this becoming the "Google Common lisp style guide" rather than what it was most likely originally the "ITA Software" common lisp style guide. Speculation of course, but looks likely.
One area of difference is in the conditionals. I never use WHEN or UNLESS for value; only for effect. And, I never write NIL as one of the values returned by IF; I always condense the expression to AND or OR. That is, I freely use AND and OR to return non-boolean values; this is a practice some deprecate, and indeed, I'm surprised not to find it explicitly mentioned here.
I do like to write certain kinds of iteration as tail-recursions, but I always use local functions (LABELS) when doing that; there's no implementation I use regularly that doesn't do tail-call optimization on local calls.
This further reinforces how Yegge's recent "software political axis" rant was wildly inconsistent. His characterization of Clojure was "highly conservative" based in part on the best practices avoiding macros when possible, unlike "liberal" languages including Common Lisp.
Meanwhile in his own company's coding style for Common Lisp states very similar best practices regarding macros -- they should be used "sparingly and carefully", "You must never use a macro where a function will do.", etc. The whole macros section basically reads as a list of well thought out reasons against using macros when writing code that other people will have to maintain.
Yegge: "I trust that if you know anything about Lisp, your blood is basically boiling at this point." Really? Well then maybe the google CL team doesn't know lisp or otherwise are looking for novel ways to escalate their collective blood pressure.
No, Steve's rant was also about how Google is very much on the "Conservative" end of the political spectrum. When a conservative company uses an - as per steve's article - liberal language, you can't expect them to use it in a liberal way. The story checks out.
I disagree. The Common Lisp community commonly has had the advice to be sparing with macros. Dan Weinreb (of ITA and the Common Lisp standard) mentioned this about ITA pre-Google. Paraphrasing from memory, "You can use functions anytime. With macros you need justification. With reader macros, you need a LOT of justification."
The conventional advice is not to go nuts with macros. In normal Common Lisp code, you don't need to write many macros. And when you do, they're typically with- macros.
Now, you can be a heavy consumer of macros. That's something else. Like using the LOOP macro all the time, for instance. The DEF... forms too. If you're building a powerful new abstraction, like OOP or the metaobject protocol, then it's great to have powers normally reserved for language designers.
> Meanwhile in his own company's coding style for Common Lisp states very similar best practices regarding macros -- they should be used "sparingly and carefully", "You must never use a macro where a function will do.", etc.
This is not arguing against macros, per se. The most powerful feature of macros is controlling how / when arguments are evaluated. So, if you don't use a macro when a function (i.e. standard arg evaluation) will do, you're not missing out on the power of macros.
I do this in Python-land. There's a tool called 'pep8' named after the community sanctioned style-guide of the same name. Before I push any changes I have a script that runs my tests and the pep8 script. It's more of a backup-measure in my case since I have my editor run pep8, lint, and my unit tests constantly.
I do this with perl. I use emacs' cperl mode, but I have Ctrl+x t bound to perltidy-buffer, which formats everything for me. I've modified the rules slightly (people at my work like tabs instead of spaces) but I tend not to spend too much time worrying about stylistic formatting rules.
Of course, this does not apply to some areas of vertical white-space, which I still have to manage myself.
We do this at Google, though humans are responsible for actual enforcement -- you can check in code even if it has lint errors, but your code reviewer should tell you not to. (For things like build files, though, you can't check in code with lint errors. That just saves everyone time and frustration.)
That argument doesn't make sense. My outlet for creativity is the content of my code not when new lines happen or how I indent. On the other hand, if everything in the codebase is formatted uniformly it's a lot easier to scan through and you don't end up with diffs filled with reformatting kruft as each team member walks over the others' formatting style.
If you're feeling extra creative, you could always change your font, font size, and syntax highlighting scheme. ;)
He could quote a better example, but I think the document goes deeper than just formatting style.
From the article's excerpt "Principles":
* Don't be "clever" — do the simplest thing that could possibly work properly.
* Be precise.
* Be concise.
* KISS — Keep It Simple, Stupid.
* Use the smallest hammer for the job.
They are indeed asking for uniformity against creativity. Which is not a bad thing for corporations (IBM have very strong guidelines for years) but definitely takes away that "hacker" aura that we gave to Google a few years ago.
I don't think google is really arguing against creativity, rather saying "don't be creative for creativity's sake".
I remember having to install Quartz Scheduler so it would hit a single servlet every night at 1am. That's _ALL_ Quartz was used for. When I asked about using cron the response was "well that wont scale"... this application had been around for fifteen years and during that time there was only one batch process... clever indeed!
There always would be anecdotal cases and interpretations. That's why I asked an open question to a googler who could set the document into a context.
I also have cases of people doing a lot of ad-hoc code because it was easier (small hammer to the job) and then resulted in a pain code to read that could be prevented with a clever system. But I am agreed with you that the whole thing is arguable and subjective
I already told in another reply that it is subjective.
But when they say "KISS Keep it simple stupid" one programmer could interpreter that as "OK for this messages management system what is more simple than a huge list of if statements" and another one could interpreted the same principle as "Hmm OK, I will have a list of closures to reduce verbosity"
That says to not be creative just to entertain yourself. Be creative to solve problems. It is a polite way of saying "you aren't as smart as you think you are at 3am on a red bull binge". This advice was not invented by Google. Knuth or someone for one has a famous quote about it: debugging is harder than first draft, so how do you debug code that you wrote at peak cleverness?
From The Elements of Programming Style by Kernighan and Plauger: “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it?”
The key words here are "by its style". There's still plenty of room for creativity in how you structure the code, how you approach solutions, etc., which is where the creativity belongs. Mediocrity is not overcome with creative spacing.
Yes, it looks like (besides the leftover "ITA" comment in the guide!) that this is simply the ITA guidelines. From Carl's remarks, you can tell that ITA is not a 'normal' Lisp program. Most Lisp programs do not preallocate ~5K of data structures and fail-fast if they exceed that, for example.
ITA actually has two large Lisp systems. One of them is the insanely optimized fare search system (QPX) that does all the crazy things described in the link. The other is the airline reservation system which is a much more conventionally coded Lisp system.
I'm not sure I understand/agree the point about "iteration over recursion". One of my favorite aspects of Lisp is the recursive approach to writing functions. It's still possible to write recursive functions that don't rely on a specific compiler's optimization:
(defun sum (numbers)
(labels ((helper (todo ans)
(if (null todo)
(helper (cdr todo) (+ ans (car todo))))))
(helper numbers 0)))
I hope that this is what the author meant with "iterative" approach, because it is recursive by most standards.
Nit pick: if it is guaranteed, call it Elimination, not Optimization, because it is part of the operational semantics required to reason about performance. As a compiler Optimization, you have an algorithm that is omega(n) space usage, but with Elimination as a language feature you have an O(1) space usage.
Yes, you can include compiler optimizations in your model of the language, but it helps to distinguish opportunitistic speedups from guaranteed complexity class improvements.
However, all the major implementations (SBCL, Clozure CL, Franz Allegro, LispWorks, CMUCL, and I think even ECL) do tail-call elimination on local calls. (A local call is a call to a function whose definition is lexically visible, such as the calls to 'helper' in your example.) The document says that TCE depends on optimization settings, but for local calls, I'm not sure even that is true. Anyway, the practical upshot, in my experience, is that there's no reason not to write loops recursively using LABELS when that is the most elegant expression of the particular loop in question.
(There was once a major implementation that never did TCE -- Genera -- but even for die-hard Lisp Machine lovers like myself, it has long since been relegated to a historical curiosity.)
This version quickly dumped Heap Exhaustion. The LABELS version does not, and can handle summation over lists much faster as well.
What does this mean? TCE only applies applies to local calls, therefore we should never write a "raw" recursive function? (which is a tad more elegant, imo). I haven't delved into TCOs deep enough to understand, but I had hoped that a smart compiler would be able to optimize the latter?
...is there any place one can find a list of companies/projects using CL and specifically what they do with it? ..or of open source projects using CL? (or do people still treat it as "our secret sauce")