Hexing a bit in 2017: https://news.ycombinator.com/item?id=14050017
Typing in 2017: https://news.ycombinator.com/item?id=14078852
Aphyr is abusing Haskell’s type deduction feature.
The abuse makes sense in terms of the type system behaving as a logic / goal-seeking engine. It does showcase the truly impressive work going in behind the scenes of Haskell.
Orders, kinds, types, sorts, classes, groups, categories, genera, flavours, colors, shapes... same thing.
Same thing, to first order.
On another note, I only wish my technical interviews were so simple. People talk about weeding out those who can’t program but finding a cycle in a linked list is a prerequisite to the prerequisite for passing the google interview questions I was expected to do in 45 min at the whiteboard.
I told them I recognise this as FizzBuzz.
They told me I am overqualified.
I am not a recent college grad, and I don't encounter a lot of interesting algorithm problems in my work, so I had prepared by working through some leetcode problems. But I had worked through only "easy" and some "medium" problems because I had the impression they didn't ask the harder ones. I was able to recognize it as a BFS graph problem and gave the time complexity (and space complexity, IIRC), but I did not finish it on the whiteboard.
Live and learn. Maybe I'll have another go in six months. Can't argue with the money and the resume cred.
Even a lot of the "basic algorithms" taught in undergrad courses would be gnarly to invent on your own without clues in less than an hour. There's a reason they're named after famous computer scientists. If they were easy, we wouldn't bother teaching them.
Of course I can't prove I independently reinvented the tortoise-and-hare algorithm, you'll have to take my word for it. But it should be obvious that somebody did it or it wouldn't be well known today.
What's more likely, somebody figured the solution out to a problem that previously took over a decade in under an hour and then felt the need to brag about it to strangers on the internet; or somebody lied online to feel better about themselves. People are certainly more likely to assume the latter.
Certainly the Internet has made it more likely that you've run into it and heard the solution.
As I said, it was a very long time ago when I heard of the problem. I don't think it was in an interview, but I honestly don't remember the circumstances. I do remember being quite proud of myself, so I guess there's truly an element of bragging in my statement. Doesn't mean I lied though.
You settled for a). Typically of what geeks do ( yeah, I'm guilty of it too).
Anyway. you seem to be an outlier to being able to solve the problem without having being familiar with it before hand. Do you realize you are extraordinary?
(FYI: I'm not saying that with sarcasm, you might indeed be "riddle" genius, or perhaps even a "real problem", problem solver)
I've always thought of myself as being pretty clever, but I'll bet many HN readers would say the same about themselves. "extraordinary" seems a little over the top, but thank you.
There are enormous implications to this seemingly harmless labeling of yourself ( not blaming you). I insist you are extraordinary, not merely 'pretty clever', not only based on my observation but also judging by other people's reactions to you your post.
Off the top of my head there are 2 implications that can be typically attributed to such traits ( and I'm not saying that you are particularly cause those implications, it's just typical):
- if someone like you is an interviewee for a programming position, your average Joe interviewer who asks you algorithmic questions ( or often some clever question) will not like the fact that you can easily solve the problems that you are presented with.
- If a person like you is an interviewer for a regular corporate programming position( which does not involve algorithms), the person is likely to pose problems of this nature to candidates for the job. When candidates routinely fail to solve those problems the person is left wondering why the candidate pool is so bad. To the interviewer the solutions seems so obvious ( or can be figured out easily) to someone who is just 'clever'.
And the industry complains how it cannot find people for CRUD programming jobs....
But computer geeks ( we are talking about kind of people who code for 'fun') are not normal people. Their default is to take words' meaning literally, unless they have somehow (often painstakingly) managed to learn the ways of normal folks communicate.
My favorite named-after-a-person one like this is Dijkstra's algorithm, which he claimed to have come up with in 20 minutes on the back of a napkin. If we suppose the average professional engineer is at most 3x slower/less brilliant than Dijkstra, it's not that unreasonable to imagine someone could reproduce the design on a whiteboard in a full hour...
Of course I don't buy that assumption, nor do I think it's a good problem or good idea to have as an interview filter even if it was true. (While I enjoy the occasional programming puzzle, I hate that they're lazily used to evaluate people in interviews so at least I avoid ever giving pure algorithm puzzles for interviews.) Nevertheless I agree with Mark that it's not "basically impossible" to come up with a good algorithm for many classes of algorithms and problems. I do wonder though how many people who could reinvent tortoise+hare without seeing it explicitly before would then be able to reinvent the teleporting turtle optimization right after.
One could use that answer to gauge the humour of the interviewer :-) For me the expected reaction of a peer to this would be "Cute! Now explain the problems I could encounter when using it." If they can't deal with some fun, well, their loss.
I must admit I didn't follow the bytecode too closely. So thanks for pointing that out.
What does "re-tying backwards" mean?
If you do a "change of coordinates" the original algorithm becomes trivial: If you know that any loop in the list doesn't begin after the node you're on, you can just mark where that is, run ahead, and check if you ever come back. In the general case, "change coordinates" so that with every step, the origin advances one step. Now you'll eventually be in the previous case.
Not in space complexity.
> A bit more aggressive, check System.Runtime.heapSize() calculate how many nodes can fit in the heap, and subtract 1 for each element visited.
This one will take quite a while…
> recognize that pointers are always powers of 2, so go ahead and tag that pointer
They're not always, and this is also technically uses extra space.
Perhaps unintentionally showing that these companies are more interested in you showing that you care about algorithms tricks you will realistically never use there than being a productive engineer (which is a pretty hard thing to measure )
Formal CS education requires good self-learning skills and is not a counter-indicator for self-learning.
(I agree with it being a mostly arbitrary criterion for evaluating a candidate.)
I wonder how much the technical question approach isn't so much wrong as it is testing for things that matter much less now. There don't seem to be many questions about concurrency and distributed systems in this kind of interview, or at least not good ones. Everything now is "it depends" and the hard solutions are about ten lines of code for ten pages of problem explanation.
Install SBCL, pick up a Lisp book from 30 years ago (e.g., Paul Graham's, Peter Seibel's, Norvig's), and start hacking.
There are of course languages that are more fun to work with, but at the end of the day we all have to eat. I would love to see Common Lisp's former glory to be restored, but sadly the language use in the industry seems to be shrinking every year.
Clojure is a modern, practical Lisp dialect that lets me use real REPL and structural editing and at the same time keep my sanity intact, plus I get paid. Please don't be so harsh about it.
I think having a Lisp as a hosted language was an awesome idea, I'd rather write (slightly crippled) Lisp that targets multiple platforms and get paid, rather than "hacking" in a (real) Lisp for free, or even worse - no Lisp at all.
PHP is bad. If you're getting paid for it, it's still bad. So what? People also get paid to scrub toilets.
Personally, I would rather write Java than Clojure. But I wouldn't quit if I suddenly had to use Clojure.
And by the way, demonstrated Clojure code is very unusual. Nobody in practice writes anything even close to that. Java interop is baked into the language, but in practice it doesn't look as scary as in the blogpost.
> SBCL has definitely proven itself to be a good companion to explore the generation of domain-specific machine code. ... Steel Bank Common Lisp: because sometimes C abstracts away too much ;)
From a more.... civilized age
But, as far as I'm concerned, these articles could be posted a dozen more times and it wouldn't be too much.
Ultimately, the purpose of the interview is to allow the interviewer to become convinced to recommend hiring. If you're going down a path that isn't going to result in that, you're depriving the candidate of that opportunity. Not to mention wasting everyone's time.
Sometimes that means the candidate will have to adapt their answer to the interviewer. Everybody is different, and this is the interviewer who has been assigned. Occasionally, that may even mean dumbing down your answer.
Of course, for narrative reasons, the interviewer in this story has to be passive. Otherwise the story would be about 90% shorter.
For example, you get developers optimising the hell out of a block of code - making it unmaintainable in the process - when the real bottleneck is developer time, not processor time. Or someone builds a giant vector autoregression model powered by MCMC when a rolling mean would have been adequate for the task.
Build the simplest tool that will solve the task at hand is not a mantra that comes easily to some people.
I've worked with smartasses, and I've worked with many people smarter than me. While there can be overlap, there generally isn't, in my experience.
Smartassery is something I've most often witnessed in people having a need for self-assertion. Most really smart colleagues I've had are confident enough in their skills that they don't need to prove them by being smartasses.
Anecdotes, but quite a few of them.
Sometimes that's true, when a dumbass is involved.
At least equally often, though, that's what smartasses tell themselves either to avoid facing their social deficiencies or because they lack the self-awareness to recognize them.
It's nice if the interview can be used to figure out if the interviewer/candidate want to work together.
I think it's fair to be frustrated at the "whiteboard interview questions" that most people encounter when interviewing.
My point is that it is sometimes easier to write complicated and unmaintainable code than writing clean and clear code.
Conversely, well-written and document code that the maintainer wouldn't be able to make the leap to invent can still be maintained. Comprehension regularly exceeds ability to create.
I wonder if there's anything else out there like this idea? If not, I'll make it happen.
And then I wrote a list of other stories in this category (Ra, linked from sibling comment, is brilliant, by the way. Less programming oriented by the concepts do underlie the story).
You're welcome to send pull requests to the list with more reviews etc. Though it's drifing from my control - having started to read some Vernor Vinge I'm not sure he belongs on the list.
Harry Potter and the Methods of Rationality -
Aphyr is an incredible writer and just extremely smart, it is hard to read anything from him and not come away humbled.
Who are all those people who constantly whine: "Lisps are unreadable". Dyslisplexic programmers?
> People foreign to Lisp often look at it and without even slightest attempt to give it a try, immediately reject it as "hard to read".
You'll have to take my word for it, but I have put significant time and effort into trying to familiarize myself with lisp. I still found it extremely difficult to read at the end of that effort. I don't think I'm alone in this. But there are definitely people such as yourself who find it extremely readable. I'm not sure why there's such a divide.
> Lisp retains readability even on small screens, good luck trying that with literally any other programming language.
Is this a need that arises frequently? I don't think I've ever wanted to do that, but I am willing to believe it's something many people might want.
In editors that have good support for vertical splits, yes. Before Lisp I would never keep more than two-three split windows, with Lisp I sometimes do four, even six vertical splits at the same time - just because I can.
Maybe you started out with an easier way into Lisp, or had a knack for it, or just spent time in it at a point when you had more motivation or time to get into it. There's more to learn in the world than people could do in a thousand lifetimes, so I wouldn't fault anyone for turning around in the door if it's something they have no interest in at the get-go.
> Lisp retains readability even on small screens, good luck trying that with literally any other programming language
I learned from Sinclair BASIC on a 32x24 character display on a black-and-white PAL television, as did a lot of people. Readability is highly subjective.
I've seen it multiple times - it's only the initial reaction, it doesn't usually take too long for anyone to adjust to the syntax. I have never met anyone who used a Lisp for several months and still hates it and finds it unreadable. I wouldn't count anecdotal encounters of people online, claiming for it to be mostly true.
I have seen people using one Lisp e.g., Clojure and having difficulty quickly parsing a different Lisp dialect, e.g., EmacsLisp, but that's not "Lisp being unreadable," it's just unfamiliarity with specific language idioms.
I had also seen people who learned Clojure before any other languages and then tried learning a non-lispy language (java, python, etc.) and surprisingly claimed it to be harder to read (initially).
> that dismissiveness counts against Lisp adoption
I agree, but how do you fix this problem? You can't remove parens and keep it homoiconic. Tools like Parinfer do help, but they don't address the problem: Lisp doesn't look "sexy" for those who are unfamiliar with it. I kept ignoring Lisp for many years, simply because I didn't know better. I wish there were people who'd keep telling me that parentheses are not a problem, they are a solution.
Survivorship bias, maybe?
Readability is subjective.
The lower level of S-expression serialization kind of resulted in offloading this task to editor, where it should lie, instead of burdening the developer with it. That said, even I will admit that without experiencing the fact that the editor will handle parens for you, one's experience in other languages is going to suggest bad things. How many of us ended up having to manually count "end" keywords in Pascal and the like?
Still, with a proper editor (not just emacs), the experience is really, really different.
Readability is indeed subjective. Few years ago I myself would strongly oppose my own words about Lisp being readable. Several years ago even plain English was pretty much unreadable for me. I guess it's a good thing I haven't dismissed it, otherwise we wouldn't be having this conversation :)
Who among you would hire this witch? Who among you would want to maintain this kind of code, should this witch move on to challenges more suited to her skills?
Who among you would dare to meddle with the work of a Real Programmer?
On the other hand, there are places where this kind of thing would fit right in: exploit writing, games development, certain kinds of HPC, cryptography, some firmware development. And it demonstrates the kind of deep understanding where being able to read this kind of thing is very useful to puzzle out some horrendous low-level mess that has been inflicted on you by a vendor. If it was John Carmack or Richard Feynman doing it, it would be celebrated.
The risk with "rockstars" is they do this stuff when you don't want them to. Codewitches are much rarer, and have a much better sense of when it's a good idea. Neither is ever really going to be a "team player" but can deliver amazing things.
A coumparisoun would be to change "o" to "ou" tou make sentences louk moure English.
>The magic item supplies the magic number identifying the class file format; it has the value 0xCAFEBABE.
Is the wrist scar reference from the series and I've forgotten it or is this from something else?
(Not sure about the style of this individual story)
What's actually happening in the code: it starts off in the Lisp-on-JVM that is called Clojure, but for "performance reasons" drops down to hand-writing Java bytecode. A useful reminder that it's all just bytes and we don't have to arrive at them by the "normal" route.
The prose seems to resemble something dark and beautiful and meaningful. But if it's meaningful it's lost on me, for the most part. And dark and beautiful and meaningless is honestly a bit torturous. Like trying to stare at a Pollock and understand something.
And the bytecode just couldn't be more boring to me. Why is bytecode for this mundane algorithm interesting?
Someone could do us second class citizens (who I have no doubt are orders of magnitude more numerous than those who privileged few who get it) a favor by producing us with a line-by-line analysis. But I guess that would ruin the feeling of privilege.
FFS, there's no shame in not understanding something. There is shame to be had in throwing around accusations of privilege at people who understand something you don't.
People won't downvote you for not understanding. Some people will applaud you for saying you don't understand, up until the point they realise you're not seeking help in understanding, that you're not trying to improve your own knowledge and wisdom; that instead, you're trying to make yourself feel better by ascribing yourself some moral high-ground for not being one of those "privileged" people who understand something you don't. They might downvote you for being a passive-aggressive child about it.
I've been around enough to know how wrong that is. Do you actually believe this? Honest questions asking for explanation are almost guaranteed to result in a negative vote sum.
instead, you're trying to make yourself feel better by ascribing yourself some moral high-ground for not being one of those "privileged" people who understand something you don't.
Make myself feel better for what? Maybe I needed to include the /s tags...
Boo hoo hoo. Oh no, lost some internet points. Anyone whose opinion is worth a damn will applaud you asking honest questions; especially here, on defensive poser central.
You've gone for the "I wasn't being serious" defense, pretending you didn't mean it. However, your post wasn't remotely sarcastic and contained far too much detail and defensiveness, with a tone taking itself very seriously. You're convincing nobody.
What exactly do you now claim you were being sarcastic about? "The prose seems to resemble something dark and beautiful and meaningful"; that's sarcasm, is it? So you're saying it IS something dark and beautiful and meaningful? "And the bytecode just couldn't be more boring to me." Right, so you were saying the bytecode was some of the most interesting code you've ever read? If your post was sarcasm, you're truly terrible at it and should switch to irony or satire instead. Little bit of literary term privilege there for you to sarcastically pretend to object to.
FFS, this is the anonymous internet. If you can't be honest with yourself here, where will you be honest with yourself.
I would guess for performance. So you would be side-stepping some of the compilation?
To me, mastery would be if you can use mathematics to trivialise an implimentation in a counter intuitive way. I understand that this is a different kind of mastery. But does the essay write code that is a sign of mastery, or does it just use the byte code as a talisman? I don't follow the code so I wouldn't know.
I spent many years writing Java, but never took the time to learn how to write the bytecode (that's why we have a compiler ...). I know that Clojure lets you call Java classes, but it's a completely crazy, magical, frightening, and awesome thing to see someone do it by writing their own class loader, and then writing their own class -- in bytecode! -- to load, when the _simplest_ and clearest solution would be far from that.
It's a bit like solving FizzBuzz with Tensorflow: terrifying on at least some level, as it makes it so clear that there's so much more depth I could be learning, yet exciting for almost the same reasons.
I had to wiki that, but it's a good point. The first thing about NN that interested me, rather than images or audio or anything like that, is that a one level NN (IIRC) can approximate any function (by adding more and more nodes to that level).
I guess the point is also that any really deep rabbit hole is in some sense better than a selection of random 1000 rabbit holes.
Basically when you work on creative stuff you're drawn to it because you have good taste, but for a long time your work will disappoint you because of your good taste.
I read somewhere that being able to recognize that your work is not great (yet) is sign that you've got good taste.
And it's that taste that gets you to place where you're able to produce amazing work.