Hacker News new | past | comments | ask | show | jobs | submit login
Sleeping through the technical interview (2022) (xeiaso.net)
279 points by xena on July 7, 2023 | hide | past | favorite | 108 comments



Not only is it not constant time, it's not even polynomial - it's psuedo-polynomial. Also it'll fail on negative numbers, right? You'll need something like `10000 * log(time + min(time) + 1)` to be linear in the bits used to represent the inputs.

> In computational complexity theory, a numeric algorithm runs in pseudo-polynomial time if its running time is a polynomial in the numeric value of the input (the largest integer present in the input)—but not necessarily in the length of the input (the number of bits required to represent it), which is the case for polynomial time algorithms.

https://en.m.wikipedia.org/wiki/Pseudo-polynomial_time


> Not only it not constant time, it's not even it's polynomial

You understand that this is part of the joke, right?

If we really want to get down to the details and kill the joke, then you don't actually need to wait real time. Computational complexity is concerned with steps in a computational model, not how much time passes on a clock. Sleep sort uses OS scheduler properties and in a virtual time environment, time advances to the next scheduled event. That's what brings you back to actual polynomial complexity, if you assume this kind of thing as your computational model.

> - it's psuedo-polynomial.

If you lecture people then please at least get your spelling right.


This is really what makes the joke work IMO.

Haskell's runtime and the OS it executes on exist only as a transient implementation detail of what is, literally, a pure environment!


I still don't get it. Sleep sort still needs O(n) Haskell-steps, while for example sorting in bash is just 2 bash-steps, calling it, and receiving the output.

I fail to see the joke, really. I only see false and nonsense statements, which still could be funny or interesting, but I don't see how?


The "constant time" is wall clock time, where it will be at least the biggest value times the constant microseconds plus like 5% for overhead like the garbage collector.


Complexity analysis deals with asymptotic cases where N gets arbitrarily large. Sleepsort would take the same wall time to sort [1,2] and [1,2,1,2] - but it would presumably take somewhat longer to sort an array of 1e10 ones and twos, because it's not a constant time algorithm.

On the joke part, sleepsort is intrinsically an extremely funny concept and I think everyone here gets that. But "constant time" has a rigorous/pedantic definition, which sleepsort doesn't meet, so I think for some readers calling it that kills the joke (in the same sort of way that it would kill the joke if TFA's code snippets used lots of invalid syntax).


I'd never seen sleepsort before, so I thought it was funny. ;-)

I like the idea of (ab)using the scheduler to sort numbers.

Now I'm inspired to make my own "sorting" routine, maybe touching filenames and then sorting them with `ls -n`, or rendering triangles and then ripping them off in z-plane order, or stuffing frames into a video and then playing it back.


Surely any PP problem can be done in P, it's just a matter of encoding? As in if you give me a box that computes whatever in PP, then I can give you back a box that takes a single input consisting of 1s to the length of each value delimited by 0s, turns those back into integers (linear), uses your PP box, and returns its output - but now it's polynomial in the length of my input?

(I said integers but I don't think that's significant, just a matter of encoding scheme - we can use a single 0 for decimal point and two for delimiting inputs say.)

But anyway isn't the joke that sleep-time doesn't count, because the computer could be doing something else? It's actually quite compelling in a 'this is stupid, I love it' sort of way.


You can indeed make the distinction between polynomial and pseudo-polynomial stop existing by enforcing the inputs are in unary. But you haven't made anything faster, you've just made almost all of them worse.


sleep sort originated in /prog/ [0] . There's quite a few HN lurkers that were active in the sleep sort thread. Perhaps xena is also one of them? :)

[0] https://www.cs.princeton.edu/courses/archive/fall13/cos226/l...


If I load gunpowder into a cannon based on the current number, such that bigger numbers get more gunpowder and are launched farther, and then I walk the distance and collect the numbers along the way, is this a physics sort?


You can just make holes bigger and bigger, until the [number] will fall in. For example: https://www.youtube.com/watch?v=ZKbJ1REQ-EA


My favorite physics sort: https://youtube.com/watch?v=Nz8ssH7LiB0


Haven't thought of /prog/ for a very long while. My favourite post was someone's apprentice programmer inventing the "<=>" operator, for when you want to test for less than or equal to or greater than. Genius!


Then someone submitted <=> to the very serious people at the C++ standards committee, and they went on to ratify it in the standard.


As the author notes, this is very much in the style of the aphyr's Interview series of stories like "Rewriting the Technical Interview" - [0] - which are all great reads as well.

[0] - https://aphyr.com/posts/353-rewriting-the-technical-intervie...


In a very different writing style, but also mocking the technical interview: "Fizzbuzz in Tensorflow" (2016)

https://joelgrus.com/2016/05/23/fizz-buzz-in-tensorflow/

To whet your appetite:

> interviewer: Um, you understand the problem is fizzbuzz, right?

> me: Do I ever. So, now let's talk models. I'm thinking a simple multi-layer-perceptron with one hidden layer.


It would be very easy to turn that into the kind of story that is being shown here though. Just make the interviewee seem mystical or whatever and reply in terms like "now the machine must learn, please join me in the prayers needed to ensure the safety of the journey" and then have a giant paragraph of Buddhist sutra translated into Lojban or something.


This is actually a nice tutorial


I never fail to re-read the entire series when it's brought up and will do so right now.


I was going to link to this as it immediately rang similar in my mind!


Computer sorts require at least linear time, as you need to read the input (if you don't know other things about the input, say it has an uniform distribution). There are several linear time sorts, sleepsort, postman sort, counting sort. (For a bounded set of numbers or sortable keys.)

But there is an (almost) truly constant time sort, using an abacus instead of a computer: https://en.wikipedia.org/wiki/Bead_sort


Sorting networks are a thing. I don't think it really changes your point, of course. :D


It’s a cute story but it’s not constant-time in any sense.

Creating ‘N’ threads and adding them all to a sorted wake list will be between O(N log N) and O(N^2), depending on the OS and/or language runtime.

There is either a sorted list, a heap, or an N^2 algorithm somewhere behind the scenes.

Similarly, the sleep sort itself is at least linear time. You have to wake N threads to output N sorted items.

Worse, in wall clock time it also scales with the values. You could compress the range by first finding the minimum and maximum values, but that’s also … linear time.


To risk killing the joke: when I said "constant time", I alluded to the words and art of time complexity analysis but actually did not mean to go there. This is a double entendre that uses two conflicting views of the word "time" because yes it is impossible to make a sorting algorithm constant time in complexity analysis terms. The joke is that the actual intent of the statement is wall clock time. This is the time that is more relevant in job interviews (and realistically when people pull the "write an integer sort function" thing out of their ass for me to defecate code onto the screen, they don't use numbers smaller than like 100, meaning that this program will be almost perceptually instant to run).

It's a subtle metalinguistic joke. It's supposed to poke fun at understandings of how computer science works by turning them on their head.

I'm sorry the joke didn't land on you.


It's not a "subtle metalinguistic joke". You're just redefining words. The algorithm isn't constant in complexity theory terms and it's also not constant in "wall-clock time" (I'm not sure why you think the two are unrelated?), except if you use a model of computation that is so bonkers that it would be of no value.

Maybe it works as a sort of dadaistic literature, like the ones where they redefine "chair" to mean "table" and so on, but beyond that?

Both your article and this comment of yours show that you're annoyed at algorithmic interview questions, so you're trying (well, at least in this fantasy) to one-up your interviewers by being smarter, and technically correct, but in an unexpected way. I understand the sentiment, but unfortunately, this only works when you're actually at least technically correct.


Can the wall time be considered constant if it depends on the magnitude of the numbers to sort? A list of numbers between 1 and 1000 will be much faster to sort than a list of numbers between 1 and 10^100. EDIT: I did get the joke when I read your post, and laughed to myself, so well done :)


The constant is the biggest number in the list multiplied by the sleep multiplicand. That plus overhead gets you the approximate runtime of the function.


If you just repeat the same number a million times, then the threads will all be scheduled to run at the same time, but of course, your computer doesn't have a million cores, so the threads will have to be executed partially in sequence. That means, even if you assume the largest integer is a constant (valid for some integer types, but notably not Integer in Haskell), the runtime will be proportional to the number of elements in the list.


Yeah, but technically that doesn't qualify as a constant since it depends on the input. If you explicitly constrain the problem space such that the magnitude of the numbers in the list is bounded by a fixed upper bound, then it works out. That's fine though, but then it's not a fully general sorting algorithm, like e.g quicksort


Everything's constant time in a universe with a finite lifetime.

https://en.wikipedia.org/wiki/Ultimate_fate_of_the_universe#...


Sure, but in such a world asymptotic complexity has little meaning. For practical purposes it's useful to make simplifying assumptions that are done in complexity theory.


In theory, since the argument to the sleep must necessarily devolve to an integer eventually, it could use a radix-sort or similar to perform in linear time. There are problem spaces where that can be a win, even after introducing a dependency on the magnitude of the largest value.

In practice, of course, no system actually does this. Syscall timeouts just aren't a place where this is typically beneficial. Also, naturally, it would be better to just directly apply a radix-sort which only scales with log(max_value) rather than linearly with max_value.


> Creating ‘N’ threads and adding them all to a sorted wake list will be between O(N log N) and O(N^2), depending on the OS and/or language runtime.

This is not a fundamental limitation of scheduling systems, especially if you take into account the possibility of special hardware allowing for constant-time-with-respect-to-number-of-threads scheduling. For example, it is trivial, though economically infeasible for any practical purpose, to construct a scheduler that performs scheduling by bouncing infopackets via laser beams off of a gargantuan set of mirrors of varying distances back onto a detector connected to the computer, relying on the speed of light to perform a delay of the specified duration. This shows that sleep sort does not inherently rely on the hidden algorithmic complexity of whatever thread scheduling approach is used, since that can be optimized to O(1) in theory, if not in practice.


This is kind of like the style of building a kubernetes cluster that returns with "Hello World"


There's a parallel universe where someone wrote the k8s hello world story and made fun of people testing on the most obscure sorting algorithms within it instead of the opposite here, and the psychological profile painted of the interviewed person is mostly the same. Maybe not as punchy due to all the YAML ofc.


Hey all! If you like this, there's a sequel of sorts in Protos: https://xeiaso.net/blog/protos

I'm working on more stories in this "universe" but it takes a while for the satire juice to build up. Maybe the next one will be on spatial computing.


It was fun, I especially liked the commute and optimisation! I'm always happy to see new articles on your blog, thanks for writing and sharing it.


> Just in time for his calendar to fire a reminder that his standup meeting was about to start

Sounds like our universe . . .

I think that universe is better at naming things.


> "Yeah, I'm here. I've just never seen that before. It's a very...inventive sorting algorithm. How would you optimize it?"

> In a flash, one line of code is changed:

    -forM_ values (\time -> forkIO $ threadDelay (100000 * time) >> writeChan chan time)
    +forM_ values (\time -> forkIO $ threadDelay (10000 * time) >> writeChan chan time)
> "It is now ten times faster."

Related: https://thedailywtf.com/articles/The-Speedup-Loop


Didn't read the article. But I hate these things... I had a remote interview with Meta once, the dude was eating in the microphone the entire time. It was so distracting I forgot how to write a for-loop.


Hmm, I much prefer remote recruitment. I remember the old days. You had a brief chat with a recruiter or HR, than you had to suit up, drive/fly to some far place (often it would take an entire day - if you're working at the time it necessitates a day off). You would stress, am I wasting my limited days off on these guys? Will there be parking there? Will I get there on time? Then you'd have a half an hour "initial interview", then you'd wait, sometimes for weeks to either get an invite for the interview proper, or to never hear from them again...

The whole process could take a month, require at least 2 days off and significant amount of travel.

Today? A recruiter/HR person rings you and asks, can we have a video call? You have that 15~20 min call the same day, they submit your CV to whoever makes the decisions, then they schedule one or more video interviews/tech sessions, some companies ask you to do behavioral/skill tests from the comfort of your own home. All of it can be done during a lunch break if you're a remote worker... What's not to like?

Yes, the communication bandwidth face to face is much higher, but only remote allows me to interview for a company in Tel Aviv in the morning, Warsaw midday and a Californian one in the evening all on the same day.


I mean, isn’t spawning 1000 threads at least linear time? You could get it to log maybe but I’m not sure that code does that automatically.


Depends on what you mean by "time". Going by algorithmic complexity time it's at least linear yes. Going by wall clock time (the more important one for interview code) it's constant time.


The time to spawn the threads is just inconsequential compared to the time to wait for the threads to exit for small sizes of N. But by doubling the number of numbers to sort you do double the time (actual wall clock time) that it takes to spawn those threads. O(N + 10000000) is still O(N)

Not to mention for the OS to handle N threads would likely take some kind of non-linear time, at least N^2 if I had to take a shot in the dark. But I imagine it'd depend on the OS and how the hardware handles interrupts. Even if the thread is sleeping it still takes resources to schedule it


Linux's CFS is O(log N) to insert; O(1) to switch tasks (N being number of threads). [1] So O(N log N) it seems.

[1] https://en.wikipedia.org/wiki/Completely_Fair_Scheduler#Algo...


You're forgetting that algorithmic complexity deals with asymptotic behavior.

An algorithm that takes N inputs, does N operations, and then sleeps for 10 seconds is not constant time, because in the asymptotic case the N iterations will take longer than the sleep.


I don’t understand how wall clock time is constant.

Sorting [5, 50] is faster wall time then [6, 60].


The constant is the biggest number in the array


If runtime depends on the input, it’s not constant time.


It's constant with regards to the length of the input array, which is the usual metric for sort algorithm performance. That is, d(runtime)/d(len(input)) can be constant, even though d(runtime)/d(max(input)) is not, if you assume that len(input) is independent from max(input) (which is inconsistent with input having uniformly distributed random members, which is probably the more common assumption).


It's constant with respect to the number of elements, just not the maximum magnitude of those elements.


I don't see how it is. Different, but the same number, of elements is a different time. There's no constant time between [0, 1] and [0, 10], both having two elements.


[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]

is just as fast as

[9]

If you really want to make it strictly constant time, just append INT32_MAX to the end of the array before sorting, and then pop it after.


Oh yeah, because assumin maximum values in your domain doesn't move all algorithms into constant time and renders complexity theory void.


It doesn't though. How would merge sort become constant time if you assume a maximum value? It's also a joke...


The main "value" in the domain of sorting problems is the number of elements in your collection.

A subproblem also considers the elements to be integers, then they become another "value" domain. (But in general, sorting problems only need their elements to be comparable, not necessarily integers.)


It’s not constant; the algorithm runs in pseudo-polynomial time.


I don't think that sleepsort is any more constant time, than any other sorting algorithm. For sleepsort to be constant time, you must

   - have an upper bound on the input (largest number)
   - you must not count some arbitrary things, like reading and processing the input, or spawning threads
But if are allowed to do these things, all the other sorts will became constant time. (I believe only one of them is enough.)


It isn’t really linear. Sleeping is a heap insertion which takes O(log n).


You still have to read all the inputs at least, which is at least linear time, in the size of the input.


Each sleep is log n for the heap insertion, so the overall complexity is at least n log n


Well they were indeed sleeping through the interview. Otherwise I don't know how could one possibly claim that a program whose first statement is a sequential loop over all values of the input has "constant time" asymptotic complexity.


As with everything in algorithm analysis, it depends on your model of computation. That is, it depends what you're counting.

For sorting (comparison sorts), one fairly typical model is to just count how many comparisons you do. Which, this does none (kind of, not really, they're really just implicit or hidden in the OS).

It's just playing around with computational models, not a serious proposal. It's either just a joke, a troll, or an parable about the need to be careful about what metric you're measuring. Or some combination of those.


> Which, this does none

Neither does bucket sort, and nobody has claimed that bucket sort is constant in time.

We only count comparisons in typical sorting algorithms because we know that the overall complexity is proportional to it.


Yeah, that's kind of the point I was trying to make. It's an innapropriate compuational model, so the answers you get out aren't meaningful.


Actually, if the thread runtime maintains its own concept of time, the algorithm doesn't even need to sleep in real time. Once all the threads have been created, the thread runtime can notice that all threads are idle and that the next thread to be scheduled is the one at time N, so it can simply update the current time to N and start executing that thread. Repeat for each thread and you'll have your sorted array out without any sleeps.

After all, all the sorting work was already done when the threads started sleeping, having registered themselves with the orchestrator (timer wheel etc) that will eventually wake them up. Actually performing the sleep is not necessary.

I don't know about Haskell, but with Rust the tokio runtime lets you do this using start_paused ( https://docs.rs/tokio/latest/tokio/runtime/struct.Builder.ht... )


My understanding is that this is basically how discrete-event simulations work under the hood. You have a frontier of future events in an appropriate data structure (say, a heap) and then alternate between adding future events into the heap and pulling the next event off the heap.

So, modulo several layers of abstraction and a bunch of implementation details I'm glossing over, using a scheduler like this to sort values is just a heapsort :)


Actually, if you start working out which one is the next to run, you have reinvented selection-sort and you are no longer linear time. (Hence why this is not a sensible sorting algorithm in reality :) )


Which is why I wrote:

>After all, all the sorting work was already done when the threads started sleeping, having registered themselves with the orchestrator (timer wheel etc) that will eventually wake them up.


It occurs to be that this might actually be a practical algorithm on an FPGA. It would need a couple of refinements to handle multiple of the same value, but for constrained length lists to sort it could be a useful technique.


I keep a list of these sorts of creative-writing-programming posts here[1] (forgive the bare and broken theme, i dont actually post so work on the site has not been started, let alone finished)

please let me know if you have any suggestions thet might fit in, and I'd be glad to even include less mystical folklore like the story of mel but clearly have a theme goin here

1: https://www.illucid.net/posts/homages-to-aphyrs-technical-in...


I have another one here but it's not very good: https://xeiaso.net/blog/voiding-the-interview-2017-04-16


> but it's not very good

My criterea don't involve judgement calls :p

The tough choice was to exclude projects without prose; I myself am more comfortable writing code. My own contribution is still down the road, and I bear a great respect for those who have put their words (and themselves) into the world.


I'm also one of these people who doesn't "get the joke". This could be a mildly interesting example of a sort that doesn't need to do any comparisons, but it's certainly not "constant time" given that it needs to loop over all the input values (twice actually).

In the end, this is basically just bucket sort, only that the buckets are managed by the OS. And nobody has ever claimed that bucket sort is constant time.


I used to think all this aphyr-like "jargon" about coding as snatching stuff up from the void was a joke, nowadays I am not so sure.


Again, we etch runes into stones, and by imbuing the runes with lightning, make the stones come alive and do our bidding. Tell me again computers aren't magic!


For an earlier example of a related vibe, I recommend checking out sicp: https://web.mit.edu/6.001/6.037/sicp.pdf#page29


The best jokes are always the truth!


When the void is the norm, it is no longer the void. This is what everyone in tech aspires to.


I handed in an assignment in my C class in college that used this algorithm for the sorting part of the assignment.


That is incredible. What did the teacher say?

I actually got away with using SleepSort in an interview for my current job. It got me the job.


Meh. Smells of "rockstar developer" thinking. I'll pass. This is a job.


that sleep sort is one of the genuinely "inventive" discoveries out of 4chan.


Very cool but will the threads necessarily wake up deterministically? One may wake up before another but not get cpu before it correct? (forgive me if I'm misunderstanding the code)


Yes, the algorithm suffers from a race condition. Time is not a synchronization primitive.


Yup. If we are being pedantic, and I am sure the author of this loves being pedantic, any number of factors could cause this algorithm to be incorrect. Put it in a slow enough cpu for instance, and it will output the wrong answer.

And correctness is the single most important part of an algorithm. We can do any problem in constant time if we don't mind our answers are not correct.


return(4)


> I am looking at joining a smaller company

> Against all odds, they wanted to hire you. For a significant amount of money

A small company not trying to cheapskate? Literally unreadable.


This was really funny. Got lost in it, and the site, for a while yesterday. Made for an enjoyable week end. Many thanks to the author.


Sleep sort is a meme.

If you pick a small unit of time you can make it faster but if the values are sparsely distributed it gets slower.


Super interesting on the sleepsort thing, amazing stuff! But for some reason, the tone and the excessive acrobatics/"bragging" kind of get the best of the article away for me.

Okay, if the interviewer had properly done 5 minutes of research on this candidate, they would know know that the candidate is "way over their pay grade". I found all that dance that the article author did a little bit... unnecessary? Kind of how killer whales play with their prey before the kill. Overkill.

Sorry, my answer probably isn't productive, I am merely sharing how I felt after reading it.

I am now off to educate myself on SleepSort :-)


The tone is styled off another popular series :)

https://aphyr.com/posts/341-hexing-the-technical-interview


It's the mark of a toxic person probably. But aren't we all to some extent ? I too am fed up of candidates so shit we need to grind them through linked list and sorting exercise to find the one that can actually code a simple function, out of the 50 we interview that couldn't even start writing.

People have no idea how many candidates try to apply to backend dev job in an investment bank for 100k a year in China with NO experience in banking, computer science or life in general. Between the guys telling you their dream job is to do nothing and be paid for it, and the dude looking beyond his screen at his friend filling the exercise for him, I've seen it all.



it's not an article though, it's a fictitious story, so of course everything is embellished for comedic effect.


Seriously? I can't think of a single instance where an interviewer has spent more than a cursory 5 seconds reading a candidate's CV before interviewing them. Many times not even that much courtesy.


God yes. In case it's funny, I have a bit that I left on the cutting room floor where it's like:

Jeff started again to speak, maybe he actually read your resume. That would be a first. "So it says here that uhhhh-".

Alas, he did not. You wonder why you make that thing if nobody is going to actually read it. You even spent the extra time putting it into normal human language too. The travesty of recruiting continues.

Or something like that. There is so much about the job seeking process that is just stupid and frankly funny.

I'm considering giving Palima an intern in the next article, just because that would be hilarious to me.


Sleep sort - invented by 4chan.

The rest of the short story also reads like it came straight from r9k


I wasn't expecting lojban! "The drink of the gods"


Esperanto, not lojban


Hahahaha I'm in love. Thanks for sharing that one lol.


This blog has nice stories, but what really does it for me are the cute interjections using random conlangs. Here it was a well-known one, but I recall having seen some toki pona another day (can't find it now).


[flagged]


The implication made here is that the interviewee is presumably running an OS such as Gentoo or similar which has the spirit of compiling everything from source. Or this person only compiles their browser from scratch which has some privacy benefits (removing unwanted telemetry, no hidden surprises in the binary, etc.) and can help performance. I have known some people who compiled their own Chromium.

While the person who wrote the article might seem odd and I would not have hired them, your condescending tone while nitpicking a perfectly valid technical statement strikes me as weird.


Some parts that are done can already be running while other parts are still compiling.

I'm seriously concerned about all the besserwissers in this thread.


I want you to know that I'm enjoying your replies as much as the linked post, if not more.


Yes, I'm aware that this is impossible in plane 432. This is part of the framing of Palima as some kind of programming/SRE god that is able to do things previously thought impossible. It's fiction, belief can be suspended just that little bit.


> Hopefully the important parts would be done in time for the call.

implies this was no mistake in writing




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: