
Look ma, no locks - ingve
http://lucteo.ro/2018/11/18/look-ma-no-locks/
======
barbegal
This is a confusing article. The main content is a description of how to
implement a simple operating system scheduler which effectively is
implementing locks using instruction level primitives.

The author has completely neglected the most common cases where either "tasks"
require multiple resources (e.g. when copying between two files) or where the
ordering of tasks is important. It is these two conditions that lead to
deadlocks and livelocks.

~~~
sterlind
Most (though not all, see Midori) OSes use preemptive multitasking and
threads. These tasks are cooperating coroutines - a different beast. Of course
you can use tasks to implement a thread scheduler, but this machinery would
probably live in a user-mode runtime.

As for the neglected cases, you can build those out of these primitives:
canonical ordering of tasks for multiple resource acquisition; work stealing
and/or a DAG executor where ordering is important.

My only criticism is that it's a little redundant to implement both mutexes
and semaphores - a mutex is just a semaphore of 1. But the rest is wonderful.

~~~
plasticchris
Mutex is for resource locking, semaphore is for signaling. This means they
have some differences in implementation extending beyond simply limiting
semaphore to get a mutex.

~~~
jamescostian
AFAIK, the only difference between "semaphore of 1" and "mutex" is that a
semaphore is definitely able to be signaled by someone who isn't holding a
lock, whereas a mutex is sometimes only able to be signaled/unlocked by
whoever acquired the lock.

As long as you trust the code running on your machine (if you don't, you have
bigger issues to worry about), you'll be fine - nobody signals a semaphore
before waiting on it if they're hoping to achieve mutual exclusion

------
Noumenon72
I'm expecting someone to come denigrate your approach in the comments, but the
fact you made this readable and kept me with you to the end is a real feat of
communication. Thanks for putting in all the effort to re-explain, color-code,
and consider objections.

------
kettlecorn
I believe a number of video game engines use this approach.

There’s a talk about this approach in the game Destiny:
[https://www.gdcvault.com/play/1022164/Multithreading-the-
Ent...](https://www.gdcvault.com/play/1022164/Multithreading-the-Entire-
Destiny)

And also in the Uncharted engine: [http://twvideo01.ubm-
us.net/o1/vault/gdc2015/presentations/G...](http://twvideo01.ubm-
us.net/o1/vault/gdc2015/presentations/Gyrling_Christian_Parallelizing_The_Naughty.pdf)

They go a step further by allowing threads to pause and resume their state by
changing the stack pointer.

These sort of systems are useful in games where you basically want to be using
all of the cpu all of the time and there’s no reason to ever give control back
to the operating system’s scheduler (with mutexes).

------
cyphar
I'm not entirely sure that I agree with the section comparing mutexes and task
dispatch. It appears that the comparison is actually between a _spinlock_ and
task dispatch -- most other kinds of mutexes put a process into an
interruptible state that allow the scheduler to run a different process. So I
would contend that they act in a very similar fashion in practice. At least,
that's how locking works within the Linux kernel (and I think pthreads locks
work the same way on Linux).

However, I do quite like this way of thinking about synchronisation. I wonder
how many other kinds of synchronisation primitives (I'm thinking about RCU and
seqlocks) can be re-thought using this method.

------
gpderetta
Nice article.

A few high performance C++ systems use variants of the dispatcher approach.
For example ScyllaDB and I know of a few HFT systems like that. Boost.asio is
similar (the serializer is, or used to be, called strand) and so is the
dispatcher proposal in C++.

A few systems I have seen strictly map each dispatcher to a thread (which is
in turn exclusively pinned to an hardware core), so the serialization is
implicit. "Active" objects expose their home dispatcher and to interact with
the object one would post a lambda to that object dispatcher. In practice this
is a form of async message passing. To get the answer back either a response
callback is posted to the original thread or some future/promise abstraction
is used.

------
kahlonel
> Here, instead of blocking on the second thread when executing the blue task,
> the scheduler picks up a yellow task;

This only works if the second thread's yellow task is independent of its blue
task's results. Which is, unfortunately, only 1% of the cases.

------
proneb1rd
I think the author eventually came up with what libdispatch already does using
queues.

------
tyingq
Interesting that a 101 style explanation wouldn't tell you that mutex is
derived from "mutually exclusive".

Otherwise, a pretty nice overview. The bar charts are helpful.

~~~
ludston
Likely because the target audience is programmers whom use mutexes rather than
programmers whom do not write threaded code.

~~~
swift532
Please don't be mad as I'm not trying to provoke, but I believe 'who' goes
here, not 'whom', as from what I understand, whom is used when something is
being done/given to something. Perhaps I'm simplifying, but it's just like
nouns etc. in the accusative case work in languages which have grammatical
cases.

~~~
usefulcat
TL;DR: 'whom' is to 'him' as 'who' is to 'he'.

For whom does the bell toll? It tolls for him (not 'he').

Who is there? He (not 'him') is there.

~~~
rlucas
Ooh, now do the predicate nominative!

------
djcapelis
This is kinda the type of model I use my little lock-free SPSC workqueues for:
[https://github.com/djcapelis/atomic-
ring](https://github.com/djcapelis/atomic-ring)

You just have things do stuff and then toss pointers to threads on the other
side of the workqueue. Most things I do this with I basically keep each thread
in a different part of the pipeline and they pass the task to the next one.
Sometimes this means each thread has a warm cache for the part of the task
it's working on, assuming the majority of the working set of each task is more
correlated with the stage of the pipeline than the specific data moving
through it.

------
gok
Sounds like libdispatch

~~~
thealistra
Yep, working in iOS/OS X for 9 years now. The primitives are queues not
mutexes.

------
joantune
This sounds a lot like going from the Lock/Mutexes/Semaphores paradigm into an
Actor system.

OP - i.e. @ingve - it might be interesting to take a look into some strategies
- like the pull one - on Actor systems like AKKA

------
al2o3cr
The skeleton of the approach suggested reminds me of how the BEAM runs
processes: one OS thread per CPU, scheduling many lightweight tasks listening
to message queues.

------
bullen
If you want to see a practical working implementation of what this article
describes: [https://github.com/tinspin/rupy](https://github.com/tinspin/rupy)
(Java HTTP Server)

------
Gupie
What does he mean by "static world", "static task"?

~~~
lucozade
They mean that, if you know upfront how long every task takes and the order
they need to run in, it's straightforward to lay out a schedule that has no
resource contention.

I'm not sure that that's true in general but it's true that it's easier than
when you don't know those things.

------
Annatar
Enqueuing tasks per resource is a very old concept in computer science; this
is nothing new and anyone with a computer science degree should have come out
of the university with having written such a task scheduler under their belt.

~~~
antoinevg
Yet, most people with a computer science degree have not come out of the
university with having written such a task scheduler under their belt.

~~~
yellowapple
Also, not all of us have computer science degrees.

~~~
Annatar
Then it's high time to get one, otherwise those people have no business being
in the IT industry.

~~~
yellowapple
Cool, so I guess my career is meaningless, then. :)

Getting a degree is something I very much want to do, but it's something I
haven't been able to do (due to the time and money required to do it). I ended
up jumping straight into the workforce in the hopes that at some point I'll
have saved up enough to actually enroll (since my high school grades left much
to be desired; it's a miracle I even got my HS diploma).

Really, though, 99.995% of IT (in my experience at least) doesn't actually
require a degree (obviously, since I'm doing it right this very moment without
a degree and have been doing so for more than half a decade). It _does_
require a lot of background knowledge that, sure, might've come to me faster
had I received a formal education in IT, but I've been able to acquire it just
fine through real-world work experience (and have been paid for it, to boot).

My interest in getting a degree has more to do with my interest in a lot of
the more theoretical concepts. I'm able to stumble through a lot of things via
self teaching, but I do feel like I'm missing various critical pieces, so I'd
like to fix that. None of those pieces are getting in the way of me doing my
current job, but I do feel like my career is prone to stagnation if I don't
have those pieces.

~~~
Annatar
_Really, though, 99.995% of IT (in my experience at least) doesn 't actually
require a degree (obviously, since I'm doing it right this very moment without
a degree and have been doing so for more than half a decade). It does require
a lot of background knowledge that, sure, might've come to me faster had I
received a formal education in IT, but I've been able to acquire it just fine
through real-world work experience (and have been paid for it, to boot)._

Unfortunately, that is not true: you can _get by_ twiddling something, but the
software you'll write without formal education in computer algorithms and data
structures will always be inferior and often unmaintainable. This is from
personal experience debugging lots of code over the years written by people
without formal computer science education.

You could attempt to make the argument that not everyone needs to be a
programmer, but system administration skills of such people are also subpar:
sooner or later one needs to automate, and the shell scripts written by such
people are always either hacked together or overcomplicated because the
knowledge of algorithms, data structures and common solutions (like this
enqueuing scheduler) simply isn't there. It's exceedingly difficult to educate
oneself on something one isn't even aware exists.

If you really love IT and love what you do, get a computer science degree
which provides a quality, required education (and not just for the paper so
you can say you have one).

~~~
yellowapple
I feel like folks who are this elitist about a degree being mandatory
(regardless of field) are trying to justify their crippling student debt :)

Yes, an academic background is helpful in IT. Yes, I do see the value in
acquiring that formal education, and certainly wish to do so. No, it is not
mandatory, and no, it does not magically turn people into good programmers.

And no, I am certainly not going to stop working because some random
commentator on Hacker News thinks that having a successful IT career is
strictly dependent on having a CS degree despite such an assertion being
demonstrably false. Bill Gates and Mark Zuckerberg and Michael Dell managed
without. Larry Wall's degree was in the study of languages (of the non-
programming variety), and that's what paved the way for him to create Perl.

Besides, CS grads are no more immune against writing bad code, in my
observation debugging (and/or working around) code written by people with
those degrees. The degree itself is not a reliable indicator of whether or not
someone is competent.

~~~
Annatar
"I feel like folks who are this elitist about a degree being mandatory
(regardless of field) are trying to justify their crippling student debt :)"

However, feelings aren't facts: I paid the remaining 12,000 USD of my student
loans to the federal government within months of graduating. The rest of my
tuition was covered by scholarships because of my 4.0 grade point average, and
the other part by my then employer, minus the IRS taxable parts which I paid
fair and square. Those scholarships were free money: all I had to do was
maintain a grade point average higher than 3.0 and send in my report card at
the end of every semester. That's reality.

As far as immunity to bad code, you really, really need to watch Brian
Cantrill's videos; if he doesn't change your mind about that, nothing and
noone will.

~~~
yellowapple
That's great. Not all of us can afford to burn $12k. I certainly didn't have a
4.0 GPA, so it ain't like I would've qualified for any scholarships of
significant value. I did qualify for FAFSA, and used that for a semester of
community college before I started working full-time. Wasn't much, though.

I'm sure Brian Cantrill is some really smart person who will totally blow my
socks off with reasons why I'm a worthless piece of shit because I don't have
a college degree. Regardless, considering the sheer amount of terrible
software written by thoroughly-educated people, and the sheer number of
diploma mills out there that'll graduate anyone with a pulse (and even that's
optional), it's patently obvious that a degree is - again - not a reliable
indicator of competence.

~~~
Annatar
"Not all of us can afford to burn $12k."

Getting an education is never burning money. I could have made the minimum
payments of a couple of hundred dollars per month for the next ten years but I
wanted to live free so I mercilessly paid it off as fast as I could because
freedom is dear to me, I had my entire life ahead of me (I was 21 at the time)
and many plans fueled by megalomaniacal ambition. I paid it all off from my
salary working in IT.

If you think like you do, then it's no wonder a mere $12,000 USD is a lot of
money to you. Not only was I a full time student, but I also worked full time
and for about six months I'd rack up another 30 hours working part time every
weekend, just in case I lost my first job. I did the bulk of my homework on
the bus and then I studied when I got home. 134 miles round trip every day.

Moving on to your comment about Brian: all he is saying is that computer
science education is well worth it if one works in the computer industry, and
he is qualified to say so because not only is he a computer scientist but also
a kernel engineer and exceptionally good at both writing and debugging code.
And yes he's super smart.

Finally, I want to touch upon your remark that my comment was _elitist_. I
thought about that for quite a while, and I concluded that yes, it _is_
elitist, and there should be no apologies made for that: computer science _is_
hard, and managing to get a degree in it does make one intellectual elite.
It's elitist because it's hard and challenging to earn. I studied like crazy,
graduated at the top of my class, I applied myself because I lived for
computers and therefore I refuse to make any apologies for becoming the elite
just as I refuse to feel bad about it.

~~~
yellowapple
"If you think like you do, then it's no wonder a mere $12,000 USD is a lot of
money to you."

$12k is a decent down payment on a small house or condo (at least if you're
living outside of a place with grotesquely-inflated property values like the
SF Bay Area), or a significant amount of the way toward a new or lightly-used
car. It's more than the vast majority of Americans have in their savings
accounts. That you consider that to be trivial betrays how thoroughly out-of-
touch you seem to be with the real world.

Regardless, given my mediocre-at-best academic record in high school, my
tuition likely would have been much higher than a "mere" $12k even with my (at
the time, and probably still in effect) plan of pursuing the Associate's
degree first (unless I stop at the Associate's degree, but I have a feeling
folks like you would still be snobbish about that).

"managing to get a degree in it does make one intellectual elite"

No, it really doesn't. Not all institutions are equal in terms of academic
rigor, and not everyone graduates from CMU or MIT; some of them went to ITT
Tech or got to be a "Legacy" at Harvard. High-schoolers in America (myself
included) were/are constantly bombarded with propaganda about how "if you just
get a college degree you'll make so much more money" and then the kids flock
to whichever degree mills will get them that piece of paper. And again, I know
from direct personal experience dealing with CS-degree-holders that the degree
itself is not a reliable indicator of competence _for that reason_.

If you think that a piece of paper is the thing making you an "intellectual
elite", then you're in for a rude awakening when the rest of us manage to peer
through that veneer. What _does_ make you an "intellectual elite" is actual
merit, and that is _earned_ , be it through actually-rigorous academic study
or actually-rigorous experience in the real world (and ultimately, the elite
of the elite will more often than not have an abundance of both; most such
folks start with the academic side of that equation, but I missed the boat on
that so I guess I'll have to make do coming at it from the opposite
direction).

------
tirumaraiselvan
I would like to compare this with Haskell's STM.

------
contravariant
Looks like a reinvention of async/await.

