
It Isn’t Your Father’s Realtime Anymore: The Misuse of a Noble Term (2006) - matt_d
https://queue.acm.org/detail.cfm?id=1117409
======
luckydude
I blame all the misguided folks who try and add "soft real time" to time
sharing systems like Unix/Linux/etc. Anyone who does that has a profound
misunderstanding of what real time is, it most definitely is not a time
sharing system.

If you want a time sharing system and real time, that problem has been solved,
in a really elegant way. Victor Yodaiken did it years ago, he made a small
real time kernel and he ran all of Linux as the idle process in the real time
kernel. When Linux thought it was disabling interrupts it wasn't, only the
real time kernel could do that.

He had a demo where he was running xperf (or some fairly compute intensive X
thing) while running

tar cf - / | rsh someotherhost 'cat > /dev/null'

while the real time kernel was gathering events. He _never_ missed a deadline
or dropped an event.

Sweetest design I've ever seen, have your cake and eat it too.

~~~
AnthonyMouse
> I blame all the misguided folks who try and add "soft real time" to time
> sharing systems like Unix/Linux/etc. Anyone who does that has a profound
> misunderstanding of what real time is, it most definitely is not a time
> sharing system.

Part of the problem is that there are realtime systems where, if you don't
meet the deadline, _the plane will fall out of the sky_. Then there are
realtime systems like VoIP where, if you don't meet the deadline, the call
will cut out.

And there is a desire to get "almost realtime" for things like VoIP on general
purpose time sharing systems, where you want to deprioritize less latency
sensitive tasks in order to hit the deadline if you can, but nobody dies if
you don't.

The trouble with using "real" realtime in those systems is that the system may
not have enough capacity _in theory_ even when it does _in practice_. In other
words, if you have three concurrent realtime processes that could each in
theory use up to 50% of the available resources at a time, but in practice
almost never use more than 10% each, then for less critical systems you may be
willing to bank on "almost never" not happening often enough to matter.

~~~
qznc
Imho "soft" and "hard" real time distinguish pretty well between "must meet
deadline" and "should meet deadline". The only problem I see is that they
sound so similar, although the development process is radically different
between the two.

------
dwc
Devolution/devaluation seems very common now. I'm not sure if it's my
perspective or if it's really worse that it used to be.

Some examples:

* ROM: something you download and install on your Android phone (as opposed to socketing a new chip)

* Machine Learning: any change in program behavior based on feedback input (we used to call this programming)

* Disruptive: doing business at least slightly different than someone else (as opposed to a new paradigm that sends shockwaves through a whole industry)

* Fork: make a copy of a project (as opposed to starting a rival project based on the codebase of another project)

~~~
oldmanjay
Considering any of that "worse" is definitely your perspective. It's simple
back-in-my-day-ism to apply a value judgment to language use.

~~~
UmDieWelt
The point is that the words don't really have a useful meaning anymore. It is
especially important in technical topics to have terms with some specific
meaning.

~~~
Retra
It's important to be able to construct specific meanings. You don't need terms
to do that, and you run a major communication risk by relying on terms that
you aren't willing to define. And if you define them first, then you can use
them however you like.

~~~
kuschku
And that is why, a century ago, German was used frequently in science: A
language where every word is a literal description of its meaning.

------
Animats
Having used QNX for a robot vehicle, I'm very aware of what "real time" means.
If the computers didn't send an update to steering and throttle every 100ms,
at 125ms a hardware stall timer cut the throttle and slammed on the brakes.

A typical test of a real time OS such as QNX is to hook up a square wave
generator to an interrupt input, and run a program which is activated by the
interrupt. The program then turns on an output. The input and output signals
are sent to an oscilloscope, so you can see the delay. In a true real time OS,
there are _no_ outliers where the delay is much longer than usual, even if
there are other processes running.[1]

[1]
[http://www.qnx.com/developers/docs/6.5.0/index.jsp?topic=%2F...](http://www.qnx.com/developers/docs/6.5.0/index.jsp?topic=%2Fcom.qnx.doc.ide.userguide%2Ftopic%2Fsysprof_Interrupt_latency_.html)

~~~
luckydude
Who owns QNX these days? I was friends with Dan Hildebrandt who was one of the
core OS guys, one of the few who could actually touch the microkernel. We had
a lot of good OS conversations/discussions/arguments back in the day. It was a
sad day in 1998 when we lost him.

~~~
Animats
Blackberry owns QNX now. QNX has gone from closed source to open source to
closed source to open source to closed source. Along the way, most of its
developer community bailed. Sad.

------
richmarr
This title is literally true for me.

During the 70s my dad & his team built a programming language & runtime
environment for realtime control systems in power stations. It was called
_Cutlass_ , and it's still in (dwindling) use today.

In my dad's context, the term 'realtime' drew the most useful distinction
between types of system in that context.

In the web days, the term is used slightly differently, but still draws
broadly the most useful distinction within the context.

And lastly in media, the tech industry and computer subcultures have a pretty
bad track record of protecting words from mis-use... troll, hacker, repo,
pull-request, etc. all have comically uninformed usage. You don't quarrel with
folk who buy ink by the barrel, nor do you get far telling them that they're
doing language wrong.

Personally this isn't an issue I choose to care about, language is contextual
after all, but even if you do my advice would be to save your blood pressure
for a fight you have a hope of winning.

------
11thEarlOfMar
I always enjoyed the rejoinder: "Real-fast is not real-time."

50 years ago, real-time in it's true sense was not a product or service
attribute that humanity was much concerned with. Information was disseminated
largely via 'latent' methods and the delay between and event and it's report
was typically measured in hours at best, and months for people subscribing to
magazines.

Today, it does matter to an increasing number of people when an event
occurred, and their desire to know 'soon' is increasing, because the reach of
events has expanded to global extents and and the time we have to react is
shrinking. Case in point: I've set a reminder to keep an eye on the Hang Seng
in Hong Kong this evening to get an early read on where the NASDAQ will head
tomorrow morning.

These trends are pushing the term real-time/realtime/real time out of computer
jargon and into mainstream where they must be oversimplified and therefore
conflated, like so many others. It's the natural evolution of language.

------
nickpsecurity
I think I have a pretty straight-forward answer to his question about its
popularity: real-time became synonymous with instant results and zero delays
to mainstream audiences. Not sure exactly from what sources but that's the
common denominator across all of these usages.

~~~
bbrazil
I think it's devolved beyond that into a marketing term, I've seen realtime
mean anything from milliseconds to hours. I'd mostly interpret it as "faster
than what you already have/are used to".

~~~
TeMPOraL
I'd argue that GP is right, but only partially. It may be true that "real-time
became synonymous with instant results and zero delays to mainstream
audiences". But the story continues: _then_ the sales&marketing folks and
various other bullshitters noticed that "realtime = fast" and "fast = good",
and they did what they always do - stared to stretch the term so that they can
steal some of the good feelings associated with it, even though their products
don't deserve them. It's what happened with the word "hacker", or is happening
with the word "geek" right now. If something is associated with good qualities
among relevant enough audience, it will be co-opted by marketing and then used
and abused until it loses any meaning.

That's what I hate about sales&marketing - it will fuck up everything that's
cool.

And journalism, I guess is to blame too (they are marketers too, competing for
our attention). My pet peeve: ever heard about a company "going nuclear" or
taking "a nuclear option"? That phrase should _not_ be used to mean anything
less serious than threatening to deploy _an actual ICBM_.

~~~
nickpsecurity
Your addition to my theory is probably right. I left it off just because I
can't trace the specifics. Marketing and media are where I'd point the finger,
though.

------
tormeh
Never take "realtime" seriously, unless it's prefaced with either "soft" or
"hard".

~~~
jacquesm
That's the whole point here. Soft realtime is not realtime, only hard realtime
is realtime.

So no need for prefixes.

~~~
tormeh
Both terms have varying definitions, depending on who you ask. Realtime, to
me, means that there are deadlines and that those deadlines can realistically
be broken. Soft realtime means that given some conditions, the deadlines will
be met, at least x% of the time. Hard realtime means that the deadlines will
be met, unless there's a grave hardware malfunction.

Then there's the hardness of deadlines, which is proportional to the
consequence of missing them. A hard deadline means that a computational result
is worthless if it's late.

------
jacquesm
It's all about the guarantees. No guarantees: don't bother calling it real-
time.

------
bitwize
"Realtime" in business means "on short enough timescales to significantly
shorten your OODA loop" and could imply delays ranging from seconds to hours.
Of course this is to be expected now that business has cozied up to technology
to run at-sign the speed of thought: look at the sound flogging that suits
have given to engineering terms like "interface", "bandwidth" and "offline".

------
bbcbasic
I am confused by this. So is the problem that realtime 'should' just be used
when there is a critical 'life threatening' problem if the deadline is not
met, but to use realtime for a system that is not critical is seen as wrong?

Lets say you have an algo controlling a fighter jet. That is realtime. If you
had exactly the same algo in a 3d fighter pilot game. That should not be
called realtime?

Is this just an ego thing?

~~~
jacquesm
No, it's a guarantees thing. If you can't guarantee your latency and
scheduling then you're not real-time. So if you're doing something that you
run on a jet that needs to be updated 5000 times per second +- 10us otherwise
something breaks or the plane becomes uncontrollable then that's real-time.

If it's a game version of the same you don't care about those guarantees
because nothing would actually cause the plane to crash.

Of course controlling the plane itself may not have such tight loops but
plenty of processes active in the flying plane may very well have extremely
tight constraints (for instance: engine management, especially near maximum
power output).

In real life there would be a penalty for missing an interrupt or a scheduled
deadline, in a simulation everything would just happily hum along with a tiny
delay.

If you were simulating the real plane then you'd want your simulation to halt
at that point so that you could figure out exactly what went wrong.

One piece of hardware in our offices in Amsterdam long ago had a 'lost data'
led latched so it would stay on if a fault condition ever happened no matter
how briefly, if that led ever came on it meant going back to the drawing board
because it indicated we were not in control of the machine to the extent that
we thought we were.

~~~
eric-hu
> If it's a game version of the same you don't care about those guarantees
> because nothing would actually cause the plane to crash.

> If you were simulating the real plane then you'd want your simulation to
> halt at that point so that you could figure out exactly what went wrong.

Doesn't this all depend on the point of the simulation? For instance, the
simulation could be an integration test of flight control systems. In which
case, yes, you'd want the simulation to halt for debugging, as you stated. On
the other hand, if the simulation is a networked flight sim trainer for
pilots, then you would benefit from having the simulation do exactly what a
plane would do in the real world.

I think there's some insight in the OP's article, but I think he goes a bit
too far. To me, "real time" has a domain specific meaning: there's a greater
importance placed on time for this thing over that. That's a legitimate use of
the word _in that domain_. If the world we we're living in were actually the
Matrix, our use of "real time" wouldn't be less legitimate just because a
plane crash is now bits changing instead of an actual plane crash.

------
everyone
Seems to be the fate of any cool-sounding term. Eg. 'Quantum'

[http://smbc-comics.com/index.php?db=comics&id=1841#comic](http://smbc-
comics.com/index.php?db=comics&id=1841#comic)

Though I must say the software industry is particularly guilty of stealing and
corrupting terms from other fields. One particularly irksome example for me
(as a former architect) is the word 'design'. When a software engineer says
design he means css essentially, making it visually pretty. Whereas in
architecture / engineering / industrial design etc. 'Design' means..
everything, the overall _design_ or plan of your entire system. Thats just one
small example that gets my goat up, there are tonnes more!

ps. The votey (red button which you click for another little panel) is
particularly good for that smbc!

pps. Also as a game developer I reckon I'm safe using the term realtime if
somethings being calculated at 60fps :) ie. realtime shadows vs. baked ones.
If for example the shadows were lagging one frame behind everything else that
would be unacceptable for a game.

~~~
CountSessine
_Though I must say the software industry is particularly guilty of stealing
and corrupting terms from other fields. One particularly irksome example for
me (as a former architect) is the word 'design'. When a software engineer says
design he means css essentially, making it visually pretty. Whereas in
architecture / engineering / industrial design etc. 'Design' means..
everything, the overall design or plan of your entire system. Thats just one
small example that gets my goat up, there are tonnes more!_

I might agree with some of your other examples, but not this one so much.
_Software engineers_ , i.e. computer programmers who actually write design
documents before they write code, have never used the word _design_ for
_visually pretty_.

The word _design_ has a separate and equally legitimate history coming from
_graphic design_ and world of commercial art and drafting. There's a not-
insignificant intersection between that world and the world of front-end web
programming where the word _design_ is used to describe the a site's visual
appeal and not the overall plan of the system.

~~~
everyone
"Software engineers, i.e. computer programmers who actually write design
documents before they write code, have never used the word design for visually
pretty."

Agreed thats the way it _should_ be but I have rarely seen that to be the case
in my experience, though that is anecdotal.

"The word design has a separate and equally legitimate history coming from
graphic design and world of commercial art and drafting."

I think your stumbling there logically a bit. Graphic designers also use the
word design, so do fashion designers and so on, but similarly to architects
and engineers they also use the word design to denote the overarching scheme
of their entire project, not just the aesthetic aspect of it (as these
programmers I am referring to do). Typographers had a lot of other technical
factors to worry about in their design apart from just the visual appeal and
clarity, similarly fashion designers must design their clothes to accomodate a
certain 3d shape and withstand all sorts of structural stresses and
deformations.

~~~
CountSessine
_I think your stumbling there logically a bit._

0_o

 _Graphic designers also use the word design, so do fashion designers and so
on, but similarly to architects and engineers they also use the word design to
denote the overarching scheme of their entire project, not just the aesthetic
aspect of it (as these programmers I am referring to do). Typographers had a
lot of other technical factors to worry about in their design apart from just
the visual appeal and clarity, similarly fashion designers must design their
clothes to accomodate a certain 3d shape and withstand all sorts of structural
stresses and deformations._

You're reaching. And the only thing sillier than us arguing about this is you
being upset that someone is using the word _design_ as it appears in the
dictionary:

1\. _a plan or drawing produced to show the look and function or workings of a
building, garment, or other object before it is built or made_

2\. _an arrangement of lines or shapes created to form a pattern or
decoration_

The article is about the very real problem with the confusion between hard
real-time vs soft real-time systems, which is about the suitability of a tool
to a set of requirements. Your complaint about the word _design_ doesn't seem
to apply.

~~~
everyone
I totally agree that this issue is far far less clearly delineated than 'real-
time' and its a relatively muddy issue of semantics, though it is worthy of
debate imo.

I stand by my assertions. When an artist talks about their design they mean
the entire creation. The dictionary definitions you presented are two
homonyms. #1 is sort of what we are discussing though a better definition
would be the verb form.

"verb (used with object) To plan and fashion artistically or skillfully."

#2 is another meaning which could be used in the sentence "The birthday card
had a pretty little design in the corner" You will not find many artists
referring to their work as #2.

Of course there are people whose job it is to design the pretty designs for
the corners of birthday cards. What they produce is designs #2 but what they
_do_ is design #1

------
uxcn
Outside of engineering, it's just the reality that terms get co-opted a lot to
mean things like _fast_ or _asynchronous_. It's really engineering where the
strict definitions are important, which is the only scenario I generally do
try to correct.

