
The Rise of Python for Embedded Systems - lfcerf
https://www.zerynth.com/blog/the-rise-of-python-for-embedded-systems/
======
chipsandkip
There's nothing wrong with using Python on an embedded platform in certain
circumstances, but this article is a poorly written advertisement.

> "Expert and skilled C programmers could justify this stating that “C
> generates a faster, more compact, and more reliable code”… BUT … if you
> replace C with “assembly” in that statement, you’d have exactly what an
> extinct generation of programmers said about 20 years ago!"

Yes, the differences between C and Assembly are directly comparable to the
differences between Python and C.

> "There’s an enormous crowd of professionals skilled in using Python
> “potentially” able to develop the software for the “next big thing” in IoT
> and to ship new amazing embedded applications in a short time."

Python doesn't even appear on the IEEE rankings shown at the start of the
article, It's amazing how this "enormous crowd" can stay so silent.

Are there any examples of products with an MCU running Python being
manufactured at scale? Even if Python provides benefits for rapid prototyping,
I'm really struggling to think of a business case where you wouldn't replace
it with C or C++ before going into production.

~~~
jjoonathan
> Yes, the differences between C and Assembly are directly comparable to the
> differences between Python and C.

To unpack that for PC programmers: the best documentation, API design, and
tooling for a microprocessor often happens at the memory map level. Sad, but
true. Everything above that is typically cat shit wrapped in dog shit. The C
ecosystem natively supports memory and addresses as abstractions, letting you
directly consume those APIs, documentation, and tools. Note that this is one
of those cases where "best" emphatically does not imply "good," and that it
lies in stark contrast to the situation on "real" processors where you'll
almost always find solid abstractions much further up the stack.

~~~
chipsandkip
Yeah, my biggest issue with C/C++ on embedded systems is that it's cumbersome
to build useful, safe abstractions of hardware. Almost every manufacturer-
provided MCU driver library I've encountered is filled with void* and #define
abuse that wouldn't fly in any other type of project.

I once worked on a project where some poor soul took the time to provide a
zero-cost abstraction, described in a Scott Meyers presentation[1], for every
register on an MCU. This resulted in thousands of lines of C++ template
classes, which, while hurting compile times, resulted in a really strong, safe
and performant API that drivers could be built upon. It was wonderful to use,
but all that upfront work isn't realistic for most projects.

I'd definitely be interested in a language that supports this type of
abstraction "natively" (i.e. doesn't require lots of boilerplate code) while
providing C-like performance on bare-metal platforms. I know that some people
are trying to make Rust fit this hole but I'm not sure how that's coming
along.

1\. [http://www.aristeia.com/c++-in-
embedded.html](http://www.aristeia.com/c++-in-embedded.html)

~~~
jjoonathan
Agreed 100%. I don't think memory maps will ever go away as the "API" between
hardware and software -- the "alternatives" always seem to involve staggering
complexity or a violent neutering on either the SW or the HW side (always the
opposite of the person doing the talking, of course) -- but I do think that
it's profitable to contain that interface beneath application code, preferably
with a language-enforced barrier. I'm cautiously optimistic about Rust's
"unsafe" as well. I'm just not cautiously optimistic about python :-)

------
nikofeyn
whhhhhhyyyyyyy?

i just can't understand why someone would want python for embedded
applications. i primarily use labview for systems development, and there is
not a single thing that would be gained by moving to python while many things
would be lost. and no one can convince me that using python over labview would
generate more robust code. i have used python before, and it was slower to
develop in and more error prone than labview for system development.

personally, i want to see more lisps/schemes and ML-based languages on
embedded systems. i know there are some, but they aren't quite there. that's
the direction i think we should move. python isn't the best at any one thing
other than having lots of libraries. there are far more productive dynamic
languages (lisps/schemes) and far better languages for developing safe code
(ML-based). i am currently learning idris, which seems like it could be
amazing for embedded code development. having provable state machines would be
a huge plus.

~~~
keenerd
> _i have used python before, and it was slower to develop in and more error
> prone than labview for system development._

Exact opposite in my experience. I've been there, tried that. Re-wrote it in
python for a more than 50x productivity gain.

At my first job out of school, I inherited maintenance for a couple of
projects. All were extremely expensive custom test equipment, written in
Labview. One was the central project, the whole point of this group's
existence. It had been in development for more than two years. It routinely
destroyed thousands of dollars worth of hand-made sensors during its power-on
self-test. It was a far cry from being finished. It couldn't even be
considered a prototype.

In a two-week period, I taught myself python, quickly cloned the existing
calibration procedures, and finished the entire thing. My boss thought I was a
genius rockstar, but no, I'm not. I just had the sense to evaluate every
available tool and came to the conclusion that Python (mostly due to the
ctypes module and a robust Windows ecosystem), was the best option.

Personally I'll use C whenever possible for embedded work. (And later I did
port the entire project to a DSP running C, because Labview & an AD DAcq board
was too expensive.) But if latency doesn't matter, Python is fine.

edit: Side note, I was already conversational in Labview. Admittedly only for
school projects, but at that point I knew more Labview than Python.

~~~
zwieback
Agreed, at my job I'm now actively campaigning against LabView (and Matlab)
for application development. These projects turn out productive at first and
invariably turn into unsupportable messes.

What's the right platform for "embedded systems" is debatable. A true bare-
metal app has different requirements from something running on RPi or
Beagleboard.

~~~
lvoudour
I strongly disagree. I've used both LV and matlab for many years and they are
tremendously good for specific tasks.

Labview is great for data acquisition and GUIs to visualize this data. Drop a
ready made UART/TCP/VISA block, some array manipulation in a loop, connect a
waveform block with zoom/cursors/etc. and you have a working real-time app in
a couple of hours.

Matlab is perfect for heavy matrix operations on large datasets, fantastic for
plotting your data and it's very good for prototyping algorithms.

I tried python as an alternative for both. Tons of boilerplate code, a pain
for GUIs and don't get me started on matplotlib/scipy/numpy...

The biggest drawback of both LV/matlab is their proprietary nature and the
expensive licenses which make them unsuitable for hobbyists (at least there's
octave as a matlab alternative). But they're great tools for the appropriate
tasks

~~~
snovv_crash
The problem with both LV and MATLAB is that once you've done the basics, and
now you want to turn it into something that you can maintain and sell.

With LabVIEW you can compile and redistribute, but good luck writing an
efficient binary search tree when your data gets bigger than what can be
searched for with brute force. Also, at least last time I used it, the version
control was horrendous.

For MATLAB, just forget redistribution in general. Nobody is going to buy a
MATLAB licence to run your code.

~~~
lvoudour
>The problem with both LV and MATLAB is that once you've done the basics, and
now you want to turn it into something that you can maintain and sell.

That's true, but it's not their strong selling point. There's tons of problems
to solve at the r&d stage when dealing with custom hardware, a dozen different
instruments and algorithm exploration. That's where these tools shine and the
reason they're ubiquitous in industrial/military/aerospace environments.

For a full product with drivers, libraries and fast, customer friendly UI sure
there are better solutions, but there's no law that dictates to use the same
tools at the development stage

------
turbinerneiter
I'll just drop this here: [Evaluation of MicroPython as Application Layer
Programming Language on
CubeSats]([http://ieeexplore.ieee.org/document/7948548/](http://ieeexplore.ieee.org/document/7948548/))

I think Python has _everything_ a language needs to be the future of Embedded
Systems - except a compiler.

You start writing you program using MicroPython as interpreter. You add type
hints because it's the right thing to do anyway. Once your prototype is done,
you compile to get the speed (cutting execution time, increasing sleep time,
increasing battery live).

My background is Aerospace Engineering and maybe that's why I can't understand
why nobody does a real, AOT, Python compiler. There is Nuitka, but it's a
transpiler that extensively uses libpython and isn't fit for microcontrollers
(still awesome project, tough). Cython also uses libpython and a different
type-annotation system.

~~~
DonaldPShimoda
For something as critical as aerospace, wouldn't a statically-typed language
be a better choice anyway? I love Python, but I don't think it has a place
where rigor and validation are concerned.

Though I suppose if you were writing a compiler, you could require and enforce
type hints... but then any calls into the standard library will just have to
be trusted. I dunno. It seems tricky.

~~~
turbinerneiter
That's why I'd enforce type-hints.

The whole story with these critical systems is tricky. At the one hand, they
want to have formal verification. Academia really loves stuff like Haskell for
that. There is Ada which was designed with critical systems in mind.

On the other hand, there is a lot of C and C++ everywhere. On
microcontrollers, no matter how much you love Haskell, Ada or Rust, there
might not even be a compiler and probably no HAL.

And in the end, all of those languages focus on the machine and the math. A
mathematical prove of the correctness of your software is nice, but readable
code is too. And the formal verification doesn't save you from making wrong
design choices. It doesn't check your logic. No language or tool can do that.
But what I believe a language can do for you is help you express your thoughts
in such an simple, structured and readable way that the cognitive load is
reduced to a minimum, freeing up thought-power to check your logic.

A lot of the failures in space that were caused by software weren't actually
programming errors like the overflows that allow us to hack everything. There
was a Mars Orbiter that crashed because the data was in the wrong format -
still the types matched. A lander crashed because they forgot to put in logic
to handle invalid sensor data: a broken sensor in the landing led triggered
the 'touch-down' signal, the lander believed it landed and cut the chute.
Except it really wasn't even close to ground.

Using a bit of hyperbole, there is formal verification in Academia on the one
side, and people forgetting to design robust logic while trying to figure out
pointers on a microcontroller that can only be programmed in C on the other.

Static Python. That's my proposal to save the next mission :)

~~~
nikofeyn
but all that seems to be silly. why try to force feed patches to make python
something it was in no way designed for? it seems like the completely wrong
approach from a design point of view.

and your dismissal of the power of type checking systems seems strange. seems
weird to say that it doesn't really help when python has nothing of the sort.
even a "static python" wouldn't reach the power of some existing systems. even
for dynamic languages, python is behind the times for things like that. for
example, look at racket versus python.

and then there's languages like idris, which go well beyond what even haskell
can do. something like idris would be wonderful for critical embedded systems.
and if you really want flexibility and a dynamic language, python seems well
below lisp/scheme. i would love to see racket and ML-based languages to move
towards embedded development over something like python.

finally, both lisps/schemes and ML-based languages are very readable. many
would argue they are more readable than a language like python. so your
implication that type systems incur a readability penalty isn't generally
accurate.

> But what I believe a language can do for you is help you express your
> thoughts in such an simple, structured and readable way that the cognitive
> load is reduced to a minimum, freeing up thought-power to check your logic.

basically what i am getting at is that a language like racket or idris fulfill
this requirement very strongly, even moreso than python. and additionally,
they come with advanced language features that no version of python will ever
likely touch on.

------
ashwin67
Each has its own place. I write code in all languages including Python and C.
Yet, I have to write code in assembly even today when the particular problem
statement requires its usage. Such generalization can only be expected from a
marketing campaign that this post really is.

~~~
fhood
What specific tasks require assembly?

~~~
monocasa
Not the OP, but I've written assembly in our codebase for:

* First stage initialization

* Interrupt prologue/epilogue

* Bitbanging where you want deterministic cycle counts

* For using the FIQ for very high priority interrupts. It has it's own registers partially banked, so if you stay in r8-r13 you don't have to save and restore state at all.

So we don't use assembly for magic go faster juice, but instead when there's a
coding constraint that we can't easily explain to the compiler.

~~~
elcritch
I'm curious to hear more how/why you use FIQ? From googling it looks like it's
an ARM feature. How can/does that work with a general purpose OS or are you
using bare metal ARM?

~~~
monocasa
We've got our own internal RTOS, not a general purpose OS.

We use it for different things depending on the board, but my favorite is a
really sweet profiler that's integrated with our watchdog logic that'll give
us ~1000 PC+SP samples in the 100msec or so before our watchdog timer resets
our board. That's pretty invaluable for debugging.

------
kronos29296
Couldn't something like micro python and Cython replace Zerynth? Reads like
the marketing post that it is. Starts off with interesting stuff and ends with
promo. It also looks very proprietary to me. Just add more FOSS and I will
definitely be interested.

~~~
ngoldbaum
Something _like_ cython certainly could, but cython itself is far too
integrated with the CPython C API to work well with micropython.

~~~
peller
There's an old github issue[0] exploring this, but it seems to have died. One
contributor had some basic success[1], so perhaps there could be opportunity
to more closely integrate the two projects by a sufficiently skilled and
motivated dev?

[0]
[https://github.com/micropython/micropython/issues/658](https://github.com/micropython/micropython/issues/658)

[1]
[https://github.com/micropython/micropython/issues/658#issuec...](https://github.com/micropython/micropython/issues/658#issuecomment-45096909)

------
infocollector
I am using MicroPython on ESP8266 right now. What are the major differences
between Zerynth and MicroPython, and why do they exist?

~~~
dbcurtis
Agree. I've been using Micropython for almost two years on several projects.
This Spamvertising from Zerynth doesn't say why I should switch.

~~~
petra
Prototyping ? Or products ? And if so please expand.

------
mbaha
Cached version:
[https://webcache.googleusercontent.com/search?q=cache:-F31l-...](https://webcache.googleusercontent.com/search?q=cache:-F31l-w3tE4J:https://www.zerynth.com/blog/the-
rise-of-python-for-embedded-systems/+&cd=1&hl=fr&ct=clnk&gl=fr)

------
syntaxing
Is there a benefit on using Python in terms of performance rather than C/++
for embedded systems? Does Python work well for lower level applications (in
terms of hardware access)?

~~~
masklinn
> Is there a benefit on using Python in terms of performance

If you're talking about _machine_ performances, the opposite, even for a
dedicated dialect like micropython. The benefits are in safety, readability
and ease of development.

~~~
syntaxing
Both in terms of embedded systems application: What makes micropython safer
and easier to develop compared to C/++? In terms of safety, is there better
memory allocation and hardware control through Python? I would imagine
micropython would be more inefficient and more prone to hardware error. As for
the ease of development, wouldn't C/++ be better? Sure, it's harder to read
but C/++ seems like the right tool for this particular job.

I'm pretty curious because I'm pretty fluent in Python and I have been
learning C++ for more lower level stuff mainly for speed reasons. I still
program most of uC through C++ so I'm curious if I should learn micropython if
it provides substantial benefits.

~~~
masklinn
> Both in terms of embedded systems application: What makes micropython safer
> and easier to develop compared to C/++?

Python is a GC'd (reference-counted in CPython and probably µpython) and
memory-safe language so it should be less prone to memory corruption, although
it is more likely to open you up to resource exhaustion.

> In terms of safety, is there better memory allocation and hardware control
> through Python?

These are not what is usually understood by "safety" in programming.

> I would imagine micropython would be more inefficient

Definitely, the overhead of micropython would be substantial (even more so as
it can't leverage compiler optimisations) so it's a tradeoff between hardware
requirements and ease of development/time to market. IIRC micropython also
supports only a relatively small subset of all embedded systems.

See
[http://docs.micropython.org/en/v1.8.6/pyboard/reference/spee...](http://docs.micropython.org/en/v1.8.6/pyboard/reference/speed_python.html),
[http://docs.micropython.org/en/v1.8.6/pyboard/reference/cons...](http://docs.micropython.org/en/v1.8.6/pyboard/reference/constrained.html)

> As for the ease of development, wouldn't C/++ be better?

Generally speaking python is more expressive and more readable than C++ code,
which makes it easier to write (develop) Python. C++ is more finicky, thus
harder to develop.

~~~
elcritch
> > In terms of safety, is there better memory allocation and hardware control
> through Python? > These are not what is usually understood by "safety" in
> programming.

In many embedded and RTOS contexts, safety and allocation and hardware control
are very difficult to distinguish.

> > As for the ease of development, wouldn't C/++ be better? > Generally
> speaking python is more expressive and more readable than C++ code, which
> makes it easier to write (develop) Python. C++ is more finicky, thus harder
> to develop.

C++ is a huge pain with loose type casting behavior (e.g. is this an long or
does this expression convert it to an unsigned long?). But then again Python
has very _loose_ descriptions of int and float types at all. For me that makes
doing bit manipulation for talking to peripherals more opaque rather than less
so.

~~~
icebraining
If you want C-like ints and floats, you use ctypes. You have all the typical
types, e.g. c_char, c_float, c_int8/16/32/64, c_size_t, etc, plus structs and
arrays.

------
afeezaziz
I am trying to read more but the website went down. My question: I have been
dabbling with mbed for quite some time and I would love to use Python for uC.
The main problems(at least the perception) are that: you cannot do programming
for low power and for real time functions, python will not work. Are these
issues addressed by zerynth?

~~~
traverseda
Micropython actually has decent support for real-time operation, according to
[1]. This is via proper use of timers and interrupts. I imagine it's something
like programming for a real-time operating system.

[1]: [http://www.edn.com/electronics-blogs/embedded-
basics/4440447...](http://www.edn.com/electronics-blogs/embedded-
basics/4440447/Using-MicroPython-for-real-time-software-development)

~~~
monocasa
The article doesn't really make it's case for Micropython having real-time
capabilities. The author seems to be confusing 'real-time' and
'microcontroller'. At a bare minimum you'd need a proper priority scheme.

------
th0ma5
What about memory fragmentation? I have had a lot of fun with MicroPython on
an ESP8266, and, while I was thinking about my problem all wrong, I did bump
up against reboots due to exhaustion of the address space (I think)...

------
etiene
This makes me want to poke my eyes out. I like python, I've used it before and
would use it again, but please stop using tools that are not made for a
certain job just because they're popular for other jobs. There are languages
especially made for this kind of work, such as Lua or others.

~~~
allover
Python has a far healthier ecosystem than Lua [1] [2].

[1] [https://luarocks.org/m/root](https://luarocks.org/m/root) [2]
[https://pypi.python.org/pypi](https://pypi.python.org/pypi)

~~~
catwell
The Python ecosystem is basically irrelevant to embedded programming.

What you need is things like [1] or maybe [2]. Lua has them because it's been
the dynamic language of choice for this domain for years (with Lisps and
Forths as its main competitors).

Now we see new challengers like micropython, mruby and jerryscript / iotjs, it
will be interesting, but they're clearly not the mainstream choices for
embedded.

[1]
[http://www.eluaproject.net/doc/v0.8/en_refman_gen.html](http://www.eluaproject.net/doc/v0.8/en_refman_gen.html)

[2]
[https://wiki.openwrt.org/doc/howto/luci.essentials](https://wiki.openwrt.org/doc/howto/luci.essentials)

------
ashayh
On a side note for ruby fans, it's quite popular in the embedded world in
Japan via a dedicated project:
[https://github.com/mruby/mruby](https://github.com/mruby/mruby)

------
rb808
Do CS grads even get taught C any more? I get the feeling that the newest
generation is all Java/Python/JS.

Given that cheap embedded systems are so powerful now it makes sense.

~~~
jefft255
If they take an Operating Systems course (which, I hope, every CS grads does)
then they certainly get taught C. That was the case for me and I only
graduated last year. Plus, most CS grads learn C++ which means they can easily
get started with C.

------
pjmlp
There are even companies writing drivers in Python.

[https://foxdeploy.com/2017/07/19/buildyourowniotmonitoringto...](https://foxdeploy.com/2017/07/19/buildyourowniotmonitoringtool/)

Search for "via a driver written in Python!".

~~~
fazkan
this looks like someones blog, and the search returned no results related to
python drivers...

~~~
pjmlp
It is someone's blog indeed.

The author tried to do an IoT setup for C#.

If you searched properly, you would have found the remark I've written, where
the author explains that the chinese OEM of a touch screen that failed to work
with his setup,told him that the respective device driver was written in
Python.

------
bobjordan
Here is a link to the zerynth docs:
[https://docs.zerynth.com/latest/official/core.zerynth.docs/i...](https://docs.zerynth.com/latest/official/core.zerynth.docs/installationguide/docs/index.html)

------
asah
This whole article is predicated on the COMICALLY flawed IEEE review: see
[https://news.ycombinator.com/item?id=14811321](https://news.ycombinator.com/item?id=14811321)

(I'm a huge python fan, but not this way...)

------
linopolus
So the developers of Zerynth say Zerynth is great. Nice try... I say, Wrong
Tool for the Job. Use C.

------
rebootthesystem
Well, "Embedded Systems" represents a continuum ranging from simple non-smart
programmable thermostats to microwave ovens, televisions, engine controllers,
navigation systems, high performance servo controllers and more. I have always
found discussion about programming languages to completely miss the reality of
what it is to deliver, support and maintain these products.

I have developed embedded systems in the context of a real business
manufacturing and supporting these products using everything from assembly
language, through Forth, C, C++ and, yes, Python.

There was always a good reason to make an choice at the time. For example, an
embedded system running a sophisticated control panel with dozens of knobs and
buttons was done entirely in assembly language eons ago. It didn't take long
for that to become a very bad idea. It was very difficult to maintain, improve
and evolve the system. We ultimately decided to bite the bullet and port the
code to C. The difference in the ability to express ideas was massive.

The only two languages I would avoid today are assembly language and Forth.
The first because it is a nightmare to support, maintain and evolve outside of
trivial systems. The second because it is impossible to find good Forth
programmers these days and the ecosystem (libraries, etc.) is nonexistent.
That said, I still think everyone should learn both.

I have yet to see an embedded system that can't be done well (and by that I
mean the entire life cycle of a product) with plain-old C. Nothing wrong with
the language. No magic. It works and it can produce clean, reliable fast code.

C++. Well. If you came up through assembler and Forth you learn to despise
waste. And that's what forcing an object oriented approach tends to be when
there is, in the end, no real need to take this route. Sometimes I see OO as a
tool lazy programmers reach for. Some of the funniest code I've seen are
things like erecting a complex OO solution to implement a state machine when a
simple lookup table and minimal procedural code could do exactly the same, run
faster and with less resources. I also do hardware in FPGA's so that's a
formula for being keenly aware of resources and not wasting them.

As for Python. Well, frankly, it's about the libraries and nothing else. If
you need to do an advanced embedded system with, for example, vision, you
almost can't beat the Python ecosystem. Not because of the language. No. I
couldn't care less. I care about time to market and the Python ecosystem
offers a cornucopia of solutions covering just about any field one might care
to name. You can hire Python programmers and anyone who knows C-like languages
can learn it fast. That's important.

In the end we build products and all that matters is to build said products as
cost-effectively as possible and the ability to support, maintain and evolve
them just as effectively.

~~~
AnimalMuppet
> C++. Well. If you came up through assembler and Forth you learn to despise
> waste. And that's what forcing an object oriented approach tends to be when
> there is, in the end, no real need to take this route.

I like wrapping an object around a hardware resource, so that the only way it
can be modified is a call that went through a public member of that object.
You can enforce consistency that way.

And C++ has the philosophy that "you don't pay for what you don't use". (Of
course, if you _use_ what you don't _need_ , you can still cause waste...)

~~~
rebootthesystem
You don't need C++ for that. Let's say we are talking about a serial port and
a motor driver, all in the same MCU. A programmer would have to be pretty
sloppy to use motor controller functions to try and talk to the serial port.

If I write a function called "set_serial_baud_rate()" and another called
"set_motor_rpm()" it would be alarming if someone tried to use it to set the
set_motor_rpm() to change the serial port's baud rate.

I understand encapsulation very well but it simply isn't necessary in a lot of
embedded cases. I can't remember a single instance in thirty years of product
development when we had an issue in development because code was NOT object
oriented. Product development never slowed down due to not using OO. More bugs
were not introduced due to not using OO. Code was just as maintainable and
extensible using C or C++. It truly does not matter at all as far as I am
concerned.

What I look for is "done for you" code. Libraries that can produce power-of-
two or order-of-magnitude gains in developing a product. Show me gains of that
magnitude and I'll code in COBOL if necessary. I truly don't care about
languages any more. Just like I don't care about code editors (counting
milliseconds in vim). Give me a decent IDE like JetBrains products and life is
good.

I know this puts me at odds with most of what they teach today. Kids come out
of school and automatically reach for objects for even the simplest of
problems. OO has it's place, of course, but it should not be the first tool
one reaches for unless it is well justified.

A CPU couldn't care less if you wrote the code in machine language or Python.
The facilities provided by these languages are for the programmer, not for the
CPU.

One of my favorite interview techniques: Ask someone to solve a problem. If
they are on the younger side they'll immediately define a class, members,
properties, maybe multiple classes. When they are done, ask them to do the
same in procedural form. I've seen people get completely tangled-up by this
simple request.

EDIT: To be clear, I am NOT saying OO doesn't have it's place. Of course it
does. I've written a bunch of iOS apps in Objective-C with a good mix between
OO and procedural when justified. For example, a genetic solver done in
procedural form on iOS is miles faster than the same solver implemented in OO.
A lot of it has to do with using layers of data objects rather than the raw
data types typically used in C. In other words, there's a cost associated with
using an object to store and manipulate what would otherwise be a simple
array. You might be surprised to see just how dramatic the performance
improvement can be.

~~~
AnimalMuppet
Well, let's say that set_motor_rpm() didn't try to slam the RPM to the
requested value, but instead tried to smoothly accelerate or decelerate to the
requested value at a rate that would not damage the motor. It might use a
function called set_immediate_rpm() to set the value for the next instant, but
that value needs to be not very different from the current value, or damage
can result. You can use an object to make sure that nobody from outside calls
set_immediate_rpm() by mistake.

Now, you're still right, because in C, you just make set_immediate_rpm()
static, so that nobody can call it from outside the file, and then the non-
static functions become the public API.

~~~
rebootthesystem
To go further than that, embedded projects more frequently than not have one
and only one software developer. It's not like building a website with a team
of several developers.

In this context the "nobody" in your scenario is the only person on the
project. Which, once again, brings us to the idea that it isn't the language
or procedural vs. OO at the root of the problem at all if things are done
wrong or misused.

I must also point out that hypothetical examples are rarely meant to cover all
possible scenarios, much less corner cases. They are simplified examples to
facilitate an explanation. As such it is always possible to find holes in the
hypothetical.

A colleague taught me a very important lesson about twenty-five years ago. He
was about ten years older than me and had far more experience. I had just
finished coding an electronics component management system for our
manufacturing department where we worked. It was really slick and fast at a
time when text-only apps and menus reigned. I was proud that my six month
effort had produced such a product.

I was showing it off. When it was his turn to be dazzled he let me do my "look
at this" tour of the software and said virtually nothing. When I was done he
said "looks good" and proceeded to plant his palm on the keyboard and run it
back and forth about four times, pressing as many keys as he could at random.
The program crashed and he walked away.

It took me about a month to make it robust enough to have it react gracefully
to such an attack. Learned a lot in the process about dealing with the real
world. This gave me the mindset to write code defensively.

A function such as "set_immediate_rpm()" should be bulletproof, and this has
nothing to do with OO vs. procedural. A good implementation of
"set_immediate_rpm()" would have a rate-of-change evaluation such that you
could call it a million times and yet not violate the maximum desired rate of
change (presumably set through a different function or defined as a constant
somewhere).

