
DIY electronic leadscrew for metalworking lathe [video] - OJFord
https://www.youtube.com/playlist?list=PLDlWKv7KIIr90ZZ7Zqt-ge5nVVdS3WVgg
======
defterGoose
Yep, he's right about the timing issues. I've taken a deep dive into LinuxCNC
recently, as I'm using it as the motion controller for my custom servo-driven
3dp/Mill, and one of the most important decisions one makes up front is what
Linux kernel to run. The real-time kernels are all but necessary for doing
heavy number crunching and encoder reading, and that's even on a powerful
i5-powered desktop machine with custom IO hardware.

As you start to climb into the multiple-thousand dollar arena on a project
that was planned with just a couple, you really start to realize why the
manufacturing industry routinely pays significant fractions of a $MM for their
machines.

~~~
Blackthorn
Isn't it more that real time kernels are necessary period? The number
crunching itself can't be that problematic...I just upgraded an old Anilam
mill controller that ran 3 axes. It was running on a 386.

The new encoders on the servos are way better, but the replacement modern
hardware is also way better.

I think in both cases (especially the upgrade), the event handlers and drivers
are all done in separate, dedicated hardware

~~~
defterGoose
Well, you can run the simulation and such without it, but for doing any real
work, yes they are necessary. That was kind of my point though: you could in
theory put one of those kernels on an RPi or something (barring compatibility
issues with the chipset), but you'd probably still be lacking the proper IO
hardware, etc.

~~~
Causality1
Absolutely agree. Modern operating systems are utter beasts when it comes to
complexity and are far less predictable than simpler solutions. There are
reasons microcontrollers are still everywhere.

------
rmu09
Linuxcnc on a raspberry pi 4 is fast enough to run the motion-control loop of
linuxcnc with 1kHz, worst case latency is below 100us with current rt-preempt
kernels.

Generation of stepper signals or interfacing with servo amplifiers and
encoders is best left to an external microcontroller of FPGA card, connected
either via SPI or ethernet.

~~~
ampdepolymerase
How do the audio and video subsystems of standard Linux systems deal with real
time requirements?

~~~
bsder
Badly.

Only macOS has an architecture to deal with low-latency audio and video
properly. Presumably iOS has some of those features as well, but it was on OS
X that Apple did some deep architectural changes for proper low-latency audio
and video.

~~~
tonyarkles
It was admittedly several years ago, but I worked on a voice chat app for iOS
and Android, and the behaviour on Android was horrifying. iOS had very
consistent and great audio latency (approx 10ms if I recall). Android devices
were all over the place; I think the worst one I tested was over 150ms.

------
justinclift
It's interesting where he's pointing out the real-time debugging support in
the chosen board.

One "weird" thing that popped out about that for me, is where he said it's not
possible to do that with the "other platforms" he's shown.

Does anyone know if the debugging style he's mentioning is different from
attaching a segger to the debug port on the Due? Asking because I've done
exactly that before, when working through bugs in the g2core (multi-axis CNC
controller) firmware.

To me, being fairly newbie level with it, it sounds like the same kind of
thing. Perhaps the author just isn't aware some Arduino's do have a debugging
port + commonly available tooling to use it?

~~~
rkagerer
I agree other ecosystems have similar debugging capabilities; I think the main
driver for hardware selection was deterministic timing and ability to keep up.

See my comment further down beginning " _Starting about 7:09 he talked about
why the Arduino wasn 't a good fit_..."

------
tardo99
Not sure I agree with the bit about Linux + Raspberry Pi not being able to
handle precision timing down to the microsecond level. I'd be interested if
anyone has thoughts.

~~~
analog31
I've done a fair amount of timing-critical design, including control of
stepping motors and encoders for specialized uses. My experience has been that
"real time" programming on a mainstream computer with an operating system is a
headache, if it's possible at all. It's just a lot easier to do that kind of
stuff on a mid range microcontroller, where the processor has exactly one task
and predictable sequencing of operations.

Granted, it was a long time ago, but I remember trying to work with a CNC
milling machine that generated all of the timing on a Windows computer.
Everything worked fine unless you touched the mouse, which would cause motion
to stutter. So, there can be a lot of gotchas that have to be accounted for
when using a "big" computer for "little" things.

~~~
Animats
_I 've done a fair amount of timing-critical design, including control of
stepping motors and encoders for specialized uses. My experience has been that
"real time" programming on a mainstream computer with an operating system is a
headache, if it's possible at all. It's just a lot easier to do that kind of
stuff on a mid range microcontroller, where the processor has exactly one task
and predictable sequencing of operations._

Yes. This is just a "keep motor A in sync with encoder B" problem. That should
be on something at the Arduino level, although you might need a faster CPU
version. Something with no caches, no superscalar, just dumb static memory and
fixed-time instruction execution. Linux would just get in the way. The loop
that keeps both sides in sync should probably be running around 1000Hz or so.
In one of the OP's rather long and numerous videos, he mentions 10Hz not being
enough. Right.

If it needs a user interface, that might be on a separate computer, with an
interconnect to set the ratio in the motor controller over I2C or something
equally dumb.

Also, you can get stepper motors with encoders, so you can tell if you missed
a step and correct. You don't have to use a servomotor, which seems to have
complicated this project. Shopbots and Tormach mills use stepper motors with
encoders.

Many people have converted existing milling machines to CNC, and there are
kits for that. This is a much simpler project.

~~~
rkagerer
Starting about 7:09 he talked about why the Arduino wasn't a good fit and
explained his choice of the TI board (including speed, single-cycle floating
point hardware, responsive interrupts and dedicated peripherals to latch
consistent, jitter-free reads of the quadrature encoder counts).

Note it's a 4096 step encoder, mounted on a different shaft than the one he's
driving, that can run upwards of 2500rpm. His motor control loop (not the
complete end-to-end, but the bit that maintains output position to that
desired) takes under a microsecond. His code works with either a Nema 24
stepper or Nema 23 servo, and he demonstrated both.

In the second video he maths out some projections for error/drift (factoring
in floating point representation error) and its ultimate impact on thread
accuracy (which provides some motivation for what you might initially consider
"overkill" hardware).

10hz is just the refresh rate for the tachometer calculation.

~~~
analog31
Indeed, a bit of confusion about "Arduino" is that it has come to mean two
things. One is the original Arduino board and its "branded" successors. The
other is the development stack. In the latter case, it's GCC with some
libraries and a beginner friendly editor. It's bare-iron programming.

Thinking about dealing with an encoder, let's say it has 8k counts per
revolution, and the motor can do 10 revolutions per second, which is pretty
fast for a stepping motor. Then the encoder steps are coming in at an 80 kHz
rate, but an 180 MHz MCU can execute a couple thousand instructions in that
time period, so it's not even breathing hard.

------
solotronics
I have seen some other people using an FPGA board to maintain instructions at
a specific clock rate and interfacing with it via LinuxCNC.

------
thdrdt
When he talks about the interface [1] I think he touches a really important
point: you have to know the user and usage to decide what interface to use.

As he says it is very tempting to use modern technology like touchscreens. But
because he knows he will touch it with oily fingers that's not an option.

Great project!

[1] Part 1: Proof of Concept @15:10

