- Bezier curves can't perfectly describe circles. ([[ https://stackoverflow.com/questions/1734745/how-to-create-circle-with-bézier-curves | Ref ]])
- It's nontrivial to define some `B(t)` function for a Bezier curve that has constant speed. ([[ https://gamedev.stackexchange.com/questions/27056/how-to-achieve-uniform-speed-of-movement-on-a-bezier-curve | Ref ]])
- G-Code can do Bezier curves (`G5` "Cubic Spline", `G5.1` "Quadratic Spline") and other complex curves (`G5.2`/`G5.3` NURBS). ([[ http://linuxcnc.org/docs/2.6/html/gcode/gcode.html#sec:G5-Cubic-Spline | Ref ]])
- However, it sounds like use of these is somewhat rare, and some/most software that generates G-Code approximates curves with many `G2` or `G3` arc segments. GRBL does not implement `G5` and has no plans to implement them. ([[ https://github.com/grbl/grbl/issues/509#issuecomment-58808102 | Ref ]]).
- G-Code doesn't seem (?) to be able to specify these curves in planes other than X/Y, although maybe you can switch the active plane with `G18` so that your "Y" parameters become "Z" parameters. But I think you can't specify an N-axis spline (or a helical spline?). GRBL //does// support the plane select (`G17`, `G18`, and `G19`) operators.
**Computers and Boards**
General model: motors are driven by a control board. The control board is connected to a normal computer over some flavor of serial connection. I want a normal computer on one end so I can press buttons.
**Why do we need a computer?**
**Why do we need a control board? Why can't we //just// use a computer?**
- The computer can't control the motors directly because normal computers physically don't have I/O pins.
- Raspberry Pis do have I/O pins, and there are tutorials for driving stepper motors with a Pi. It's not immediately clear to me why you can't use a single Raspberry Pi in both roles.
- Guess 1: Maybe it's difficult to achieve real-time I/O control on a PI?
- Guess 2: Everything runs GRBL, which is mostly written by one guy for free, and no one has happened to want to spend their life writing PiGRBL for free?
- Retailers sell a Pi glued to an Arduino ([[ https://www.pishop.us/product/raspberry-pi-cnc-board/ | Ref ]]), although this reseller doesn't explain why.
- In fact, the Ardunio is a whole custom Ardunio painted to go on a Pi ([[ https://wiki.protoneer.co.nz/Raspberry_Pi_CNC | Ref ]]). Surely this isn't just for GRBL?
- Some guy actually has written PiGRBL for free, called "raspigcd" ([[ https://github.com/pantadeusz/raspigcd | Ref ]]).
- Discussion connected to raspigcd suggests that the real-time stuff is probably the main issue ("you have to install a real-time kernel").
- There are some hybrid boards like the BeagleBone that have a chip called a PRU ("Programmable Real-Time Unit") to do the control stuff.
I would guess that a significant force here is that users in the small-scale/DIY CNC space are price sensitive but not interested enough in DIY to want to write their own CNC software or wire their own control board, and using an Arduino clone is cheaper than any other approach provided the end-user already has a computer, and some guy already wrote GRBL for free, and the real-time stuff is probably trickier on other systems.
I don't really expect to get anywhere, but I'm not married to Arduino as a control board platform. Seems fine to start, but worth reconsidering if I hit limits (probably memory?).
**Division of Responsibilities**
The general flow for cheap/DIY CNC appears to be: some kind of fancy GUI tool on the computer generates G-Code, which is a simple text format describing machine instructions ("move to X=100, Y=200"). G-Code is sent to the control board. The control board converts G-Code to I/O pins, generally with a piece of software running on the control board called GRBL.
The minimal complexity of the control board is fairly high. I think there are two general limits: real-timiness and bandwith/latency.
Even cases like hitting a limit/homing switch or a tool height switch aren't really real-time sensitive as long as the board reports the event to the computer before processing more steps.
There's one obvious issue here: these delays always make execution slower and never make it faster, so even if the machine is time-independent, accepting these delays means accepting a slower machine.
A less obvious issue may be that paths aren't time-independent if a machine has heavy parts with momentum: an axis will generally take longer to stop if it is moving quickly than if it is moving slowly. If we run the exact same sequence of I/O operations twice as fast, momentum may give us a slightly different result.
We could also perhaps imagine cases like a tapping cycle where we want to run an axis at the same rate as a spindle? But I'm not sure you can tap without a servo/stepper as the spindle. Even stuff like running a constant-feed-rate cycle on a lathe axis shouldn't depend significantly on things being real-time.
There are also some cases where things "obviously" have real-time physics and we have to react to them in real time (like an end-effector catching a ball, or a sandblasting tool that continuously removes material under the toolhead independent of axis motion, or maybe the heating or cooling of a 3D-printed plastic part).
So I think the general issue here is that machines interact to varying degrees with physics. In traditional CNC, this interaction is mostly motor momentum. In other applications, particularly general robotics applications, there might be more interaction. This seems to line up with random people talking about this on the internet. ([[ https://forum.linuxcnc.org/27-driver-boards/11073-novice-why-realtime-is-so-important | Ref ]]).
I might be able to ignore this on small machines with light workpieces and tiny steppers: the error from momentum is almost certainly overwhelmed by the error from my plotter collet being made out of pool noodle. However: I suspect a major source of angry stepper motor sounds is not accelerating them properly; and the next limit is harder and informs design directly.
The other limit is bandwidth, I think? A short, simple tool path can easily require 2^16 steps on the I/O pins, and each axis theoretically accepts 30K pulses/second. If we want to drive three axes at full speed, we may need to change I/O state 180,000 times per second. If each instruction is primitive ("Enable pin 3 now") maybe we could perhaps imagine a control language composed of 1-byte instructions.
By default, Arduinos run at 9600 baud, which would limit us to 0.5% of the maximum machine speed. However, you can increase the rate to 115200 in the UI (~6% of maximum speed) and some people on the internet suggest that 1M baud (~50% of maximum speed) works fine. This is still too slow and it's likely not practical to design a useful 1-byte protocol, but maybe a relatively low-level control language isn't //entirely// ridiculous.
**Streaming and Timing**
Memory on the Arduino is at a premium. We can't fit a whole G-Code (or whatever) plan into memory, which isn't too much of a shock.
But we often can't even fit a single path into memory. GRBL maintains a ring buffer of 16 planned linear motions, but it looks like it slices arcs into as many as 2,000 linear motions. When the motion planner attempts to insert a 17th linear motion into the queue, it just busy waits.
The actual motion is driven "in another thread" by interrupts: the motion ring buffer is produced by the main execution thread, and consumed by interrupt timers.
From comments, it seems like some aspects of linear decomposition (particularly, trig functions) may take ~50us, which is way above the ~10us sensitivity on steppers.
However, I think this design means that you can't (or, at least, can't easily) rewind a path? I'm generally unclear on this, but it seems like there is generally no provision in this control system for operations like "the mill snapped, so: pause, pull out of the work piece, swap the bit, measure the new tool height, rewind the path by one second, resume milling".
This guy ([[ https://www.youtube.com/watch?v=WyOyNnUtGP4 | Ref ]]) doesn't even seem to be able to pause the machine to kick down the dust collection sheathing? Maybe he just isn't familiar with the system, or is running it in standalone mode and didn't wire in a pause button? G-Code has a "Feed Hold" command which works like a "Pause", but it generally seems like most implementations are fire-and-forget and don't support modifying the path at runtime to handle things like replacing the bit.
There seem to be essentially no relevant hits for "CNC rewind" on Google. I'm surprised no one is //asking// about this -- maybe it's not actually necessary in practice?
- This guy ([[ https://www.youtube.com/watch?v=J7GkSWupQRk | Ref ]]) does a bit change between sections by splitting the job into two parts and manually re-zeroing to, you know, roughly the same place.
- This guy ([[ https://www.youtube.com/watch?v=ePlyXeXYZBE | Ref ]]) can start a program partway through, but he's using a trillion dollar industrial CNC machine, and he "usually leaves [this setting] off" because replaying 700,000 lines of program state is slow (???!!!).
Conceptually, I'd like (?) to structure instructions as iterators that emit lists of motion plans? Then the board could drive multiple motion plans simultaneously and pause/rewind/resume more easily? I'm not sure how practical this is given memory and CPU constraints on the Ardunio.
GRBL is also very hard-coded to a particular pinout: the computer controller can't tell the control board how it is wired at runtime. This seems sort of silly: I'd like to hand the board a software definition of an arbitrary number of axes, spindles, effectors, etc., at runtime and have the control language look more like "move stepper 1 and stepper 2 in a linear path to 100, 200 at rate X", not "move x to 100 and y to 200". This may be difficult to achieve given space constraints.