- Bezier curves can't perfectly describe circles. ([[ https://stackoverflow.com/questions/1734745/how-to-create-circle-with-bézier-curves | Ref ]])
- It's nontrivial to define some `B(t)` function for a Bezier curve that has constant speed. ([[ https://gamedev.stackexchange.com/questions/27056/how-to-achieve-uniform-speed-of-movement-on-a-bezier-curve | Ref ]])
- G-Code can do Bezier curves (`G5` "Cubic Spline", `G5.1` "Quadratic Spline") and other complex curves (`G5.2`/`G5.3` NURBS). ([[ http://linuxcnc.org/docs/2.6/html/gcode/gcode.html#sec:G5-Cubic-Spline | Ref ]])
- However, it sounds like use of these is somewhat rare, and some/most software that generates G-Code approximates curves with many `G2` or `G3` arc segments. GRBL does not implement `G5` and has no plans to implement them. ([[ https://github.com/grbl/grbl/issues/509#issuecomment-58808102 | Ref ]]).
- G-Code doesn't seem (?) to be able to specify these curves in planes other than X/Y, although maybe you can switch the active plane with `G18` so that your "Y" parameters become "Z" parameters. But I think you can't specify an N-axis spline (or a helical spline?). GRBL //does// support the plane select (`G17`, `G18`, and `G19`) operators.
**Computers and Boards**
General model: motors are driven by a control board. The control board is connected to a normal computer over some flavor of serial connection. I want a normal computer on one end so I can press buttons.
**Why do we need a computer?**
**Why do we need a control board? Why can't we //just// use a computer?**
- The computer can't control the motors directly because normal computers physically don't have I/O pins.
- Raspberry Pis do have I/O pins, and there are tutorials for driving stepper motors with a Pi. It's not immediately clear to me why you can't use a single Raspberry Pi in both roles.
- Guess 1: Maybe it's difficult to achieve real-time I/O control on a PI?
- Guess 2: Everything runs GRBL, which is mostly written by one guy for free, and no one has happened to want to spend their life writing PiGRBL for free?
- Retailers sell a Pi glued to an Arduino ([[ https://www.pishop.us/product/raspberry-pi-cnc-board/ | Ref ]]), although this reseller doesn't explain why.
- In fact, the Ardunio is a whole custom Ardunio painted to go on a Pi ([[ https://wiki.protoneer.co.nz/Raspberry_Pi_CNC | Ref ]]). Surely this isn't just for GRBL?
- Some guy actually has written PiGRBL for free, called "raspigcd" ([[ https://github.com/pantadeusz/raspigcd | Ref ]]).
- Discussion connected to raspigcd suggests that the real-time stuff is probably the main issue ("you have to install a real-time kernel").
- There are some hybrid boards like the BeagleBone that have a chip called a PRU ("Programmable Real-Time Unit") to do the control stuff.
I would guess that a significant force here is that users in the small-scale/DIY CNC space are price sensitive but not interested enough in DIY to want to write their own CNC software or wire their own control board, and using an Arduino clone is cheaper than any other approach provided the end-user already has a computer, and some guy already wrote GRBL for free, and the real-time stuff is probably trickier on other systems.
I don't really expect to get anywhere, but I'm not married to Arduino as a control board platform. Seems fine to start, but worth reconsidering if I hit limits (probably memory?).
**Division of Responsibilities**
The general flow for cheap/DIY CNC appears to be: some kind of fancy GUI tool on the computer generates G-Code, which is a simple text format describing machine instructions ("move to X=100, Y=200"). G-Code is sent to the control board. The control board converts G-Code to I/O pins, generally with a piece of software running on the control board called GRBL.
The minimal complexity of the control board is fairly high. I think there are two general limits: real-timiness and bandwith/latency.
Even cases like hitting a limit/homing switch or a tool height switch aren't really real-time sensitive as long as the board reports the event to the computer before processing more steps.
There's one obvious issue here: these delays always make execution slower and never make it faster, so even if the machine is time-independent, accepting these delays means accepting a slower machine.
A less obvious issue may be that paths aren't time-independent if a machine has heavy parts with momentum: an axis will generally take longer to stop if it is moving quickly than if it is moving slowly. If we run the exact same sequence of I/O operations twice as fast, momentum may give us a slightly different result.
We could also perhaps imagine cases like a tapping cycle where we want to run an axis at the same rate as a spindle? But I'm not sure you can tap without a servo/stepper as the spindle. Even stuff like running a constant-feed-rate cycle on a lathe axis shouldn't depend significantly on things being real-time.
There are also some cases where things "obviously" have real-time physics and we have to react to them in real time (like an end-effector catching a ball, or a sandblasting tool that continuously removes material under the toolhead independent of axis motion, or maybe the heating or cooling of a 3D-printed plastic part).
So I think the general issue here is that machines interact to varying degrees with physics. In traditional CNC, this interaction is mostly motor momentum. In other applications, particularly general robotics applications, there might be more interaction. This seems to line up with random people talking about this on the internet. ([[ https://forum.linuxcnc.org/27-driver-boards/11073-novice-why-realtime-is-so-important | Ref ]]).
I might be able to ignore this on small machines with light workpieces and tiny steppers: the error from momentum is almost certainly overwhelmed by the error from my plotter collet being made out of pool noodle. However: I suspect a major source of angry stepper motor sounds is not accelerating them properly; and the next limit is harder and informs design directly.
The other limit is bandwidth, I think? A short, simple tool path can easily require 2^16 steps on the I/O pins, and each axis theoretically accepts 30K pulses/second. If we want to drive three axes at full speed, we may need to change I/O state 180,000 times per second. If each instruction is primitive ("Enable pin 3 now") maybe we could perhaps imagine a control language composed of 1-byte instructions.
By default, Arduinos run at 9600 baud, which would limit us to 0.5% of the maximum machine speed. However, you can increase the rate to 115200 in the UI (~6% of maximum speed) and some people on the internet suggest that 1M baud (~50% of maximum speed) works fine. This is still too slow and it's likely not practical to design a useful 1-byte protocol, but maybe a relatively low-level control language isn't //entirely// ridiculous.