Page MenuHomePhabricator

Notes on Ardunio CNC drivers
Open, WishlistPublic

Assigned To
None
Authored By
epriestley
Apr 28 2021, 5:33 PM
Tags
None
Tokens
"Manufacturing Defect?" token, awarded by tycho.tatitscheff."Pterodactyl" token, awarded by leoluk."Cup of Joe" token, awarded by cspeckmim.

Description

Curves

  • Bezier curves can't perfectly describe circles. (Ref)
  • It's nontrivial to define some B(t) function for a Bezier curve that has constant speed. (Ref)
  • G-Code can do Bezier curves (G5 "Cubic Spline", G5.1 "Quadratic Spline") and other complex curves (G5.2/G5.3 NURBS). (Ref)
    • However, it sounds like use of these is somewhat rare, and some/most software that generates G-Code approximates curves with many G2 or G3 arc segments. GRBL does not implement G5 and has no plans to implement them. (Ref).
    • G-Code doesn't seem (?) to be able to specify these curves in planes other than X/Y, although maybe you can switch the active plane with G18 so that your "Y" parameters become "Z" parameters. But I think you can't specify an N-axis spline (or a helical spline?). GRBL does support the plane select (G17, G18, and G19) operators.

Computers and Boards

General model: motors are driven by a control board. The control board is connected to a normal computer over some flavor of serial connection. I want a normal computer on one end so I can press buttons.

Why do we need a computer?

Some component of the system needs to be able to run a GUI so the UI can be reasonably nice to use. It's possible to run CNC fully offline with no external connections (by putting a G-Code instruction file on a flash drive, for example) but this use case isn't interesting to me. Small boards like Arduinos can't run the 50MB of Javascript required to render a button in a modern GUI because they aren't fast enough. A desktop computer is, very conservatively, >200x faster than an Arduino.

Why do we need a control board? Why can't we just use a computer?

  • The computer can't control the motors directly because normal computers physically don't have I/O pins.
      • Raspberry Pis do have I/O pins, and there are tutorials for driving stepper motors with a Pi. It's not immediately clear to me why you can't use a single Raspberry Pi in both roles.
        • Guess 1: Maybe it's difficult to achieve real-time I/O control on a PI?
        • Guess 2: Everything runs GRBL, which is mostly written by one guy for free, and no one has happened to want to spend their life writing PiGRBL for free?
      • Retailers sell a Pi glued to an Arduino (Ref), although this reseller doesn't explain why.
        • In fact, the Ardunio is a whole custom Ardunio painted to go on a Pi (Ref). Surely this isn't just for GRBL?
    • Some guy actually has written PiGRBL for free, called "raspigcd" (Ref).
      • Discussion connected to raspigcd suggests that the real-time stuff is probably the main issue ("you have to install a real-time kernel").
    • There are some hybrid boards like the BeagleBone that have a chip called a PRU ("Programmable Real-Time Unit") to do the control stuff.

I would guess that a significant force here is that users in the small-scale/DIY CNC space are price sensitive but not interested enough in DIY to want to write their own CNC software or wire their own control board, and using an Arduino clone is cheaper than any other approach provided the end-user already has a computer, and some guy already wrote GRBL for free, and the real-time stuff is probably trickier on other systems.

I don't really expect to get anywhere, but I'm not married to Arduino as a control board platform. Seems fine to start, but worth reconsidering if I hit limits (probably memory?).

Division of Responsibilities

The general flow for cheap/DIY CNC appears to be: some kind of fancy GUI tool on the computer generates G-Code, which is a simple text format describing machine instructions ("move to X=100, Y=200"). G-Code is sent to the control board. The control board converts G-Code to I/O pins, generally with a piece of software running on the control board called GRBL.

The minimal complexity of the control board is fairly high. I think there are two general limits: real-timiness and bandwith/latency.

The real-time constraint is a bit complicated. If we imagine that the board control language is very low-level ("enable pin 3 now"), we might expect that the time between when the computer sends the command and the board receives the command may sometimes be several milliseconds, if the computer needed to pause for a moment to think about 50MB of Javascript to let you know about new deals from Microsoft. So commands may reach the board at an arbitrarily later time than they are sent from the computer.

For most simple stuff, that actually doesn't seem like a big problem? Stepper motors only move when you pulse them, so if you send "enable pin 3 now" and then wait too long to send "disable pin 3 now" because you're thinking about Javascript, the tool should move along the same path, just at a slower speed.

Even cases like hitting a limit/homing switch or a tool height switch aren't really real-time sensitive as long as the board reports the event to the computer before processing more steps.

There's one obvious issue here: these delays always make execution slower and never make it faster, so even if the machine is time-independent, accepting these delays means accepting a slower machine.

A less obvious issue may be that paths aren't time-independent if a machine has heavy parts with momentum: an axis will generally take longer to stop if it is moving quickly than if it is moving slowly. If we run the exact same sequence of I/O operations twice as fast, momentum may give us a slightly different result.

We could also perhaps imagine cases like a tapping cycle where we want to run an axis at the same rate as a spindle? But I'm not sure you can tap without a servo/stepper as the spindle. Even stuff like running a constant-feed-rate cycle on a lathe axis shouldn't depend significantly on things being real-time.

There are also some cases where things "obviously" have real-time physics and we have to react to them in real time (like an end-effector catching a ball, or a sandblasting tool that continuously removes material under the toolhead independent of axis motion, or maybe the heating or cooling of a 3D-printed plastic part).

So I think the general issue here is that machines interact to varying degrees with physics. In traditional CNC, this interaction is mostly motor momentum. In other applications, particularly general robotics applications, there might be more interaction. This seems to line up with random people talking about this on the internet. (Ref).

I might be able to ignore this on small machines with light workpieces and tiny steppers: the error from momentum is almost certainly overwhelmed by the error from my plotter collet being made out of pool noodle. However: I suspect a major source of angry stepper motor sounds is not accelerating them properly; and the next limit is harder and informs design directly.

The other limit is bandwidth, I think? A short, simple tool path can easily require 2^16 steps on the I/O pins, and each axis theoretically accepts 30K pulses/second. If we want to drive three axes at full speed, we may need to change I/O state 180,000 times per second. If each instruction is primitive ("Enable pin 3 now") maybe we could perhaps imagine a control language composed of 1-byte instructions.

By default, Arduinos run at 9600 baud, which would limit us to 0.5% of the maximum machine speed. However, you can increase the rate to 115200 in the UI (~6% of maximum speed) and some people on the internet suggest that 1M baud (~50% of maximum speed) works fine. This is still too slow and it's likely not practical to design a useful 1-byte protocol, but maybe a relatively low-level control language isn't entirely ridiculous.

Streaming and Timing

Memory on the Arduino is at a premium. We can't fit a whole G-Code (or whatever) plan into memory, which isn't too much of a shock.

But we often can't even fit a single path into memory. GRBL maintains a ring buffer of 16 planned linear motions, but it looks like it slices arcs into as many as 2,000 linear motions. When the motion planner attempts to insert a 17th linear motion into the queue, it just busy waits.

The actual motion is driven "in another thread" by interrupts: the motion ring buffer is produced by the main execution thread, and consumed by interrupt timers.

From comments, it seems like some aspects of linear decomposition (particularly, trig functions) may take ~50us, which is way above the ~10us sensitivity on steppers.

However, I think this design means that you can't (or, at least, can't easily) rewind a path? I'm generally unclear on this, but it seems like there is generally no provision in this control system for operations like "the mill snapped, so: pause, pull out of the work piece, swap the bit, measure the new tool height, rewind the path by one second, resume milling".

This guy (Ref) doesn't even seem to be able to pause the machine to kick down the dust collection sheathing? Maybe he just isn't familiar with the system, or is running it in standalone mode and didn't wire in a pause button? G-Code has a "Feed Hold" command which works like a "Pause", but it generally seems like most implementations are fire-and-forget and don't support modifying the path at runtime to handle things like replacing the bit.

There seem to be essentially no relevant hits for "CNC rewind" on Google. I'm surprised no one is asking about this -- maybe it's not actually necessary in practice?

  • This guy (Ref) does a bit change between sections by splitting the job into two parts and manually re-zeroing to, you know, roughly the same place.
  • This guy (Ref) can start a program partway through, but he's using a trillion dollar industrial CNC machine, and he "usually leaves [this setting] off" because replaying 700,000 lines of program state is slow (???!!!).

Iterators/Functors?

Conceptually, I'd like (?) to structure instructions as iterators that emit lists of motion plans? Then the board could drive multiple motion plans simultaneously and pause/rewind/resume more easily? I'm not sure how practical this is given memory and CPU constraints on the Ardunio.

GRBL is also very hard-coded to a particular pinout: the computer controller can't tell the control board how it is wired at runtime. This seems sort of silly: I'd like to hand the board a software definition of an arbitrary number of axes, spindles, effectors, etc., at runtime and have the control language look more like "move stepper 1 and stepper 2 in a linear path to 100, 200 at rate X", not "move x to 100 and y to 200". This may be difficult to achieve given space constraints.

Event Timeline

epriestley triaged this task as Wishlist priority.Apr 28 2021, 5:33 PM
epriestley created this task.
  • This is far afield from any application I have today, but it seems plausible to operate a small-scale DIY plastic foundry (Ref) that converts plastic waste into blanks for machining or injecting into machined molds.
    • The cost to just buy premade plastic blanks doesn't seem particularly high (roughly comparable to plywood?) although I know nothing about plastic qualities.
    • Unsurprisingly, it seems like the market for recycled plastic material doesn't have a lot of DIY buyers (unit sizes are often: 1,500 pounds; per metric ton; per 40,000 pound truckload; "*Only Quantities of 10k lbs Plus").
  • There are a handful of people doing extremely high-precision DIY EDM machining (Ref).

Now that I realized this isn't an April fools joke (or if it is, it's kinda late...) anyway I have a whole lot of experience in this department, having built several cnc controllers. My latest one is using a beaglebone black but I also experimented quite a lot with an arduino running grbl and another one running on a $30 esp32 board. If you have any unanswered questions I'd be happy to comment. I should have slightly better than ignorant responses.

The arduino based GRBL systems are essentially using 100% of the hardware capabilities with extreme levels of optimization to just barely work. The upshot is that there is very nearly zero possibility for improvements to grbl without moving to better hardware. Fortunately lots of people have ported it to various other platforms.

  • One of the best and most active ports that I'm aware of is Grbl_Esp32
  • Probably the most popular alternative to grbl is running Mach3 on a windows PC. As I refuse to run windows I cannot comment on this option and definitely cannot recommend it.

Anyway, apologies for butting in on your notes here... just thought I'd offer something that is hopefully more helpful than:

lol

I'm starting with an absolute bottom-of-the-line 3018, I've "upgraded" it with a plotter collet I made out of a pool noodle and a piece of cable gland so I'm less likely to hurt myself for now:

179844809_10107947618735199_738255650651003118_n.jpg (935×526 px, 71 KB)

So far, I've mostly been implementing controller software, following heavily in the footsteps of GRBL. I think I'm maybe ~25% of the way toward having a similar-ish capability set? But I don't really know what I'm doing, I am doing a few things a bit differently, and haven't written any meaningful amount of C in many years, and suspect I may run into a wall with the CPU.

One thing I've changed is that the computer tells the board what hardware it is connected to at startup, e.g. "Linear Stepper Axis 1 is enabled by pin 8, driven by pin 2, and direction is controlled by pin 5", since I'd like to be able to rewire the board without updating the software, and generally have more control over the hardware configuration, and have an arbitrarily large number of axes and actuators and blinking lights and whatnot. My motion commands then reference axis IDs ("2D linear motion on axis 0 and axis 1 to relative position 15, 35") instead of spatial axes like "X" and "Y" directly.

In theory, this means that the tool can (for example) draw true 3-axis splines rather than just 2-axis helical splines. This capability is almost certainly useless and I can't draw splines yet anyway, and I'm not sure if I have enough CPU to decompose a spline into constant-speed segments on the board. It also probably means that I'll need to have explicit stuff later on so the machine knows that it has to pull the tool out on the Z axis before it can move on X and Y, since it doesn't "know" that Z is Z anymore.

It also means that, in theory, the controller can control multiple tool heads simultaneously, e.g. I can put a second unique tool on the X-axis rails with a second lead screw. Both tool heads would always have the same Y position but maybe this would be useful if I have to, uh, drill and tap a long horizontal row of holes? I guess? I don't really have any use cases for this, it just seems sort of neat.

One possible (?) improvement I've made is that I think GRBL uses a "mostly"-linear clock to drive the steppers (e.g., the timer is in compare mode and it does one update every X ticks). I currently have it working with a "callback" clock (the timer is in overflow mode and I only get an interrupt when I next need to do something). I'm not sure why GRBL works the way it does -- probably for a good reason that I'll discover soon.

My next steps are something like this:

  • Get the rest of the linear motion pipeline sort of working.
  • Make arcs work.
  • Make acceleration / deceleration work?
  • Install and support limit/homing switches.
  • Add an hardware emergency stop button in case the machine goes crazy and tries to draw on something important with that sharpie.
  • Figure out how to do PWM now that I'm doing the timers manually.
  • Reconnect the spindle.
  • Try to cut some materials?

In theory, I then have a CNC machine that can cut parts to upgrade itself, which means I effectively have an unlimited number of infinitely large CNC machines.

I'm not sure I'll get that far and a more likely outcome is probably something like "my knock-off GRBL is way too slow to do anything useful", but I don't really have room for infinitely large CNC machines anyway.

One of the best and most active ports that I'm aware of is Grbl_Esp32

Interesting -- looks like they made some of the changes I made, too:

  • "Control up to 6 coordinated axes (XYZABC)."
  • "Motor drivers can be dynamically assigned to axes, so a 4 motor XYZA controller could be converted to a XYYZ (dual motor Y axis) without any hardware changes."

..so perhaps I'm doomed on the current board, but it looks like I can probably port to ESP32 fairly easily, too, as a possible approach once I hit the wall.

Probably the most popular alternative to grbl is running Mach3 on a windows PC. As I refuse to run windows I cannot comment on this option and definitely cannot recommend it.

Yeah, seeing the Mach3 UI and the Haas UI on YouTube are part of what made me interested in understanding the underlying problems better. Haas has a very well made YouTube series of tips on how to use their very serious, very expensive industrial machines and it's all like "manually type this special Haas extended G-Code control, G173, into your program, and then adjust setting 92, to solve this kind of problem". This is apparently the state of the art and Haas controls all the hardware and software in that chain!

I'm starting with an absolute bottom-of-the-line 3018, I've "upgraded" it with a plotter collet I made out of a pool noodle and a piece of cable gland so I'm less likely to hurt myself for now:

179844809_10107947618735199_738255650651003118_n.jpg (935×526 px, 71 KB)

So far, I've mostly been implementing controller software, following heavily in the footsteps of GRBL. I think I'm maybe ~25% of the way toward having a similar-ish capability set? But I don't really know what I'm doing, I am doing a few things a bit differently, and haven't written any meaningful amount of C in many years, and suspect I may run into a wall with the CPU.

One thing I've changed is that the computer tells the board what hardware it is connected to at startup, e.g. "Linear Stepper Axis 1 is enabled by pin 8, driven by pin 2, and direction is controlled by pin 5", since I'd like to be able to rewire the board without updating the software, and generally have more control over the hardware configuration, and have an arbitrarily large number of axes and actuators and blinking lights and whatnot. My motion commands then reference axis IDs ("2D linear motion on axis 0 and axis 1 to relative position 15, 35") instead of spatial axes like "X" and "Y" directly.

Dynamic configuration does seem pretty useful. Hard coding everything in C header files and re-flashing for each change is not the most friendly configuration interface. It's also not good for the longevity of wear-limited flash memory on the microcontroller.

One possible (?) improvement I've made is that I think GRBL uses a "mostly"-linear clock to drive the steppers (e.g., the timer is in compare mode and it does one update every X ticks). I currently have it working with a "callback" clock (the timer is in overflow mode and I only get an interrupt when I next need to do something). I'm not sure why GRBL works the way it does -- probably for a good reason that I'll discover soon.

It may be doing it that way to avoid the possibility for missed timer overflows? Somewhat interesting explanation of that possibility in this stack exchange question and answer.

My next steps are something like this:

  • Get the rest of the linear motion pipeline sort of working.
  • Make arcs work.
  • Make acceleration / deceleration work?
  • Install and support limit/homing switches.
  • Add an hardware emergency stop button in case the machine goes crazy and tries to draw on something important with that sharpie.
  • Figure out how to do PWM now that I'm doing the timers manually.
  • Reconnect the spindle.
  • Try to cut some materials?

One of the best and most active ports that I'm aware of is Grbl_Esp32

Interesting -- looks like they made some of the changes I made, too:

  • "Control up to 6 coordinated axes (XYZABC)."
  • "Motor drivers can be dynamically assigned to axes, so a 4 motor XYZA controller could be converted to a XYYZ (dual motor Y axis) without any hardware changes."

..so perhaps I'm doomed on the current board, but it looks like I can probably port to ESP32 fairly easily, too, as a possible approach once I hit the wall.

You can always upgrade to an arduino mega but the interesting thing that grbl_esp32 does is that it uses special hardware features on the esp32 to generate the step pulses. esp32 has a pulse generator that's intended for generating pulses for a IR remote control emitter. The api for the pulse generator is a series of bytes that specify a pin state and duration and you can queue up operations which get faithfully executed while the cpu is free to handle other things. If you're interested the esp32 rmt driver is pretty well documented. I haven't looked at that code in a long time but it seemed just about ideal for running stepper motors and grbl_esp32 makes good use of that hardware feature. The esp32 also has nice stuff like bluetooth, wifi and a lot more RAM than an 8bit atmega (though still quite limited)

Probably the most popular alternative to grbl is running Mach3 on a windows PC. As I refuse to run windows I cannot comment on this option and definitely cannot recommend it.

Yeah, seeing the Mach3 UI and the Haas UI on YouTube are part of what made me interested in understanding the underlying problems better. Haas has a very well made YouTube series of tips on how to use their very serious, very expensive industrial machines and it's all like "manually type this special Haas extended G-Code control, G173, into your program, and then adjust setting 92, to solve this kind of problem". This is apparently the state of the art and Haas controls all the hardware and software in that chain!

It's amazing what passes for a UI in industrial machines. I suspect there isn't any motivation to improve because the operator is often not the person paying the invoice and the sales people work on commission + kickback so it's all about maintaining relationships and long term maintenance contracts rather that improving the product.

It may be doing it that way to avoid the possibility for missed timer overflows?

Ah, interesting. I think the particular issue with this "capacitor stopwatch" program doesn't directly impact either motion driver design, but the underlying "your interrupt may run arbitrarily later than the actual event which triggers it occurs" issue might.

I am currently fudging some of the timing here and hoping that I'll just get a small cumulative error (i.e., the whole path executes 0.01% slower than it's supposed to) but I'm not sure how large the error actually is and haven't really tried to measure or estimate it. It's possible that it's significant I won't be able to correct for it. One advantage of the "regular interval" design is probably that it's immune to cumulative error.

I should probably just set up a blog or something since I don't expect to bring any microcontroller components upstream, but until then:

A bunch of woodworking issues came up interrupting this, and then a bunch of plumbing issues came up interrupting that. I did manage to solder an emergency stop button to two barrel connectors and stick them in the power supply, so I should be able to avoid any major accidents where the router goes crazy with the sharpie.


I've learned that if you have a piece of male pipe which you measure to have an OD of 0.54 inches, that's definitely 1/4" pipe per helpful sizing charts like this. Unless they're compression threads, in which case they're 3/8" threads. You can tell the difference by buying the wrong fitting several times.

Home Depot has a pipe thread sizing gauge hanging next to the bins of pipe thread adapters. It has all the common sizes of male and female thread for several thread types (NPT, compression, flare?), so you can screw your fitting onto or into the right slot and be sure it fits. It looks a lot like this:

81ltVzJUTDL._AC_SL1500_.jpg (1×431 px, 141 KB)

...except it has pipe threads instead of bolt threads. But I can't bring an actively-in-use shutoff valve fitting into Home Depot without turning off the water to my house, and I can't find this sizing chart as a product anywhere online. A couple of suppliers do sell a "necklace" style gauge for NPT:

41B4MSbv8FL._AC_.jpg (426×500 px, 22 KB)

But these cost like $50 -- I'm pretty sure I could build one more cheaply by going to Home Depot and buying one of every size fitting? -- and none of them have compression threads.

I sort of suspect everyone doing this more than once just figures out that a supply line to a dishwasher or sink or toilet is either 3/8" compression or 1/2" compression 99% of the time, and if the two bits don't fit together you need the other one. But hardware stores sell a lot more compression fitting sizes -- are these just decoy fittings to trap the unaware? If not, why are there no compression thread gauge tools?


Among other things, I'm replacing two pedestal-style vanities that were attached to the wall with great gobs of silicone sealant (the wall is tiled). They look like this:

images.jpeg (196×196 px, 2 KB)

These vanities are a bit dangerous for particular babies, since it's easy to stand up into the overhanging part and hit your head if you're the kind of baby that likes doing that sort of thing. I removed them by trying these things:

  • Removing the silicone with a putty knife, but this couldn't reach most of it.
  • Cutting through the silicone with a flush trim saw and a flexible Japanese-style pull saw, but the blade kerf was too thick to make much progress and got pinched between the vanity and the tile.
  • Cutting through the silicone with a wire/cable saw, but it didn't make much progress and seemed to be damaging the tile and vanity.
  • Dissolving the silicone with a solvent. The one I had on hand has 4.3 stars on Amazon but only seems to make sealant wet and sort of smell bad, with no apparent effect on its adhesive properties.
  • Blasting the back of the sink with a heat gun while applying a gentle convincing force to the sink basin. This sort of worked.

I got one off with no damage to the tiles. The other one took four tiles with it.

Since the damaged area will be completely hidden by the new vanity, I bought some similar-looking tile without worrying too much about getting an exact match. After applying mortar to the wall and the back of the tile and setting the first tile in place, I learned that some tile is manufactured so you don't have to include any spacers for grout, like this Datile 3x6 subway tile:

bright-white-daltile-ceramic-tile-re1536modhd1p4-1d_1000.jpg (1×1 px, 6 KB)

Specially designed with lugs to create a grout joint, no spacers needed.

Since I expected that I'd need to add grout spacers, I didn't painstakingly clean up every bit of the old grout first, so the fit was very tight and not very flat. After grouting, the repaired tile might escape a casual glance if the viewer wasn't looking carefully and it was completely hidden behind a vanity, so that's good enough for me.


I saw this clock (LaMetric Wi-Fi Clock) in a plumbing YouTube video:

816C5vTvKiL._AC_SL1500_.jpg (1×1 px, 101 KB)

I end up checking the time a lot these days so I can keep track of baby nap times (shhh!) and bath times -- and I'm spending less time in front of a computer, so I don't always have a clock on-screen. It's also inconvenient to pull out my phone if my hands are full of tools or babies or whatever, so I have digital clocks in most of the rooms I spend a lot of time in. I'm currently using these ("DreamSky Compact Digital Alarm Clock"):

61-31FeRmGL._AC_SL1500_.jpg (998×1 px, 67 KB)

...which I think are simple and clean looking, but I like the look of the LaMetric a lot. But it's like $200 (??!!!) and I don't want any of the "smart" nonsense, so I was like "It's just a clock! Why does it cost $200?! I can probably build that, given $400-600 in materials and 12-18 months of work!".

I suspect it's $200 because it's actually a lot of work to make it look that nice, but I bought this WS2812B matrix as a starting point:

71JjdUjfZjL._AC_SL1100_.jpg (772×924 px, 126 KB)

As far as I can tell, you control these by sending a sequence of 24-bit frames to the data pin. Each LED strips the first frame off, sets its color according to the value the frame represents, and passes the rest of the frames to the next LED.

There are a bunch of libraries to help with this that I obviously won't be using because I didn't write them, but I was looking at one and found an open issue along these lines (paraphrased):

Please consider rewriting this library in a more "Ardunio" style, e.g. without #define and with more globals. If you don't, I won't use this library and will write my own.

Somehow, I feel quite sure I won't miss open source.


GitHub Copilot: I don't really understand how people who have enough mastery over programming to build this would think it's a useful thing to build. Perhaps GitHub (market value: ~$10B) just understands the market better than I (market value: pocket full of leftover NPT fittings) do.

There are a bunch of libraries to help with this that I obviously won't be using because I didn't write them...

I didn't ultimately have to link against anything, but I did shamelessly copy/paste the 30 lines of assembly required for the signal timing transmission loop because I am nowhere near that much of a wizard.

  • Cryptographically Random Orbit Sander