I’ve recently posted about my new stepper motor controller. Well, I was unhappy with the implementation used in the RepRap project, and what started as a code cleanup has turned into a near full rewrite. I’m fairly happy with the code now and finished my preliminary testing. I based my original version of the rewritten code turned out to be a variant of Bresenham’s algorithm (though I found this out after I had written the code). The algorythm would determine the axis which required the most travel, step that axis, then add the required travel from one of the other axis to an error term for that axis. When the error term was equal to or greater than one, it would step that axis and subtract one from the error term. This resulted in stepping synchronized with the major axis timebase. Looking at the output of the stepper motor driver, this pattern emerged.
Here the GCode command was “G1 X3 Y2 Z1” — and indeed the ratio of steps is 3:2:1, but notice that the Ysteps are required to align with the XSteps (master axis because it moves the furthest), but because 3 is not divisible by 2, it skips every third step. This motion is bad for the stepper motor, and it can cause all sorts of problems.
It is *extremely* important the steps be evenly spaced, unless you’ll be content with slow, unreliable operation. Jitter in the step pulse train will limit your top speed, as it *will* cause the stepper to stall. Ray L.
So I spent some time thinking about it, and came to the conclusion that the Bresenham’s Algorithm only makes sense if you are dealing with a pixelated output (its main purpose is for fast rendering of lines on computer screens). I realized that I needed was a temporal based one, and came up with something. I still need to do a literature search to see if someone has come up with it yet (would be surprising if no one has) but this is basically what I came up with.
 Determine the number of steps per axis
 Determine the distance of travel for the command, and based on the feedrate, compute the duration of the command.
 Compute the time between steps (TBS) for each axis (duration / steps)
 Start the main loop.

 If the time elapsed is equal to 1/2 the TBS, then step that particular axis.
 Wait 1/2 TBS, then go to step 5.
I prototyped this in python, and came up with
maxtime = 100.0; #duration of motion in microseconds
steps = 41.0; #number of y steps
timePERstep = maxtime/steps; #time per step
y=0;
stepped = 0;
oldTimeIntoSlice = 0;
for time in range(0,int(maxtime)):
timeIntoSlice = (time%timePERstep);
if (timeIntoSlice < oldTimeIntoSlice):
stepped = 0;
oldTimeIntoSlice = timeIntoSlice
if (timeIntoSlice >= 0.5*timePERstep) and (stepped == 0):
#print 'step'
stepped = 1;
y = y + 1;
print '%d,%d' % (time,y)
Which generated this sort of a stepping profile. Notice that the steps are pretty evenly spread out..
After rewriting the arduino code, I tried the same test (with G1 X1 Y2 Z3 this time) and got much better results
Notice how everything is nicely spread out. Looking at the timestamps, the maximum step jitter is about 150uS. The next step is to hook this up to some stepper motors and see what it does.
I’ve also released the code for the arduino here in case it might be of help to anyone. I’m thinking of making a daughter board for the arduino with the stepper controller ICs on it in order to simplify deployment for others — if there is interest.
Uncategorized