Freesteel Blog » How to automatically locate a model from probe points

How to automatically locate a model from probe points

Wednesday, August 19th, 2015 at 6:44 pm Written by:

The limitations of the scipy.optimize.minimize() function has now become apparent. They should have called it localminimize() starting from an “initial guess”.

This follows on from the mess I made out of using this same function to calculate the circumcircle of a triangle.

Here I begin with a STL file of a widget which was then probed from 20 random directions to a distance (ball radius) of 5mm.

widgetprobe2

This was done using a my barmesh library that I ought to start getting back into as I haven’t touched it since I got distracted by all this arduino electronics.

The barmesh code itself is impenetrable when I looked at it recently, but use of its features is still possible.

For example, to find the point along a line segment between probefrompoint and probetopoint that is exactly proberad units from the stl model, use the function Cutpos() in the following way.

tbm = mainfunctions.TBMfromSTL("stlmodel.stl") # Triangle Barmesh
ibo = implicitareaballoffset.ImplicitAreaBallOffset(tbm)
probefrompoint, probetopoint = P3(100,100,100), P3(0,0,0)
proberad = 5.0
probevector = probetopoint - probefrompoint
lam = ibo.Cutpos(probefrompoint, probevector, None, proberad)
assert lam != 2, "probe vector failed to intersect"
probepoint = probefrompoint + probevector*lam

You can then use DistP() to verify that the points chosen are within proberad units of the model. The PointZone object also has a vector pointing to the closest point, which are drawn in red in the image above.

pz = PointZone(0, proberad+1, None)  # object to manage searching
ibo.DistP(pz, probepoint)
print("Distance from point to stl model is", pz.r, "which is close to", proberad)

It all runs pretty slowly because it’s in Python and I’ve not paid much attention to the speed. The structures are prepared for a more efficient implementation, but it won’t be done until there’s a good reason to finish it.

Statement of problem
Suppose we bolt down the STL part on the bed of a coordinate measuring machine in an unknown orientation at an unknown position. The unknowns take the form of 6 numbers, 3 for the rotation and 3 for the translation, which we encode into the vector X.

The coordinate measuring machine probes the shape at N different places with a spherical probe of radius proberad. This produces a list probepoints of N points all of which are the distance proberad from the model transform(stlmodel, X).

Modulo any symmetry in the stlmodel, there is only one solution of X satisfying:
—> sum( (distance(transform(stlmodel, X), p) – proberad)**2 for p in probepoints ) = 0

We should therefore be able to solve for X, and thus find the exact orientation of the model in the machine from the probing data. We would be able to use this knowledge to drive any hole drilling or edge trimming adjustments to the part.

As said, I tried to use the scipy.optimize.minimize() function to find the 6-dimensional vector X and it failed completely unless the “initial guess” was closed to the right answer.

So I simplified the problem by locking down the rotation component leaving only the translation vector.

This usually gets to a solution, but it takes ages from so many slow evaluations. I’m just doing it like this:

def fun2(x):
    v = P3(x[0], x[1], x[2])
    ds = [ MeasureDistanceToModel(p + v)  for p in probepoints ]
    return sum((d-proberad)**2  for d in ds)
g = scipy.optimize.minimize(fun=fun2, x0=[0,0,0])

You can simulate moving the model to different places by setting different initial values of x0. It usually gets to the right answer, because points that are far away from the surface apply a disproportional pull due to the square sum, while points inside the model attracted to the wrong surface are going to make a small difference.

There will be a very much faster way to iterate down to the correct translational value by using the vectors pointing to the closest approach to direct the pull. The key is that you know you’ve got the result when g.fun approaches zero.

What if we allow for one degree of freedom in the rotation about the z-axis? This could be useful if you know you’ve clamped the part down onto the flat bed, but you don’t exactly know where or how it is aligned.

I’ve simulated this in the following way:

def FindClosestMatchRotDistance(ang):
    theta = math.radians(ang)
    st, ct = math.sin(theta), math.cos(theta)
    rprobepoints = [ P3(p.x*ct+p.y*st, p.y*ct-p.x*st, p.z)  for p in probepoints ]
    def fun2(x):
        v = P3(x[0], x[1], x[2])
        ds = [ MeasureDistanceToModel(p + v)  for p in rprobepoints ]
        return sum((d-proberad)**2  for d in ds)
    g2 = scipy.optimize.minimize(fun2, [0,0,0], method='Powell')
    return float(g2.fun)

As you would expect FindClosestMatchRotDistance(ang) is approximately equal to zero when ang=0.

Here’s what happens when I graph this against different values of ang:

widgetproberot

As hoped, the only place where the answer goes to zero is at ang=0 or 360. Any starting point between +-30 degrees is likely to fall into this correct minimal value if we applied scipy.optimize.minimize().

But there is a huge local minima around the ang=180 mark, which is easy to predict from the symmetry of the example and the number of probe points on the hump in the middle.

There are a couple of other minimas at around the 90 degree and 270 degree marks, because a rectangular block will fit itself at these orientations slightly better than at one of the cockeyed 45 degree orientations — especially when the two sides of the rectangle are similar in length.

(Don’t worry that the graph not symmetric about the ang=0 line in spite of the original model being perfect mirror image of itself. This is because the set of probepoints is random and not symmetric.)

How to solve this problem?

There are two ways.

Either you get the user to estimate the orientation of the part to within 30 degrees, or we have to basically run the minimize() function at 30 degree rotation intervals until it finds a minima that’s actually zero. That is, we scan the solution space at an interval that is narrower than the valley surrounding the correct answer in the expectation that we land in that valley.

All of this can be highly optimized, parallelized and whatever. I would not expect the finished product to depend on the general-purpose scipy.optimize.minimize() function because we can pool the evaluations of FindClosestMatchRotDistance() and terminate it early once we have determined that whatever minimas that would lie in a particular search range cannot be zero.

Also, you can operate on subsets of the probe samples to discount no-hope search zones quickly and very early on.

For example, if we repeat the above analysis for the first 3 probe points, everything computes faster and we get the graph below which rules out about 35% of the solution space straight away.

widgetproberot2

The next trick is to find out if anyone wants this geometric function and whether there is a way of supplying it were it to be written.

I would hope to distribute it self-contained and on the end of a pipe like we did with our Adaptive Clearing processes for parallelism, and have the geometry and points serialized in, and the results serialized back out so that nobody has to be concerned that it’s implemented in Python or in some crazy hybrid Cythonized derivation.

The full-on solution would be to hand live control of the coordinate measuring machine to this application so that it can start its computation as the probe positions are received, and recommend further probe positions based on what would make the biggest difference to the solution space.

It would be a shame if, due to the lack of systems thinking in the industry forced by the requirements of the suppliers to make profits from limited general-purpose software, combined with the absence of sophisticated demands by the customers, we are left with a solution that depends on sampling with the probe on a plane grid array and then batch processing the values elsewhere.

Even then it’s still poking the air blindly and it ought to be receiving still images of the part so that it can estimate its silhouette and instantly narrow down the search volume to what matters.

Paint the clamps bright green. The computer must know what shapes they are. When you finally see that happening in the industrial world you’ll know that they’ve finally gotten around to using object visual recognition technology 20 years after the hardware in terms of digital imagery became undeniably available.

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <blockquote cite=""> <code> <em> <strong>