Freesteel Blog » 2006 » April
Tuesday, April 25th, 2006 at 7:17 pm - Machining
We’ve got our hack of a brochure more or less done. The word “brochure” might be an over-the-top description of this single sheet of glossy paper. The PDF version is here. Some of its material may find its way onto our web-page.
The amount of effort this stuff takes up leaves me in awe of the work that must go in to producing all the heaps of printed junk mail that I get sent. Maybe there are tricks of the trade. The effort it takes to make even the most basic 3D diagram look good and be informative is staggering.
Admittedly, we aren’t using any proper software. Just Open Office to stick it together. The main 3D picture was made by writing python scripts for VTK, and the rest of the images are simply screenshots from Cimco Inspect, which allows you do to everything except switch off those nasty little axis arrows on the origin.
I think we should run off about 50 down at the stationer’s.
Tuesday, April 25th, 2006 at 5:47 pm - Machining
We went to the Liverpool Inventor’s Club in the Central Library last night with a talk by John Lambert. Oh my. I kept my mouth zipped shut. It’s all very well explaining the patenting process in terms of the opportunity for the private citizen to dream up an invention, keep it secret, take out a patent, avoid the scam artists and patent sharks, and somehow negotiate with someone who has the money for a small cut of the profits should they back it as a product. It might be fine to warn us that only 2% of the patents ever become actual inventions, and that an even smaller fraction ever make money, but that’s not the whole story.
The real story is that almost all patents are taken out by corporations, not private individuals who are inventors, and any presentation that leads people to believe otherwise should be seen as a malicious act of deception. There is a strong case that the patent system in its present form is responsible for such an exceptional amount of damage to the public that it is the duty of everyone involved to raise awareness of it, rather than promulgating highly selective fantasies about how a little person managed to make a fortune out of the system against all the odds stacked against him by the system. It’s as unethical as plastering the town with the picture of the one grinning gambler who won fifty thousand pounds down at the casino last year.
Most people are going to be ripped off massively in their lifetime by just the cost of patented drugs to keep them alive, beyond any patent royalties they could ever hope to earn. The scandal behind this business, and the lies that are told to justify the status quo are quite breathtaking. It’s a disgrace that we let them get away with it, rather than prosecute them for fraud.
The whole concept of Intellectual Property is a nest of thorns. One pattern is clear: those who are the keenest on it tend to be the least creative and least productive in terms of generating this “property”. This includes has-been musicians, who of course aren’t speaking out for themselves, but are speaking up for “other lesser-known singers [who] would lose out on what in some cases was effectively their pension”… if they hadn’t already sold all their recording rights to a record company back in 1959 as they tended to do on condition of getting their single released. We don’t expect dear Cliff to know about the well-worn dirty tricks in the record industry. We also don’t expect him to be aware that when he compares copyrights of recording to copyrights of writing, the terms of the latter are already outrageous. It’s about the continual theft from the public of what should now be our cultural heritage. The extremist view would like to Shakespeare’s plays still under copyright. A real genius, one who was creative in their art each and every day, would be generous with their work, knowing that there is more where that came from. For this attitude we should rate mathematicians more highly than mere artists in terms of respect, since they don’t pretend that their work is private property; it is given to all at the time of invention.
Whatever. Your mind can drift during these lectures. Ostensibly we were there to ask about software patent indemnities in an OEM contract that didn’t look quite right. Apparently you can get insurance for this sort of thing. However, I don’t think our case fits into the standard mold enough to find someone who can give us a quote. But I may be surprised yet.
Meanwhile, back to some debugging of my pencil milling algorithm which I have been studiously avoiding for days.
We’ve had several people asking for the exe download of our algorithm after I rashly noted that I’d send them out to anyone who asked “nicely”. Let me rephrase this in the context so far: “Nice” means you have complimented us on our lovely web interface of our algorithm, which we have put a lot of effort into. And the most effective way to compliment it is to use it… or tell us specifically what we need to change to make it more useable, wait for us to do it, and have another try.
The exe version isn’t a whole lot better, except for the animation, and we can’t go handing out too many copies of it without getting anything back (such as a specific offer of rigorous cutting trials). The contract I have seen so far for selling this algorithm to a CAM company for money has a “best price clause” — we refund them the difference if we sell it cheaper to anyone else — and I need to keep my options open.
One of the serious flaws in the interface is the post-processor, obviously. Someone has sent us a load of necessary corrections to us, and Martin will work out some effective way of handling it properly. If any other user finds any issues, we need to be told.
A little hint about helping with posts — we don’t need to know reasons. Just say what text needs to be converted into what, and we’ll deal with it. When I joined NCGraphics in 1992 I was put onto doing post-processors in the first day, and was having to do support calls within the first week. The system they had was so bad (the posts were written in FORTRAN) than I designed a universal post-processor after the first month, which is still used today. It wasn’t until after six months that I even saw a machine tool. All I had to get from a user was that they wanted the “X” letter and the “Z” letter swapped around, or the decimal points taken out, and I could do it, but they could be explaining their machine which I knew nothing about for twenty minutes before I got to this point. It’s like when you are in a city you don’t know, but you have a map, and you ask where the train station is, the road name is enough: a full explanation of how to get there from here is going to be forgotten. But once you get people started you have to let them run through until they reach that critical piece of information at the end of the tape loop.
Tuesday, April 18th, 2006 at 11:01 am - Weekends
Just come from this Science Fiction convention in Glasgow. Four days of beer, listening to talks, and sleeping in the middle of a little patch of trees in the carpark by the river (I am an extraordinary a cheapskate when it comes to hotels). I think I enjoyed it, though I can’t remember much specifically. It’ll come back. Maybe what was most enjoyable was being unable to do anything with a computer for such an extended period. It gave my field of vision time to recover. At 70 quid connection fee in the hotel, no one there was online.
I did see five other guys I knew from university 17 years ago when we were all in the SF society and met every week. I barely had a conversation with most of them. I am regularly surprised by what a boring person I can be in certain contact situations. Ask about work, it’s always programming computers for all of us. One often has to struggle after that. Two now work on stock exchange trading/life insurance software. Given as I believe that the entire financial “services” sector is a con-trick from start to finish, it can be like meeting a nice RAF man and trying not to ask how many children he’s bombed to defend British interests. Oh dear.
Science Fiction is in some ways an extraordinarily conservative genre. Its usual performance is to take some huge lie, like the ideas that there are time travel machines, gods, parallel universes, thinking computers, an infinite energy supply, no global warming; and then make it the truth. It’s very rare to take something we all believe in, like nationhood, or the existence of money, and write a story which exposes it as the lie that it is.
I am still looking for an effective way to put my one sold story online so it looks good.
Now in Aberdeen visiting my ma for a couple of days. It’s a long way up here. There is a heck of a lot of Scotland, especially the coast. Hope to do some trips in the summer if we get lucky with cars. It seems impossible to hire ones that can carry sea canoes.
Someone asked how to work out the sample rate for cutter locations from a given machining tolerance. This value refers to how much a toolpath is allowed to gouge the part. Undercutting is a different and less critical problem since you can always go back and cut some more metal off; it doesn’t mean you have to throw away your part.
Looking at the problem from above in 2D (the principle is the same in 3D), you have two cutter positions A and B. The absolute volumes in space occupied by the cutter overlaps if the distance between A and B is less than twice its radius r.
We know that the cutter at location A absolutely does not interfere with the part, because that’s how we found its position (by moving it down or from the side until its very first contact with any triangle defining the part). The same is true for location B.
Now, when the machine tool drives the cutter from point A to point B, it passes through an area we don’t know anything about. There might or might not be material of the part in the cusped area I have marked as the “Volume of possible gouge”; it has not been checked.
Assuming the worst case scenario, there could have been a piece of the model that intruded into this unchecked volume as far as possible. And according to the geometry of the tool and the overlapping areas, the deepest it could come in is the distance t. That is the machining, or gouge limiting tolerance. It’s simple trigonometry to work backwards from the cutter radius, r a pre-set value of t, to the appropriate sample rate s for which the tool will never gouge the part by more than a tolerable amount. This distance is often larger than you would expect, particularly for big tools.
As I mentioned earlier, you can also take account of the contact points made with the part corresponding to cutter locations A and B, and notice when it changes suddenly to tell when there is a discontinuity in the model (a corner for example) where you need to subdivide the samples to make a better finish. This does not affect the gouging tolerance since it’s only about reducing the undercutting in these areas.
I’ve had some lovely questions from the author of MeshCAM yesterday. I am pretty fully aware of how these machining algorithms work in all the CAM systems I have seen. The underlying concepts are simple. If there’s the will I think we could all work together to find out what headings to put these into wikipedia and maybe with a few diagrams. Someone ought to start the pages.
1) In a blog post you discuss a function which determines the tool height given a triangle and an xy point. This type of function implies to me that you layout the toolpath in a signle plae and then step over it to set the z height. Am I correct and is this the way that commercial programs do it? If so then I assume the step size is adjusted dynamically to pickup the curret triangle edge positions? If so, is there a lower limit of the step size to keep from trying to step over nearly vertical triangles?
Yes. This accounts for all the toolpath strategies that seem to be made by tracing out a path shape in 2D above the model, and projecting it down into the toolspace by setting the appropriate Z values for a series of XY points along the path according to my previous post. The sample rate is set according to the toolsize and the machining tolerance. Larger tools get sampled less because the cusps between two close sample points would be smaller for them.
After the first pass of samples is done, we consider every pair of points (cutter locations) and test whether a new sample needs adding at the half-way point. This is done according to the following criteria. If the two samples are closer in XY than 0.001mm (the tolerance of the machine tool) there is no further subdivision. Otherwise there is subdivision if the Z-value changes by more than, say, 2mm. And there is subdivision if the contact normals of the two cutter locations differ by more than, say, 20 degrees. (The contact normal is calculated as a byproduct of the calculation that gives the Z-value.)
2) In another post you mention having to remove “spikes” and “glitches.” I’ve run into these type of problems and can usually remove them by setting some minimum length or equivalent metric coupled with a few heuristincs about what should be correct. This never felt very scientific and I always wondered what the big guys did to remove this stuff. How do you decide good/no good on a section of a toolpath?
The spikes and glitches come in the implementation of the constant stepover/constant scallop algorithm. What we are discussing above is the simple 2D toolpath projection finishing algorithm, and spikes and glitches are very rare in general unless there is a dodgy calculation in the implementation. In theory I had expected there to be lots of spikes in the above algorithm, but in practice the numerical results are good enough such that they don’t happen.
3) What is the preferred curve/polyline offset algorithm in use? Voronoi algorithms seem to be very hard to implement correctly and PWID and Hansen/Arbab (and it’s derivatives) can be slow and be subject to numerical accuracy problems. What do the big guys do and how do they make them robust?
I’ve written an implementation of a voronoi twice, and won’t do one again and keep it private. It takes about six months and is very hard work. Martin Held is the expert and has come up with a third version of the algorithm which he sells, although he should be releasing it under the GPL as well. One day I might do a final implementation and make it GPL, if there’s anyone who can use it in that form.
You’ll have to tell me what PWID and Hansen are, I’ve not heard of them. Most companies work by brute force, and solve robustness problems like this by employing a programmer to work on debugging it day after day every day for eighteen months.
I managed to get my Voronoi module for NC Graphics pretty robust, but for final reliability I would detect any failures, shift the points a few microns in random directions, and rerun it. Usually this changed their alignments so that the miscalculations happened differently, and the result would get through.
The best way to approach something like this is to recognize that it’s a very hard problem in the first place. This is good for separating the men from the boys; the boys think it’s going to be easy and get stuck in right away. The men who have more sense do everything they can to redesign the system to avoid having to solve it in the first place.
Monday, April 10th, 2006 at 11:04 am - Machining
A reader asked about the hardness of steel we can knowingly cut with our Adaptive clearing method. He asked:
“Got the general gist of the method, hard steel I read, P20 should I presume?”
We have some reliable data, produced by the machine tool manufacturer OPS Ingersoll, and demonstrated on Euromold in December 2005 and other trade shows. The image shows one of the samples they cut live on the show. (It was a good set up, we could talk to people and if they were sufficiently interested we could take them to OPS Ingersoll’s stand to show ‘Adaptive Clearing’ in use):
- Cutter: Toroidal 10 x 0.5 mm
- Spindle speed: 8000 min-1
- Feedrate: 8000 mm/min
- Z step: 10mm
- Side step: 0.6mm
- Material: 1.276 7HRC54
HRC54 stands for Rockwell hardness 54, which is high in the scale of hardness.
Whilst trying to find out what exactly P20 stands for I found this interesting article. So, it seems that P20 means different things in different parts of the world. But is P20 a standardised way of describing toolmaking steel, how does it compare to HRC (Rockwell hardness) which is a standardised way of measuring steel hardness, I don’t know. If you do, please leave a comment!
Friday, April 7th, 2006 at 9:47 am - Machining
It’s a bit dodgy, but it appears to work. (Use the obvious “Animate” tickbox on the lower left once the machining is done.) Watch your memory use. We’ve had it consume an extra megabyte for every frame, even the ones on replay. That can polish off your RAM in under two minutes at four frames a second. Web-browsers were never designed for this sort of treatment. That’s the disadvantage with cutting edge experimental programming. Might work or it might not.
By the way, we’re hoping to go to Mach2006 for a day or so and hang about. If anyone going wants to chat, let us know. We’re looking for contacts in the tooling industry where the technologists have been making all these fancy cutters but still haven’t realized that there is no generally available software to make use of them. For example, Nachi Fujikoshi Corp in Japan has a fine selection of videos showing their well-twisted cutters chewing off metal along the sides, but every example shows it skimming off the shoulder of a straight-edge rectangular block. Why? I’ve been dropping emails asking what software they recommend for this style of cutting. I’m not getting much response. It’s either because they don’t have any software to recommend, they don’t want to encourage such software to be written, or the software perfectly well exists out there but they don’t like my email enough to respond to it.
Maybe minds are so company-centred that having a person contact them as a person is so weird we would do better to invent some shell-company in California and contact them as agents of that imaginary entity.