Freesteel Blog » 2011 » November
Tuesday, November 29th, 2011 at 1:17 pm - Whipping
Because the political process in the UK is funded by donations, people must learn to donate money to political campaigns the believe in, or they get what’s coming — a political process entirely owned by moneyed interests who find they can buy it out for chump change.
So I donated £50 to the Yes to AV referendum campaign at the beginning of the year (largely selected and funded by the Joseph Roundtree Foundation), only to find that they were the most useless bunch of wankers one could ever be stuck with. It was so bad it seriously looked like sabotage. Meanwhile, the No Campaign could pump out more and more lies to its heart’s content, knowing there was no opposition.
Trying to get any idea what the Yes “Campaign” was on about was like talking to a brick wall. There were rumours of their awesome decisions, like not taking any advantage of the free leaflet mailshot and blowing their whole roll on a telephone cold calling system, but today, with the financial disclosures, we have the first peak into what went on.
While trying to kick-start my engagement with some CAM algorithms I am supposed to be doing stuff about, I decided to try some of these new-fangled unit test concepts people have been going on about.
(Normally I object to unit test people quite a lot because they come with the attitude that if the code is not completely pre-infested with unit tests from start to finish throughout the development process, then it cannot possibly be any good, and by implication the programmers who wrote it are stupid, ignorant and ugly. Nevertheless we can salvage something from the ideas, if we can be bothered.)
Much of my CAM algorithms are self-tested with the use of assertions, rather than unit tested. Partly because it’s easier, and partly because the example failure cases are so big it’s never the cases which you think about that are the problems. Usually I have a slow, simple algorithm running against a fast highly optimized algorithm and I make sure (in debug mode) that the two answers are the same.
Generally, running big CAM algorithms in an isolated test harness is tedious and tricky, so you normally compile it into the final product and test it there. However, lately I’ve decided that maybe I ought to be trying a bit harder to take it out of that environment. Especially as we do have our C++ algorithms driven through SWIG into Python, so there is no excuse whatsoever for not trying it out.
Almost none of it is documented properly. So here is an idea of the code needed to plot a series of slices of a torus against a line.
First build the “surface” object, add one line into it (in the form of a triangle with two points the same), and “close” the object with the Build function (I’ll explain this some other day).
import hsmkernel as kernel fssurf = kernel.FsSurf.New() fssurf.PushTriangle(0,0,0, 0,0,0, 0.4,0.1,1) fssurf.Build(1.0)
Now make a horizontal tool surface, which is an object that defines a tool shape in a particular Z-plane and references a surface
fshoriztoolsurf = kernel.FsHorizontalToolSurface.New() fshoriztoolsurf.AddSurf(fssurf) fshoriztoolsurf.AddTipShape(0.4, 0.3, 0.5) # (corner_radius, flat_radius, z)
Now make an implicit area, which is an object that defines a subset of the 2D plane and a set of tolerances to which the boundary contour of that subset will be computed. Such an object can reference more than one horizontal tool surface, making it possible to handle compound tools, such as you get when you have a protection surfaces or large tool holders that should limit the 2D area.
fsimplicitarea = kernel.FsImplicitArea.New(0) fsimplicitarea.AddHorizToolSurf(fshoriztoolsurf) fsimplicitarea.SetContourConditions(0.99, -1.0, 0.002, 2, -1.0, 0.9) # (minCNdotContour, maxZdiffContour, deltaHdiffContour, maxHdiffContour, maxCPcuspContour, minBNdotContour)
Those contour conditions (boundary tolerance values) don’t really belong here, but they have to get into the algorithm somewhere, and it doesn’t matter where. (At least that’s what I tell myself to overcome the sense that I have made a design mistake here.) These values are things like maximum segment length (maxHdiffContour), minimum segment length (deltaHdiffContour), maximum change in angle between two segments (cosine of value is minCNdotContour), and so on.
Now we build the structure which will model the subset of the 2D plane:
fsweave = kernel.FsWeave.New() fsweave.SetShape(-5, 5, -5, 5, 0.17) # (xlo, xhi, ylo, yhi, approx_resolution)
Now, the contour created by the interference set of a near vertical line and a small torus (total diameter 1.4) moving in a horizontal plane is going to be something approximating a circle.
The numbers add up, because you can see it’s approximately 8.5 major cells across (about 0.17mm wide each) and it fits with the total diameter of the torus.
You can also see some subdividing of the contour to maintain an angle change between subsequent segments of less than acos(0.99), or approximately 8 degrees.
If I was a bit more rigourous with my inputs, I could be comparing this contour with the near ellipse shape I would expect the answer to be in order to unit test my torus-line slicing algorithm.
In the image below I have created 50 slices at various z-heights and plotted the normals in green so you can see the blob shape defining a sort of 3D volume made from all the slices together.
I’m wondering whether the DLL and python bindings which allow the above algorithms to be scripted would be a more worthwhile thing for people to use than our slice.exe application. This would require people to know Python, though. Unfortunately, most Python programmers are doing web things and not doing CADCAM.
Monday, November 14th, 2011 at 11:49 am - Whipping
My attention has been drawn to a 22 March 2010 Parliamentary publication, London Regional Committee – London’s population and the 2011 Census
From Chapter 3, Preparations for the 2011 Census in London:
The National Address Register
87. We are encouraged by the development of a national address register for the 2011 Census. Such a register is vital for a successful Census in London.
91. We understand that the address register will not be maintained in its present form after 2011, despite the substantial time and effort which has gone into establishing and updating it. Shaun Flanagan of the Cabinet Office told us that when ONS negotiated the contract with the Royal Mail, Ordnance Survey and the Local Government Information House to provide data for the register, a condition of the agreement was that the register would not be re-used, but that any improvements to the data would be fed back to the three providers.
92. The Chair of the UK Statistics Authority has already written to Ministers to make the case for the national register to be maintained beyond 2011. The Minister for London told us that negotiations on the future use of the register were ongoing: “there is no dispute about the importance and benefits of resolving this.” That view was echoed by Keith Dugmore of the Demographics User Group, who described it as a “golden opportunity to produce a definitive national address register and to keep it going”. We were nevertheless disheartened to receive no clear answer from the Government on the issue of lead responsibility for negotiating an agreement.
93. An accurate and well-maintained national address register is an invaluable tool for the 2011 Census, and will be vital for any future exercises to quantify London’s population. We find it barely credible that the address register developed for the 2011 Census at substantial effort and expense is to be abandoned following the Census for reasons connected to the ownership of the intellectual property.
94. We concur with the UK Statistics Authority in recommending that the address register prepared for the 2011 Census be maintained as a public resource. We recommend that the Government urgently seek to resolve any outstanding issues with the maintenance of the register after April 2011, and to provide sufficient resources for its continued maintenance and development.
Just turned in my Reply to the Response concerning the personal privacy of wasting finite resources and cooking the planet through the operation of unnecessarily inefficient house insulation.
Tuesday, November 8th, 2011 at 10:54 pm - Cave
The Telegraph has the scoop.
Underground cave system links Yorkshire, Lancashire and Cumbria
Last Sunday, 300ft below the surface, 83 years of exploration came to an end (what? nothing has ended!) as it became possible to enter into Cumbria, travel below Lancashire and emerge in Yorkshire.
Due to the brave work of individuals such as Geoff Yeadon, Tim Allen and Mick Nunwick, the separate cave systems of Boxhead Pot and Notts Pot now make up a continuous 70 mile route under the heartland of Britain.
For some of the cavers this puts to an end almost 26 years of planning (“planning” as in hitting it with a hammer), using modern techniques (hammers, scaff poles, and explosives) in efforts to link the Three Counties.
Harnessing the power of natural underground water sources to blast away (with dynamite, not water) the mud and rock that was hiding a linking passage between Boxhead and Notts Pot, the team have joined a network of 30 entrances underneath the M62 and A66 (yep, our caves pass right under Manchester).
In the difficult conditions 300ft below, the team experienced ten bars of atmospheric pressure (equivalent to diving 90m deep in the sea) as they channelled water through with a simple hose with great force, blasting their way through.
This seems to be somewhat of a loose interpretation from the Guardian’s northern blog
Tuesday, November 8th, 2011 at 10:59 am - Science Fiction
I’m struggling to keep up, what with all the other duties that I had planned to shuffle off. And there’s the cave digging activities that tends to wipe out an entire day. No you do not have any energy afterwards to write your 1667 words target quota. All you have time for is beer.
As promised, here is the the output of Day 1. It’s all downhill from here on as I rapidly exhausted the entire plot I had dreamt up, like some cook who has run out of ingredients and only has pot and pans left in his kitchen.
As the rules state, it’s word count that matters, not quality. And this is not based on some woolly liberal inclusive everyone has something to offer concept. This is like running your first marathon; the spectators on the road might imagine there are good stylists and shambolic joggers, but the only thing that matters is the quantity of steps that take you in the right direction.
Tuesday, November 1st, 2011 at 5:15 pm - Cave
It was a dark and sunny day. Well, there was some rain. But the caves were dry. Team of 4 went to work on the dig in Duke Street 2 of Ireby Fell Cavern
I have this old video of a trip down Ireby a few years ago, pumping the sump out in order to gain access to both ends of the whirlpool dig to more than double the speed it got dug out to provide a dry bypass.
Whirlpool passage is a surprisingly crawl that had been backfilled with clay and sand up to the ceiling, and it would have taken many years if you had to keep dragging the dirt out along its entire length from the far end.