Freesteel Blog » 2007 » January

Wednesday, January 24th, 2007 at 11:56 am - - Whipping 4 Comments »

Last month I put in a follow-up request to Bristol Council about their educational IT procurement. In their previous response, they said they could not give me a copy of the contract they had signed with their supplier, Northgate, because:

[I]t is considered that disclosure at this time would prejudice the commercial interests of the parties and could lead to legal action against the Council, as such; it would not be in the public interest to release the information at this time.

Legal action only takes place if there has been a breach of a confidentiality clause in the contract they signed, and this is a textbook case of the council manufacturing a public interest argument by attempt[ing] to contract out of its responsibilities.

This is not allowed.

Councils are allowed to sign confidentiality clauses, but they should carefully consider the compatibility of such terms with their obligations… It is important that both the public authority and the contractor are aware of the limits placed by the Act on the enforceability of such confidentiality clauses.

So on 3 December I filed a follow-up:

Under the terms of the Freedom of Information Act, I am requesting all relevant details relating to the Council’s consideration to accept such confidentiality requirements in the contract it signed with Northgate, as well as all parts of that contract which are not considered to be commercially sensitive.

I also pointed out that according the framework contract which Northgate had signed with Becta dictating the terms under which they could sign contracts with councils, the information I was requesting was deemed not confidential.

On 2 January, not a minute sooner than required by law, I got this:

I write in response to your email of 3 December 2006 requesting under the Freedom of Information Act 2000 for all relevant details relating to Bristol City Council’s consideration to accept confidentiality requirements in the contract it signed with Northgate as well as all parts of that contract which are not considered to be commercially sensitive.

In responding to your request I will deal with the relevant facts, which you highlighted.

BECTA (British Educational Communications Technology Agency) is a UK agency, which supports all four UK education departments in their strategic ICT developments. It is not a body that governs Local Authorities in England and Wales; it does not have power or jurisdiction over local authorities. Decisions and frameworks by BECTA do not have to be followed by local authorities. There is nothing in BECTA’s constitution that stipulates local authorities are bound by BECTA’s decisions and frameworks.

Documents and guidance found on the BECTA website relates to contracts that BECTA enter into with suppliers such as Northgate. They do not apply to local authorities unless they have a contract with BECTA.

The DCA guidance does state that authorities should consider the compatibility of confidentiality clauses, which it enters into with contractors, with our obligations under the Freedom of Information Act. This was taken into consideration when Bristol City Council entered into the contracts.

When the Council was in the process of negotiating the ICT contract with it’s chosen development partner, the issue of freedom of information was specifically considered and addressed in the drafting of the contract. The contract was in a standard format, which was approved by the government through its agency for building schools for the future – partnerships for schools.

As a consequence of this, the final form of the contract (and its confidentiality clauses) reflects appropriate obligations under the Freedom of Information Act 2000 and sets out clearly that information for which it is felt appropriate to disclose and that information which would harm the commercial interests of the parties were it to be disclosed.

Consequently, Bristol City Council is still relying on the exemptions in S.41 and S.43 (2) of the Freedom of Information Act 2000 in relation to the charges for the provision of ICT services to each school. As stated in my previous response, the Council wishes to be transparent and accountable to the public but having considered the public interest test, disclosure at this time would prejudice the commercial interests of the parties and would place the Council in breach of contract. At the present time and in accordance with the terms of the contract, it is not in the public interest to release the information and this information will not be disclosable until 2018.

The Council will endeavour to send under a separate cover, on receipt of a postal address, details of the contract which is not deemed to be commercially sensitive. Further information on building schools is available on the partnerships for schools website www.p4s.org.uk and the building schools for the future website at www.bsf.gov.uk.

I appreciate that this response does not answer your request in full but for the reasons as set out above, the Council declines to disclose all of the information you have requested.

If you are not satisfied with this response or wish to lodge an appeal against any exemptions that may have been applied, you can do so through the council’s complaints procedure, details of which can be found at www.bristol-city.gov.uk/complaints.

If, after you have exhausted the council’s complaints procedure, you are still not satisfied with the response you have received you have the right to complain to the Information Commissioner, details of your right to complain can be found at www.ico.gov.uk/complaints.

I replied the same day to the “Trainee Solicitor” who had drafted this response:

Thank you for your reply. In your recent response to my FOI request you said that you would endeavour to send me by separate cover, on receipt of a postal address, all those parts of the contract signed between Northgate and Bristol City Council which are not deemed to be commercially sensitive. My address, which was included in my original request, is at the bottom of this message.

You have assured me that the issues of confidentiality and freedom of information were specifically considered and addressed in the drafting of the contract. My FOI request asked for the details of this consideration, which the guidelines say should have been undertaken. You have not yet informed me whether you hold any evidence of this consideration in the form of emails, minutes of meetings, or any in other written communication or record.

Finally, I am fully aware of the legal relationship between Becta and local authorities. I am, however, referring to the information which, according to the contract Northgate signed with Becta, is not confidential since it must be reported in the quarterly management information report stipulated by Schedule 7 of
their agreement
.

I can see no way that it would be a breach of confidence for Bristol City Council to release information that your contractor has already agreed with another public body is not confidential, irrespective of what appears to be, on the surface, an attempt to contract out of obligations under the Act.

In the last three weeks I have received nothing. So I have finally got my arse in gear and filed a complaint to the council which appears to have no special complaint procedure for FOI requests:

On 1 November, and 3 December 2006, I filed Freedom of Information Requests to Bristol City Council for details about the £8.9million 10-year contract signed in July 2006 with Northgate to supply ICT equipment to schools. I have received unsatisfactory responses in both cases.

The persons who responded to my requests claim that the council have contracted out of its obligations under the FOI Act — in contravention of the guidelines — by signing commercial confidentiality clauses. They have furthermore failed to confirm whether there is any written evidence that there was a careful consideration of “the compatibility of such terms with their obligations under the Act”, as required by the code of practice. I have also failed to receive through the post any “details of the contract which [are] not deemed to be commercially sensitive,” as promised by Mr. K.O. in his 2 January reply.

According to the rules, I have to first exhaust the council’s complaints procedure, before I bring this matter to the attention of the Information Commissioner. As I am unable to find any pages detailing this procedure in relation to FOI requests on your website, can you outline to me the stages as well as a fixed timetable which will tell me when this procedure has been exhausted? I feel this is necessary given the performance I have experienced so far where FOI requests are replied to in the last hour of the 20th working day.

I later checked the press release again and saw that the £8.9million deal signed between Bristol Council and Northgate was for only 5 years. So the idea that the information would remain confidential until 2018 — seven years beyond the expiry of the contract — is somehow missing the point; we have the right to know anything that does not prejudice a commercial interest — ie the flow of money. This means we should automatically get to see all contracts after the government has signed them, and there is no longer anything anyone can do to stop them.

Monday, January 22nd, 2007 at 7:32 pm - - Machining

I wrote the FAQ before checking things out more carefully. Surfcam have some nice videos of the Truemill thing in action, and it seems on one or two of the examples it clears more than one level.

To recap, our Adaptive Clearing algorithm uses a model of the stock, and is very good at the rest roughing cycle. So, by default, it cuts a layer at the greatest depth of the tool, and then clears the remaining material on a series of up-stepping layers to leave a well-roughed part. To the best of my knowledge, the Truemill algorithm is less good at this because it plans the toolpath from the shape of the pocket and is less able to take advantage of any areas that have already been cleared.

However, the final video on the page, labelled: “Stainless 420/ .750 inches deep/ 125IPM /3175MMPM/ (short version)”, shows the tool clearing on an upper level at 1minute into the video. Unfortunately, they’ve edited out the crucial period where it would be possible to see whether the it was done efficiently, or there was a lot of cutting of air.

There’s also a a nice screencast on this page explaining the Tool Engagement Angle. The first versions of our algorithm also used an “engagement angle”, but we had to change it to engagement width once we applied it to ball-nosed cutters where the radius changes significantly towards the tip of the tool. A small stepover with a ball-nosed cutter on a series of straight cuts has a constant engagement width (equal to the stepover) at any height, however the engagement angle gets wider when the radius of the tool starts to shrink.

Other observations

The other thing they’ve got now is RapidRough, which is some kind of rest roughing cycle. It’s good to see that Step Reduction MillingTM no longer gets a mention as this was a feature so trivial it caused a psychological disconnect just knowing that someone had bothered to go to the trouble of patenting and trademarking it; you always wondered if there you were missing something that made it more than the utter obviousness of inserting a plurality of smaller steps between the bigger steps.

It was no surprise that the patent office had accepted it, because they accept anything and have serious conflicts of interest on account of getting paid per patent while not being liable for compensation to the victims who have had to fight against a bogus patent until it gets over-turned in court. It was a marketing ploy. However, it always seemed to be treating end-users as though they were idiots who couldn’t recognize detrivializing hype.

So, it’s a lot better now. I could say about how the page doesn’t get new diagrams very often (now missing the dog on the skateboard), lacks an RSS feed, blog, public user forum; but most CAM companies don’t do the internet well. Counter-intuitively, having a network of dealers (note: selection box on country doesn’t do anything) doesn’t lead to good web-pagery. Since the company doesn’t sell through the web-page, it’s not taken very seriously. You just need to have one, and you try to make it so it’s not embarrassing. Of course, if you gave editing rights to all the salesmen and told them that making good improvements to the company webpages could make up for not hitting quarterly sales targets, then things would get fixed every quarter.

No need to point out that our webpage is not all that great either, but we are just two people, and we’re supposed to be doing some programming as well. However, Martin and I spent all morning cutting wood off a fallen tree with a small saw. The tree came down in the same storm that nearly stranded me in Manchester, and closed all the trains in Germany for the first time since the war.

Friday, January 19th, 2007 at 5:18 pm - - Machining, Whipping

I’ve redone the FAQ, taking out a lot of the references about our on-line interface, thinking that no one was using it. But then Martin showed me the logs and proved that it was being prodded at all the time, which is interesting. Why haven’t we been getting any feedback?

I’ve added the question Why is your blog so unprofessional and full of ranting. See how long it lasts.

Meanwhile, I went down to Sheffield earlier this week to write some Python scripts to help automate what goes on in an NHS cancer registry office. It gives me the incentive to make sure this business works, because I don’t want to be trapped in a soul-destroying office ever again. I managed 450 lines of code in a day, after spending the previous day working out what was necessary for it to do. Unzipping password protected zip files — Python can’t do that! Found something called 7-zip, which got it done.

Then yesterday I was in Manchester visiting a guy who’s setting up the National Open Centre who was looking for people to be on an “advisory board”. No one has ever asked me to be on a board before, so that was cool. He’s employed by the National Computing Centre, which is a semi-government derivation of something set up in 1967 during the white heat of technology when computers were like rocket science.

Since they’re there to “champion the effective deployment of IT”, which is directly in opposition to big computer business — who do not desire the effective and efficient use of IT when there’s so much money to be made from failure — what they’re supposed to be doing is worthwhile. Anything that supplies experts who can be trusted by government, who are not walking conflicts of interest, is a good thing. Business has no place in giving the government advice, yet it’s their leaders who always have unqualified access to ministers.

All trains back from Manchester were cancelled due to gales. There was chaos. Then I heard of one going to Liverpool while I was trying to phone home about it, and was lucky to catch it. It was half empty.

Monday, January 15th, 2007 at 5:25 pm - - Machining 4 Comments »


Here’s the before and after, after a couple of days of hacking. It’s a difficult issue — the problem of planning the directions for the Adaptive Clearing toolpath — since any plan would depend on the remaining stock, whose shape changes completely throughout the process.

The best plan is no plan at all, because there’s nothing to go wrong, and things run a bit quicker. At all stages during the calculation, the algorithm maintains a list of potential starting points. The simplest plan is always to select the most recently left potential starting point, and try to cut forward from it. If that fails (eg there’s no material there anymore), then discard it and try the second most recent potential starting point. (There’s a class in the system called PotentialStartPath.)

Simple though it is, it’s still way complicated, because there are over seven different types of starting points (on contour, from helix, from corner, towards contour, away from contour, double cut breakthrough, towards vanishing point on last level, and so on), all with radically different geometric requirements. This entire nightmare of confused logic is coded in Python using its high level structures and easier hackability. There are maps, dynamic lists, and placeholder variables galore. It takes a good few hours to get into it, and then you don’t want to mess it about too much.

I was told of a small complaint about the ordering. It turns out that though this strategy is quite adequate for most of the machining operation, it looks awful at the start when the tool is hopping around at random sniffing the shape out. With a simple a part, like the one pictured where the sides are coincident with the stock and do not require any machining, the first thing it would do is go once round and visit every corner before settling down to nibble one of them away. Then it would jump back counter-clockwise one corner, instead of going ahead to the closer one, and work on that.

It doesn’t give a good first impression.

It took two tweaks to fix. The first was to always go for the closest starting point to where it last broke off. (Best would be to do a linking path to each possible starting point and pick the shortest one; sometimes it’s quicker to carry on forwards to some material in front than to do a big loop to get to a point immediately behind.)

The second tweak was a more tricky. When starting with a rectangular piece of stock, it tends to aim for the corner which is the best starting point for spiralling in from. Unfortunately, in this example, aiming for the corner is worst possible place because that lands in the middle of a disconnected fragment of stock. It’s better to aim for the middle of one side of the model, and follow it along till meeting the stock.

The algorithm is now a bit slower on account of scanning along all the contours looking for these starting points. There’s no reason it has to check all of them. It could try ten places at random and stand a good chance of hitting one of these large unnecessary-to-cut areas with a high probability. But that’ll be much more complicated to program, so I’ll leave it at that until speed becomes an issue again. One of the worst things you can do is optimize you code too early before it is its final form.

Thursday, January 11th, 2007 at 1:56 pm - - Machining


The New Year blues were wearing off just in time for my laptop computer to crack up (the problem seems localized to the network circuitry). This delayed me getting to grips with this delightful adaptive roughing bug I discovered while investigating a completely different bug. How it looked at first is in the right; how it ought to be is on the left.

There is always a trade-off between speed and quality. Normally you control this with the tolerance value, which controls the sample rate of whatever is going on in your algorithm.

In any one machining algorithm, there can be dozens of distinct tolerance values to set the precision of the triangulations, the precision of the zslice contours, the rate at which the cutter is sampled, the tolerance at which the toolpaths are thinned (the process of converting nearly colinear sequences points into a single straight line), and so on.

The user isn’t going to want to set all of these independently, so we usually give them a control tolerance, and then set all the alternative tolerances proportionately.

For example, if the user tolerance is u, we might set the triangulation tolerance to u/3 and the machining tolerance to 2u/3.

The triangulation tolerance conventionally limits the deflection between each triangle in the model and the designed parametric surface. As it gets tighter (smaller) you get smaller triangles and more of them.

The machining tolerance conventionally limits the amount that the cutter will gouge the model. Since it’s too hard to simulate the swept volume of the tool against the triangulated surface, it’s sampled at descrete positions along the toolpath. As the samples get close together, the quantity of the material which can be over-cut from the model decreases. (See this post for a diagram).

The sum of these two tolerances gives the user tolerance; the error is shared out by the two stages in the process. There’s a trade-off as to what proportions of the user tolerance to share between the triangulation and machining tolerances. If the triangulation tolerance portion gets smaller (and its tolerance gets tighter) you get more triangles, which slows down the speed at which you can calculate a cutter position. But since the machining tolerance portion gets larger, the cutter location sample rate gets wider and you need less of them.

Because of this trade-off, no one bothers to look at what would be the optimum proportion to share out these tolerances. There’s probably a minimum, but it may be long and flat, so any value between 0.1 and 0.9 will probably do. Programmers generally pick a number without too much thought as they’re writing the code, and no one gets around to looking at it again for decades, if ever. There are almost certainly some stunning improvements in performance to be had in some CADCAM routines just by reviewing these numbers in case someone has been unlucky with their guesswork.

There’s also an art to it. Just because you set a tolerance value, doesn’t mean that the result is going to reach that level of inaccuracy. For machining, you normally sample at the rate suggested by the tolerance value, but also subsample quite a bit when there’s a corner, which is where the tool is most likely to have gouged. This means that the machining tolerance is usually much much better than the value put in, but you can’t rule out the possibility that there is some tiny feature on an otherwise flat model which the sample rate simply misses.

The triangulation tolerance, on the other hand, behaves quite differently. Since this is the surface that is being used to find the cutter locations, the tolerance limits set on it will be used. The worst accuracy generally occurs in the middle of the triangles, and it doesn’t take bad luck to hit those locations routinely.

That’s why, given the choice, it’s better to have a tighter triangulation tolerance and loosen off the machining tolerance. Unfortunately there’s a tendency to economize on the triangulor in many systems, and that’s wrong the wrong compromise.

So, back to the adaptive clearing bug. There’s a sample rate for the cutter engagement modelling here too, which I set at 3 times the machining tolerance. Don’t know why; I just did. Since the tool is always bound by the contour, nothing it can do will ever gouge the model, making it hard to relate its meaning to what users normally refer to as tolerance. They just want something to work.

What made it interesting was that when I started tinkering with the values, like setting the sample rate to 2.5 times the machining tolerance, the problem disappeared suddenly, not gradually. It was as if some nasty resonance was getting in there from the sample rate and the cutter engagement, making the paths get worse and worse until it showed up to the naked eye.

I could have quickly fixed the bug this way, by adjusting the sample rate, and slowed the whole system down slightly, and not cared if this resonance condition could re-emerge in some other case I no one yet knew about. However, this is my code and I intend to still be maintaining it in 5 years time, which means it really matters to me if there’s thousands of people using this software then and I’m having to work night and day to stay on top of a vast list of creaky problems.

That’s code rot. Numerous little quick fixes keep coming back to haunt you, until it’s like a over-patched roof with nothing good left to nail things to, and it has to be thoroughly replaced. You don’t want to do this with software; it’s so difficult to see how bad things are getting that what normally happens is someone else makes a totally independent set of software from scratch and it’s better than yours.

Anyway, the correct fix was easy. I added in another couple of tolerance values. When the toolpath changes direction by by more than a certain amount (eg 30 degrees — a number I picked out of a hat), the minimum sample distance is allowed to get smaller by a factor of 3.

Monday, January 8th, 2007 at 10:05 pm - - UN, Whipping 1 Comment »

Marvel at the incredible amount of research that went into writing up wikipedia: UN Security Council Resolution 1267 (1999). A lot of this information is just barely teetering on the edge of the start of the internet age at the point when newspapers began putting the articles on-line so that not only academic scholars have access to last year’s official lies news.

The article is also due to my processing a tonne of downloaded UN documents over the past few weeks, and now not having much idea of what to do with it. If I think it can be made into a tool to help people write articles about what’s going on in that body, then obviously I have to give it a try. The subject follows from a previous blog posting about some suspicious happenings up the street from here.

The basic story is that there is now a Consolidated List maintained by the UN which governments who have bought into this war on terrorism garbage can put names onto. Once you’re on it, you’re knackered. Even when you are in jail your wife isn’t even allowed to take your children to the swimming pool due to the sanctions because — according to the government — it might benefit Osama bin Laden’s terrorist network. You wonder why Muslims are getting so pissed off with it all. It’s completely outside the law. No charges. No evidence. No trial. No judge. No proof of guilt. Nothing.

Speaking of terrorism, did anyone notice that massive bomb in a Spanish airport last week? Didn’t think so. Wrong kind of terrorists who don’t fit with the official narrative, so they don’t get reported like the Wood Green no-ricin plot, or the Miami bomb plot to attack the Sears Tower, both of which don’t even pass the laugh test. It wears you out. What’s the point in these jokers in government. They do not care if we live or die for as long as they can get on with spinning their lies.

Monday, January 8th, 2007 at 9:32 pm - - Cave

Expect the reach of the internet to crawl back in time as what you thought was safely rotting down in the form of ink on bits of dead tree gets scanned in. Obviously some people have too much time on their hands. Some caving clubs have even more time on their hands and actually type stuff in, including ancient pissed off writings in expedition logbooks.

Not that anything changes. I spent New Year over at Bullpot Farm. You don’t go to Bullpot Farm for a good night’s sleep, so it was lucky I got an iPod for Xmas so I could listen to it while the lads downstairs played crockery cricket until they ran out of china mugs at 4 in the morning. Then they set off fireworks in the kitchen until the the bedrooms were completely filled with white smoke. What is the point of a smoke detector if no one will let you turn it off?