Freesteel Blog » 2008 » August

Friday, August 29th, 2008 at 3:04 pm - - UN, Whipping

As some people can tell by my funny accent, I spent some years of my childhood in America where I watched a lot of TV and soaked up plenty of that old 1970s Cold War American propaganda. It was interminable. Even my elementary school had these strange fallout-shelter symbols all around the basement cafeteria, so no eight year old with any curiosity could fail to notice that the Cold War was real and that at any time those evil Russians were going to nuke American children with their 50 Megatonne hydrogen bombs because they hated our freedom.

You had all these serious espionage programs, between the World War II documentaries in which the United States single-handedly fought and won the war, where they showed off those devious Ruskies and their bugging technology smuggled into the ambassador’s office in a carved wooden plaque of the Great Seal of the United States (that ugly bald eagle thingie) presented as a gift from Soviet schoolchildren in 1945.

Here’s the story of the device, known as The Thing, as told for the National Cryptologic Museum. As you will note, it contains lots of photos of a smiling Ambassador Henry Cabot Lodge Jr showing it off from his desk in the United Nations Security Council in 1960 like an early day Colin Powell waving around his fake non-existent vial of anthrax. The TV shows I watched probably had movie footage of this.

So that’s something that sank into back of my memory from childhood 30 years ago.

Today I am continuing my scatter-gun approach of trying to drum up some interest in undemocracy.com, and was referencing some old transcripts from Security Council into a wikipedia article about the 1989 US invasion of Panama which involved lots of shooting and killing and liberating the country from its own self-governance.

As used to happen in those days, violations of the UN Charter (eg military invasions) were debated in Security Council, and the American ambassador cited Article 51 of the Charter to explain that the operation in Panama was legal because it was an act of self-defence against an armed attack against his nation.

To quote:

Last Friday Noriega declared his military dictatorship to be in a state of war with the United States and publicly threatened the lives of Americans in Panama. The very next day forces under his command shot and killed an unarmed American serviceman, wounded another, arrested and brutally beat a third American serviceman and then brutally interrogated his wife, threatening her with sexual abuse. That was enough.

That’s it. That is the entire allegation that there was an “armed attack” against the United States. Noriega was right, of course. Panama was in a state of war with the United States, but it wasn’t because there were any Panamanian forces storming the beaches at Coney Island. This kind of nonsense makes you want to laugh. Or cry. I mean, at the very least we ought to require closure on these kinds of things. Whenever a country cites an event under Article 51 (the right of self-defence) there has got to be an account produced of the threat and an official ruling as to what steps a reasonable government would have taken to respond to the alleged armed attack.

But, we don’t. These things get forgotten, even when it’s on the scale of Weapons of Mass Destruction lie. The issue at the core of international law is that the five permanent members of the Security Council are judge, jury and executioner on any question brought to them, and can quite crudely veto any draft resolution that finds against them.

Security Council transcripts are scanned and on-line back to meeting 2601 of 26 July 1986. Earlier numbers return “no document”. This is unfortunate because I have always been interested in what the excuses for the Vietnam War atrocities were going to be. United States citizens don’t understand the implications and purpose of the UN Charter — at least as far as their media is concerned. The Charter says that all military aggression is illegal under international law, except when it is sanctioned by the Security Council or it is an act of self-defence. As mentioned earlier, the system doesn’t work because the same Security Council is called upon to condemn acts of military aggression as illegal when they do not conform to one of these two categories.

I just happened last night to try and scrape meeting number 1000, and discovered it was there. That was about 10pm, and I realized I was going to have a very late night pulling all these transcripts out and looking for the jokes. The set spans from meeting 687 in 1955 to meeting 1021 in 1962, thus missing a large parts of the Vietnam escalation. Bit it does include the 1961 US invasion of Cuba at the Bay of Pigs and the 1962 Cuban Missile Crisis when the Commies took wholly unreasonable steps towards arming their territories with nuclear missiles against future invasions. The crisis, it turned out, very nearly caused a nuclear war when a US destroyer began depth charging a Soviet submarine that was armed with nuclear missiles. You have to understand that submarines don’t have communication with the outside world, and Vasiliy Arkhipov saved us from nuclear holocaust by being only one of three commanders of the sub who was against firing the weapons when such an order required unanimous agreement.

So that was fun and games, which no one would ever have known. The Cuban Missile Crisis resulted in a secret deal to de-escalate the Cold War slightly, and sign an agreement of no invasion of Cuba. All the other nations of Latin America were not so lucky during that era and somehow wound up living under US-backed dictators which the US media explained to the US people were not bad chaps at all until the next one needed to be imposed.

Digging back earlier, I stumbled on the Council meeting about the U-2 crisis of 1960 when — pre-spy satellites — the US was flying these high altitude jets all over the place for the purpose of strategic reconnaissance over Soviet territory from basis in Pakistan. I don’t know if the Russians had similar programmes spying on US territory, but their job would have been much more difficult due to the fact that all suitable sites for airstrips within flying range of the US border were either ocean or without of Russian strategic interests.

Anyways, as it turned out, and as happened several times in the Cold War, there was going to be this big summit in Paris in 1960 which everyone was looking forward to to bring an end to this ridiculous, dangerous and expensive Cold War. As part of the confidence building measures, the US tactlessly continued to fly their spy planes all over the place, until one of these planes was captured in Soviet territory with its pilot alive. The Americans didn’t know this, and started rolling out the lies about how it was all a mistake, the plane was a weather research craft whose pilot had passed out at the controls due to problems with the oxygen equipment, and it had continued to fly in a straight line right through Soviet airspace, blah blah blah.

Then the Russians said: Fooled you. We have the guy here and all his spy photographs, etc. etc. And you know that big Paris Summit you were all looking forward to? Well, it’s cancelled, you son of a bitches.

They brought a draft resolution to the Security Council condemning violations of airspace and requesting the US government to stop them. The judge, jury and executioner voted against it, because international law at this level is just a game of votes by immoral actors.

As part of his defence case the US ambassador brought in The Thing as exposed in 1952 and explained:

Well, it so happens that I have here today a concrete example of Soviet espionage so that you can see for yourselves… [The Thing]…

We submit that the Soviet Union, for reasons which remain undisclosed, has deliberately seized on the U-2 incident, magnifying it out of all proportion, and has used it as a pretext to abort the Summit Conference to which so many have looked with hope or serious discussion of international problems.

The whole meeting, and the other ones, are really interesting to read and give lots of different side of the story. Now, from my childhood memories, the US kept showing to its people footage filmed in the serious forum of a Security Council incidents of the dastardly Russians and their dastardly ways, without ever explaining that this, yes this cruddy bit of wood and metal, this basic listening device that would have been an embarrassment had it got past even an airport security screening, was their best answer to the issue of U-2 flights over Russian military installations.

I mean, really.

I submit that one of the greatest problems in countering what appears to me to be absolutely laughable quality propaganda is an objective understanding of equivalences and magnitudes of actions.

This is just from reading the allegations — let alone whether they are true or not. A lucky attempt at getting a listening device into the ambassador’s office is not equivalent to flying uninvited spy planes over enemy territory days before an important peace conference. One American being shot in the streets of Panama after months of international provocation is not equivalent to a full scale invasion of the capitol city and abduction of the head of state.

It seems to be a systematic pattern where an official can equate two incidents that are several orders of magnitude different in scale, and not get laughed at. What is it with these pesky humans and their ridiculous system of nation-states? It shouldn’t be happening to any species which appears to have rational thoughts and a sense of humour.

If only the threshold for propaganda to work was a little higher than the evidence shows it is. Then it would be easier for politicians to do the right thing than get away with this absolute crap. It’s like a child who discovers that they can get anything they want by starting to cry. It’s got to stop. There has to be some standards.

Friday, August 22nd, 2008 at 9:38 pm - - Adaptive 2 Comments »

“This is the boss’s son; he’ll be working his way up from the bottom over the next two weeks.”

I’ve always wanted to use that sentence. Fortunately, neither my employment with NC Graphics, nor NC Graphics itself lasted long enough for this morale-obliterating situation to come to pass.

However, announced at Surfware on 1 July 2008:

Stephen A. Diehl has been named President and CEO of Surfware, Inc., developer of SURFCAM® CAD/CAM systems.

“I am proud and happy to announce that my eldest son Stephen will take over as Surfware President and CEO of Surfware, “says Alan Diehl, founder and former CEO of Surfware, Inc. “Stephen has been working with me behind the scenes for several years and more recently, full time at Surfware.”

But I’m not here to pick on the elements of shouldn’t-be-admired experience gained working in the vast global swindle and misallocation of capital of which the real-time trading of stocks, bonds and swaps for Fortune 500 companies is but part.

My attention was actually grabbed by a 19th of August announcement of a Notice of Allowance signifying that their patent application has been examined and is allowed for issuance as a [software] patent.

I blogged about this back in November 2005. The links to the patent pending pages on the US Patent Office webpage seem to pull out random patents now (for a Jet nozzle mixer and a Classification-expanded indexing and retrieval of classified documents thingie) because the Office’s website is absolutely shite and inexplicably avoids the use of the handy centuries-old unique-id system provided for these documents by the patent number. On the other hand, my European Patent Office link from three years ago does still work, because it does.

You can read the entire 38 pages of gory details on-line there. The provisional applications were filed in April 2004. Think: if all the work and expense that went into writing this patent and applying for it had instead been applied to working on the code itself, maybe they wouldn’t have had to spend the last four years “taking it to new levels of excellence”.

The announcement explains:

The origin of the patent application goes back to early 2002 — Surfware’s R&D Department. Robert (Pat) Patterson came up with the core idea for engagement milling, and he and Surfware co-founder Alan Diehl, set out to develop it into a workable product. Within one year they had developed two different versions of TrueMill, both covered in patent applications.

Over the next several years, the pair went on to supervise the project based on their core ideas, with some assistance from the SURFCAM product manager. In 2005, the initial patent application for engagement milling was filed with the co-inventors listed in alphabetical order, without regard to their actual contribution.

So that’s why when you search for “surfware” on the USPTO website (I’m not wasting time with their deeplinks) you get:

  • Application: 20050246052, Filed March 2, 2005: Coleman, Glenn; (Cave Creek, AZ) ; Diehl, Alan; (Westlake Village, CA) ; Patterson, Robert B.; (Bellevue, WA)
  • Application: 20050256604, Filed April 22, 2005: Diehi, Alan; (Westlake Village, CA) ; Patterson, Robert B.; (Bellevue, WA)

Back in 2005, Glenn Coleman was touting the benefits of Truemill in his capacity as Surfware’s Vice President of Product Design.

Also around at the time doing the same thing in his capacity as Vice President of Worldwide Sales, was Domenic Lanzillotta.

And then there was Dr Evan Sherbrooke who was Systems Architect. And there was Terry J. Sorensen who joined in May 2006 and became CEO of Surfware in December 2006, even though he was not Alan Diehl’s son.

Meanwhile, in Scottsdale (Phoenix) Arizona, in October 2006 Mike Coleman (probably no relation) announced the appointment of Domenic Lanzillotta as Vice President of Worldwide sales, and Greg Dare as Director of Marketing at TekSoft. Lanzillotta was formerly Vice President of Worldwide Sales for Surfware, and Dare was formerly Director of Marketing for Surfware.

In September 2007 TekSoft announced a new toolpath strategy, called the Adaptive roughing strategy providing the ability to cut using the full depth of the tool and safely running machines at optimum speed to reduce machining time up to 40% over conventional roughing with less wear.

A friend who went to the EMO 2007 trade show at the time saw it in action. In an interview in the same month, Mike Coleman said:

“Rather than pretending that a couple of guys in the back room can come up with everything that we need, we buy our HSM algorithms from a third party that devotes 10–15 programmers to developing just the HSM modules,” he says. This third party can afford to invest more in the module than most other developers because selling it to companies allows it to amortize the cost over a larger user base.

The two of them, Dare and Lanzillotta seemed happily installed at the TekSoft trade show booths in February 2007 at SolidWorks World (in same room as HSMWorks) and March 2007 at Westec.

Westec 2007 had breasts

In early 2008 Lanzillotta moves to Planit to sell Edgecam software from Thousand Oaks California.

At some point around then, Sorensen introduced his new marketing department with Steve Crane as the Director of Marketing [I have his business card – he told me I was crazy], Steve Myers, Sales Engineer, and Bryan Sullivan, Media Relations Manager.

Then in April 2007, Sorenson, Sherbrooke and Glenn Coleman show up with their new business model attempting to market a new algorithm called VoluMill, which goes on-line in October 2007 from Cave Creek (Phoenix) Arizona.

That gives them about 6 months to write their new algorithm and release it. I observed it in December 2007 while at the Euromold trade show. It uses the neat but flawed idea of hosting the algorithm on their servers and arranging for your CAM system to transfer the model to them, generate the toolpaths, and transfer the results back. It’s a nice idea. The payment is by a monthly service plan rather than, say, per metre of toolpath calculated. We’ve made a much more sophisticated implementation of this, and could have given them the code if they’d asked. It seems that users are not quite as excited by it all as we are, so the innovators all need to work together to create the interest.

Not that any of this happens, mind you. Now I’d thought that VoluMill had basically died as so many on-line things do, but there’s a non-spam message from June 2008 on the forum:

Question: I am a hobbyist user. and although the up-front cost is zero, have you considered a plan which limits the number of tool-paths that can be generated in a month, or maybe a per usage charge?

As a hobbyist I am not so much interested in the aspects of saving time as I am in a good quality tool path. It appears that your paths work well on machines that can not accelerate quickly since they try to maintain a constant velocity.

Answer: At this time we have not received a level of interest that would make it a high priority. However, if the level of interest in such an option increases we will address it accordingly.

That answer is from Joe McChesney, Product Manager. It’s a closed user forum, so I can’t post a message telling the questioner that we’d happily give him a free copy of the Adaptive Clearing algorithm in return for some user feedback and movies.

Given what they achieved in terms of development in their first six months, what have they been doing over the past year? Also, I’ll eat my hat if they have any customers using their service at all. You can see the client source code activity here.

Back to the evil software patents — the filing of which is as much of a waste of programmer time as technical blogging like this — I asked someone about it at Euromold 2005 and took action. I received effective confirmation about it from their General Counsel of Surfware Inc in February 2006. According to the Patent Office rules:

Each individual associated with the patent owner in a reexamination proceeding has a duty of candor and good faith in dealing with the Office, which includes a duty to disclose to the Office all information known to that individual to be material to patentability in a reexamination proceeding.

So that’s all right then.

Not that the Adaptive Clearing algorithm has anything but superficial similarity in intent with TrueMill or VoluMill. But when has that ever been an excuse to avoid grief? The real defence is that the world at large hasn’t found it particularly interesting, so there’s no money worth arguing about.

The fact is, we should all be talking and working together on this. The market does not seem to have taken to new technologies, such as constant engagement milling, as it ought to have been. There are significant savings to be made in production machining by applying something like this, even if it was a very temperamental and unstable release. However, I don’t see evidence at the trade shows of it ever being used by the machine tool vendors, say. The only place it gets exhibited is on the stands of the CAM companies that sell these algorithms, and nowhere else. This is an issue.

With a market that is as conservative as it is, it makes it very difficult to get anywhere, because the dominant CAM companies can keep flogging their ten year old systems at the same high prices, invest nothing into development, and pocketing all the profits for as long as it takes users never to notice.

Ultimately the problem is with the users and their level of interest. They’ve got lots of better things they need to do than care about the software, and none of them seem to show any curiosity whatsoever as to what goes into it. The vendors spin this line about how there’s all these programmers at work in a back room you can’t talk to, and the company has all its secret special valuable algorithms that are extra good works of genius better than the science behind General Relativity, and I don’t think they actually need to bother with these fairy-tales. So few people question it. All the company needs to say is:

“Yes, we sacked all the programmers last year, and we’re down to our last guy who knows how to compile the system for new versions of the operating system. We pay him well to stay. No you won’t get your bugs fixed, because at this stage of development everyone seems able to work around the issues that remain without too much hassle. We’re certain that no one is going to come along with anything new and better because they won’t be able to afford the years of development that it took to get ours up to this stage. Back when our product was being developed in the 1990s it was possible to make money with fewer features and with something that ran slower on the machine, and at that time we were still re-investing the money and keeping lots of well-motivated programmers working on it to get it ahead. Now we don’t need to do this, because we believe that the development gap is too wide for any new start-ups to be able to bridge it with us competing against them while they are still in their early days. We know you, the customer, will not give them a second thought until they have everything we do and twice as good, and they’ll always go out of business before that happens. Today and tomorrow, we own this software. It is indeed stagnant. And if you want it, you can take it at the price we like, and I’ll be able to afford a nice car and continue filling it with gas to drive back and forth across Arizona until this economy goes completely into the ground because we’re not able to tell the difference between producing stuff and making money. Thank you very much.”

Like I said, it would be nice to come clean in the CAM software industry as to which system uses what kernel and whose algorithm. I don’t think the users actually give a toss enough to make a difference to sales. Some transparency would be really helpful for people like me to find out who in the world is actually still programming these specialized algorithms in order to share tips and find out where the jobs are.

In the absence of this, all I’ve got is this blog-ranting about manager-level machinations within the industry that have nothing to do with actually getting any programming work done. Even communication that’s one-way can be useful.

Wednesday, August 20th, 2008 at 5:11 pm - - Machining

For reasons known to myself, I have decided that it would be convenient to be able to get a quick result for the distance of closest approach from a given point to a model…

I was going to outline how I have begun work on a giant 3D array of cells which each know the minimum answer for all the points in the cube, but have just thought of something better in the process of writing after I spent the last two days on the project.

I was going to cache all the results in an array: vector< vector< vector< double > > > subdividing the region in space and compute the closest difference between cubes and triangles, and began writing the code. Here’s the annoying code for doing it to a point:


void BoxClosest::MergeClosPoint(const P3& p)
{
double pdsq = Square(xrg.Distance(p.x));
if (pdsq > clossq) return;
pdsq += Square(yrg.Distance(p.y));
if (pdsq > clossq) return;
pdsq += Square(zrg.Distance(p.z));
if (pdsq > clossq) return;
clossq = pdsq;
}

where

struct P3 { double x, y, z }
struct I1 { double lo, hi }
struct BoxClosest { I1 xrg, yrg, zrg; double clossq }

The code for edges and triangles is very horrible. There should be some fancy algorithm to fill it in across the entire space, but I don’t know if it one exists. It’s an interesting puzzle.

Anyways, I’m hoping I can throw all that code out shortly if I proceed with something a bit more interesting that is able to cache the results and do it by points instead of cubes.

Ultimately I want a function that gives me the following:


bool IsDistanceGreater(const P3& p, double r, triangulated_model)
{
if (distance(p, triangulated_model) > r) return true;
// false negatives are acceptable as long as it's fast
return false;
}

Now, if we know (have earlier calculated) that s = distance(q, triangulated_model), and it happens to be that s - distance(p, q) > r then we can get the answer quickly, using power of metric spaces.

We have ultimate case of lazy calculations here. The function remains valid for the same set of triangles, so it can be shared among several algorithms and they will constantly get faster as the computer runs on them for longer. It could be a strange situation. The first 5-axis pass will be slow. The second will be faster, as long as it doesn’t go too far away from the first. How this optimizes for multi-core coding is another problem.

Wednesday, August 20th, 2008 at 4:22 pm - - Whipping 7 Comments »

Several of my FOI requests have matured while I was away, following my adventures with the Audit Commission Act.

Vellum

A slightly annoyed response from the House of Lords put me right about my stupid belief that there was a set of printing presses running on cowhide on the premises of Westminster. I have now requested details in the TSO contract stipulating that vellum must be used, and preventing them from producing it in a 0.0001pt typeface so as not to create waste. I ranted about this topic in June following my discovery of a stupid vote in Parliament in 1999 over the issue.

Now if we could have a debate about publishing Acts and Bills in XML and funding it with the cost savings of not doing it on cow-hide, then we could move forward.

BBC secrecy

As I observed to them, the BBC operates a very detailed database of whole website commissions, creative inputs, content ingest costs, application technologies, content rights, customised software licences and contractors/freelancer/sole traders engaged on a “deliverables” basis — in order to verify its Quota Requirement of External Spend on Future Media & Technology, which was reported as being 31%.

However, following “considerable consultation” with the new media industry, it was agreed that a set of virtually useless performance metrics could be provided which would “be helpful” to the industry whilst not compromising commercial confidentiality — which they then didn’t publish.

They gave out the three page document of what they could publish (which I didn’t find very helpful), but couldn’t be bothered to gather any details about said “considerable consultation” which resulted in this surprising level of non-disclosure. So it’s remains a secret as to which companies told them that their business had to be secret.

Rother District Council

The floodgates of information were finally opened by the Interim Solicitor when he got a letter from the ICO telling him to behave. This letter, and many others, was disclosed under a request for all communications about the whatdotheyknow website.

I’d been concerned by the threat in all the correspondence that “any application for consent to re-use information will be considered under the Re-use of Public Sector Information Regulations 2005, but if consent is given a charge may be made to you” and made a request following a close reading of the Regulations.

The reply was finally satisfactory and pointed to this page detailing their Re-use of Public Sector Information policy. As I suspected, there has so far been no re-use of Rother District public sector information, and no plans for any in the future. I think it’s a fresh document, and shares some words (also with the Regulation) with this statement on the Audit Commission website.

You’d think there would by now be a central service where all these legal and policy issues could be shared between the local authorities.

Mouchel Parkman

I’m digging into partnership contracts with this company and local authorities. Rochdale and Knowsly have asked for extentions to their 20 statutory days to cleanse them of “commercially confidential” data.

I really need to send in my complaint to the ICO about Liverpool’s exemptions on same contract. The delay is because I have to fill in a crappy Word Document complaints form which doesn’t render properly in Open Office. I hate these filling in of word processed forms. Must get on the case soon.

Liverpool continues to flatly disregard FOI requests for Liverpool Direct contracts, IPS Services contracts, and a really old one from early in the development of the webpage, Veolia contract.

And finally

Cambridgeshire council has a contract document that’s too big to email. Interesting puzzle. This request came about from an investigation the way PFI is imposed onto Local Councils who would otherwise have the common sense not to have anything to do with this expensive exercise in corporate welfare.

My related request for all the PFI credits given out by central government in order to subsidize the scam (and it’s probably subsidy in the form of withdrawing central government grants and then giving them back with the stipulation that it must only be squandered on a PFI project) got granted.

Rather handily, the FOI officer writes:

“In the past individual sponsoring departments have often produced news releases when new allocations were made and, as you say, individual local authorities have also frequently publicised the figures. There is therefore no reason not to bring this information together in a collated form.

I have therefore arranged for the list of PFI contracts on this department’s website (at www.local.communities.gov.uk/pfi/index.htm) to be revised so that in future it includes PFI credit amounts.”

Well, that’s progress. Some day we’ll be able to turn it out on a map.

Monday, August 18th, 2008 at 8:04 pm - - UN, Whipping

And I ran out of easy places to contribute to on Wikipedia, such as United Nations African Union Mission in Darfur, United Nations Observer Mission in Georgia, United Nations Integrated Office in Sierra Leone, United Nations Integrated Peacebuilding Office in Sierra Leone, and Timeline of the 2008 South Ossetia war; and so surfed around on the UN News Centre.

Naturally, there wasn’t anything about all the pro bono work I have so far done accessibilizing the official documents in a cumulatively constructive way that makes it possible to find out what the processes are, who’s operating them, and discover what’s been going on over the past decade to get us to the way things are now. After all, I am merely a programmer.

What I did find was a press release about how an… Innovative UN awareness-raising campaign earns prestigious Cannes award:

15 July 2008 – A groundbreaking United Nations campaign that uses the latest technology to give a voice to those who normally go unheard has been recognized by one of the world’s leading international advertising festivals.

“United Nations Voices,” (Internet Explorer only) which was designed pro bono for the UN Information Centre in Canberra by Saatchi & Saatchi, Australia, was awarded a Bronze Lion in the 2008 Cannes Lions International Advertising Festival, held in France last month.

I’ve added those links myself to improve connectivity.

Although bronze is not as prestigious as gold or silver, if you’re an ad company you know how to place your free advertising into the UN news feed in the hope that people will forgive you for delivering us Margaret Thatcher in 1979, as well as tonnes of other evil.

The material is a large colour poster of a face which you photograph with your mobile phone, email the picture to a particular phone number, and then you get phoned back with a recorded message from the featured “voiceless” person. The functionality could have been implemented by texting a key-word to the particular phone number instead of the digital photograph, or frigging it by using different phone numbers on each poster and ignoring the image (they didn’t do this).
Meanwhile, their March press release when they ran the ad campaign in Sydney, Australia is here, a blog about the campaign with lots of comments is here, and according to this posting the vital stats are:

Brief:
The United Nations wanted to find an engaging way of talking to modern day Australians, particularly the youth, and making them aware of the many and varied issues in today’s multi-cultural society.

Solution:
The problem is the people who really need to be heard are the ones who don’t normally have a voice. So by using revolutionary digital image recognition technology we could make a poster and press ad talk for the very first time and actually give everyone a voice.

Results:
In a small market like Australia, over a 2 week period, more than 35,000 people “listened” – making this the country’s most successful UN brand campaign to date. Due to its overwhelming success next year the UN is going to roll it out globally in all major cities.

And now, the credits:

Advertising Agency: SAATCHI & SAATCHI, Sydney, Australia
Executive Creative Directors: Steve Back, David Nobay
Copywriters / Art Directors: Steve Jackson, Vince Lagana
Photographers: Sean Izzard, Petrina Hicks, Scott Newett
Producer: Kate Whitfield
Art Buyers: Olivia Wilson, Danni Simpson, Skye Houghton
CEO: Simone Bartley
Account Supervisors: Bree Lennon, Stephen Lacy, James Tracey-Inglis
Head Of Digital & Direct: Paul Worboys
Image Technology: Hyperfactory
Image Technology: Mobot
Creative Group Head, Digital: Brian Merrifield
Director: Ralph Van Dijk, Eardrum
Photographers: Tim Gibbs, David Knight, Daniel Smith

According to the technologist’s website:

Mobot has developed a powerful, scalable, and flexible patent-pending solution which relies on image recovery, pattern recognition, and image matching capability ‘in the cloud.’ Cognitive science research has shown that the human brain uses blobs to recognize objects, that is, your brain does not use sharp edges to determine that a table is a table or a face is face. Mobot applies algorithms patterned after these methods to solve the problem of mobile visual search. Mobot has built a best in class solution through a combination of invention, innovation, and tech licensing. Mobot has strong technology partnerships with leading edge companies. For example, Imagen’s technology and Evolution Robotics’ ViPR technology are components of Mobot’s visual search engine and help Mobot deliver state of the art pattern recognition.

Miraculously, the Google Patent search engine digs out what appears to be the correct patent application for the inventor “Zvi Haim Lev” based on the term “mobot”, although this made-up word appears nowhere in the text.

It’s all pretty standard computer vision processing, done with a lot of elbow grease and efforts to handle occlusion and gauss filtering to handle the low quality resulting from the JPEG crappiness of the camera phone images.

As it’s a case of unnecessary technology used unobtrusively, it’s doing no harm. In the future the real advertising applications will be to make the cameras point outwards from the poster at the people so it can speak to you messages depending on who you are on determining your demographic status according to the branded products you choose to clothe your body in, with the eventual result the world described in the 1954 Philip K Dick story Sales Pitch.

Meanwhile, back in the world where things need to get done, not only sold, one can wonder what this campaign was actually trying to sell. “Raising awareness” is such a vague term, if none of those people whose awareness is raised never stand a change in finding out what they can do. The point of these vast PR companies is to (a) move product (make you spend money on profitable crap), (b) get votes (sabotage the democratic process), and (c) manage concern (prevent people from taking effective action).

This particular campaign is attempting to achieve (c). Thanks for all the help and encouragement, folks. Maybe I should just go home and pick the marrows.

Monday, August 18th, 2008 at 11:55 am - - Cave 1 Comment »

Have been at CUCC expo 2008 in Austria for the past three weeks. Apart from the caving, the carries up to top camp, the carries back from top camp, efforts to find an easy surface route down to 161 entrance h (failed), I did quite a bit on Tunnel as half the user-base was there.

(more…)

Thursday, August 14th, 2008 at 5:40 pm - - Machining 3 Comments »

Here’s a bit of easy C++ code using a template class with a template member function that compiles and executes with no problems on MS VC8:

#include <iostream>
#include <vector>
using namespace std;

template<class T>
struct A
{
 T x;

 template
 void fun(IT& it1, IT& it2)
 {
  for(IT i = it1; i != it2; ++i)
   cout << (*i + x) << endl;
 }
};

int main(int argc, char* argv[])
{
 A<int> a;
 a.x = 1;

 vector<int> b;
 b.push_back(10);

 a.fun(b.begin(), b.end());
 return 0;
}

But try this on a linux machine with gcc (I used “gcc version 4.1.2 20061115 (prerelease) (Debian 4.1.1-21)”) and you will get into compilation trouble:

g++ main.cpp -o main
main.cpp: In function âint main(int, char**)â:
main.cpp:27: error: no matching function for call to A<int>::fun(__gnu_cxx::__normal_iterator<int*, std::vector<int, std::allocator<int> > >, __gnu_cxx::__normal_iterator<int*, std::vector<int, std::allocator<int> > >)
main.cpp:12: note: candidates are: void A<T>::fun(IT&, IT&) [with IT = __gnu_cxx::__normal_iterator<int*, std::vector<int, std::allocator<int> > >, T = int]
make: *** [main] Error 1

So, what’s wrong?
If there’s only one candidate, why does the compiler not instatiate that one? Why is there a matching problem, and why does VC8 not spot this?

I could wait until tomorrow with telling you the solution, but I won’t let you racking your brains…

here it is:

replace the declaration of the template member in template struct A with this:

void fun(const IT& it1, const IT& it2)

and you get a result. But why is that? You tell me!
And what do you do if you want to code a member that needs pointers, not constant references? Somebody out there who knows the answer?