Freesteel Blog » VoluMill at large

VoluMill at large

Sunday, December 9th, 2007 at 9:22 am Written by:

A couple of my colleagues found out about VoluMill last week, but didn’t pass on the news, assuming that I’d know already, seeing as I try to keep on top of these things. So I found out about it during Euromold, and sent them an email. Haven’t heard back yet, mind you. I guess I’m too unimportant. I’ve included a mention of them in the FAQ. No mention of Adaptive Clearing appears on their web-page. Either we’re totally irrelevant, or they believe people won’t find out about it if they don’t link to it. Ever heard of google? At some stage, pretending things don’t exist when they obviously do stops looking clever.

I don’t have time right now to post onto their new User Forum myself, but it remains to be seen if a discussion about Adaptive Clearing there would get cleansed faster than erectile dysfunction spam. I mean, what else can people discuss in a User Forum? Things really don’t take off in this Web 2.0 world without a critical mass. I would advise replacing it by a blog with the comments turned on, as we have here, as the only way to get a bit of constant life.

This year at Euromold people are beginning to talk for a change. We’re getting beyond the childish stage of:

“Oh, you’re a competitor. No, we haven’t heard of you. You’re far too rubbish and insignificant for us to have wasted any of our precious time thinking about you. So, no, of course we haven’t got any questions we have been wondering about, which you might know the answer to. Now go away before you find out any answers to your questions which might give you a competitive advantage. Really, we do not care at all if the entire CAM software industry completely stagnates as a result of no one talking.”

I have noticed that the VoluMill story has an interesting relation to the software patent campaign that was on-going a couple of years ago in the EU, where I testified as a programmer about how it worked to the utter detriment of everything to do with software development, citing as my example the Surfware TrueMill patent application.

I remember commenting that patents worked totally against the interests of the programmers even within the company that took out the patent because it would prevent them from applying — or threatening to apply — their relevant skills under other employment conditions. Under no circumstances would a programmer ever cooperate with a software patent application if they knew what it was like to sell out their own future.

The name “Glenn Coleman” shows up on the Surfware TrueMill 2005 patent application, now appears as the “Chief Product Officer” of “Celerative Technologies” which has developed VoluMill. Obviously, the website claims it’s a brand-new strategy, but it looks similar enough at a glance for a lawyer to cause a great deal of aggravation, should he be paid to do so.

Doubtess, the Celerative folks have taken this into account, having spent a long enough in the business to know that nobody in the CAM industry is in the habit of doing anything with software patents… yet. We’re all good guys, in that respect. Unfortunately, that doesn’t take account of how these things work. There’s nothing to stop some totally evil patent troll sensing an opportunity and dropping by the Surfware offices in Westlake Village with cash, and a promise to help get their own back. The troll buys the patent and then goes back to his air-conditioned offices where nothing of any good is produced, and starts suing the ass off Celerative until they settle for some sizeable chunk of their investment budget in a process that is practically indistinguishable from racketeering.

Meanwhile, I’ve got a lot of stuff other to do (having now entirely broken the scallop strategy in the machining kernel in a process known as “break to fix”). At some point in the future I’ll compile a detailed review of the VoluMill strategy, as well as a few notes about the areas of the business plan that are going to be hard.

6 Comments

  • 1. Roberto replies at 9th January 2008, 9:30 pm :

    First of all I’d like to make compliments about this space of discussion. It is a long time that people involved in a specific field need to confront and evolve themselves.

    IMHO VoluMill has a right approach to propose adding a CAM strategy to existing environments simply by a “Plugin” and demanding the number crunching to an Internet server.

    This ASP paradigm has the advantage to always deliver the newest algorithms (leaving to the user the choice of using previous ones for repetivity of results) avoiding the cracking of applications and/or the cost of an HW security dongle.
    If more and more actors on the CAM scene will adopt this strategy, the competition between them will bring the correct pricing to their monthly fee.

    By converse it is a wrongdoing not confronting with the competitors hoping that the customers don’t know the search engines on the Web …

    Let’s go on with innovation!

  • 2. Julian replies at 10th January 2008, 11:01 am :

    Anyone who has used Debian/Linux will know how seamlessly it is possible to deliver and roll-back general software installations.

    Volumill requires a client to be installed into your CAM system. This client sends and receives geometric data remotely each time the strategy is used, when it could alternatively manage the download and local use of a single DLL far less frequently.

    Using a local DLL would give faster results for short calculations, and be more reliable at every stage because it doesn’t require a live and unbroken internet connection. I am expecting the size of the DLL not to greatly exceed the size of one or two jobs.

    In my opinion, the paradigm can only be justified by the power of distributed computing where expensive calculations are farmed out to other machines that can operate in parallel and get a result sooner than on the local machine. However, given that most processors are idling at any one time, it would rarely need to call on machines outside the office.

    The paradigm cannot be justified according to a pricing model, because that’s like admitting that its inefficient, in the way, and that the need for it is artificial. Hence the existence of arguments that don’t make any sense. Unless the computational experience is improved, there is no advantage to the user, without fabricating barriers, like these HW security dongles — ie unnecessary conditions on the use of the software that are nothing more than an inconvenience to the user.

    The argument about always delivering “the newest algorithms” is, in my view, the most curious justification. In my experience, new CAM algorithms are rarer than new computers (which obviously require a full software installation), and when they do arise they must be supported by a whole new interface containing all their new parameters, etc. New algorithms that just slot in place over an old algorithm barely exist.

    If by “newest algorithms”, you actually mean “implementations with fewer bugs in them”, then that’s a whole different matter, because that’s like explicitly expecting users to be dissatisfied by the results almost all the time. The only reason they put faith in the “newest algorithm” is that they haven’t tried it yet, and they are hopeful. We’ve all been there. And over the years we learn better — new is normally worst. We count on other people to suffer the instability and sort out the bugs. Only then is it worth most people’s time.

  • 3. Roberto replies at 12th January 2008, 7:33 pm :

    I’m very glad to read Julian’s opinions and I agree with all his thoughts that derive from his deep knowings into CAM field.
    In fact I was just defending VoluMill choices because my feeling is that they are in some way innovative (referring to their deployment/business scheme) .
    In present times Internet is nor 24h/7 available nor speedy enough but in a near future when optical fiber connections will become common it could change. In the past even the steady availability of power grid supply wasn’t so assured…
    By the way I think that SW protection and performance improvements could be obtained, in the meantime, in another way f.i. developing CAM systems that demand the number crunching to efficient coprocessor boards plugged into PCs. This way, the CAM company could sell the coprocessor HW with its software bundled and only who has one of these electronic boards could run the software application. Then you have to pay for a useful bit of hardware and not to hassle with an expensive protection dongle that doesn’t boost computation times in any way.
    I have to agree that a networked solution is not at the moment going to make customers completely satisfied.
    As an Internet fascinated user, my dream is having in the future a CAD/CAM environment in the form of a Free/Libre SW without any CAM strategy (different clients/GUIs could be developed in SourceForge) and then pick-up computation strategies and computation power from the Web.

    Should this dream bring to a different business model with CAM actors selling algorithms, calculation power and secure repository of user data with backups into their Server Farms.

    However, in the near future, all Julian’s confutations are sound based.

    Nice to have a part into this kind discussion.

    I have a dream …

  • 4. Freesteel&hellip replies at 22nd August 2008, 9:39 pm :

    […] gives them about 6 months to write their new algorithm and release it. I observed it in December 2007 while at the Euromold trade show. It uses the neat but flawed idea of hosting the algorithm on […]

  • 5. 5axes replies at 21st September 2011, 10:28 pm :

    Hello,

    I have on question concerning VoluMill. On the VoluMill Web site, the company have annonced the end of the cooperation with mastercam :
    “Despite all of our attempts to continue our relationship with CNC Software, we have been unsuccessful.”

    http://www.volumill.com/downloads/mastercam

    So who devellops the 2D HSM cycle for mastercam ? Do they do by themselves ? In the X5 release 2 new cycle have been updated are they some VoluMill toolpaths ?

    I really appreciate your blog. If you like new idea you shoud have also a look to the last white paper of VoluMill : http://www.volumill.com/document/actc-white-paper-active-chip-thickness-control-non-concentric-arc-milling

  • 6. Julian replies at 11th October 2011, 12:04 pm :

    All very intriguing.

    I understand their Dynamic Milling algorithm (equivalent of the Adaptive Clearing) is developed by their programmers in-house.

    The 3-axis high speed machining algorithms were bought in (probably as source code) from NCGraphics Machining Strategist sometime between 2005 and 2007. This is disclosed on page 12 of this document.
    http://www.ptc.com/company/ncgraphics/briefing.pdf

    It would help if the industry was more transparent about the source and development of the algorithms — which is only going to happen when the customers begin to take an interest these details.

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <blockquote cite=""> <code> <em> <strong>