Freesteel Blog » Abolishing the differential barometer 200ms autocorrelation

Abolishing the differential barometer 200ms autocorrelation

Friday, March 3rd, 2017 at 11:04 am Written by:

To be clear, I haven’t got mathematical proofs here (I don’t have the time), but the experimental evidence is quick to get.

Take the differential barometer sensor (used to measure airspeed) of the hang-glider flight logger. The Arduino code which updates the reading every 200ms looks like this:

long lastpx4timestamp; 
void Flylogger::FetchPX4pitot()
{
    long mstamp = millis(); 
    if (mstamp >= lastpx4timestamp + 200) {
        px4pitot->readpitot(); 
        sdlogger->logpitot(px4timestampset, px4pitot-rawpressure, px4pitot->rawtemp); 
        lastpx4timestamp = mstamp; 
    }
}

Why did I choose 200 milliseconds? It sounded like a good number to read it at. This is a quick way to program it to be a regular reading.

A better way is to actually synchronize it with the clock divided rather than simply add 200ms to the next time, like so:

int mstampdivider = 20; 
int prevmstampdivided = 0; 
void loop()
{
    long mstampdivided = millis()/mstampdivider; 
    if (mstampdivided != prevmstampdivided) {
        prevmstampdivided = mstampdivided; 
        P(micros());  P(" ");  P(singlereading());  P("\n"); 
    }
}

Now that code reads at 20ms rather than 200ms, but it prints a load of output which I can cut and paste into a file and read into pandas, like so:

rows = [ (int(s[0]), int(s[1]))  for s in (ln.split()  for ln in open("../logfiles/dmprapidtest.txt").readlines())  if len(s) == 2]
k = pandas.DataFrame.from_records(rows, columns=["t", "d"])

And then we can plot the autocorrelation (the covariance) with itself shifted in time, like so:

d = k.d   # just the measurement Series
dm = d.mean()
ss = [((d - dm)*(d.shift(i) - dm)).mean()  for i in range(400)]

autocov1

Let’s zoom in on the first 50 covariances:

autocov2

We can forgive the first 5 covariances due to filters and capacitors in the system, but what the heck is that bump which occurs every 20*20=400ms?

It seems that the signal really is there, because when I set the stepsize to 15ms, then the autocovariance becomes:

autocov15

and roughly puts the peaks at 29*15=435ms intervals.

For comparison, let’s show what this looks like with proper white noise truncated to the integer values on the same mean standard deviation:

ds = d.std()
dR = pandas.Series([ int(random.gauss(dm, ds)+0.5)**2) for i in range(20000) ])
ssR = [((dR – dm)*(dR.shift(i) – dm)).mean() for i in range(400)]

autocov20

With random numbers there is no autocorrelation for any shift where i!=0. (The non-shift i=0 is obviously the variance/standard deviation squared of the series.)

After some experimenting, I was able to create a fake signal which matched the real data under this autocovariance transform by adding a squared half sin wave.

int(random.gauss(dm, ds) + max(0, sin(pi*i/10))**2 + 0.5)

autocov20s

This is a pretty good match. If true, this indicates that there is like some sort of a 5Hz oscillation leaking through a diode that interferes with the sensor at the threshold of its precision.

It also suggests that sampling the sensor at a 200ms interval is about the worse thing I can do, as my readings will resonate with this factor and slip from being in sync to being out of sync with it, probably over a time frame of about a minute and make this sensor impossible to calibrate — ie it will need different scalings at different times to work properly.

In fact here’s what it looks like with fake data just out of sync with the oscillation

ts = [ 197*i for i in range(20000) ]
dR = pd.Series([ int(random.gauss(dm, ds)+0.5+max(0, math.sin(math.pi*t/200))**2) for t in ts ])

autocov30

And here’s the same with the real data (not a very big signal that doesn’t last beyond 3 seconds).

autocov35

But just because you can’t see the signal, doesn’t mean it’s not there, as any irregularity in the timer over this long time period (much longer to get the same number of samples at 200ms as when you are sampling at 20ms) is going to hide those waves signal.

How to fix this?

There is no way to compensate for this noise except on a long time series when the sensor is statically at rest (so you know the value it should be varying about).

Experiments with fake data show that it is impossible to fix this data by randomizing the timing of the readings with a uniform offset within the reading time cycle or by averaging a set of readings — unless done over a greater than 400ms timescale to cover the full waveform.

But the oscillation isn’t perfectly regular, so in real data sampled over a longer time-scale with a bit of noise introduced to the offsets, the signal might disappear.

Cutting things long story short, I get good results by measuring at 25ms intervals, scaling down by the ((x/3)*2+(x%3)) factor, and then averaging over 16 samples which reduces the standard deviation down from 1.5 to 0.5, which is below the precision of the device. The auto-regression at the resting state disappears.

autocov50

Now I’ve got to do the same deal with the barometer, as well as fix up a synchronization with the spinning anemometer.

Leave a comment

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <blockquote cite=""> <code> <em> <strong>