Freesteel Blog » Flightlogger

Monday, August 12th, 2019 at 3:58 pm - - Flightlogger, Hang-glide

I thought I had published this long form article in the blog until I looked for it. Turned out I’d accidentally left it on github here. Text is below the fold.

(more…)

Saturday, July 27th, 2019 at 8:20 pm - - Flightlogger

I’ve been trying to use the BNO055 for hang-glider experiments for a while. My current serial micropython interface is here or here. The sensor contains its own dedicated microcontroller that continually reads the gyros, accelerometers and magnetometers at a high frequency, fuses them, and provides orientation in the form of a quaternion, and acceleration separated into it’s kinetic and gravity components. (I’d prefer a version where you set the device working, and it streamed the measurements down on the wire instead of needing to be polled every 100ms.)

After years of not really having a clue, I’ve got far enough to be able to make a movie with these frames from an overhead unit attached to the keel of the glider with the intention of measuring the control inputs (ie the pilot’s hang position which determins his weight shift) in relation to the aerodynamic response.

There are two objectives of this work — other than the byproduct of learning a whole load about sensors that ought make me useful for something.

Firstly, we’d want to quantify the hidden variables of a glider (eg glide angle, control responsiveness, etc) in order to better compare between them.

Secondly, I want to quantify pilot behaviour and rig this up on a flight by a top pilot who always seems to do well, and find out what they’re doing that’s different from what I’m doing in order to enable some form of effective coaching that would save me a lot of time. (Generally the top pilots don’t know what they’re doing as they merely report doing it by feel, and that feel very luckily for them happens to coincide with doing it right.)

However, I don’t really trust these accelerometer readings when I watched this. It seemed like it was sticky. That is, its mathematical filters sometimes hold one value until it is no longer valid, and then swings wildly to another other value.

GPS readings sometimes do this and have nasty discontinuities. This is going to happen when there are bimodal probability distributions of the error where there are two likely interpretations of the position from the same measurments.

Meanwhile, I was looking at namespaces in OpenCV and the function findChessboardCorners() caught my eye. I wondered what that was for. Turns out it’s used for Camera calibration — finding lens distortion and focal length.

Then I found out about ArUco Markers and tried to use them for measuring the pilot’s position (see above).

That didn’t work on half the frames because the light catches it badly.

This lead onto the amazing all-in-one charuco board technology which has an aruco tag in each white square of a chess board so that nothing can possibly get confused.

This is great, because it gives an external means of verifying the absolute orientation sensor. If I could get the numbers to agree, then I’ll have proved I’ve understood all the orientation decoding and would know the error bounds.

My code is in this Jupyter notebook, though the bulk has been moved into the videos module of the hacktrack library for safekeeping.

Here’s the procedure, with the unit up top under my thumb containing the orientation sensor, the video camera embedded in the main unit (with the orange light visible to show that it’s working) looking down past the randomly flashing LED light.

And this is what it looks like from the camera’s point of view:

The phone is running a very crude android app I wrote called Hanglog3 for receiving the orientation sensor data via wifi over a socket from the ESP32 attached to the sensor and storing it in the phone’s copious memory, so I don’t need to use a mini-SD card writer wired to the microcontroller that causes it to stall for up to 80ms when the data is flushed to it.

What’s that LED light doing in the view?

That’s used to synchronize the logged orientation data with the frames from the video. I’ve made an interactive function called frameselectinteractive() that lets you slide a box around the LED in the image, like so:

Then the function extractledflashframes() measures the mean red green and blue values in the box for each frame so you can see that there is a clear enough signal.

In this case, the 200 value in the red channel shows a clear enough signal, which can be converted to a boolean on or off, and aligned with the timestamped LED on and off commands in the flight data file that also carries the orientation sensor data. I’ve used the Dust measurement records from this to save reprogramming anything, as the Dust sensor no longer exists (it was a ridiculous thing to carry around on a hang-glider anyway; what was I thinking?).

videoledonvalues = ledbrights.r>200
ledswitchtimes = (fd.pU.Dust==1)  # one timestamped record for every on and off of the LED
frametimes = videos.framestotime(videoledonvalues, ledswitchtimes)

Since the LED flashes at random intervals, there can be only one way to align them, which allows me to assign a timestamp to each video frame, and consequently to any information derived from that video frame, such as camera orientation.

The function which extracts camera orientation from the video frame is findtiltfromvideoframes().

The code which does this from an image frame works like this:

# extract the DICT_4X4_50 aruco markers from the image
markerCorners, markerIds, rejectedMarkers = 
  cv2.aruco.detectMarkers(frame, aruco_dict, parameters, cameraMatrix, distCoeff)

# try harder to match some of the failed markers with the knowledge of 
# where they lie in this particular charucoboard
cv2.aruco.refineDetectedMarkers(frame, charboard, markerCorners, markerIds, 
                                rejectedMarkers, cameraMatrix, distCoeffs)

# derive accurate 2D corners of the chessboard from the marker positions we have
retval, charucoCorners, charucoIds = cv2.aruco.interpolateCornersCharuco(markerCorners, 
     markerIds, frame, charboard, cameraMatrix, distCoeffs)

# Calculate relative camera to charuco board as a Rodrigues rotation vector (rvec) 
# and translation vector (tvec)
retval, rvec, tvec = cv2.aruco.estimatePoseCharucoBoard(charucoCorners, charucoIds, 
                                            charboard, cameraMatrix, distCoeffs)

# Convert rvec to single vector of the vertical Z-axis kingpost
r = cv2.Rodrigues(rvec)[0][2]
row = {"framenum":framenum, "tx":tvec[0][0], "ty":tvec[1][0], "tz":tvec[2][0], 
                            "rx":r[0], "ry":r[1], "rz":r[2]}

Notice that this cannot work without accurate values for the cameraMatrix and distCoeffs (distortion coefficients), which in the case of this camera has been calculated as:

cameraMatrix = numpy.array([[1.01048336e+03, 0.00000000e+00, 9.46630412e+02],
                            [0.00000000e+00, 1.01945395e+03, 5.71135893e+02],
                            [0.00000000e+00, 0.00000000e+00, 1.00000000e+00]])
distCoeffs = numpy.array([[-0.31967893,  0.13367133, -0.00175612,  0.00153122, -0.03052692]])

These numbers are not fully reproducible. However, they do make the picture look okay with the undistortion preview where straight lines look straight. Maybe there are too many degrees of freedom in the solution.

Now the hard part. The camera and the orientation sensor are not precisely aligned.

First the vertical axis of the orientation sensor, whose values are given in quaternions (pZ.q0, pZ.q1, pZ.q2, pZ.q3) needs to be extracted to provide a tilt-vector (of the Z-axis):

r00 = pZ.q0*pZ.q0*2 * pZ.iqsq
r33 = pZ.q3*pZ.q3*2 * pZ.iqsq
r01 = pZ.q0*pZ.q1*2 * pZ.iqsq
r02 = pZ.q0*pZ.q2*2 * pZ.iqsq
r13 = pZ.q1*pZ.q3*2 * pZ.iqsq
r23 = pZ.q2*pZ.q3*2 * pZ.iqsq
pZ["tiltx"] = r13 + r02
pZ["tilty"] = r23 - r01
pZ["tiltz"] = r00 - 1 + r33

Then the (rx, ry, rz) vectors from the video camera images needs aligning to the (tiltx, tilty, tiltz) vectors from this orientation sensor.

I have no idea how I cracked this one, but it went a bit like this:

# kingpost vertical vectors from camera
rx, ry, rz = tiltv.rx[t0:t1], tiltv.ry[t0:t1], tiltv.rz[t0:t1]

# Interpolated orientation tilt vector (so the timestamps are the same) 
ax = utils.InterpT(rx, lpZ.tiltx)
ay = utils.InterpT(ry, lpZ.tilty)
az = utils.InterpT(rz, lpZ.tiltz)

# Find the rotation between these two sets of points using SVD technology
# a[xyz] * r[xyz]^T
H = numpy.array([[sum(ax*rx), sum(ax*ry), sum(ax*rz)], 
                 [sum(ay*rx), sum(ay*ry), sum(ay*rz)], 
                 [sum(az*rx), sum(az*ry), sum(az*rz)]])
U, S, Vt = numpy.linalg.svd(H)
R = numpy.matmul(U, Vt)

print("Rotations in XYZ come to", numpy.degrees(cv2.Rodrigues(R)[0].reshape(3)), "degrees")


# Apply the rotations to the camera orientation
rrx, rry, rrz = \
(R[0][0]*rx + R[0][1]*ry + R[0][2]*rz, 
 R[1][0]*rx + R[1][1]*ry + R[1][2]*rz,
 R[2][0]*rx + R[2][1]*ry + R[2][2]*rz)

In this case, the rotations in XYZ came to [ 0.52880368 0.96020647 -80.29792978] degrees.

Thus, since I am now thoroughly running out of time, here is the comparison in XY of the unit vectors:

And this is how it looks to the individual components:

I am totally going to forget how any of this works when I get back. It is this hard to validate an orientation sensor against a video image, it seems.

Next is to make a charuco board with a flashing light in it with its own orientation sensor and pin it to the back of the pilot’s harness. Then maybe I’ll be measuring the glider relative to the pilot rather than the other way round.

All of this is so hard, and the main objectives haven’t even begun. Why is it so hard to get anywhere?

Wednesday, May 29th, 2019 at 11:57 am - - Flightlogger, Hang-glide

I think I’ve not been blogging ongoing projects are not working. A long running one that I have failed to report here is this dabbling with the RTK GPS system, which I learnt about by researching precision agriculture, having been tipped off about it by a guy from sixty-5 when I was working out of farset labs in Belfast earlier this year.

Anyway, in theory one can log the raw data from these ublox M8T GPS chips, use the open source RTKLIB software to process the rover GPS against a base station GPS to get a 2cm accurate time series (with a lot of help from the rtklibexplorer blog, and then plan to put one of these rover stations in each wingtip of a glider.

And this would have all been fine if one of the wingtips ESP32 devices that receives and transmits the UBX data from the GPS to my phone through wifi didn’t keep failing. I finally found out what it was: the tiny sheet metal antenna had snapped off so cleanly that you couldn’t tell it was missing.

Here is a picture of my three devices. The 2 rovers go into pouches with their own batteries and get tied into the wingtips.

Anyway, it was a rubbish and rough flight that I did last Tuesday, never getting higher than 2600 feet. Meanwhile, Becka was doing her Welsh 3000s walk across 15 peaks all of which were higher than I managed to fly, and got a photo of this Brocken spectre on the peak of Snowdon at 8am, having set off at 5am from the car.

I was tasked with being a few kilometres further down the road to provide the second breakfast and some sandwiches for her further journey.

No I wasn’t going to do that walk, after my experience with the Lakeland 3000s. Walking too far in one day is annoying, especially when you are constantly being told you’re not going fast enough.

The logical consequence of having more strength and always wanting to do more than anyone else is… that other people will want to do less, and this is going to be a disappointment.

So I went flying, and RTKLIB processed my one working GPS track, like so:


(Blue is the phone GPS and orange is the RTK gps.)

I was going to show some correspondences between the RTK GPS altitude and the barometric altitude when suitably filtered, but my interacting plotting system broke down. There are a lot of oscillations in the GPS, which I don’t understand. Will get back to it.

Sunday, September 16th, 2018 at 2:33 pm - - Flightlogger

Don’t know why, but I’ve got lazy with the blog. Probably due to use of twitter/goatchurch for quickly reporting things, which then lets me get on with what I’m doing.

Also, code gets done at github/goatchurchprime and github/Future-Hangglider.

Then I was in France and, after a couple of crashes on takeoff which took out my higher performance glider during a training week, I won the sports class on my old Sport2 that someone had driven out from Liverpool at short notice.

There continues to be a good deal of wholly ineffective data-logging from hang-gliders and stuff like that. I’ve been trying to learn to do something with FreeCAD, Blender, and Dales. It seems impossible to catch up, even if you don’t do anything new. And you’ve got to catch up or you can’t do anything new.

I still feel good watching this video Becka took in France during the competition being talked off the hill by the lovely JB on what became my best flight of the year. The eventual destination was beyond the hills on the horizon to my left. You can see my mind focussing.

Tuesday, July 3rd, 2018 at 7:50 pm - - Flightlogger, Hang-glide

We had a go, where I rigged my U2 hang-glider in the front garden with the VG full on to make it rigid, and then standing it on its nose so that JR could take lots of nice high definition photos of it from a variety of angles with a proper camera with a big lens.

The Agisoft Photoscan thing initially got it right, with a good looking 3D image:

But then I started doing things with the point scan — in particular finding its symmetry so as to compare the left wing with the right wing.

The code is here.

Basically, I loaded the 9653216 points from the csv file with this one Python command:

k = pandas.read_csv("hg1a1b.txt", sep=" ", names=["x","y","z","r","g","b","nx","ny","nz"])

And then worked out that I could perform vector calculations on the columns of coordinates, like this

# Reflect about the plane through x=2 parallel to the YZ plane
mv = pandas.Series({"x":2, "y":0, "z":0})
mvsq = sum(mv**2) # (scalar)
mvfac = (k.x*mv.x + k.y*mv.y + k.z*mv.z)*2/mvsq - 2  # 9million value column
kmirr = pandas.DataFrame({"x":k.x-mv.x*mvfac, "y":k.y-mv.y*mvfac, "z":k.z-mv.z*mvfac})

The alternative more memory efficient calculation method, performed row by row runs many, many times slower:

kmirr = k.apply(lambda R:R[["x","y","z"]] - mv*((R.x*mv.x+R.y*mv.y+R.z*mv.z)*2/mvsq - 2), axis=1)

There’s something curious about this column mathematics and how it applies to computational geometry.

In any case, have produced an animation melting through from one wing tip to the other, like so:

It seems that one wing is much fatter in depth than the other.

I think this is a photogrammetry error in its understanding of how far apart to put both sides of the wing. The gap at the leading edge on the fatter wing gives it away.

As is my observation in freeform CAD/CAM: you can get away with a lot of deviation from the required surface because no one can tell when it’s wrong. They can measure the flatness of the square edges, but errors in the middle of the freeform surface (so long as they are smooth) pass without notice. I suspect a lot of photogrammetry works on that principle. It’s only when we scanned something with two sides that was supposed to be symmetrical could I tell there was a big a problem.

(To be fair, the Agisoft failed when we reran it to get a better fit. It is better to

Well, so much for that. I had hoped I’d have something good enough to trace up and enter into XFLR5 as a series of contours, but it’s not quite.

However, I should just make up a series of contours based on this anyway (since it has things like the washout/twist approaching the wingtips) so that when we get good data (eg from a laser scanner) we are all ready for it.

Thursday, April 12th, 2018 at 4:29 pm - - Flightlogger, Hang-glide

Okay, so that last flying day at Meduno wasn’t very adventurous on the scale of the top pilots, but I was extremely pleased with it; I did just as well as anyone else in our xtc-paragliding (hang-gliding week) group and felt perfectly up with it.

Often you come down disappointed, and can watch everyone else from the landing field going higher and further and having more fun, and you’re down wholly because of your lack the skill and competence. But this wasn’t one of those days.


Here is the page of everyone’s tracklogs.

I was particularly happy with the part of the flight where I maintained my altitude over the flat lands at about 700m for 11 minutes before finally the air currents strengthened enough to carry me up. I had a sense of calm and flow rather than panic and disappointment this time.


It doesn’t look particularly low in the picture, but it felt like it.

I thought it was rising air from a pig farm I could see below and towards the dry river bed (because it smelled as such) but it couldn’t be as this as it was about 700m cross wind. I had consistently the wrong idea of the wind direction. It shows that even with totally mistaken ideas, I was still able to stay with the weakly rising air.

At one point I was passed high over a rifle range. The pops of the guns were like tap-taps on my breastbone.

I overflew the takeoff at the end of the day and took a photo of this cute pink training glider on the ramp beside the wood pile in the car park.

Then I tried to narrate part of my glide down to landing to the camera, which doesn’t work at all with my full face helmet.

One of the folks on the hill was SashaZ whose long blogpost about surfskis is what caused me to book my Tarifa trip with Becka.

Here are some other pics from previous days.

We had some long drives there and back in someone else’s car. Becka spent the whole time at SpeleoCamp caving, and so this shouldn’t count as a hang-gliding holiday.

Oh, I might as well put down my notion of the physics of flight here, while I have it worked out. It goes like this:

A heavier than air object with a mass of 100kg wishes to avoid accelerating downwards to the ground under a gravitational force amounting to 10 metres per second per second.

As each second that passes there is 100×10 = 1000 kg m/s of momentum that must be accounted for by blowing a volume air downwards at a speed k m/s.

Suppose the craft encloses a horizontal area a square metres within which it blows the air downwards at k m/s. In one second this would be ak cubic metres, which, with a density of about 1 kg per cubic metres, is ak kilograms, sent downwards with a momentum of ak2 kg m/s.

If the area a was circular, then you could cover it with a circular propellor like a helicopter, and maintain your altitude by blowing the air at sqrt(1000/a) metres per second downwards to counteract the gravity.

But imagine the shape of a is rectangular, and instead of a rotating blade, the blade moves horizontally on rails of length v and has a width w. This is somewhat like a wing with a span w flying at a velocity v.

My glider has a wingspan of the order of 10m, and an airspeed of 16 m/s, so the air needs to be blown downwards at a speed of sqrt(1000/(10*16)) = 10/4 = 2.5 m/s.

The kinetic energy embodied in this is 1/2 * mv2 = 0.5*160*2.5*2.5*2.5 = 1250 Joules/second.

If I weigh 100kg I can generate 1250 Joules from potential energy if I sink at 1.2 m/s — which is about the rate that my glider sinks on a steady glide.

This is a story of what needs to happen to the air to keep you up, not how it is done with aerofoils, vortices, induced drag or any stuff like that. And it also suggests that our lovely gliders have already hit certain limits of what they could physically achieve for their size and speed.

One way to get them to go up will be to add an electric motor to give you that extra to get off the ground, or to find a thermal when you’re going down.

That ad says they have 24 Ah in their 57.8V battery, which equates to 24*57.8*60*60=5Megajoules. This can maintain a horizontal flight for 27 minutes, which means it’s at the rate of 3000 Watts. That’s about a 50% conversion rate from the battery to powered energy, which is plausible.

It also gives a “max summit height” of 750m, which is a budget of 6660 Joules per metre. I need to give it 1000 Joules per metre in potential energy, so suppose my climb rate is k m/s then it will take me 750/k seconds to get up there, consuming 3000*750/k + 750*1000 = 5Megajoules which computes to a climb rate of 0.53 m/s over 23 minutes.

I can’t afford this stuff. I should be happy with the massive amount that I’ve already got.

Thursday, April 5th, 2018 at 10:02 pm - - Flightlogger, Hang-glide

I’ve been deeply not keeping up with blogging on this Slovenia hang-gliding trip. Telegram and Twitter seem to take the wind out of such activities. So maybe this thing is for mainly technical reports. There are a lot of dead blogs out there that only have such things. This blog was started for technical content, and then I began putting all my own activities into it.

I’ve been working on this technical thing to do with gliding and tracklogs for so long without any breakthrough that I finally decided I had to start reporting negative results.

My latest failure was attempting to use a Hough transform to derive wind speed and direction from the 2second interval GPS sample point of a glider flying around in the air mass.

There are many made up algorithms for doing this, but I wanted something mathematical. This time I based it on the assumption that the glider is mostly flying at a constant speed, so that changes in its GPS/ground speed were entirely due to flying with or against the wind. In particular, given three consecutive positions p0, p1, p2 with td seconds between them, then the correct wind velocity w would satisfy the following equation:

|p1 - p0 - wtd| = |p2 - p1 - wtd|

There is no unique solution for w in this equation; the solutions all lie along a line. So if we add some spread and combine the probability fields of solutions for every sequence of three points in the track, then the peak probability will be the best guess at the wind direction.

It’s all explained here in this jupyter notebook.

After so many failures, I’m much pleased with this result. The actual wind was blowing towards the northeast, and the bad guesses are when the glider was on glide and not doing any circles.

That was from a four hour mega flight all round the three ridges near Gorzia where at one point I got lifted smoothly one thousand metres into the blue sky at the rate of 5m/s. I could see from the capital city inland to the container ships on the Adriatic.

Here’s a picture after landing from a lesser flight today where the clouds were pretty low on the ridge.

I need to grab some self-portraits from the other folks some point real soon of me taking off, and me landing quite properly on my feet. I’m starting to hanker after a new glider, one that’s sleeker and goes faster. This one’s beginning to feel sluggish all of a sudden. I can’t afford anything else now, and it would be quite naughty. And after my spectacular failure of an XC last week on Bradwell, I don’t deserve an upgrade.

Wednesday, June 7th, 2017 at 3:28 pm - - Flightlogger

I don’t know why I refrained from looking into hacking the XCSoar flight software that I have on the phone that’s bolted to the stick to which I’ve hot-glued my temperature and orientation sensor technology.

It is now the ugliest piece of electronic junk in flight today.

sensorset

But the fact that the Air-Where project seemed to have done something amazing in the last year with Lora networks and an ESP8266 to display all your flying buddies onto the same flight map as the airspace without my noticing indicated that I had some catching up to do.

Even working full time on this I can’t remotely keep up with the tech.

Here’s some of the stuff I learned in the last couple of days.

It’s hard to believe, but there’s enough vol libre hacker capacity in Europe to squander it on two completely independent open source flight computer projects, XCSoar and LK8000 which got forked acrimoniously from one another back in 2009.

And merrily they have been implementing the same things as each other over and again (see below).

I’ve downloaded and built both systems from source, following the Make instructions. (I’m terrible at OS stuff like Make; the code is in C++ and if I do anything it’ll have to be blind and without a debugger.)

The XCSoar code seems marginally more hackable at the moment, but I should check I can deploy the Android version. (Getting all this C++ stuff to run on a Java phone environment with a bunch of different sensors is an amazing achievement.) Most people go with Kobos, but I can’t cope with the lack of colour and it looks like it’s got even more difficult Operating System problems I don’t have time to learn about.

The architecture of XCSoar is given as follows:

xcsoardataflow

The key therefore is the NMEA data stream, which the XCSoar program can point to as one of its inputs.

So, in the case of Air-Where, they’ve used an ESP8266 to connect to a Kobo or Notepad computer as a standard wifi hotspot (like I’ve been doing with my other ESP8266 projects) and somehow obtaining a stream of data composed of NMEA statements through a port called /dev/ttymxc0.

The most common source of NMEA is the GPS unit, like so:

$GPRMC,164742.682,A,5324.1915,N,00257.8000,W,10.96,098.82,190115,,,A*4e

I did not know this was part of an extended language, but it turns out there’s an air collision avoidance FLARM protocol in NMEA form as well.

According to the manual:

$PFLAA,0,-1234,1234,220,2,DD8F12,180,-4.5,30,-1.4,1*

is read as:

There is a glider in the south-east direction, 1.7km away (1.2km south, 1.2km east), 220m higher flying on south track with a ground speed of 30m/s in a slight left turn with 4.5°/s turning rate, sinking with 1.4m/s. Its ID is a static FLARM-ID “DD8F12”. There is no danger.

The final number before the * is <AcftType> and it is chosen from the following real list:

0=unknown; 1=glider/motor-glider; 2=tow/tug plane; 3=helicopter/rotorcraft; 4=parachute; 5=drop plane for parachutes; 6=hang-glider (hard); 7=para-glider (soft); 8=powered aircraft; 9=jet aircraft; 10=flying saucer (UFO); 11=balloon; 12=airship; 13=unmanned aerial vehicle (UAV); 15=static object

Well, it’s good to know that if you want a softer bump you should go for the paraglider rather than the crunchier hang-glider.

(more…)

Thursday, June 1st, 2017 at 4:06 pm - - Flightlogger, Hang-glide

I’ve been flying around my data logger on my hang-glider and doing my own data processing with mixed results for two years now.

During this time I’ve been on the lookout for someone else’s work that I can copy.

Just yesterday I discovered the existence of Dropsondes and then Radiosondes (devices that get lifted by a weather balloon with a radio link; nothing to do with sound-waves).

The fact of their existence has been staring me in the face for years.

raspsoundings

Those little red *S* symbols in the rasp forecast are not weather symbols for sunshine, but in fact the locations of half a dozen atmospheric “sounding” stations.

Until now I’d believed they were something involving a fancy radar beams shining up through the clouds, but it turns out it’s a freaking weather balloon with a humidity, temperature and gps sensors (they don’t bother with the barometer anymore and just use the gps altitude) that radio back data for an hour and a half till the latex balloon bursts at 25,000m and the device falls under a biodegradable parachute with a 95% of never being seen again.

The US government has a complete tour of the procedure, but the MetOffice has some automated stations which assemble and let off a new balloon every 12 hours from a robot building.
(more…)

Tuesday, May 30th, 2017 at 6:11 pm - - Flightlogger, Hang-glide

First, here’s a picture of me and my cheezy grin high up in wave over Wether Fell with about a dozen other gliders last Sunday

wavegrin

I was able to generate my incomplete Tephigram as before to illustrate the warm dry air encountered way up there.

tephi

Unfortunately, the fancy Python tephigram software released by the MetOffice doesn’t work for me as it’s designed to plot graphs that go ten times higher in the atmosphere.

Tephigram is short for “Temperature” and “Entropy/phi” plot and was invented in 1915. I don’t understand all of it yet. But this science goes back a long way.

(more…)