Freesteel Blog » 2019 » July

Saturday, July 27th, 2019 at 8:20 pm - - Flightlogger

I’ve been trying to use the BNO055 for hang-glider experiments for a while. My current serial micropython interface is here or here. The sensor contains its own dedicated microcontroller that continually reads the gyros, accelerometers and magnetometers at a high frequency, fuses them, and provides orientation in the form of a quaternion, and acceleration separated into it’s kinetic and gravity components. (I’d prefer a version where you set the device working, and it streamed the measurements down on the wire instead of needing to be polled every 100ms.)

After years of not really having a clue, I’ve got far enough to be able to make a movie with these frames from an overhead unit attached to the keel of the glider with the intention of measuring the control inputs (ie the pilot’s hang position which determins his weight shift) in relation to the aerodynamic response.

There are two objectives of this work — other than the byproduct of learning a whole load about sensors that ought make me useful for something.

Firstly, we’d want to quantify the hidden variables of a glider (eg glide angle, control responsiveness, etc) in order to better compare between them.

Secondly, I want to quantify pilot behaviour and rig this up on a flight by a top pilot who always seems to do well, and find out what they’re doing that’s different from what I’m doing in order to enable some form of effective coaching that would save me a lot of time. (Generally the top pilots don’t know what they’re doing as they merely report doing it by feel, and that feel very luckily for them happens to coincide with doing it right.)

However, I don’t really trust these accelerometer readings when I watched this. It seemed like it was sticky. That is, its mathematical filters sometimes hold one value until it is no longer valid, and then swings wildly to another other value.

GPS readings sometimes do this and have nasty discontinuities. This is going to happen when there are bimodal probability distributions of the error where there are two likely interpretations of the position from the same measurments.

Meanwhile, I was looking at namespaces in OpenCV and the function findChessboardCorners() caught my eye. I wondered what that was for. Turns out it’s used for Camera calibration — finding lens distortion and focal length.

Then I found out about ArUco Markers and tried to use them for measuring the pilot’s position (see above).

That didn’t work on half the frames because the light catches it badly.

This lead onto the amazing all-in-one charuco board technology which has an aruco tag in each white square of a chess board so that nothing can possibly get confused.

This is great, because it gives an external means of verifying the absolute orientation sensor. If I could get the numbers to agree, then I’ll have proved I’ve understood all the orientation decoding and would know the error bounds.

My code is in this Jupyter notebook, though the bulk has been moved into the videos module of the hacktrack library for safekeeping.

Here’s the procedure, with the unit up top under my thumb containing the orientation sensor, the video camera embedded in the main unit (with the orange light visible to show that it’s working) looking down past the randomly flashing LED light.

And this is what it looks like from the camera’s point of view:

The phone is running a very crude android app I wrote called Hanglog3 for receiving the orientation sensor data via wifi over a socket from the ESP32 attached to the sensor and storing it in the phone’s copious memory, so I don’t need to use a mini-SD card writer wired to the microcontroller that causes it to stall for up to 80ms when the data is flushed to it.

What’s that LED light doing in the view?

That’s used to synchronize the logged orientation data with the frames from the video. I’ve made an interactive function called frameselectinteractive() that lets you slide a box around the LED in the image, like so:

Then the function extractledflashframes() measures the mean red green and blue values in the box for each frame so you can see that there is a clear enough signal.

In this case, the 200 value in the red channel shows a clear enough signal, which can be converted to a boolean on or off, and aligned with the timestamped LED on and off commands in the flight data file that also carries the orientation sensor data. I’ve used the Dust measurement records from this to save reprogramming anything, as the Dust sensor no longer exists (it was a ridiculous thing to carry around on a hang-glider anyway; what was I thinking?).

videoledonvalues = ledbrights.r>200
ledswitchtimes = (fd.pU.Dust==1)  # one timestamped record for every on and off of the LED
frametimes = videos.framestotime(videoledonvalues, ledswitchtimes)

Since the LED flashes at random intervals, there can be only one way to align them, which allows me to assign a timestamp to each video frame, and consequently to any information derived from that video frame, such as camera orientation.

The function which extracts camera orientation from the video frame is findtiltfromvideoframes().

The code which does this from an image frame works like this:

# extract the DICT_4X4_50 aruco markers from the image
markerCorners, markerIds, rejectedMarkers = 
  cv2.aruco.detectMarkers(frame, aruco_dict, parameters, cameraMatrix, distCoeff)

# try harder to match some of the failed markers with the knowledge of 
# where they lie in this particular charucoboard
cv2.aruco.refineDetectedMarkers(frame, charboard, markerCorners, markerIds, 
                                rejectedMarkers, cameraMatrix, distCoeffs)

# derive accurate 2D corners of the chessboard from the marker positions we have
retval, charucoCorners, charucoIds = cv2.aruco.interpolateCornersCharuco(markerCorners, 
     markerIds, frame, charboard, cameraMatrix, distCoeffs)

# Calculate relative camera to charuco board as a Rodrigues rotation vector (rvec) 
# and translation vector (tvec)
retval, rvec, tvec = cv2.aruco.estimatePoseCharucoBoard(charucoCorners, charucoIds, 
                                            charboard, cameraMatrix, distCoeffs)

# Convert rvec to single vector of the vertical Z-axis kingpost
r = cv2.Rodrigues(rvec)[0][2]
row = {"framenum":framenum, "tx":tvec[0][0], "ty":tvec[1][0], "tz":tvec[2][0], 
                            "rx":r[0], "ry":r[1], "rz":r[2]}

Notice that this cannot work without accurate values for the cameraMatrix and distCoeffs (distortion coefficients), which in the case of this camera has been calculated as:

cameraMatrix = numpy.array([[1.01048336e+03, 0.00000000e+00, 9.46630412e+02],
                            [0.00000000e+00, 1.01945395e+03, 5.71135893e+02],
                            [0.00000000e+00, 0.00000000e+00, 1.00000000e+00]])
distCoeffs = numpy.array([[-0.31967893,  0.13367133, -0.00175612,  0.00153122, -0.03052692]])

These numbers are not fully reproducible. However, they do make the picture look okay with the undistortion preview where straight lines look straight. Maybe there are too many degrees of freedom in the solution.

Now the hard part. The camera and the orientation sensor are not precisely aligned.

First the vertical axis of the orientation sensor, whose values are given in quaternions (pZ.q0, pZ.q1, pZ.q2, pZ.q3) needs to be extracted to provide a tilt-vector (of the Z-axis):

r00 = pZ.q0*pZ.q0*2 * pZ.iqsq
r33 = pZ.q3*pZ.q3*2 * pZ.iqsq
r01 = pZ.q0*pZ.q1*2 * pZ.iqsq
r02 = pZ.q0*pZ.q2*2 * pZ.iqsq
r13 = pZ.q1*pZ.q3*2 * pZ.iqsq
r23 = pZ.q2*pZ.q3*2 * pZ.iqsq
pZ["tiltx"] = r13 + r02
pZ["tilty"] = r23 - r01
pZ["tiltz"] = r00 - 1 + r33

Then the (rx, ry, rz) vectors from the video camera images needs aligning to the (tiltx, tilty, tiltz) vectors from this orientation sensor.

I have no idea how I cracked this one, but it went a bit like this:

# kingpost vertical vectors from camera
rx, ry, rz = tiltv.rx[t0:t1], tiltv.ry[t0:t1], tiltv.rz[t0:t1]

# Interpolated orientation tilt vector (so the timestamps are the same) 
ax = utils.InterpT(rx, lpZ.tiltx)
ay = utils.InterpT(ry, lpZ.tilty)
az = utils.InterpT(rz, lpZ.tiltz)

# Find the rotation between these two sets of points using SVD technology
# a[xyz] * r[xyz]^T
H = numpy.array([[sum(ax*rx), sum(ax*ry), sum(ax*rz)], 
                 [sum(ay*rx), sum(ay*ry), sum(ay*rz)], 
                 [sum(az*rx), sum(az*ry), sum(az*rz)]])
U, S, Vt = numpy.linalg.svd(H)
R = numpy.matmul(U, Vt)

print("Rotations in XYZ come to", numpy.degrees(cv2.Rodrigues(R)[0].reshape(3)), "degrees")


# Apply the rotations to the camera orientation
rrx, rry, rrz = \
(R[0][0]*rx + R[0][1]*ry + R[0][2]*rz, 
 R[1][0]*rx + R[1][1]*ry + R[1][2]*rz,
 R[2][0]*rx + R[2][1]*ry + R[2][2]*rz)

In this case, the rotations in XYZ came to [ 0.52880368 0.96020647 -80.29792978] degrees.

Thus, since I am now thoroughly running out of time, here is the comparison in XY of the unit vectors:

And this is how it looks to the individual components:

I am totally going to forget how any of this works when I get back. It is this hard to validate an orientation sensor against a video image, it seems.

Next is to make a charuco board with a flashing light in it with its own orientation sensor and pin it to the back of the pilot’s harness. Then maybe I’ll be measuring the glider relative to the pilot rather than the other way round.

All of this is so hard, and the main objectives haven’t even begun. Why is it so hard to get anywhere?

Thursday, July 25th, 2019 at 6:29 pm - - University 1 Comment »

On 15 July I nipped across to Manchester for the evening, having picked up a flier in the Unity Theatre a month before for the performance of Tao of Glass by Phelim McDermott (twitter handle).

I made up for the price of the ticket by some very cheap train fares bought in advance. As I walked across the city in the summer air past a very long queue for a homeless soup kitchen (which should not be a thing in this day and age) I saw a statue of Queen Victoria covered in pigeon poo. It could never have been otherwise from the day it was erected, unless there were no pigeons in Manchester back then.

The Royal Exchange Theatre looks like the Lunar Lander parked indoors. I had never seen it before, but apparently it’s been like this since 1976. Initially I thought it was something they built especially for the festival, which was a problem for me at the start of the show where Phelim was sitting in the audience and recounting about all the amazing performances he’d seen in that theatre when he was young.

Here’s what it looks like inside. The stage is a massive turntable, so it can spin round and show you the whole performance even when the actors are not moving. I don’t know what it dos to their sense of direction.

Halfway through the first act I recognized the performer from something I’d seen before. It was from a random show I rather liked called Panic, on for one night at the Unity Theatre (just 2 blocks walk from my house), where Phelem played the Great God Pan with his three nymphs. There was a lot of mythology woven in. The goat-like god Pan suspiciously disappeared at the same time that the goat-like depiction of Satan emerged around the birth of Christ.

My favourite part of the performance of Panic was when Phelem was having an emotional crisis and began going through his collection of self-help books, pulling each one out of the box and progressively shrieking their titles. He specifically singled out Tony Buzan who, “every year writes a new book, and it’s always exactly the same as his previous book!” The quantity was overwhelming. He ended up pouring box after box onto the table.

The Tao of Glass went on about Kintsugi, which doesn’t work for smashed glass. There were some other allegories I’ve been unable to remember to look up. One of the reviews tracked down A Mindell’s theory of three conical layers of Consensus Reality, Dreamland, and Essence. I wish I’d taken notes.

I got drawn into the misdirection about Philip Glass, who helped write the performance with Phelem over a week of work-shopping near his home in New York, and was to appear at the end on a Steinway player piano that would reproduce his composition exactly.

But there was a final scene where Phelem lay down next to an old record player to listen to the start of Glassworks, which was the album with which he first fell in love with the music and used to play it at home on repeat (joke!).

At this point Philip Glass himself walked across the stage, joined the musicians in the corner, 4 seats away from me, and played it himself.

I am so lucky to be here. I ought to go to more shows near me.

Apparently there have been some revivals of the older Glass work, like Einstein on the Beach. I just discovered that there’s films of it online from a recent show in Paris. I can watch hours of it, like so:

I’d go traveling to shows of Philip Glass pieces, if I had a way to find out about them in time, like some folks do for Wagner Operas. No one else I know understands it.

It’s a shame I don’t have a video of a Panic or this show I can go back over and get the names of things I want to look up again. We need footnotes, or show notes, or a recording that ticket-holders are allowed to access after the run. How else are we supposed to get a self-education round here?

Thursday, July 25th, 2019 at 3:46 pm - - Kayak Dive

Stop doing and start blogging!

Yesterday I got out on the wreck of the Resurgam in 1879, which then sank while on tow in the Mersey Bay, only be discovered 116 years later. It was propelled by a coal powered steam engine and was lit by candle light.

You’d think that the people of the day would have seen that electric power had to be the way forward for submarines. The inventor, George Garret, appeared to have done a lot of marine tech in a life that was shorter than mine, but then emigrated to America in 1890 to become a rice farmer, which seems a bit of a waste.

I dived it with a half empty tank (on a 100bar) because I was too incompetent to fetch the right one out of the garage. Luckily it was a shallow dive and I could go easy on the breathing at the cost of giving myself a headache for the rest of the day. The water was pretty warm, so I skipped gloves, which meant that I got nipped by crabs when I forgot to look where I put my hands.

Here’s the remnants of the propeller in my hand:

Here’s the other pointy end:

Here’s the remains of the conning tower:

Maybe if I took off my tank I could have jammed myself down inside. It’s pretty tight.

There was one hole on one side by the sea bed, from which this lobster made a successful escape from the other diving pair who were trying to catch it for their dinner:

Then I bothered a tompot blenny by poking it out of its hole. It slipped round to the other side of the tower only to meet a second blenny who was not pleased by the territorial incursion:

I wonder if there’s been any experiments with territorial species to find out whether they use natural boundaries to demarcate their areas, and to what extent a nipped intruder understands where the line is drawn. Or maybe the animal territories are not actually areas, but instead single perching places from which to leap out and attack intruders that they can see, and that’s the key.

I remember a talk about fiddler crabs on the beach who have little burrows that they run back to for safety. The experimenter wanted to find out how they navigated to their hole and put down piece of sandpaper and fishing line which they used to quietly drag the unsuspecting crab away from its hole and prove that the crab used its sense of direction in a polar coordinate grid centred on its hole.

Then I surfaced and got fetched by the boat.

Wreck of the Calcium

Quick subsequent dive on the Wreck of the Calcium, also not deep, but with a proper fill in the tank.

First we fall overboard backwards so that our mask is not swept off by the water.

Who doesn’t love a good swim-through filled with fish?:

Here’s a short video of a flatfish with its funny bulbous eyeballs slipping away beneath a school of other fish.

There were a couple small shy conger eels in the boiler:

The lobsters were too numerous and brave to be frightened away and saved:

I used to think the gopro was a waste of money, but compared to the alternatives it can be a lot clearer. You can see the red camera light flashing on my forehead. All four of us divers had headcams.

At one point I lost my headcam (the retaining string wasn’t properly on round my neck). Here’s the episode of it falling off and being kicked around.

I found it quite enjoyable watching this video, though shame it wasn’t the right way up so it saw me coming back to fetch it. These blur-o-vision FPOV headcam movies don’t seem to be as engaging as you’d expect. Next time I’m on a dive where I’m absolutely sure I’m going to get back to where I started, it would be neat to just park it somewhere looking out so it could see the divers receding from view, leaving everything to the fish and wandering crabs, before gradually seeing the divers emerge back from the distance.

Reminds me of a trip a few years ago when someone’s helmet cam came off his head on a kayak dive in Loch Sunart, and it was filmed the perfect express elevator to the surface where it was picked up by Becka:

I got to think of some way to stop it floating off, and securing it to whatever I’ve made. Maybe I’ll have to line off from it to be safe. The jiggling on the line when the divers are out of sight might add a bit of suspense and anticipation.

I got back with a horrible Diesel fume headache, attempted to play underwater hockey, but gave up. Then went round to DoESLiverpool to check on my robot, which had gone offline because somebody had left it out of the charging dock.

Tuesday, July 9th, 2019 at 10:29 am - - Kayak Dive

Things go on. We did some excellent kayak diving up in St Abbs that was planned to take advantage of a student who could be in a sea kayak on the surface so we’d feel more at ease going deeper and further underwater than we’d normally venture on our own.

Our main mission was to see a wolf fish, frequently sighted on Black Carrs rock below 20m.

Here he is:

This was along the low cut down that runs due east from the rock, in a horizontal crack behind an upstanding rock. It’s probably always the same fish that everybody sees. He caught my eye as we were searching along the bottom, and I propped up a cairn on the spot so we could come back to him after pushing on a bit deeper to the brittle star carpets.

And here’s my cairn marking the spot of the shy fish’s lair.

My pics make it look a lot less pretty than what it was it was to be there in the water, but they work for me as evidence.

Our support kayaker (plus visitors) was present when we went down at this deep spot.

As usual they were nowhere to be seen when we came up. They tend to get bored and find something else to go look at, because it all seems well from the surface to non-divers who don’t know what disasters might be unfolding below the water.

The wolf fish dive was on the Tuesday 2 July 2019. It was a stiff northwest wind and swell that made it impossible to go near the coast and explore the caves anywhere further round towards Pettico Wick.

The weather and water visibility conditions had not been the best we had hoped for, but the trip had to fit into a narrow time window of people’s availability and Becka not being on a caving expedition.

We stayed overnight in a three bed shared room at Marin Quest, which was a little expensive, but it paid off well when on Monday over breakfast the boat skipper was able to give us the position of the wreck of The President at this spot: 55°52’10.0″N+2°04’25.0″W/@55.8694568,-2.0741817 in a very sheltered channel to the south of Eyemouth directly in line with a fence style.

Here’s us loading up the kayaks at the convenient concrete access path near Greenends Gully.

We overshot too far south on the paddle out. The cliffs further towards Burnmouth look well worth exploring, but we didn’t have time for that.

The dive on The President was excellent, progressively finding bigger and bigger bits of scrap steel until we finally hit the boilers. Otherwise, there was not much life.

Sam, our look-out student, spent the time watching dolphins doing leaps and flips close in.

Here’s a blurry shot from a Mark One blurry gopro to prove he saw something jumping.

After a tank changeover at the carpark, we hauled our kayaks against the wind and waves to the north of Eyemouth and into the shelter of Weasel Loch.

Sam took my wallet shopping for junk food as we did a shore dive out of the channel to look for Conger Reef.

We didn’t find the reef, so here’s a picture of a flatfish and small lobster in the rocky wasteland it was supposed to be.

We circled back to the cliff wall, which was spectacular, huge, deep and overhanging, and then found the way back in. I could spend all day shore-diving out of this loch popping in and out of the water trying to get my bearings. Maybe I’d eventually find this reef.

Back by the car we changed into wetsuits and I gave Sam a try dive, during which we saw a small lobster on a ledge at minus one metre.

Stepping back in time to Sunday, when Becka and I arrived in St Abb’s (before Sam came), we dashed out for an afternoon dive on Wuddy Rocks.

Becka managed to haul down the anchor at the start of the dive, but couldn’t stay down because she didn’t have enough lead.

Normally this is my fault for not putting enough on her weight belt, but this time it was because she’d forgot to put it on at all!

Once sorted out, we found the way into the tunnels where we tried out our new new diving torches, one wide and one narrow angle.

Up till now I’d been using a Dive Scurion Light, which Becka has appropriated into her caving gear. I’ve nearly lost or broken on a couple of occasions. I’m glad not to bother with that thing again, because a burn time of 12 hours is no use when dives are at most a couple hours a day. It’s huge and has a dangly wire between the battery and the headset.

Torches are a good investment, because one of the points of diving is to see things. Over the years they’ve become smaller and brighter, until maybe soon you’ll just have some bridge specks on the fingertips of your gloves that will emit rays when you cup your hand in a particular way.

Then we did a second dive and went looking for Cathedral Rock from the shore. Here are the instructions from Marine Reserve booklet:

To reach Cathedral Rock follow the main gulley between Broad Craig and the harbour wall. Keeping Broad Craig on your left and the training pool on your right, enter a narrow gully which drops down to approximately 5 metres. Swim to the right around the narrow gap and proceed until you reach a pile of angular boulders. From here head approximately 45 degrees to the right, passing over kelp forest on the way, until you reach a small rock face covered with dead men’s fingers. Swim past this rock keeping it on your left shoulder into a sandy gully. Cathedral Rock is on your right, just over a large boulder.

Not surprisingly, I didn’t find it.

The Lawson Wood diver guide (whose position for The President are out by 3 minutes of arc) describes the route like so:

Swim over to Big Green Carr [this is the wrong rock -ed]; keeping it to your left swim south in line with the reef. At the end of the reef you should see a low lying ridge extending at right angles in front of you; pass over this and you will meet a wall that curves to the left over a tumble of large boulders. With this wall to your right, you are now swimming east and you will reach Cathedral Rock in about 12 yards.

The problem with these descriptions is that everything is relative in terms of what constitutes a large boulder or a sandy gully. This is no use underwater where the visibility is such that you can only see one thing at a time. If you swim into a boulder that’s 2 metres tall, then it’s large if is alone on a rubble strewn plane, but small if it is surrounded by 8 metre high blocks. When you can see only than 5 metres distant, you can easily persuade yourself either way, and therefore the description is of no use. It might as well have said: turn left at the boulder that once had an octopus on it in 1998.

Had I realized that these descriptions were so utterly defective, I’d have looked online, and found this dive description:

The one thing you must do on this dive is trust your compass, so take your bearings and follow them!

On entry head right and at the end of Broad Craig there is an area of almost white gravel (actually shells and worm-casts); Take a compass bearing of 120degrees and swim approximately 30m to reach the site.

We had driven partway up to St Abbs on the Saturday and slept overnight in a layby on the A7 before seeking out breakfast in Berwick upon Tweed. The cheap eating place was packed out, so we wandered into town and hit upon the Mule on Rouge, which is where I’d be hanging out every day if I lived in this town. Unfortunately, Becka had just decided that we were now on an economy drive, because I haven’t been paying my house bills for a while, so we shared one single bagel.

On the Saturday I had been taking my telepresence robot around Makerfest Liverpool in the Central Library. It’s possible that this toy had something to do with the cashflow crisis.

Isn’t it cute?