Hack Hitchin

Pi wars week 2: encoders and magnetometers

8th Oct 2018

It’s been a productive week for the Tauradigm team, we’ve tested a prototype projectile launcher, researched sensors for odometry, mapped out the system architecture, evaluated a magnetometer board and order some of the key components.

 

Projectile Launcher

First the projectile launcher. Since the concept design (posted last week) is quite similar to Hitchin’s old skittles ball flinger, I thought it was worthwhile to just use that for a test, to check it was feasible to use with the soft juggling balls. I 3d printed a holder to mount the flywheels closer together, so they’d grip the balls:

modified ball flinger and juggling ball. notice the scuff marks on the ball from the tyres

The result is just about ok. It could do with the balls having a bit more energy, so they fly rather than bounce towards the targets. I’m not sure if its because the flywheel tyres are soft, as well as the balls, so they’e not getting much grip, or because the flywheels are too far apart, or because there’s not enough energy stored in the flywheels.

Still, a promising result and worth further development, since the balls knocked over the books and there’s no way the elastic bands from last year would have managed that.

 

Odometry sensors

Odometry is the fancy word for sensors that tell you how far you’ve traveled. The classic example is wheel (or shaft) encoders where markers on the wheels or motors are counted, to allow the travel distance to be estimated (its an estimate as the wheels may slip).  There are now also optical flow based sensors, where a series of images are taken, key reference points identified in each frame, and the travel distance worked out from there. This can eliminate the wheel slip error, but introduces other potential errors such as when when the camera to ground distance varies, the perceived speed changes.

As mentioned previously, we ideally want an encoder that doesn’t interfere with our magentometer (no long range magnetic fields), is fast (better than 100Hz), precise (better than 1mm resolution when used on a 50mm wheel, so more than 100counts per rev) and isn’t sensitive to dirt, contamination and sunlight.

Shaft encoders are split into a few variants:

 

Optical flow

optical flow solutions fit into two categories: general purpose cameras with specialist software analysis, and dedicated hardware based solutions, often based on optical mice. The mouse chips tend to be cheaper and faster.

so lots of options available, none ideal. We’re going to start with the cheapest and see what we learn.

 

System Architecture

For most challenges, we’re aiming to use map based navigation, and rely on ‘external’ sensors as little as possible (after the issues with light interference last year). To do this we’re planning to have a microcontroller very rapidly reading simple sensors like the IMU (including the magnetometer) and encoders to keeps track of where it thinks it has traveled and adjust the motors accordingly, aiming for  a given waypoint, whilst keeping the pi up to date with what its doing and where it thinks it is.

The Pi will keep track of the location on a representative map, use key features to update the perceived location on the map and give the microcontroller new waypoints to head for. Localisation (figuring out where we are on the map) and route planning will be separate modules. See this page: https://atsushisakai.github.io/PythonRobotics/ for a collection of algorithms written in python that achieve this, along with animations representing each one’s approach. Our Pi will use ToF sensors and image processing to detect obstacles and use them as inputs for correcting the perceived location.

 

Magnetometer Evaluation

Since we already had a gy-271 magnetometer (HCM5883L breakout board), we thought it was worth doing a few tests to see if the magnetic field of the motors would interfere with the results.

The HCM5883L is a 3axis magnetometer, so doesn’t know its orientation to gravity or its turn rate like an accelerometer or gyroscope does, but by measuring the local magnetic field the results can be used to calculate the current orientation.

We first did a bench top test with a single motor and the board. we found that at about 150mm distance, the motors could cause a +/-2degree heading error. At 100mm it was +/-20degrees and at 50mm it caused more than 45degrees error. A few people on twitter suggested using materials with a high magnetic permeability as shielding for  the motors.  We managed to borrow some flexible magnetic keeper material (like Giron?), so we tried that:

Interestingly, in most orientations this gave no improvement, and in some orientations it was worse than no shielding. We think the material may have been concentrating the motors magnetic field into a better magnetic shape or directing more towards the magnetometer, or maybe the earths field was also affected by the material. We couldn’t entirely engulf the motor as we need the power wires to exit one end and the gearbox output shaft to exit the other. If anyone can suggest a more suitable material or arrangement, please let us know in the comments. Good magnetic shielding materials seem to be very specialist and expensive, and will likely add more weight than a plastic sensor tower, so for now we’re going with that.

Here you can see we’ve mounted a wooden stick to the front of our Tiny test chassis. Since the explorer phat doesn’t have pass through headers, we initially used a PiFace solderless expansion shim to breakout the GPIO pins, but we found some of the pins weren’t getting a connection, so had to revert to using jumpers to wire everything up (explorer phat floating in the air on wires, instead of sitting on the pi). After a bit of calibration, we got the Tiny to drive towards North fairly successfully:

From this test, its clear we need other sensors to help us figure out how far off course each little deviation takes us, so we don’t just turn back to the correct heading (parallel to the original path but offset), but we can turn back onto the desired path. We also need to find a better calibration routine, the few libraries we found still left some errors, North was ok but South was closer to South by South East. Here’s an example of one of the early calibration test results:

This is after some scaling and conversion, so the units are a little meaningless, but it shows the magnetic field strength in X and Y as the sensor was rotated. Since the earth’s field is constant, this should result in all the points being an equal distance from the origin, clearly this is offset to the right and downwards. The scaling looks ok as both axis vary by the same amount. After we changed the offsets we got much better results.

 

Parts are arriving

We’ve started ordering the parts we’re confident we will need, like a Pi, Pi camera and motor controllers:

So lots of progress. Hopefully next week we’ll evaluate a few more sensors and develop the Tiny test bed further.

Pi wars 2019 – it begins!

30th Sep 2018

First the big news: Hitchin’s A and B teams both got in to Pi Wars 2019!

Before the excitement fades, the feeling of worry sets in, how much work have we just committed to? Can we get it done on time this year? Better get cracking!

Mixed progress has been made on Tauradigm this week.  We’ve fleshed out the CAD design a little more, estimating what components we need for all the challenges, and checking how they’ll fit:

initial layout

Pi 3b+ in the centre, batteries either side, motor controllers behind them, with a Teensy at the rear (to quickly track the encoders and IMU). The field of view of the distance sensors (mounted on the underside of the main board) are represented by the cones pointing outwards. The grey sketched rectangles represent the IMU (gyro, accelerometer and magnetometer, for figuring out the robots orientation and movement) and a multiplexer, so we can speak to multiple sensors easily.

At this point it was looking fine.

rear encoder

We’d even included a model of the encoders we were planning to use. It was at this point we realised a couple of issues. Firstly, the motors we’d been hoarding (high speed, high power 12mm motors, bought really cheap off ebay in a pack of ten) didn’t have rear motor shafts, so those encoders wouldn’t fit. Secondly, would the encoder magnet interfere with our magnetometer? Come to think of it, would the magnets in the motors interfere? Some quick research suggested they would, many people have had issues with getting magnetometers to work reliably. it looks like we might have to move the IMU as far from the motors as possible. like on the top of a small tower! Ok, if that’s what we need to do…

 

layout

pi noon setup

Next was a quick mock up of the Pi Noon arrangement, with the camera angled up to see the pin and opponents balloons (an issue we had last year where the balloon disappeared from view just at the critical moment!) and you can see the tower for the IMU. We’ve also added the 5v regulator and the barrel jack for running from a power supply. Looking ok. but space is getting tight and we still need to sort out a solution for the encoders.

ball flinger attachment

 

Next was a drawing of the firing mechanism for space invaders (target shooting). This design is based closely on Hitchin’s previous ball launcher (for skittles, http://hackhitchin.org.uk/finalstraight/) but this time firing soft juggling balls. There’s been some discussion with the pi wars organisers about whether a speed or energy limit might apply, so this might need to be revised as it requires quite a lot of energy to work properly, even though the speed isn’t that high (probably slower than a nerf dart). We’re also considering vacuum cannons 🙂

Drawing up the launcher highlighted an issue with the camera and IMU mount we had for pi noon above, so that’s going to need a rethink.

 

For the encoders, it looks like we could go magnetic or optical on the output shaft, or optical mouse sensors looking at the floor. The magentic sensor is probably the most reliable but has a very low resolution and might interfere more with the magnetometer, the optical sensor may be a little big and may be sensitive to dirt, and the mouse sensors are sensitive to distance from the ground (and might need to be too close to be practical).

magnetic encoder on the output shaft

 

We’re planning a separate post on sensor selection, as it can be challenging and confusing.

 

That’s it for this week. hopefully we can soon start ordering parts to test.

 

Pi Wars 2019 – The entries are in

24th Sep 2018

Its that time of year again, where Hitchin Hackspace get slowly obsessed with Pi powered robots, in the build up to Pi Wars. Hitchin have again applied with two entries, this is post will be focused on the entry for Tauradigm, another fully autonomous entry and successor to last years Piradigm entry.

Key bits from the application form:

Team/Robot name: Tauradigm
Team category: Advanced or Professional
Team details: Mark Mellors: Professional Mechanical Engineer, lead on the mechanical and electrical side of the robot. Previously did majority of work (across all disciplines) on last year’s (disastrous/’ambitious’) Piradigm.
Rob Berwick: Professional Computerman, lead on the software side. Previously an adviser and supporter on Piradigm
Both Mark and Rob have also contributed to all of Hitchin Hackspaces Piwars entries, from main team members in ‘16 to occasional advisers in ‘18.
Proposed robot details: Fully autonomous in all challenges, obviously
4 wheel drive, lego/custom wheels tyres as default configuration
Custom chassis using unusual materials/construction techniques 🙂
Stylish, lightweight, UFO themed bodywork
Home made pcbs for power and signal distribution
Arrays of various sensors, including: internal (encoders and inertial measurement), proximity (distance sensors, reflective) and vision (pi camera), possibly supported by offboard sensors (beacons, ‘eye in the sky’)
Software:
Separated architectural units
Fast IMU/encoder feedback with continuous location estimation, much like actual spacecraft (so build a virtual map then use that for navigation, with occasional corrections when known markers are detected)
Advanced route planning algorithms
Automated hardware self test

Attachments for challenges
Space invaders will have a much more powerful soft projectile system with vision based auto aiming supported by encoders and distance sensors
Spirit of Curiosity may use one or two homing beacons or visual markers
Navigate by the Stars and Blast Off may use a rocket (jet, no flames or pyrotechnics!) propulsion module
The Hubble Telescope Nebula Challenge and Pi Noon: The Right Stuff will use vision again, supported by distance sensors and encoders (plus be more tolerant of changing lighting conditions this time!)
The Apollo 13 Obstacle Course will be attempted autonomously without on course markers this year, using a range of sensors

Why do you want to enter Pi Wars?: Retribution! To show that full autonomous is possible! But also, as before, to inspire, share and stretch ourselves.
Which challenges are you expecting to take part in?: Remote-controlled challenges, Autonomous challenges, Blogging (before the event)
Any more information: Piradigm would like to attend as an exhibitor this year

The entry only went in last week (sorry Mike and Tim for leaving it late!) yet we’ve already realised we might need to make some changes to the robot design.

Initial draft CAD rendering:

Render of Tauradigm

Render of Tauradigm

Fingers crossed we get accepted!

We should find out in the next week