Piradigm – Approach to Piwars challenges 2018

12th Nov 2017

Quick progress update: chassis wired and driving in remote control mode!

 

So now I’ve got the basic functionality working, I can start on programming for the challenges. In principle I could compete in three of the challenges right now, as they’re intended to be entered with manual control. As I mentioned last time though, I’m hoping to attempt them autonomously, as well as the mandatory autonomous challenges. Here’s how I’m intending to approach them all:
Somewhere Over the Rainbow (driving to coloured balls in sequence)
I’m hoping to use OpenCV’s contour detection for this challenge, to spot shapes which can then be filtered by size, shape and colour, to identify the coloured balls, then use the size of the ball and possibly the height of the arena to help judge distance to the ball. I have a few ideas to help make the image processing quicker and more reliable – have the camera at the same height as the balls and prioritise looking where the balls should be based on their expected height and angle relative to the start position.

 

Minimal maze
To help with many of the challenges, I’m intending to use external markers (like QR codes) to help track the robot’s position and orientation on the course. This year the minimal maze allows markers to be used, so I’m intending to put an ARuco marker on each corner, with the hope that a wide angle lens will always be able to see one marker, giving me position on the course at all times. I’ll preprogram waypoints for each corner of the track and use the markers to help navigate to them in sequence.

Hitchin Hackspace built a copy of the minimal maze last year. its the same this year, with different colours for each wall

Straight line speed challenge
Like the maze, I’m intending to put a marker at the far end of the course and just drive towards it. Once the marker reaches a certain size, I’ll know I’m at the end of the course and can stop. This is the first challenge I’m going to attempt to program. If I get all the others done and working reasonably reliably, I may come back and try to do it without a marker, just using the colour of the edges of the course as guidance.

 

Duck shoot (target shooting)
This challenge is intended to be shooting targets by manually steering the robot. I’m hoping to do autonomous aiming through object detection. I’ve picked a gun (blog post coming up on that) and got it to be electronically actuated, so I “just” need to find the targets and aim. I’m hoping the targets will be rectangles of a consistent colour, or at least something easily identifiable using opencv, but that’s up to the organisors and course builders to let me know. I know roughly the size, position and distance to the targets, so I may be able to use that to narrow down which detected shapes are targets.

 

Pi noon (1 v 1 balloon popping)
This is going to be tricky! I’m again intending to put markers at the corners of the arena so I can judge position. After that, I can either drive around randomly or look for coloured round things (balloons) and drive towards them. Hopefully after I’ve got the code for Over the Rainbow and Minimal Maze challenges working, this one should be more than halfway there. I think spotting balloons may be tricky though.
Slightly deranged golf
The golf challenge is a little like pi noon, in that there’s a round object on a course that’s going to be difficult to catch. I’m going to attempt it the same way, programming a waypoint for the hole and looking for round white objects that might be balls. Very tricky.

Golf Course from last year, note the steep hill at the start that caused the ball to roll into a hazard

Obstacle course
Again, by having markers on the corners of the course and programming waypoints, like the minimal maze, I’m hoping to autonomously navigate the course. The extra 3D-ness of the obstacle course will make this more difficult, as the markers may not always be visible. The main difficulty will be the turntable though. I may need to put a marker on the turntable or some other trick. I’m leaving this challenge until last, as its so difficult.

 

Obviously its still early days so these plans may change as I get into and find image processing is too challenging for some reason, but hopefully I can complete at least some of the challenges using the camera.

next week: design and build notes

 

Reminder: please vote for Hitchin Hackspace on the Aviva site here:  Hitchin Hackspace Community Workshop  We are very close to getting funding to enable our Hackspace, we just need a few more votes.

Pi wars 2018

29th Oct 2017

Its Pi Wars time again! If you were following along the last few years, you’ll know that Hitchin Hackspace has previously entered the Raspberry Pi powered robot competition with some innovative and challenging robot designs, sometimes with great success, often with stress and very late nights towards the end. This time we’re doing things a little different, On the one hand there’s the A team of Dave, Martin, Brian, Mike and Paul; taking the typical Hitchin approach and on the other hand, there’s, well, me. I’m being politely referred to as the rebel, or more frequently Team Defectorcon.
Why the split? I want to take a different approach; a somewhat risky strategy that’s unlikely to be competitive, and I knew the rest of the team would prefer something more along the lines of what they’ve done before.
So what’s the difference? Hitchin Hackspace typically goes for highly optimised designs, with attachments specifically tailored for each challenge, attached to a high performance drivetrain and using carefully selected sensors. I’m going to be using a robot kit as the basis of my entry, and my only sensor is going to be a camera. I’m hoping to use off the shelf items wherever possible even if they may be a little compromised. In addition to that, I’m going to attempt to enter *all* challenges autonomously, even those intended to be driven using remote control. By starting with a kit, I’m hoping that I can get a moving chassis very early on, so I can get maximum time to develop and test the software.

Progress has been very good so far. I decided to use Coretec robotics ‘Tiny’ chassis very early on, as its a nice, popular platform that other pi wars competitors will recognize, its not that expensive and its not super fancy (important as I want this to be a relatively budget build and show others what can be done with simple and cheap mechanics). I saw them at the competition last ear and was impressed how well such a small chassis coped with the obstacle course. In fact in many places it had an easier time than its larger competitors.

The tiny kit comes with the basics of a moving chassis: wheels, motors, chassis (including camera mount) and motor controller. At a minimum, that leaves me to select the Raspberry Pi board I’m going to use, along with the battery, voltage regulator, camera and LEDs. All piwars bots need extra LEDS :-). Taking those in order:

Pi board: This was an easy one. Since I want to use computer vision for the challenges, its inevitably going to be computationally intensive, so I wanted the fastest board I could: Pi 3 B.

Battery: As above, due to the processing power (and therefore power consumption) and learning from previous years, I knew I wanted the highest capacity battery I could fit in. Lithium polymer batteries offer the most energy density and the best value at this scale, so that decides the battery technology. The stock Tiny kit motors are rated for about 6V, but I expected they’d cope with being slightly over-volted for a bit more speed, so I went with a 2 cell pack (7.4V nominal). Hobbyking do a vast array of lipo batteries and I knew from previous projects the quality is ok, so I used their battery chooser to find the highest capacity pack that would fit within the kit chassis – 2.2Ah, that will do nicely : – )

 

Voltage regulator: Hobbyking also do a bunch of regulators (often called a BEC in radio control circles, a Battery Eliminator Circuit, i.e. it eliminates the separate pack that used to be used to power an RC receiver). I picked a 5A version so it was definitely adequate for the burst current a Pi may need.

Camera: I went for the Raspberry pi branded camera as I knew it would *just work*. I’ve also bought a clip on wide angle lens to hopefully reduce the amount of turning needed to spot things. The quality of the wide angle lens isn’t great though, I may have been better getting a Sainsmart Pi Camera as it already has a built in, well matched wide angle lens and is a bit cheaper.

LEDs: most Piwars bots have some flashy RGB LEDs on them for decoration. I also wanted to have LEDs but, after a suggestion from Rob Berwick, I’m going to try to use the lighting for something functional. One of the challenges with computer vision is variable lighting conditions, particularly shadows and colour changes. Sharp shadows can create edges and shapes that the detection algorithms can mistake for the actual target. By having very high power white lights, I’m hoping I can balance out the sun, reducing the affect of shadows. Beating the sun requires *a lot* of light though, about 15watts by my calculations (1000lux over about 1m^2 requires 10-20watts). Sparkfun do a pack of 5 3 watt LEDs, so I’m going to setup an array of wide beam headlights.

 

So those components cover the basics. I also wanted a screen on the bot for debugging, so went with a fancy touch screen from Adafruit, so I can also drive menus. Unfortunately, after buying the screen, I realised It had created a bit of a challenge. The screen uses about 8 of the Pi’s GPIO pins, and many of them clashed with the pins of the explorer motor control board that comes with the tiny kit. that controller also doesn’t work well when stacked under a screen, and it can only drive the motors at 5V (from the Pi), not battery voltage. I went looking for another motor controller
but couldn’t find one that meet all my needs, particularly the need to stack. The best I could find is a Picon zero from 4tronix, but that doesn’t stack, so I needed to find a way to get the neccesary control signals to it. Luckily the Adafruit screen includes a second set of pins on the underside, duplicating the full GPIO array, so I’m planning to use a ribbon cable to connect them to the motor controller. Getting this to fit turned out to be a much bigger headache than I’d expected.
With my component selection covered, I’m going to leave it there for this week, other than to say a CAD model of the current design is now on Grabcad here: https://grabcad.com/library/pi-radigm-v1-piwars-robot-1

I’ve done a wiring diagram with Fritzing:

 

 

 

 

 

 

 

 

 

And the build currently looks like this:

 

Next week I’m hoping to get the wiring finished, some basic software going so that its driving, and I’ll talk about how I’m planning on attacking each Piwars challenge.

Reminder: please vote for Hitchin Hackspace on the Aviva site here:  Hitchin Hackspace Community Workshop  We are very close to getting funding to enable our Hackspace, we just need a few more votes.

Thanks

Mark

*Thanks to Pimoroni and Coretec robotics for letting me share CAD of their parts.

Piwars 3.0 Deadline Looms

28th Mar 2017

The final few days are here!

Obviously all competitors will have completely finished their robots and will have dialed in the settings to perfectly tune each task to run a peak performance….yer right.

Last year we ran so close to the deadline that we were not just tweaking challenge code on the competition day, we were completely writing some challenges from scratch. This year we made a conscious effort to start early so that wouldn’t be required. So far that hasn’t gone exactly to plan. We started early, for sure, but got bogged down in the details and minutia that little things like getting a robot driving early on and thus allow for driving practice seemed to fall by the wayside.

In fairness, last year we went all out in the last 2 weeks. So much so that it caused a few “issues” with home lives, requiring the use of many, many ‘browny points’. This year has been a definite improvement. So far we have managed to not need the last mad dash effort, but it’s going to be close still.

We have made the strip board version of the breadboard electronics. It’s pretty simple as it is mostly just I2C pins with a small amount of magic components that make our lovely laser sensors work beautifully. The soldered board being significantly smaller than the breadboard, which will make the bot a little neater.

We’ve been playing with LED’s. This was one simple test to make TITO2 “shine” above the rest. As it’s not part of the basic challenge, we’ve left it a little late. However, Brian has come up with some “bright” ideas…really sorry, could’t resist.

Over the last few days/hours we have got the line following sensor working and mostly coded up to work. Big improvement from last year as we wrote that challenge off near the end. It’s taken quite some research to get a sensor that can read the laminated sample provided. With some testing, playing, research and purchasing power, we now have a sensor board that, although doesn’t work directly with the Raspberry Pi, does via an arduino nano. It’s a bit annoying as if it wasn’t for this, the only real electronic device other than ESC’s was the Pi Zero W. But, it’s only on the sensor mount, and the sensor was a bit of a pain to get information/code samples from the manufacturer that I’m still classing it as a win. I should stress, although the sensor picks up the printed line on the vinyl print, the readings are much closer to white than the black electrical tape readings. It may still prove in tougher testing to be less reliable.

Anyway, I hope you all are progressing well. There maybe one or two more posts before the deadline, depending on time, energy and sleep requirements.

🙂

Piwars 3.0 ‘T Minus’ Countdown Continues

25th Mar 2017

Starting to feel the time pressure a little now…

TITO 2 has progressed from early prototype into a the early stages of final bot. This means that the laser cut chassis has been altered to include all known mounting points for the electronics as well as shaped for the b-e-a-utiful 3D printed clam shell bodywork.

What this has meant is that the motors, battery, wiring harness, pi, connected sensors and all other items have been meticulously and carefully transferred from the prototype to the release candidate chassis. The result of that is the motors are now obviously backwards, the left distance sensor is now the front…well, you can see where I’m going. The motors being wired backwards was actually a blessing in disguise as it wasn’t noticed till then that RC mode was driving backwards (sad face).

So, RC mode now correctly drives forwards, but our autonomous modes all flip out. It should be a simple matter of code changes, but it’s all time that we know we have little of.

I2C OLED screen

The team are working tirelessly to get the robot ‘fighting fit’, so Paul has taken the I2C circuit design and strip board away to solder up a proper (non breadboard) I2C connector/extender for all the I2C sensors and screens that all need to connect to the single set of 4 pins on the Pi. Martin, Rob and I have gone to sit in a dark room and think about what needs to be done on the code base to progress the project ready for yet more driving tests and tweaks on Monday. Paul also knocked up a laser cut rig for the golf challenge, so that will need to be assembled and ‘electrified’ ready for testing on Monday as well.

Mark has been busy in the evenings designing 3D objects to be printed that will safely hold our sensors so that the electronics don’t fall off or get damaged by inevitable contact with the walls.

Brian has volunteered to take charge of the line following sensor. This is an area that has hit problems. We noticed that the line detection was sketchy at best when using the line sample kindly provided by Piwars Mike and Tim. We think the sample has been laminated with some sort of IR reflective coating (probably to stop the course/banner from fading in sunlight). This means that our sensors (and apparently other groups sensors as well) have struggled to see the change in colour as they only see a reflection. The effect is similar to the 2 way mirrors seen in films where several ‘usual suspects’ line up in a well lit room. They cannot see the people judging them in the dark room the other side of a pane of glass, mainly due to their rooms reflections overpowering the small amount of light coming from the dark room. To try and overcome this issue we have been playing with all kinds of techniques like blocking out all external light sources, changing sensor distance from surface, angle of sensor to surface, disabling onboard IR LEDs and using an external IR source at a significant angle difference. None of these bore much fruit and neither did the following test of seeing if out IR sensors were sensitive to visible light. Good news, they are…bad news, not enough (again, sad face). Our final tests was to use black electrical tape over the top of the sample line. This instantly was a solid detection using the standard sensors in the standard manor. Our options are rapidly disappearing. Two remain, use a different senor to detect the line (probably the pi camera currently being tested by Brian), or kindly request Piwars overlay black electrical tape onto the course.

Anyway, to quote a good film ‘Keep moving forward’ (without googling, you have 3 guesses which film).

Piwars 3.0 ‘T Minus’ Countdown Begins

22nd Mar 2017

I recently received the email from Piwars reminding us that the end is in sight. “Thanks Mike and Tim!”

I don’t know about the rest of you, but it certainly gave me that same warm fuzzy feeling inside that you get when you wake up knowing you are late for work!

Where are we with our robot? Well, that is a very good question. We meet as a group at least twice a week to push the project forward, but we also use our evenings and weekends as well. Although we did get stuck with EM interference issues for a good couple of weeks, we have now recovered from that and made a huge surge forward with getting TITO 2 ready for the impending challenges.

  • TITO 2 , although still in prototype mode, has now successfully and repeatedly navigated our mock up of the Minimal Maze. Seeing it do this really helped me believe that we will get it ready in time.
  • We also have tested TITO 2 on the straight line speed track, however not at full speed…not even close. I’d like to get a few more runs at this to dial in our tweaks so that it doesn’t touch the sides.
  • Our RC mode has also been tested (to the dismay of Mark who very kindly spent at least 40 hours 3D printing the shell). We know that it at least works in this configuration, but there are some significant improvements to be made, as the general consensus is that it is “a little twitchy”. That and the tank steering mode has made it nearly undriveable to all but experts (ie Mark).
  • Line following has hit a snag. Our sensor board is clearly suffering an injury of some sort and so we may have to fall back to OpenCV and a Pi Camera. As we have little to no code for this yet, I’m not even slightly concerned………
  • Slightly Deranged Golf is basically RC mode with a ball moving attachment. This attachment is yet to be added to the bot but has been tested in prototype stage.
  • Pi Noon is again basically RC mode. Once we’ve sorted the steering and throttle issues, I’m sure we will have this pegged!
  • Skittles. This one always fills me with joy. We have made progress and have a prototype that will actually do this. I mean, it’s not tested in any way shape or form. But surely that’s just minor details?

Here are a few short videos showing off what we have accomplished so far.

So, after checking back in with you I hope you can see that we are moving forward!

Piwars 3.0, back from radio silence

3rd Mar 2017

Sorry for no posts recently, it appears life can get really busy, really quickly!

Since the last post we haven’t been idle though. The prototype bot is now drive-able both remotely and autonomously using distance sensors. We have a short video of it following a wall (albeit very slowly), but it is doing it!

https://drive.google.com/open?id=0B2Socfm_uuw9VDBXbFZIaGZmelE

Whilst busily working away testing sensors, we encountered a problem…and I quote Martin

“After some initial success with autonomously driving our test chassis using an old analog distance sensor, we set about upgrading to some snazzier laser distance sensors with much better resolution, talking to the Pi Zero over I2C.  In the test setup the sensors dutifully reported the distance they measured, but as soon as we tried to actually drive the robot the sensors started to reset themselves, losing their bus addresses and returning garbage values. As the problems started when we started turning the motors, the suspected causes were either noise on the power supply, or radio interference.  

We had previously had some problems with these sensors which turned out to be a loose ground pin, but tightening up all the connections and adding a capacitor to smooth the power supply didn’t help.  We then put together a test setup with a Pi reading values from the distance sensor, and a motor wired up to a separate battery through a switch so it would have no effect on the Pi’s power supply. Sure enough, a quick spin of the motor was enough to disrupt the I2C sensor, so radio interference had to be the culprit.
We brought out the oscilloscope, and probed the XSHUT (or “don’t turn off”) pin on the sensor. This pin has to be held at 2.8V for normal running, as bringing it to ground will reset the sensor. There was a little noise on the signal when the sensor was running which was likely to be crosstalk from the signal wires, but at 0.1V it wasn’t significant enough to cause problems. As soon as we ran the motor, however, the scope lit up like the proverbial Christmas tree.  The wire from the XSHUT pin to the Pi’s GPIO header was acting as an antenna, picking up interference from the motor next to it and triggering the sensor’s reset function. 
We tried a couple of different changes to keep the signal under control, with the eventual winner proving to be a large capacitor that would soak up the interference and banish the electrical gremlins.  With the capacitor in place, the oscilloscope showed the XSHUT signal staying calm, and the motor could happily spin away right next to the sensor with no interference issues.  Result: four happy roboteers :)”
Interference before and after
Above is a screenshot of the oscilloscope before and after. Note how the before (upper half) is looking all over the place, and how it has been resolved (mostly) in the lower half of the image to a single static line.
Not only have we managed to resolve a very problematic issue, we have done away with most of the additional controller boards. We moved to a PiZero as was always the intention, but we have dropped the requirement for an arduino as we are now using digital I2C distance sensors. Not only that but we then decided to use a Zero4U board to allow both wifi and bluetooth dongles without a huge USB hub plugged into the USB OTG port. That said, the Pi foundation has just released the Pi Zero W that has those two components on-board…so we dropped the Zero4U from the requirements as well. Other than ESC’s, sensors, battery/regulator and the voltage monitor, the whole thing is run from one single Pi Zero W at the moment.
Barebones
If you watch the video linked above, you will see exactly how much the hardware has been reduced. Its all coming together now!

Piwars 3.0 Day 4

7th Feb 2017

Hi World,

I promised proper coding and possibly some testing…well I may have to disappoint. Coding has happened (although as some of it’s mine I wouldn’t class it as proper) but we have only limited testing so far, no vehicle movement, just individual components.

We have been attempting to find a secure way to attach our wheels to the motors. Last year we 3D printed wheel hubs and they worked, mostly. In last years testing they proved reliable, reliable to fall off that is, so we resorted to glue which meant minimal ability to replace motors or wheels should something break….which it did on the day (rear left gearbox on motor was having huge issues). So this year, something new.

Aluminium HubThe silver round metal object is our lovely new aluminium wheel hub. Its tiny which is both good and bad. The centre hole is 4mm and fits our motors beautifully, however the grub nut and 4x bolt holes are just too small for the forces we expect to shove through it. Not mentioning that the ebay special has an american standard screw thread which none of us can find in our spares boxes.

So, onto the work that needs to be done. We need to re-tap them, but obviously we can’t keep the hole the same size as the new thread would be so weak against the old thread. So we are upsizing them from M3 (ish) to M4. This will give (hopefully) a stronger grip for the bolts that will be holding the wheels onto the motor shaft.

Beautiful 3D Printed HubAnd the result, 4x colour coded allen cap head bolts for added stealth. I say stealth, amusingly the first test 3D print of the lovely new wheel hub (designed by Mark last year and lovingly modified by Paul for the new bolts) was done in a blue ABS plastic as that was what was still in the 3D printer. Group members have already shown a liking to the blue/black combo, so we will have to wait and see which colour scheme we end up with.

Mark Working Hard

Mark in the photo to the left can be seen working hard (as he usually does) re-tapping the aluminium shaft blocks, and the rest of us can just be seen performing all kinds of ungodly wiring and coding stuff. The really really badly tangled wiring mess in front of Mark is my desk, but they always say “If a cluttered desk is a sign of a cluttered mind, of what, then, is an empty desk a sign?”.

So, coding. There is currently two of us working on this task and we are taking last years code as an example of what and what not to do. The on-board hardware has changed significantly to reduce size and weight so a lot of the old code is no longer useful as it was for talking to devices no longer present.

Python Module Sketch

This little sketch is our first stab at trying to organise what talks to what and how many modules we will need. Safety is always paramount with us, so we will be utilising last years method of always having something listening to the controller so the motors can be put into neutral at ANY point, no matter what its doing. This will probably be done with the launcher thread running permanently in the background always listening for a specific button press and locking our any other modules access to the motors if needs be.

And we then segway into course construction:

Brian and Johnny Building Course

What you can see here is Brian and Johnny thinking way too hard about the course measurements. They are perfectionists (which I love) but are slightly stifled by the lack of measuring tools. They have found an old school ruler in one of the cupboards and are using that to measure and cut wood for the minimal maze task. Rather them than me.

So with course prep going on and lots of wiring and coding progressing, I REALLY want to see a vehicle test in the next week. I think we can do it (if I pull my thumb out). I know it will boost the teams confidence if the thing actually moves in a controllable way.

Watch this space…..

Piwars 3.0 Day 3

3rd Feb 2017

Oooooh there is so much I want to share you you all about our newest robot…but can’t. Sorry, sworn to secrecy 🙁

Don’t dismay, here is what I CAN tell you:

Over the last couple of years I’ve assisted in building several wheeled bots and prototypes (actually, counting back in my head its more that I realised). These range from tiny little three omni wheeled mico-pi noon entrants all the way up to our beloved bighak. One thing that they all have in common during the building stage is we haven’t ever built a rig testing station. By that (excluding bighak due to its size and weight) they have all rested on a china mug when performing motor tests. Normally this induces a fair amount of fear in me as the vehicle spins up the dangling wheels to their full speed (not always in the same direction as each other and not always stably…) and having loose wiring precariously close to the rubber grippy tires.

World, say hello to our newest creation:

Assembled Test Bed

Laser Cutting Test Bed

 

It may not be the most photogenic, but this is a big thing! We now have a ‘safe’ way to test our prototype AND drink our cups of tea, something not previously possible.

Now, we’ve had to do this previously but as I’m in charge of the blog I’d thought it required mentioning so that future Pi designs might be tweaked. Every time we come to mount the Pi (or pi camera module) we very quickly remember that the mounting holes are M2.5. If you didn’t already know, M2.5 bolts are not generally available from hardware stores as they are tiny. The smallest I’ve seen is M3, of which I now have a small stock with nylock nuts to pair with. Just to be clear, I’m not saying they are impossible to come by, far from it. A quick google will probably show loads of places to buy them online. What I’m trying to get across (and probably failing) is that not many DIY projects use them and so I don’t tend to hold a stock of them at home. This has previously led us to do something drastic:

Drilling out a Pi Zero mount hole

What you can see here is me finger turning a 3mm drill bit through an existing 2mm mounting hole. I’m using my fingers and not a drill mainly because the drill is cordless and not charged as I had forgotten I’d needed to do this 🙁

Before I get any grief, please don’t try this at home. I’m a responsible adult who will take responsibility if I break my own Pi. I don’t want to be responsible for anyone else who copies me.

 

Wii Classic Controller

Onto the “new” toy, ah the wonders of the second hand market! Last year we used a Wii controller with it’s nunchuk to drive our robot wirelessly. This year we wanted to improve on this as we were basically holding two separate controllers yet only really using one hand/thumb to drive thing. I’ve been researching PS3 controllers and have done a fair few tests with positive results when interfacing with a Pi and Python. Whilst the PS3 controller is ergonomic I don’t like how the thing pairs with the bluetooth adapter. It isn’t a simple button press and requires knowledge of the ‘dark side’ to get it to work. This means if the PS3 controller broke or ran out of juice mid contest we’d be frantically typing on a laptop to try and pair a backup. So, back to the Wii controller it is because this easily pairs each time the robot boots by pressing 2 buttons on the controller. this means we can have many spares and no-one needs divine knowledge as to which one is currently paired.

Whilst I was passing by a favorite second hand shop in town, they happened to have a Wii classic remote going cheap. I knew the Wii had plastic remote holders in the shape of guns and steering wheels and that they had balance boards and the like, but I wasn’t aware that these were made…probably shows just how much use my Wii actually got 🙁

I bought the controller and hurriedly ran home like Charley from the chocolate factory to see if I had purchased something useful or whether I’d wasted my money. I was in luck, the CWIID python module supported it (mind, this took a fair bit of googling and github searching to find the exact text state to request from the module and each button ID to test for).

If you have very good eyes you might be able to see a load of text on the screen behind the controller above. Each one of those is a test for a button state and a print line to report a button pressed.

So, at this point we have a basic test rig and a basic controller input. Next up will be proper coding and hopefully testing.

Piwars 3.0 Day 2

25th Jan 2017

Before we go any further lets agree to keep any costs between us. I wouldn’t want my family knowing exactly what I’ve spent on my hobbies instead of holidays 😉
First putter test

So with that out of the way, the basic drive-train has been decided and building started so next we need to focus on some of the challenges. Golf first as it has peeked our teams interest for most comical effect. Amusingly the photo above actually shows us doing first feasibility tests on using an arm mounted putter on a cheap lightweight servo. When I say putter, I actually mean metal ring with a pencil stuck to it 🙂

I can’t believe how many ‘unlikely to succeed’ ball moving suggestions we have gone through on the golf challenge. It started with the simple yet effective servo mounted putter pausing slightly with a variable power solenoid kicker and ended with gas powered combustion ball launcher. Seriously, someone (probably me) actually thought it might be funny to “fire” the ball….not a good idea, not power adjustable between shots and DEFINITELY not guaranteed to keep the ball touching the ground at all times.

Still, we haven’t actually settled on the putter idea as the solenoid one keeps rearing its little head because it seems ‘simpler’ and ‘less prone to mechanical failure’. On testing the metal putter idea above it was quickly noticed that any competing solenoid idea would either have to have a reasonable mass on the moving arm or be extremely powerful to have any chance of moving the ball any distance. Then comes the adjustable power problem….so, so far the putter is winning.

noir and nanoThe first round of tests were done using an Arduino Mega the team member writing the test code to make a servo move had one to hand, but obviously we won’t have room for such luxury in our bot so the second round of testing will be done on the Arduino Nano, shown with a carefully placed Raspberry Pi NOIR camera in shot. The reason for using Arduino’s to control the servo is basically PWM timing accuracy. Obviously the Raspberry Pi will be in control of the whole system and will tell the Arduino exactly what to do at any point, but why not sub-out some of the low level work to devices that are designed to do this kind of thing day in and day out. That said, the Arduino will also be used for multiple things in combination with the Pi. One such task will be analog to digital conversion as the Pi has no analog GPIO pins and a Nano is a very cheap way to incorporate this feature.

motor kitNow that the drive-train design has settled down, we thought it wise to buy spares of everything. 8x motors, 8x motor controllers (ESC’s) and lots and lots of connectors and wire. Last year we went through soooooo many cheap ESC’s. Since then we have found a decent supplier, albeit from Australia, and have stocked up. Plus it made the delivery cost more efficient, well that is what I’m telling my wife.

Well, onward we go and always attempting to avoid the bunkers 😉

Piwars 3.0 Initial Preparation

17th Jan 2017

Well, after having our entrance accepted maybe we should actually get on and build something.

The Piwars challenges have evolved since the last competition. Some are similar and need only tweaks to the our previous design where as other challenges are new but kind of build on previous years challenges. We’ve been thinking about this for some time (my excuse for not actually doing any work until now) but now is the time to put some parts together and build a drive-able bot.

First things first, planning. Do we dismantle Tito, our old bot, or building from scratch?

Tito Robot

Previous Piwars robot called TTOS aka Tito.

Overwhelming emotions seem to be stopping us destroying our previous creations plus we intend to make our new bot smaller lighter and based on a pi zero so other than wheels and motors (which have been hammered in testing and competing) not much else is transferable.

 

 

 

 

Laser cut frame

Laser cut prototype frame/chassis.

Mark Mellors kindly designed and laser cut a frame onto which we can to attach our motors and ESCs so that we could start putting stuff together. I put together the basics of a wiring loom. Its modular (detachable connectors in key positions) so motors and/or ESCs can be swapped out in case the magical blue smoke is accidentally released 🙂

We suffered last year from at least one damaged gear box so this year I’d love to have an easily replaceable drive system should the worst happen.

 

 

Prototype wiring loom

Prototype wiring loom

Experience from last year showed that building a robot and being able to drive it competitively are two separate things. We were a little close to the deadline last time as we were still writing….erm I mean tweaking… code on the morning of and in between challenges. So this year we are all very keen to get a drive-able prototype together everyone can practice driving and have confidence that we could actually complete a challenge or two.

Prototype progress so far:

  • Frame laser cut.
  • Motor clamps 3D printed.
  • Motors bolted to frame.
  • Motors wired with detachable connectors to ESC’s and battery.
Prototype part assembled

Prototype part assembled.