Hack Hitchin

Piradigm – Approach to Piwars challenges 2018

12th Nov 2017

Quick progress update: chassis wired and driving in remote control mode!

 

So now I’ve got the basic functionality working, I can start on programming for the challenges. In principle I could compete in three of the challenges right now, as they’re intended to be entered with manual control. As I mentioned last time though, I’m hoping to attempt them autonomously, as well as the mandatory autonomous challenges. Here’s how I’m intending to approach them all:
Somewhere Over the Rainbow (driving to coloured balls in sequence)
I’m hoping to use OpenCV’s contour detection for this challenge, to spot shapes which can then be filtered by size, shape and colour, to identify the coloured balls, then use the size of the ball and possibly the height of the arena to help judge distance to the ball. I have a few ideas to help make the image processing quicker and more reliable – have the camera at the same height as the balls and prioritise looking where the balls should be based on their expected height and angle relative to the start position.

 

Minimal maze
To help with many of the challenges, I’m intending to use external markers (like QR codes) to help track the robot’s position and orientation on the course. This year the minimal maze allows markers to be used, so I’m intending to put an ARuco marker on each corner, with the hope that a wide angle lens will always be able to see one marker, giving me position on the course at all times. I’ll preprogram waypoints for each corner of the track and use the markers to help navigate to them in sequence.

Hitchin Hackspace built a copy of the minimal maze last year. its the same this year, with different colours for each wall

Straight line speed challenge
Like the maze, I’m intending to put a marker at the far end of the course and just drive towards it. Once the marker reaches a certain size, I’ll know I’m at the end of the course and can stop. This is the first challenge I’m going to attempt to program. If I get all the others done and working reasonably reliably, I may come back and try to do it without a marker, just using the colour of the edges of the course as guidance.

 

Duck shoot (target shooting)
This challenge is intended to be shooting targets by manually steering the robot. I’m hoping to do autonomous aiming through object detection. I’ve picked a gun (blog post coming up on that) and got it to be electronically actuated, so I “just” need to find the targets and aim. I’m hoping the targets will be rectangles of a consistent colour, or at least something easily identifiable using opencv, but that’s up to the organisors and course builders to let me know. I know roughly the size, position and distance to the targets, so I may be able to use that to narrow down which detected shapes are targets.

 

Pi noon (1 v 1 balloon popping)
This is going to be tricky! I’m again intending to put markers at the corners of the arena so I can judge position. After that, I can either drive around randomly or look for coloured round things (balloons) and drive towards them. Hopefully after I’ve got the code for Over the Rainbow and Minimal Maze challenges working, this one should be more than halfway there. I think spotting balloons may be tricky though.
Slightly deranged golf
The golf challenge is a little like pi noon, in that there’s a round object on a course that’s going to be difficult to catch. I’m going to attempt it the same way, programming a waypoint for the hole and looking for round white objects that might be balls. Very tricky.

Golf Course from last year, note the steep hill at the start that caused the ball to roll into a hazard

Obstacle course
Again, by having markers on the corners of the course and programming waypoints, like the minimal maze, I’m hoping to autonomously navigate the course. The extra 3D-ness of the obstacle course will make this more difficult, as the markers may not always be visible. The main difficulty will be the turntable though. I may need to put a marker on the turntable or some other trick. I’m leaving this challenge until last, as its so difficult.

 

Obviously its still early days so these plans may change as I get into and find image processing is too challenging for some reason, but hopefully I can complete at least some of the challenges using the camera.

next week: design and build notes

 

Reminder: please vote for Hitchin Hackspace on the Aviva site here:  Hitchin Hackspace Community Workshop  We are very close to getting funding to enable our Hackspace, we just need a few more votes.

Pi wars 2018

29th Oct 2017

Its Pi Wars time again! If you were following along the last few years, you’ll know that Hitchin Hackspace has previously entered the Raspberry Pi powered robot competition with some innovative and challenging robot designs, sometimes with great success, often with stress and very late nights towards the end. This time we’re doing things a little different, On the one hand there’s the A team of Dave, Martin, Brian, Mike and Paul; taking the typical Hitchin approach and on the other hand, there’s, well, me. I’m being politely referred to as the rebel, or more frequently Team Defectorcon.
Why the split? I want to take a different approach; a somewhat risky strategy that’s unlikely to be competitive, and I knew the rest of the team would prefer something more along the lines of what they’ve done before.
So what’s the difference? Hitchin Hackspace typically goes for highly optimised designs, with attachments specifically tailored for each challenge, attached to a high performance drivetrain and using carefully selected sensors. I’m going to be using a robot kit as the basis of my entry, and my only sensor is going to be a camera. I’m hoping to use off the shelf items wherever possible even if they may be a little compromised. In addition to that, I’m going to attempt to enter *all* challenges autonomously, even those intended to be driven using remote control. By starting with a kit, I’m hoping that I can get a moving chassis very early on, so I can get maximum time to develop and test the software.

Progress has been very good so far. I decided to use Coretec robotics ‘Tiny’ chassis very early on, as its a nice, popular platform that other pi wars competitors will recognize, its not that expensive and its not super fancy (important as I want this to be a relatively budget build and show others what can be done with simple and cheap mechanics). I saw them at the competition last ear and was impressed how well such a small chassis coped with the obstacle course. In fact in many places it had an easier time than its larger competitors.

The tiny kit comes with the basics of a moving chassis: wheels, motors, chassis (including camera mount) and motor controller. At a minimum, that leaves me to select the Raspberry Pi board I’m going to use, along with the battery, voltage regulator, camera and LEDs. All piwars bots need extra LEDS :-). Taking those in order:

Pi board: This was an easy one. Since I want to use computer vision for the challenges, its inevitably going to be computationally intensive, so I wanted the fastest board I could: Pi 3 B.

Battery: As above, due to the processing power (and therefore power consumption) and learning from previous years, I knew I wanted the highest capacity battery I could fit in. Lithium polymer batteries offer the most energy density and the best value at this scale, so that decides the battery technology. The stock Tiny kit motors are rated for about 6V, but I expected they’d cope with being slightly over-volted for a bit more speed, so I went with a 2 cell pack (7.4V nominal). Hobbyking do a vast array of lipo batteries and I knew from previous projects the quality is ok, so I used their battery chooser to find the highest capacity pack that would fit within the kit chassis – 2.2Ah, that will do nicely : – )

 

Voltage regulator: Hobbyking also do a bunch of regulators (often called a BEC in radio control circles, a Battery Eliminator Circuit, i.e. it eliminates the separate pack that used to be used to power an RC receiver). I picked a 5A version so it was definitely adequate for the burst current a Pi may need.

Camera: I went for the Raspberry pi branded camera as I knew it would *just work*. I’ve also bought a clip on wide angle lens to hopefully reduce the amount of turning needed to spot things. The quality of the wide angle lens isn’t great though, I may have been better getting a Sainsmart Pi Camera as it already has a built in, well matched wide angle lens and is a bit cheaper.

LEDs: most Piwars bots have some flashy RGB LEDs on them for decoration. I also wanted to have LEDs but, after a suggestion from Rob Berwick, I’m going to try to use the lighting for something functional. One of the challenges with computer vision is variable lighting conditions, particularly shadows and colour changes. Sharp shadows can create edges and shapes that the detection algorithms can mistake for the actual target. By having very high power white lights, I’m hoping I can balance out the sun, reducing the affect of shadows. Beating the sun requires *a lot* of light though, about 15watts by my calculations (1000lux over about 1m^2 requires 10-20watts). Sparkfun do a pack of 5 3 watt LEDs, so I’m going to setup an array of wide beam headlights.

 

So those components cover the basics. I also wanted a screen on the bot for debugging, so went with a fancy touch screen from Adafruit, so I can also drive menus. Unfortunately, after buying the screen, I realised It had created a bit of a challenge. The screen uses about 8 of the Pi’s GPIO pins, and many of them clashed with the pins of the explorer motor control board that comes with the tiny kit. that controller also doesn’t work well when stacked under a screen, and it can only drive the motors at 5V (from the Pi), not battery voltage. I went looking for another motor controller
but couldn’t find one that meet all my needs, particularly the need to stack. The best I could find is a Picon zero from 4tronix, but that doesn’t stack, so I needed to find a way to get the neccesary control signals to it. Luckily the Adafruit screen includes a second set of pins on the underside, duplicating the full GPIO array, so I’m planning to use a ribbon cable to connect them to the motor controller. Getting this to fit turned out to be a much bigger headache than I’d expected.
With my component selection covered, I’m going to leave it there for this week, other than to say a CAD model of the current design is now on Grabcad here: https://grabcad.com/library/pi-radigm-v1-piwars-robot-1

I’ve done a wiring diagram with Fritzing:

 

 

 

 

 

 

 

 

 

And the build currently looks like this:

 

Next week I’m hoping to get the wiring finished, some basic software going so that its driving, and I’ll talk about how I’m planning on attacking each Piwars challenge.

Reminder: please vote for Hitchin Hackspace on the Aviva site here:  Hitchin Hackspace Community Workshop  We are very close to getting funding to enable our Hackspace, we just need a few more votes.

Thanks

Mark

*Thanks to Pimoroni and Coretec robotics for letting me share CAD of their parts.

Reflection on Pi wars

17th Dec 2015

What an event! We came away exhausted but with smiles on our faces and three prizes! Here’s how we did in each event:
Skittles – 1st

The skittles challenge was one of the only ones where our innovations really paid off, our attachment worked just as we’d hoped and we got spares on every round. We even managed to get a strike on our practice go! We only had one minor issue- there was a slight issue with one of the rotor motors that meant it didn’t always spin up properly but we managed to overcome this on the day by repeatedly spinning up and slowing down until both rotors were spinning properly. The laser pointer worked great for lining up, knocking the pins down to get the spares was really satisfying, we had so much ball speed we not only knocked the spare pin over, we smashed it into the backboard! (back board added just because of our attachment : – ) We also had a very minor issue with maneuvering. We knew we would be very front heavy, but the front tyres compressed so much that the attachment hit ground, adding friction and reducing the grip the rear tyres had. Luckily Mike had time to make some nice little ptfe skids and put lead ballast in the ‘booster’ battery holder. This meant we could actually maneuver, but it was still slightly challenging to line up sometimes.

score: 8,2  9,1   9,1

 

Speed challenge – 1st


The time spent choosing the motors/voltage/wheel sizes really paid off here, and we got times within 0.1second of what we predicted! Unfortunately we didn’t get the sensor feedback working in time but luckily it wasn’t actually required. Four grippy wheels, tuning the motor speeds to be equal and aiming straight was all that was needed to avoid the sides. We really should have done more tuning beforehand, as we hit the sides on the first few runs whilst dialing the motor speeds in but after that it went straight as an arrow. When we did hit the sides, we just lightly glanced off or continued relatively straight whilst sliding along the sides, so the front bearing guide worked as hoped. Starting at the back of the starting box also allowed us to get some speed up before entering the timed section, saving about 0.1s (fairly significant when your overall time is 1.5s :-). The motors didn’t seem to mind being significantly overvolted for that short period, so we might actually be able to go to an even higher voltage next time, if we can find some suitable motor controllers.

times: 1.618s, 1.54s, 1.526s

 

Proximity – 3rd


This event was a total fluke for us! We didn’t manage to get the code for our secret (blog post to follow), super accurate sensor finished in time so we were just using a regular IR sensor as many others were. In the hour of testing we got on the day, we were getting +/- 5-10mm of noise in the stop positions, so we settled on a fairly conservative target of 13mm. When we got on the course, the judges commented that many robots using IR had really suffered, so we were then even more nervous. On our first run we got 4.5mm! That was a great, lucky start but we were worried, was that just random luck or was the wall colour or reflectivity causing the robot to stop late? If we got 4.5mm+/-5mm or more, we were quite likely to hit the wall on the next run! We decided to risk leaving the code as it was and lined up for another run. We got 4.5mm again! We’d never been that close or consistent in or practice runs. Maybe the wall reflectivity was actually doing us a big favour. We had originally been testing with a wooden wall, then switched to paper. We had found that the paper might be more consistent but it’s always hard to tell when the results are quite variable. We crossed our fingers and went for the third run: 0.5mm! What a set of results! We were more shocked then anything after the dodgy test performance. It was clearly a fluke but we were just happy we had finished without incurring a fault. We weren’t sure he we’d do in the results, as we’d heard of others getting 0.5mm results and several with consistent few ~3mm scores. In the end we got third.

scores: 4.5mm, 4.5mm, 0.5mm

 

Three point turn – 7th


This was another challenge where we had grand plans of clever sensors that didn’t quite pan out due to being behind with the code. Like the proximity challenge, we spent an hour on the day of the competition tuning the timing in the basic code we had. We knew the floor surface would have quite a strong effect on the turn speed, so we tested on the closest surface we could find: the textured tiled floor. It didn’t take too long to get the timings pretty close and our test results were surprisingly repeatable. Once we got on the actual playing surface of the challenge though, we quickly found we needed to change a few parameters. On the first one we over shot and drove straight off the platform after the first turn. Luckily there’s just enough time allowed to tune your code once you’ve started the challenge, so we quickly ran back, got the laptop and guessed some new numbers. On the second attempt we were much closer but again slightly went outside the playing surface on the second leg and overshot the finish at the end. Again, we tuned the numbers and went for it again. It wasn’t a perfect run but we at least got back behind the starting line. Dave proved himself as the master of guessing the magic timing numbers , every one of his guesses gave the spot on correction we needed for that element.

times: 16s, 16.04s, 14.5s

 

Obstacle course – 9th


We were quite hopeful of doing well in this challenge as we had speed, grip, ground clearance and a reasonable amount of driving practice under our belt. We weren’t too worried about most of the obstacles as they were less severe than the wooden obstacles we’d attempted in testing; the one unknown was the turntable, it looked like it might be tricky to line up and time the entry. We’d also noticed in some of the previous challenges that the gearbox that started clicking during testing now wasn’t reversing correctly, so the robot didn’t turn around the centre as it usually did. Rob did a good job of compensating for the dodgy gearbox, getting through the first few obstacles with no faults. The gearbox really scuppered us with turntable though, he struggled to line up and the robot didn’t accelerate straight forwards when asked to, causing us to get caught by the rotating bollards. By an unlucky design fluke, the bollards were a close fit with the gap between our wheels and we got wedged. We have nearly enough grip to climb vertical walls so we initially hoped we’d be able to climb out but we then got wedged against the wall and couldn’t move. In retrospect, we should have been clearer on the rules as none of us knew what the penalty was for getting rescued, so we took some time deliberating. If we’d known it was no penalty for a rescue, we would have moved the robot much more quickly and could have got a much faster time. The rest of the course was uneventful, although we were slightly disappointed with the run over the seesaw. In testing, we’d found the robot was fast enough to hit small wooden blocks and fly quite a distance, so we were hoping we could hit the ramp at speed and finish the course in the air. In reality, the dodgy gearbox hindered s again, and Rob couldn’t quite get it lined up as he hoped. We were still in the air over the finish line, but only because we were bouncing… Still, 1:26 with one rescue wasn’t too bad, good enough for 9th place. We shouldn’t dwell too long on the could haves and might have beens, but after looking at the video for our obstacle run, if we’d rescued the robot immediately after we got stuck, we could have finished in less than a minute with one rescue (no penalties), possibly putting us in first place.

time: 1:26 with one rescue

Pi Noon – second round


Our first round was over very quickly: Rob expertly maneuvered TTOS alongside Bedford Modern then turned in for the kill, popping the balloon in less then ten seconds! The second round didn’t go as well. Mark volunteered to drive this one (mistake?) and we hadn’t had chance to replace the dodgy gearbox (definitely a mistake), so turning was hampered. Westpark Club had the interesting idea of mounting the balloon/spike on the back of their robot, pointing forwards, making it easier for them to keep the balloon out of the way. After a short period of jostling for position, with TTOS failing to get around their side/back, the two robots came together. Westpark Club had lined up better and burst our balloon first. We were out.

Line following – not placed
Before the day we had made the mount for the line sensor and got some simple code written, but it was completely untested. As soon as we saw the course we knew it was pointless to distract ourselves with trying to get it working. The course was so narrow and twisty that we would really struggle to get around it. We had designed our sensor mount to work best on relatively fast line following courses, with no turns over 90 degrees and lots of space between turns, like the course was last year. Those courses favour a sensor that is well in front of the centre of mass. This tight course strongly favoured robots with the sensors very close to the centre of turning, allowing them to follow the very twisty bends and switch backs. So a no score on this one.

 

Code – 20th

Our low position in the coding probably reflects more on how little coding we’d done when we submitted it than the actual quality of the code. We’ll do better next year!

 

Aesthetics – 9th

Given we were one of the few entries that didn’t have a mass of wires sticking out, we were hoping to score higher in this challenge.

_MG_8870

 

 

 

 

 

 

 

Build Quality – 9th

Like the aesthetics ‘challenge’, we were hoping to do fairly well here, as the core robot was solid and had some nice features.  Maybe not actually having enough room for the wiring and connectors let us down, or maybe we didn’t ‘sell’ the design enough to the judges.

 

Blogging – 12th

Not a bad score, considering we didn’t start until pretty late in the day and some of the other blogs were excellent.

 

Overall – 3rd!

A fantastic result for our first time at the competition! With a bit more consistency and preparation, we’re hoping for higher next year

🙂

 

Other awards

Most featured
We knew we’d got more attachments than many of the other competitors but hadn’t considered this was an award during the run up to the event, so we were quite surprised when we were announced as the winner!

 

Progress – the final straight?

2nd Dec 2015

 

IMG_20151201_205322

 

 

 

 

 

 

 

 

The team’s been working hard the last couple of days. Let’s compare the progress against the job list from Sunday:

Wiring in the new body:

  • 12v power/motors: done (Mark  + Paul)
  • 5v power: done (Paul)
  • Analogue signals: half done (Mark)
  • Pwm signals: not started
  • Lcd: done (Dave)
  • Other digital I/O: not started

Obstacle course:

  • Practice driving: not done

Skittles:

  • Mount arms to the servos on the ball launcher: done (Mike)
  • All the wiring (both power and signal)*: mostly done, some routing issues (Mike)
  • Write some code that allows the wii mote to control the servos and the spinner motors*: not started

Pi Noon:

  • Design and print a wire holder*: done (Laser cut, Paul)
  • Practice driving: not done

IMG_20151201_225143

 

 

 

 

 

 

 

 

Straight line speed test:

  • Design and print the distance sensor holder / bumper: done (Mark)
  • Design and print the second (boost) battery holder: not done
  • Turn my pseudo code into real code: not done

IMG_20151201_224522

 

 

 

 

 

 

 

 

Three point turn:

  • Finish designing the camera/line sensor mount and print it out: done (Mark)
  • Design and print the beacon: started (Mike)
  • Get the basic code working*: Dave and Rob have spent two evenings finding and fixing bugs. Getting there but still lots to do
  • Tune the timings*: not started

IMG_20151130_220847

 

 

 

 

 

 

 

 

Line follower:

  • Finish designing the line sensor mount and print it out* done: (Mark)
  • Turn my pseudo code into real code*: not done
  • Tune it*: not done

IMG_20151201_223814

 

 

 

 

 

 

 

 

Proximity alert

  • Turn my pseudo code into real code*: not done
  • Design and print the attachment for our probe: not done
  • Tune it:not done

* = high priority

T-4 days? Agggghhhhhhh!

The Flinger takes shape….

30th Nov 2015

As part of night’s work the Ball Flinger has started to take shape….

Ball Flinger

 

 

 

 

 

 

 

 

First stage of assembly

 

Fly wheel assembly

 

 

 

 

 

 

 

 

 

The flywheel assembled with kevlar anti expansion strengthening.

 

 

General view with ball pusher arms installed

Ball pusher servos and arms installed… now to wire it up and attach the speed controllers

 

Mad dash

29th Nov 2015

Like many other teams, we’ve now appreciating how much there is left to do with only a week before the competition. On the bright side, we’re all managing to put in a bit of crucial extra time. Tonight we met at our space (I think for the first time we’ve ever hired it on a Sunday) to work on our modules:

I (Mark) was working on the wire routing, with Paul’s help:

2015-11-29 20.02.19

 

 

 

 

 

 

 

 

(Our speed controllers don’t quite fit, so we lost time hacking at the prints to make room)

 

Dave (also with Paul’s help) was working on getting the LCD display working, so that we can easily see what menu/challenge code we’re running:

2015-11-29 20.02.49

 

 

 

 

 

 

 

 

(Hopefully we can get a video of it working tomorrow, it’s looking really slick)

 

Rob was working on turning my terrible code into something that actually works:

2015-11-29 20.02.26

 

 

 

 

 

 

 

 

Mike was working on his solid looking ball launcher for the skittles challenge:

2015-11-29 20.01.59

 

 

 

 

 

 

 

 

And here’s the robot as it stands now, a nice looking but empty body shell and an overly complicated RC car:

2015-11-29 22.20.15

 

 

 

 

 

 

 

 

What’s left to do?  well, by challenge:

Obstacle course:

  • We could enter this now but it’d be nice to have all the components in the new body shell. That needs a load of wiring doing
  • Practice driving

Skittles:

  • Mount arms to the servos on the ball launcher
  • All the wiring (both power and signal)*
  • Write some code that allows the wii mote to control the servos and the spinner motors*

Pi Noon:

  • Design and print a wire holder*
  • Practice driving

Straight line speed test:

  • We could also enter this now but we really want to do it autonomously
  • Design and print the distance sensor holder / bumper
  • Design and print the second (boost) battery holder
  • Turn my pseudo code into real code

Three point turn:

  • Finish designing the camera/line sensor mount and print it out
  • Design and print the beacon
  • Get the basic code working*
  • Tune the timings*

Line follower:

  • Finish designing the line sensor mount and print it out*
  • Turn my pseudo code into real code*
  • Tune it*

Proximity alert

  • Turn my pseudo code into real code*
  • Design and print the attachment for our probe
  • Tune it

* = high priority

T-6 days? Agghhhh!

Driving onwards

28th Nov 2015

On the previous drivetrain blog post we’d found that cordless drill motors were excessively large and powerful for a piwars robot. We continued the drivetrain search, this time looking for smaller motors that could work in a one-per-wheel (four motors total) configuration. From last time we know that we want a total power output of around 120watts, or 30watts per motor. At this point we were hoping that we could use the small ‘1000rpm’ gearmotors we’ve used on small combat robots before.

 

_MG_7522alt yellow and black colour scheme

 

 

 

 

 

 

 

 

 

 

 

 

 

 

They give our beetleweight (~1.5kg) combat robot great performance and it looks like other competitors are looking at them too. From previous testing we also knew they had a stall current of ~6A at 12V (72W input) so they look like they might work here. As before, we went to enter them into the drivetrain calculator , but found the 1000rpm motors weren’t listed in their motor database.So this time  we put together our own numerical model (spreadsheet) of the torque, acceleration etc, allowing us to change the parameters and see what happens to the performance: This is something we’ve used successfully many times before, for combat robots to dragsters:

acceleration calculator

So they’re ok but it looks like they top out very quickly. The top speed is much below the 12m/s we were aiming for. What can we do about that? The usual options are: change the gear ratio (complex for us) or use larger wheels (increasing effective gearing, is more easily acheived but increases the stress on the motors). Another solution is running the motors above their rated voltage (something thats very common in combat robots). This also increases the stress on the motors(overheating). Playing with the drivetrain calc again, we found that increasing the wheel size worked up to a point (about where the motors stopped having enough torque to wheelspin), then the overall time started to increase again:

wheel size

accleration results 1

 

 

 

 

 

 

So we can get a good improvement there but its still not quite as fast as we’d hoped for (1.2s). As expected, changing the voltage also worked but running the motors at 36V is a long way from their rated voltage.

Playing with the inputs more showed a combination of increasing the voltage and the wheel size could get us pretty close to target:

accleration results 2

 

 

 

 

 

 

24v 95mm traction limited

 

Picking the 95mm, 24V design, we can see from the graph that we have more than enough torque (we’re traction limited) for most of the time and the finish speed is pretty high at ~9m/s, much more than we’ll need for the other challenges. Since the 95mm, 12V performance still looks pretty good, we now wondered if we could run at 12V for most of the challenges, just adding the extra battery pack for the speed challenge. This would reduce weight and stress on the motors when we don’t need the extra speed. As before, we quickly mocked up a drivetrain test platform to confirm the model:

second concept

laser cut chassis

 

 

 

 

 

 

Well that looks promising 🙂 The robot is about as fast as before but it’s now much easier to fit the electronics in. Reliability seems good, apart from the occasional wheel falling off… It also has a tendency to wheelie or roll over, so we’ll have to be careful for component placement to keep the centre of gravity low or limit the peak deceleration a little. Its still a little difficult to get the robot to reliably move slowly though (like we’ll need for the proximity challenge) but now the motors are smaller, we can fit smaller wheels just for that challenge, effectively changing the gear ratio and slowing the robot down:

second concept - small wheels configuration

 

 

 

 

 

 

 

 

So that’s the drive train sorted, better get the code started now 🙂

 

Raspberry Pi Zombie

27th Nov 2015

or “How I Fell in DooDoo and Came Up Smelling of Raspberries”

OK. Picture the scene. It’s late on a Friday night, and you’ve been hacking on your Pi Wars robot all evening. It’s been productive. Between you and co-roboteer, you’ve ironed out glitches in your Remote Control code, you’ve soldered up the wiring looms, and you’ve even designed and printed custom parts to mount the pi onto the baseplate you laser cut earlier. You’re robot building machines. Go you. High fives all round.

Flushed with success, you both decide to power up the pi and take it for a spin. So, you plug in the USB cable that you wired up to that adjustable 5v regulator earlier to step down the power from the lipo battery, and… uh. I’m pretty sure the activity light doesn’t normally do *that* when it’s booting. what the..? yank the power! yank the freaking power!

Welcome To Cockupsville. Population: You

facepalm

OK. wait a second. what just happened? Well, remember that 5v regulator you wired up earlier? The key word there was adjustable, Dingus. And you didn’t check it, did you? You buzzed every single other thing you soldered, but you forgot to check that the output voltage was actually, y’know, ADJUSTED to 5v. So you just sent how many volts into your Pi? 12? Excellent work. well done. Slow hand clap.

You decide to check the damage, hoping against hope that the Pi just kind of wouldn’t notice that you just rammed 12v up it’s tiny USB port and you can pretend like nothing happened. After all, no blue smoke came out, so … fine, right? Fine. Probably fine.

Except, no. Adjusting the voltage regulator to 5v (triple checked – bit late now, but whatevs) and trying to boot again does nothing. Well, not exactly *nothing*, but only some flickering of the activity light and no actual booting. Saddest of sad faces.

Alas Poor Pi

yorick

So that’s that then. You’ve fried your pi. It has gone toes up. Time to give it a viking funeral.

But. BUT. A bit of Googling seems to suggest a few things:

  1. There is such a thing as a polyfuse
  2. They can heal themselves when they’ve tripped.
  3. Actually flipping *HEAL THEMSELVES*
  4. The Pi has one on the USB power input.

So you leave it an hour and, with great hope in your dumb little heart, you plug it in.

Nothing. Just a bunch more flickering. Probing across the polyfuse seems to suggest that it’s maybe a bit better, but still a loooong way away from being useful as part of a functioning computer. sigh.

M. Night Shyamalan style plot twist

dr_frankenstein

Fast forward two weeks. You’ve nearly forgiven yourself for frying a perfectly innocent Pi. You’ve ordered a replacement and plumbed it in to your Bot, and you’re sitting at your desk idly surfing the web when you see out of the corner of your eye that poor little dead Pi, half hidden under a pile of papers. “I wonder…”, you, um, wonder.

So, you dig out a phone charger and a cable, and you plug it in. <DEITY> be praised! It’s booting. It lives! You’re like the Doctor freaking Frankenstein of consumer electronics! Except that he made a sort of patchwork quilt of chopped up people and you’ve reanimated a credit card sized computer. Same thing apart from that though. Probably.

Step aside Pi Zero, I give you Pi Zombie!

zombie_pi

So, there you have it. You too can get away with stuffing 12v into a 5v hole, if you’re very, VERY lucky. But what have we learned? Well, we’ve learned:

  • If someone gives you an adjustable voltage regulator and tells you that it’s set to 5v, don’t believe a damn word of it.
  • Stick your multimeter across EVERYTHING.
  • Raspberry Pis don’t like 12v up them, Corporal Jones.
  • Polyfuses are an actual thing. They freaking HEAL THEMSELVES, people. Insane.
  • Sometimes you can fall in the doodoo and come up smelling of Raspberries.

Technical Drawings: Interface

26th Nov 2015

Since there’re multiple people all working on different elements of the hardware, we’ve found it useful to share CAD and technical drawings describing the key interfaces. For example, here’s the latest drawing should the overall layout and the key mechanical interfaces:

piwars v0.4

 

 

 

 

 

 

 

 

Getting a bit of a track record…

26th Nov 2015

We saw the pictures of the tracks for the PiWars challenges and figured the best way to test our robot was to build our tracks so we checked out all the specs and figured out a cutting list . Then we had an evening of construction.

Here’s our skittles track

Skittles Challenge

Three point turn track

3 point turn track

Speed run track modelled by Mike (left) and Rob (right)

Speed Run Track