Search
  • Daniel Vlasits

Pi Wars

This year I took part in piWars (piwars.org) in a team of 3. Here is the video we made showcasing the event.


I detail some specific autonomous coding challenges I faced being the Head Coder of the team but first the video:



Hubble Telescope Nebula Challenge:


The hardest challenge and the one I spent the most time on was the Hubble Telescope Nebula challenge. The setup is that the robot is placed in a box 1.2m x 1.2m with 4 coloured panels at each corner. The robot has to work out where each colour is and go to them in the correct order: Red, Blue, Yellow and then Green.


The challenge can be broken down into two key steps:

1.) Detect where colours are

2.) Go to them in the correct order


Step 1 Go to them in the correct order:


The key here was to be precise and guarantee that you get the colours right as the robot could then later just memorize where the colours are and can just go to the colours. First I developed code that allowed the robot to turn 45, 90 degrees precisely. Originally, the robot just turned until the compass reported 45 degrees and then stopped but then the robot overshot. Therefore the system I designed was that the robot would quickly try to get close to the correct degree then "jig" into place as you will see below until the angle is just right. I bought a very impressive compass called the BNO055 which internally was always taking results and averaging them and would send a constant averaged value to the main code.


The second part of this was to figure out what colour the robot was looking at. The best solution we settled on was taking the r,g,b values of all pixels and average each of them to get a total r,g,b average and then look at which ones are dominant in each photo. We tried this in two ways.


1.) Hard code based on averages which photo is which colour

2.) Take a photo of each colour on the day and calculate the manhattan distance of the photo taken averages to the calibration photos and decide which one it is closest to.


It turned out that solution number 1 was the better solution especially because we didn't just say the photo with the most red was red as that could depend on the lighting in each corner. Therefore, the code looked at the maximums of the differences of the colours. e.g. here is the code for deciding which photo was red.

np.argmax(np.array([r-g-b for i,r,g,b in Pics]))


The red one was the one that had the highest r - g - b, the classified red one was then taken out of the options and the green one is then chosen with this:


return np.argmax(np.array([g-b-r for i,r,g,b in Pics]))


With the green one being the one with highest g - b - r.


You can see the above section of the nebula challenge in action below:



The second part of the nebula as stated above is to visit the colours as fast as possible. I want to talk about two interesting things I coded to maximise the speed below:


I quickly realised the only realistic thing to code was that the robot always goes back to the centre and turns to face the next card before going to it. However, the "jigging" process used above was far to slow for this step so a better tactic had to be devised.


What I coded, in the end, was that the robot would turn approximately using the compass and then on the way to the colour would adjust its path slowing the left or the right wheels to fine-tune its heading but without the time loss of the jigging process.


The other issue was that to avoid crashing into the coloured plates the robot had to move quite slow to avoid crashing into the walls. I solved this by using the ultrasonic sensors to map the distance of the colour to the speed with the trick being that the mapping is done in such a way that the wheels of the robot spin backwards before it has even entered the zone of the colour. This allows the robot to travel forwards at maximum speed and then shoot out of the zone as fast as possible as well.


You can see these two features in action in the test run below:




The other challenge I had a lot of fun with was the speed test. The speed test looked like below and the aim was to get from one side to the other as fast as possible:



To do this challenge ultrasonic sensors on the side were used as the path had walls on the side. Therefore an algorithm was needed which could take in the distances either side and potentially by using the compass could tell the motors what to do. The main algorithm just had to stay straight in the middle of the lane. This was a very interesting problem as depending on how close you are to either side of the walls the robot wants to be more or less angled to towards the centre for maximum speed. For example, if the robot is very close to the left hand side the robot wants a very sharp angle to get back to the middle faster so it doesn't hit the side. However if you are fairly close to the centre a shallower angle would suffice. The robot had two ultrasonic sensors on each side one at the front, facing left and one at the front, facing right and one at the back facing left and one at the back facing right. The robot would take a reading from two ultrasonics on one side and then average them. This was a very good approximation for the distance the centre of the robot was from the wall. These values were then mapped onto an angle that the robot should be facing at that distance from the wall. i.e. very close to the right hand side sharp angle left very close to left hand side sharp angle right and close to the middle angle is straight ahead and everything in between. The compass would then determine how off the robot was from that angle and then that would map onto a certain speed the left and right motors should be going at. This was the algorithm used to go straight.


Now all that is needed is something that turns the robot when it reached the bend as the ultrasonics couldn't take reading's if the angle was greater than 45 degrees. Therefore as soon as the robot picked up a string of useless readings from the front right facing sensor it knew it was time to turn at which point the compass would be used to roughly turn 45 degrees by just shutting of motors on one side and then algorithm from above would continue. You can see this code in action in the video at the top which was a best in our category with a time of 6 seconds.


It was very helpful as it was very similar code that was needed for the maze challenge as well however, on the day something went wrong with the connection from the ultrasonic sensor to the raspberry pi and the maze didn't work at all however, here is a video of the maze in action in testing, The wiggling occurs because of the inaccuracy of the ultrasonic sensors.



Here is the link to the PiWars GitHub Repository:

https://github.com/dvlasits/B0bby-T4bles/blob/master/nebula2.py




38 views

©2019 by My Tech endeavors.