There is a lot involved in building your own robot: sourcing adequate parts, knowing how to assemble them and then how to get the robot to do things by writing software. Buying a robot kit glosses over these steps as the parts are provided together with assembly instructions and often a software library to make programming simple. Going from building such a kit to your own robot can be quite a shock for this reason.
This is why this book is valuable. It covers a lot of ground! The author, Danny Staple, guides us through the process of building a wheeled robot without relying on a kit. Instead, Danny describes the components required and provides some recommendations but allows for variation. Step by step we are shown how to set up a Raspberry Pi and control motors & servos, read inputs from distance sensors, line sensors, and use the Raspberry Pi camera. Everything, hardware and software, is explained from scratch without assuming any prior knowledge and in an easy to read style. Instead of relying on a library of code Danny shows us how to implement everything required a small feature at a time. He also explains the principles behind why he structures the code as he does; so we learn how to extend the software on our own without it becoming a hard-to-maintain mess.
The first couple of chapters describe what a robot is and give some examples of contemporary robots. The next two introduce the Raspberry Pi and how to set one up. The material here is very similar to anything else available on the internet but including here is helpful as it keeps all the information required in one place. Chapter 6 is where we really get started; covering choosing a chassis for your robot as well as a motor controller. Danny recommends beginning with a simple car-like chassis with two-wheel drive as a good choice for a first robot. This chapter also describes powering the robot and how to provide power for the Raspberry PI and the motors. We also see how to assemble the chassis with Raspberry Pi, motors and batteries.
Chapters 7 and 8 get us to the point where our robot can move and follow a line drawn with a marker. We also hear why it’s good to separate the code talking to the motor controller from the code managing the robot’s behaviour. The optical sensors used for line following are introduced. We find out how they work, how to attach them to the Raspberry Pi and how to read the values using python. By the end of the chapter, the robot can follow a circuit drawn using a black marker pen.
Chapter 9 shows us how to control LED strips from our robot which as well as providing eye-candy allows the robot to give some indication of its state and make debugging easier. We also do some soldering for the first time, and the book provides some basic advice on soldering and how to get a good joint. We use the Raspberry Pi’s SPI interface to drive the LEDs and see python code for doing that. The line following behaviour gets updated to give us left and right “indicator” signals as the robot turns. This chapter also introduces the HSV (Hue, Saturation, Value) colour system to describe how we can program the LEDs to make rainbow effects.
In chapter 11 things really get interesting! We see how to mount ultrasonic distance sensors and connect them to the Raspberry Pi. We’re not using i2c sensors, so we need to use raw I/O to time the response and work out the distance. The same code uses busy waits and may not be accurate as it doesn’t compensate for multitasking. We first write a simple object avoiding behaviour for the robot and see that it works but has problems causing it to appear to be indecisive about some obstacles and occasionally ramming into things. We can do better than this, and so we then develop an algorithm that provides a finer adjustment of motor speeds based on distance from an obstacle.
Now we have several behaviours we see how to build a simple web application on the robot that lets us start and stop all the defined behaviours.
Chapter 12 provides us with a means for our robot to know how much each wheel has turned so we can automatically adjust for variance in motor speed.
Chapter 13 attaches a camera to the robot and allows it to chase objects of a particular colour. We also see how to use the pan & tilt mounting to track faces using OpenCV’s built-in HAAR cascade classifier. The text describes how HAAR cascades work and provides some references which go into more detail. The book probably should have pointed out that HAAR cascades while efficient are not state-of-the-art and there are better, more modern, techniques out there.
Chapter 14 introduces voice control by installing MyCroft on a second Raspberry Pi which controls the robot by making HTTP requests to the simple web application we’ve been building on the robot.
Chapter 15 then shows us how to build a slightly more complex web UI that provides slider controls to drive the robot as well as a display of what the robot can see. The two sliders only control the wheel motors, but by this point, we have all the information we need to add controls for the pan & tilt camera platform.
The last two chapters provide links and guidance for what we can do next when contemplating designing and building our own robots.