Nested QR code with the Sumo's wheels against the monitor.
To implement "auto-docking" I am intending to use a 7x7cm square nested QR code target and the Sumo's FPV camera. The QR nesting allows detection and recognition at distances up to 50cm whilst remaining accurate right up the moment of touching the docking station. The pictures shown here are actual snapshots from my Jumping Sumo's FPV camera (640x480 resolution).
Same QR code image at an approx range of 30cm from the monitor.
I've prototyped
some detection code
using an install of the python-qrtools package on the Raspberry Pi:
This code successfully recognises the two QR codes in these test images, though it takes about half a second to do so.
In the
following code
the built-in
yavta
tool is used to push (and buffer) a snapshot from the camera to /dev/stdout and then
base64
is used to push an encoded version of /dev/stdout back to the Python client where it is decoded and saved to disk. As before I have to
kill
the
dragon-prog
process to gain access to the camera.
There are a number of alternative image capture routes using
yavta
, e.g. saving the file to the Sumo's flash or a ram disk and using FTP to recover it. This one seems to be the quickest (and simplest) for now and will suffice for the purposes of identifying and homing in on the charging station.
As part of the my
attempts to set up inductive charging
on a Parrot Jumping Sumo I am looking at some form of "docking" control where when the robot is close to the charging station it attempts to self-dock, using the camera and some sort of marker on the charging station.
To do this I'm going to need to access the video feed and before I go down the path of setting up the full Parrot SDK I thought I'd have a poke around the Sumo's Linux OS internals.
It turns out that the Sumo has
yavta
- aka "Yet Another V4L2 Test Application".
So, after telnetting to the Sumo on 192.168.2.1 I first needed to stop the Sumo process that was accessing the camera (disclaimer, this is the process that does "everything" that makes the Sumo work with your phone so you'll need to reboot to restore normal functionality):
I've tidied up the install significantly and switched the RFID tag to be on top of the robot. Previously it was at the front (to be close to the detection coil) but that was obscuring the charging coil and camera. Doing it this way also allows me to play with the balancing mode of the Jumping Sumo.
The charging appears to be very slow though so the next step will be to push the charging coil completely flush to the box to get the closest possible coupling.
Here's a close-up of the neater install (I've since routed the power supplies through a cable gland):
Obviously the
IKEA box
will be replaced with something a little more visually-appealing (and with less masking tape), but the basic RFID detection and charging enablement control loop works perfectly. The code for the Raspberry Pi is
on GitHub
.