thisismyrobot

Littered with robots and Python code

Python's working on DSL

I found a .uci of Python today, and ran it under the latest version of DSL. It installed quickly, and works fine, but I was unable to import the pyserial library to it due to the way that DSL loads uci's into a read-only memory space. I had to copy the contents of the "serial" directory in the "pyserial-2.2" archive to the root of my program folder. I then had to rename "__init__.py" to "serial.py".

This got the program starting without error but, as yet, I have been unable to get any actual serial data returned from the robot. I assume this is simple a baud rate issue, as under XP/Vista, I had to set-up the USB-rs232 adapter in device manager before it would work. I just need to find the equivalent under linux.

More to come AFTER tomorrow's exam.

Shopping...

I just ordered a K1001 8 Channel Serial Servo Controller from Ozitronics (under motor control). This fully assembled and tested unit was fantastically priced at $44AUD and can control 8 servo motors using rs232. More importantly, it can use use any of its 8 output pins for digital output. 4 of these pins are to be spoken for by the h-bridge in the robot. This should give me much better motor speed control than before, and give me head-room for expansion with a servo-based arm/gripper etc.

OK, back to study...

It Moves!

Finally, after months of mucking around, the robot moves under its "own" steam.


Since my last post, I have made a small change to the code with the addition of a very simple "fuzzy" motor control algorithm. Specifically, the offset of the target in the camera's view directly influences the pause between switching the motor on and then off again. This allows for large corrections initially, followed by small ones as the robot locks on. These "on" durations can be clearly heard in the video. I have also uploaded the program currently in use here.

Oh, and to the cross-platform abilities of Python, the laptop used for this demo is running Vista, and all I had to do was to install the two USB-RS232 adaptors (from here), set-up the baud rates, change to code to match the allocated ports, and it worked first go!

Next step, installing Python on the MicroClient Jr.

The snake wins me over

It has been an interesting day today - exam study combined with robot-related diversions that lead to the altering of two fundamental components of the robot.

As already mentioned, I initially started this project wanting the focus to be on developing light-weight effective vision-based algorithms (keeping my expectations to a minimum...)

I wanted to quickly develop a basic and robust platform that could handle a large range of programming ideas quickly. As part of this I devised an - in theory - fairly cross platform motor-control circuit based on a keyboard's caps/num/scroll-locking LEDs (described in articles below). Testing with the power of my index finger on the Caps-Lock key, I was able to reach a fairly high rate of switching, surely enough to form a rudimentary PWM system for the robot's motors.

Inline with my cross-platform ideals, I chose Java as my programming language, primarily due to its Toolkit method setLockingKeyState() for controlling the keyboard's LEDs. Also influencing this decision was a cross-platform (Windows/Linux) Serial IO class included in the CMUCam demo files.

Cobbling these together, I created my first Java application for the robot, implementing some very rudimentary functionality - the tracking of a big red box on a very not-red background. This worked, as described in the preceding article, the robot was able to lock onto the box's location. Unfortunately, a glaring problem quickly appeared - related to how Java implements the setLockingKeyState() method. This method is part of the Java GUI, Sun accompanying the method description with the following note (see here):

Many GUI operations may be performed asynchronously. This means that if you set the state of a component, and then immediately query the state, the returned value may not yet reflect the requested change.

This is certainly a bit of an understatement. The setting of the LED's current state via setLockingKeyState() virtually never reflects the LED's actual state, so the robot will occasionally just "bolt for the door" after turning "off" the LEDs, or just not move at all when told to turn etc.

This lead me to a dilemma. Along with the reasons stated above, I chose Java for its platform independence, and relative high-level programming methodology. I wanted to develop an aspect of the robot's programming on my desktop pc in Windows, test it in Windows, pop the file on a USB stick and run the code instantly on the robot for final debugging. I did not want to be altering the code per OS or hardware architecture. The speed of the program was (at the moment) not the most important thing, I just wanted rough results for my ideas in a short time-frame. On top of the keyboard LED problems, I was having problems with the Serial IO class, as I was taking it well beyond its initial specification (primarily in the areas of parsing input and timing).

These shortcomings, along with the amount of time spent debugging vs developing let me to the conclusion that I now had to try another programming language.

After much arm-waving, a possible solution hit me. I had used Pyro (Python Robotics) a couple of years ago to play with some simple neural-network ideas i'd had. Whilst not wanting to use the full Pyro system (yet), I did enjoy the high-level power of Python, and 10MB later, I was in business. In just 1 hour, the entire demo application that had taken me days to develop in Java (primarily messing with Serial comms) was recreated in Python. A fantastic side effect of the swap to Python is that swapping from Windows to Linux requires virtually no changes - just a change of the port label from "COM1" to "/dev/ttyS1" is all that is needed.

Naturally, there were problems. I was unable to find (so far) a method of altering the keyboard LEDs, so aside from the easy and robust cross-platform rs232 comms, I had hardly progressed. This lead me to google, which lead me to this.

Now came the second big change. Windmeadow's idea had me dump the keyboard-controller based interface, in favour of using both the rs232 RTS and DTR pins to control each motor via my second serial port (as a temporary fix at least). Whilst this is something that you could argue that I should have done at the start, I really wanted to use the keyboard io board as it would make use of the robot's unused (during operation) ps/2 port, possibly negating the need for a battery-hungry USB hub.

Regardless, after all this mucking around, I have decided that I will make the best use possible of robust the serial methods in python (via pyserial) and I will upgrade the motor control to an rs232 motor-controller board.

Oh, and despite the hacked nature of the RTS/DTR fix, I used it to attain the same functionality as the keyboard LEDs but with 100% reliability and excellent control. I was able to reliably send pulses to the motor with durations less than 1/10 of a second!

Well, that's it then. My suggestion, if you want to build something quickly that uses a lot of RS232, and does what you want when you ask it, try Python.

Progress (Finally!)

Finally, (in the middle of my exams, of course) I have some proper progress to report on my robot.

I have determined the first feature/ability benchmark for the robot, and I have made some significant progress towards that benchmark.

Before the first benchmark, I had a number of goals, outlined below:
  • Fabrication of basic frame, motors, camera hardware etc (90% complete)
  • IO board for PC-based motor control (70% complete)
  • Battery systems (50% complete)
Some of these will have to be further completed before the first benchmark can be complete, the rest are good enough for proof-of-concept development.

The first benchmark is based on creating some simple vision based navigation software and is defined as such:
  • Provide and act upon locational information on coloured target (100% complete)
    • This is the detection of a red target (as seen in the image on the right), and the reporting of commands to rotate the camera (robot) to centre the target horizontally. This has now been completed, with the robot able to locate and centre on the target autonomously, even if the target is not initially in the frame.
  • Applying commands (50% complete)
    • Finish attaching the camera to the robot, connecting the camera to the MicroClient Jr, and debugging and movement command problems.
  • Path Planning (10% complete)
    • This is the determination and actioning of a path towards the target, including the avoidance of obstacles (Conveniently placed shoes...) along the way.
Now, obviously, I don't expect to create a perfect vision-based navigation program in the next two weeks! As you will notice, the floor is a rather conveniently white and the obstacles are rather conveniently contrasting black/purple. If the code under development for the path-planning works, the robot should be able to navigate a simple situation like this with just the single camera. It should also be noted that the CMUCam does the majority of the actual target tracking and most of the work described in this article was in the development of a robust RS232 command suite for the camera and a system for converting the target's locational information into movement commands.

Speaking of the camera, the image shown above is an actual still from the CMUCam. When recorded, the image has a resolution of 80x143 pixels. I have doubled the horizontal resolution of to repair the perspective. Before the alteration, the image looked like this (on left).

Finally, I will post some code for the tracking and centre-ing as soon as it is debugged in my damn-small-linux build.

CeBIT 2008 Australia

I've just returned from CeBIT 2008 in Darling Harbour, Sydney, Australia (thanks to my work ) where, in my free time, I did my best to soak up as much embedded-computing technology as I could. Here are some of my highlights.

Backplane Systems
Backplane Systems had a number of exciting competitors to the Norhtec MicroClient Jr that I am using, although most were in a slightly higher price bracket (with accompanying higher specs/performance, of course). Here is an example of one of their Fanless Micro PC's, the MCP21A . This little beauty comes with 4xRS232/422/485 ports, 4xOpto-Isolated inputs, a 500MHz processor, USB etc.

Wincomm and ICP-Australia
Whilst at CeBIT, I found a product that I really want to get my hands on. Both Wincomm and ICP-Australia had a range of fanless, touch-screen panel PC's on display. These come with processors in the 500MHz - 1GHz range, 7-10inch lcd's, 1G RAM etc. Most come with a number of RS232 ports, and will run fully blown Linux installs and/or XP (on the right is an example from ICP-Australia).
All I could think of was strapping a pair of gearboxes, some batteries and a castor wheel onto one, and letting it run free!

There were some other companies that I noticed that were involved in embedded-computing, particularly ARBOR Australia, SOANAR and AFOLUX

The whole trip was a fantastic chance to catch up with current embedded-computing technology, I shall definitely be there again next year!

For more info on these companies, google is your friend.

7 gram robot can jump 1.4m every 3 seconds!

(Yes, I'm back, after a study-necessitated break)

Researchers from the Laboratory of Intelligent Systems at EPFL have demonstrated a 7 gram micro-robot that has the ability to jump more that 27 times its own body size. Using a tiny pager motor, the robot builds up a store of elastic energy that is released to propel the robot into the air. The researches envisage extending the technology with additional power coming from solar panels and a suite of micro-sensors for remote monitoring in hard to reach locations.

Gallery (Gizmag)

Sources: Gizmag , Physorg

Copehill Down's Robot invasion

This is an interesting article that was brought to my attention by one of my intrepid reporters, Nick.

Copehill Down in England, a mock East German village used for urban warfare simulations, will be the focus of a Robot invasion in August.

The UK's Ministry of Defence (MoD) will be conducting a competition to sort the wheat from the chaff when it comes to Robotic Urban Warfare.

11 Teams will compete to find a number of threats throughout out village, the winner possibly gaining a MoD contract for wartime deployment.

The newscientist article briefly covers three of the teams, and a quick google search will fill in the rest. I will certainly be following this competition!

SwashBot has a friend

It seems to be the time for follow-ups at the moment. Previously, I wrote about an awesome little robot called SwashBot that was based on the swash plate of an RC Helicopter.

Yesterday, I was looking at the pictures again and noticed an addition at the bottom. SwashBot's creator, Crabfu, has just built a larger version, SwashBot 2 (pictured besides the original).

There is a video of SwashBot 2 on his site , it has the same style gait as the original, but is somewhat larger.

Kudos goes out to Crabfu, both of these bots are fantastic examples interesting robots built on a budget. I just hope he puts a little automation into them now; there are plenty of Microcontrollers that could handle that many Servos directly and that gait coupled with a simple light-seeking, obstacle-avoiding behaviour would be fantastic to see.

Hexapod follow-up

Recently, I wrote about Hexapod, the fantastic robot by Matt Denton. Robots Dreams now has another article about Hexapod, this one primarily a video.

If you have 1:33 free in your day, I would certainly watch the video as it has more fantastic vision of its fluid motion.

source