After a long hiatus on group projects, we are starting up a new project to build smart servos, based on modified hobby RC servos. The basic idea is to make a PCB with a microcontroller and some motor drivers to fit into a standard sized hobby servo. We might even try to fit into a mini servo case as well. We hope to add continuous rotation and position sensing via AMS magnetic position encoders, but those features may come on a later revision of the board. If you are interested in participating in this project, go to the Servo Project Brainstorming page and start adding your ideas and comments.
We’ve been making slow, intermittent progress on firmware for the Brushbot. We reached a new milestone in November, however. We now have simultaneous IR code transmission and reception working, both between bots as well as within a single bot to use for obstacle detection.
Cat uploaded a video she took of my dual brushbot avoiding / attacking objects, during the November 15 group project meeting. See the PARTS flickr pool.
We had a great time at the November meeting watching and discussing a number of different robot-related videos.
Here are links to most of them:
The URL is: http://github.org/portlandrobotics
Scott Dixon will fill in details of how to use it. Enjoy!
Come on down Monday night and join in on the BrushBotComm activities. We’ll be continuing with the IR receiver transmitter protocol development and mounting the boards on to brush bots.
6/28, 7pm bRrainSilo
Hope to see you there!
The next group project meeting is Monday evening — same time, same place:
6/14, 7pm bRrainSilo
How many people want to build up some BrushBotComm boards using the Tiny84 at the meeting? (Tiny44’s are still out of stock)
If there’s enough interest, we’ll do it — please reply on the mailing list, or below, with the number of Tiny84 based boards you want to build.
For those with a board already, let’s start working on the Rx/Tx protocol now that we have the IR working, and figure out the best way to strap these boards onto the BrushBots!
See you all Monday!
No project meeting this Monday 5/31 as it’s a holiday. But don’t let that stop you from dedicating your three day weekend to concocting the best IR communication protocol the Portland area has ever seen! Or just kick back and enjoy some nice BBQ weather. Whichever.
It would be cool to hear about any progress with the latest (ver0.3) IR Tx/Rx code that was released a couple weeks ago though. Anybody try it yet?
Next project meeting: June 14.
After this morning’s PARTS meeting and Indoor Challenge, a few of us were talking about using optical mouse sensors for tracking robot position rather than wheel-based encoders. Optical mouse sensors use optical-flow techniques to determine X & Y motion based on differences between successive snapshots of the underlying surface texture. These sensors contain very-low resolution (18×18) cameras and all the necessary DSP to distill the information down to a delta-X and a delta-Y value. It’s possible to read the raw image data out as well.
I’ve been wanting to play around with optical mouse sensors for a while, and the PARTS Indoor Challenge seemed like a good target application, so I threw some Avago ADNS-2620 sensors into the last DorkBot group order. Unfortunately I didn’t have time this last week to do anything with them. I figured I would need two sensors to track X, Y, and rotation since the mouse sensor doesn’t do rotation. This certainly isn’t a new idea — mouse sensors have been used for robotics for quite a while, but I hadn’t done much research on it until now. Here are some good links:
Precise Dead-Reckoning for Mobile Robots Using Multiple Optical Mouse Sensors (PDF)
Evaluates accuracy of two vs four mouse sensors compared to encoder-based dead reckoning.
Two sensors are significantly better than encoder-based DR if the robot speed is less than the max rate of the mouse sensors; four sensors are even bettter.
Four sensors are still significantly better than encoder-based DR even when the robot speed is greater than the max rate of the mouse sensors.
Testing was done on a felt surface, however, so the variability of the floor we’ve been running the PARTS Indoor Challenge on may require more than two sensors for robust positioning, even at slower speeds.
Cody’s Robot Optical Motion Sensor #1 (CROMS-1)
Nice writeup on hacking an optical mouse sensor, including creating a custom lens assembly for increased range.
Implementation Of An On-Chip Insect-Inspired Optic Flow Based Navigation Sensor
NASA Tech Brief referenced by Cody’s above writeup. Using optical mouse sensors with custom lens assemblies for flying robots (at low altitudes). Overviews the optical design methodology and presents data from actual flight testing. Concludes that optical mouse sensors are usable for terrain-following behavior on a robotic-flier. Free registration is required to access the report.
Required components for optical mouse sensor (if you want to start from scratch rather than hack a mouse):
(These aren’t necessarily the best mouse sensor for robot navigation, but they are readily available.)
More to come!
New to the Portland area, therefore, new to the Parts group.
I’m currently working on a rather large robot that will use several microprocessors. I have just added AVR processors to the list of processors that I work with. Now to get them to all talk together. More later as I am sitting in the April meeting now…..
Hi folks, I created a Flickr pool for us. You have to be approved to join in order to post new photos to the pool.
Here’s the link: