Difference between revisions of "Team Commando"
From GcatWiki
(→To Do) |
(→Image Processing) |
||
Line 11: | Line 11: | ||
==Image Processing== | ==Image Processing== | ||
− | * | + | *Camera |
− | + | **Logitech Orbit (on a stick) | |
− | |||
− | |||
− | |||
− | |||
− | * | + | *Software |
− | |||
**http://homepages.inf.ed.ac.uk/rbf/BOOKS/PHILLIPS/cips2ed.pdf | **http://homepages.inf.ed.ac.uk/rbf/BOOKS/PHILLIPS/cips2ed.pdf | ||
**Roborealm | **Roborealm | ||
***useful demo: http://www.roborealm.com/tutorial/Fun_with_Roomba/slide070.php | ***useful demo: http://www.roborealm.com/tutorial/Fun_with_Roomba/slide070.php | ||
− | * | + | *Method |
**attach laptop to iRobot | **attach laptop to iRobot | ||
− | **receive signal from robot when it has arrived in room | + | **receive signal from robot when it has arrived in room via command module serial ports => matlab |
− | ** | + | **webcam pans and roborealm scans for magenta |
− | ** | + | ***matlab checks to see if roborealm has found magenta at every degree of the pan |
+ | ****if found, places a one in the vector entry corresponding to that room | ||
*Current dataset | *Current dataset | ||
Line 39: | Line 35: | ||
**attach color to bottom of window so can only be detected when open | **attach color to bottom of window so can only be detected when open | ||
− | *to | + | *What We Have So Far: |
+ | **RoboRealm module string to find magenta blobs and control Orbit | ||
+ | **MatLab script that opens RoboRealm and controls camera/gets magenta detection info | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
+ | *To-Do: | ||
+ | **Get magenta tape | ||
+ | **test Orbit in different light conditions | ||
+ | **look into more filter options | ||
+ | ***sunglasses seemed like an option | ||
+ | ***Rob McSwain has been emailed about video filters | ||
+ | **Figure out exact frequency of magenta and how that corresponds to hue/intensity in RoboRealm so that extraneous magenta objects are not picked up | ||
+ | **Add something to MatLab script to interpret the vector once the Roomba has been everywhere | ||
+ | **Test/figure out the serial port pings to MatLab | ||
*Active Thoughts: | *Active Thoughts: | ||
− | + | **How should the camera move? currently pans 180; maybe depends on how the navigation people do room entering. | |
− | |||
− | |||
− | |||
− |
Revision as of 03:58, 7 May 2012
To Do
- command module navigation (Daniel and Matt)
- image processing (Steph, Evan, Jonah)
- decide on a camera
- abstract
- poster
Image Processing
- Camera
- Logitech Orbit (on a stick)
- Software
- Method
- attach laptop to iRobot
- receive signal from robot when it has arrived in room via command module serial ports => matlab
- webcam pans and roborealm scans for magenta
- matlab checks to see if roborealm has found magenta at every degree of the pan
- if found, places a one in the vector entry corresponding to that room
- matlab checks to see if roborealm has found magenta at every degree of the pan
- Current dataset
- 26 closed windows
- 26 open windows
- https://www.dropbox.com/sh/hx5rwp9ilakwg5c/8Yf4twUjV1
- RoboRealm
- Color to detect: magenta?
- attach color to bottom of window so can only be detected when open
- What We Have So Far:
- RoboRealm module string to find magenta blobs and control Orbit
- MatLab script that opens RoboRealm and controls camera/gets magenta detection info
- To-Do:
- Get magenta tape
- test Orbit in different light conditions
- look into more filter options
- sunglasses seemed like an option
- Rob McSwain has been emailed about video filters
- Figure out exact frequency of magenta and how that corresponds to hue/intensity in RoboRealm so that extraneous magenta objects are not picked up
- Add something to MatLab script to interpret the vector once the Roomba has been everywhere
- Test/figure out the serial port pings to MatLab
- Active Thoughts:
- How should the camera move? currently pans 180; maybe depends on how the navigation people do room entering.