Difference between revisions of "Team Commando"

From GcatWiki
Jump to: navigation, search
(To Do)
(Image Processing)
Line 11: Line 11:
  
 
==Image Processing==
 
==Image Processing==
*camera
+
*Camera
<!--**http://www.logitech.com/en-us/webcam-communications/webcams/devices/6333
+
**Logitech Orbit (on a stick)
**http://us.playstation.com/ps3/accessories/playstation-eye-camera-ps3.html
 
**http://www.westernunion.com/wushop/category/product/304009645/search/Microsoft-LifeCam-VX-5000-Webcam---Red---Bulk/
 
**http://www.zagrosrobotics.com/shop/item.aspx?itemid=759-->
 
**simple webcam!
 
  
*software
+
*Software
**matlab toolbox
 
 
**http://homepages.inf.ed.ac.uk/rbf/BOOKS/PHILLIPS/cips2ed.pdf
 
**http://homepages.inf.ed.ac.uk/rbf/BOOKS/PHILLIPS/cips2ed.pdf
 
**Roborealm
 
**Roborealm
 
***useful demo: http://www.roborealm.com/tutorial/Fun_with_Roomba/slide070.php
 
***useful demo: http://www.roborealm.com/tutorial/Fun_with_Roomba/slide070.php
  
*method
+
*Method
 
**attach laptop to iRobot
 
**attach laptop to iRobot
**receive signal from robot when it has arrived in room
+
**receive signal from robot when it has arrived in room via command module serial ports => matlab
**use laptop to take image
+
**webcam pans and roborealm scans for magenta
**process image to determine whether or not a window is open
+
***matlab checks to see if roborealm has found magenta at every degree of the pan
 +
****if found, places a one in the vector entry corresponding to that room
  
 
*Current dataset
 
*Current dataset
Line 39: Line 35:
 
**attach color to bottom of window so can only be detected when open
 
**attach color to bottom of window so can only be detected when open
  
*to-do:
+
*What We Have So Far:
 +
**RoboRealm module string to find magenta blobs and control Orbit
 +
**MatLab script that opens RoboRealm and controls camera/gets magenta detection info
  
-figure out physical orientation of computer/camera
 
  
-build training set: every possible window configuration
 
 
-figure out how to get the images... opencv documentation: http://opencv.willowgarage.com/documentation/cpp/features2d_common_interfaces_of_feature_detectors.html, http://opencv.willowgarage.com/wiki/CameraCapture
 
 
-decide on preprocessing method: threshhold, line detection, contrast upping, pulling out the windows, or doing none of that and trusting the neural net.
 
 
-figure out how to get the command module to tell comp when to take a picture
 
 
-get the neural net trained and tested
 
  
 +
*To-Do:
 +
**Get magenta tape
 +
**test Orbit in different light conditions
 +
**look into more filter options
 +
***sunglasses seemed like an option
 +
***Rob McSwain has been emailed about video filters
 +
**Figure out exact frequency of magenta and how that corresponds to hue/intensity in RoboRealm so that extraneous magenta objects are not picked up
 +
**Add something to MatLab script to interpret the vector once the Roomba has been everywhere
 +
**Test/figure out the serial port pings to MatLab
  
  
 
*Active Thoughts:
 
*Active Thoughts:
 
+
**How should the camera move? currently pans 180; maybe depends on how the navigation people do room entering.
-RoboRealm seems sweeeeell.
 
::interface with MATLAB: http://www.mathworks.com/matlabcentral/fileexchange/22882
 
-current plan-ish:
 
::roborealm does processing and recieve signals from command module via serial port, then sends processed images ot matlab for NN classification. BOOM.
 

Revision as of 03:58, 7 May 2012

To Do

Navigation

Image Processing

  • Camera
    • Logitech Orbit (on a stick)
  • Method
    • attach laptop to iRobot
    • receive signal from robot when it has arrived in room via command module serial ports => matlab
    • webcam pans and roborealm scans for magenta
      • matlab checks to see if roborealm has found magenta at every degree of the pan
        • if found, places a one in the vector entry corresponding to that room
  • RoboRealm
    • Color to detect: magenta?
    • attach color to bottom of window so can only be detected when open
  • What We Have So Far:
    • RoboRealm module string to find magenta blobs and control Orbit
    • MatLab script that opens RoboRealm and controls camera/gets magenta detection info


  • To-Do:
    • Get magenta tape
    • test Orbit in different light conditions
    • look into more filter options
      • sunglasses seemed like an option
      • Rob McSwain has been emailed about video filters
    • Figure out exact frequency of magenta and how that corresponds to hue/intensity in RoboRealm so that extraneous magenta objects are not picked up
    • Add something to MatLab script to interpret the vector once the Roomba has been everywhere
    • Test/figure out the serial port pings to MatLab


  • Active Thoughts:
    • How should the camera move? currently pans 180; maybe depends on how the navigation people do room entering.