Difference between revisions of "Team R2D2"
(→Possible Cameras) |
(→With VLC) |
||
Line 33: | Line 33: | ||
=====With VLC===== | =====With VLC===== | ||
*See [http://forum.videolan.org/viewtopic.php?f=14&t=81119 this thread] | *See [http://forum.videolan.org/viewtopic.php?f=14&t=81119 this thread] | ||
+ | *http://www.videolan.org/doc/play-howto/en/ch04.html | ||
#Make VLC alias <pre>alias vlc='/Applications/VLC.app/Contents/MacOS/VLC'</pre> | #Make VLC alias <pre>alias vlc='/Applications/VLC.app/Contents/MacOS/VLC'</pre> | ||
#Now connect, and up comes live feed! <pre>vlc -vvv "http://http://10.40.181.49/videostream.cgi?user=test&pwd=&rate=3"</pre> | #Now connect, and up comes live feed! <pre>vlc -vvv "http://http://10.40.181.49/videostream.cgi?user=test&pwd=&rate=3"</pre> | ||
+ | |||
=====With Python/OpenCV===== | =====With Python/OpenCV===== | ||
*http://stackoverflow.com/questions/3001881/display-an-webcam-stream-in-pyqt4-using-opencv-camera-capture | *http://stackoverflow.com/questions/3001881/display-an-webcam-stream-in-pyqt4-using-opencv-camera-capture |
Revision as of 00:23, 8 May 2012
Contents
To Do
- Connect to iRobot & Send Command with MatLab (with laptop) - Cyrus
- Decide on Camera - Leland & Corey
- Subgroup Make Outline
Imaging
Hardware Setup
Summary
- Camera will be attached and set - to move the camera left or right we will move the robot
- We will have to decide if we want to create a contraption to raise/lower the camera
- We need to get a camera that we can also connect to via bluetooth... at first this could be a wired, but we need direct usb connection in final version
- Good Link
Possible Cameras
- Something
- BT-1 - 640x480
- For obtaining online images via MATLAB
- Serial port integrated camera--looks good
- Foscam FI8918W
SmartPhone Camera Vision
- Use android phone and vlc
- http://androidcloudcam.net/
- http://lifehacker.com/5650095/ip-webcam-turns-your-android-phone-into-a-remote-camera
- VLC Command Line
Feed Streaming
With VLC
- Make VLC alias
alias vlc='/Applications/VLC.app/Contents/MacOS/VLC'
- Now connect, and up comes live feed!
vlc -vvv "http://http://10.40.181.49/videostream.cgi?user=test&pwd=&rate=3"
With Python/OpenCV
- http://stackoverflow.com/questions/3001881/display-an-webcam-stream-in-pyqt4-using-opencv-camera-capture
- https://www.dropbox.com/s/3el9e70y71f19ig/webcam.py
How to connect to IP camera
- Connect to davidson device using WPA personal with the passphrase
- Go to the address http://10.40.181.49/
- Log in using our id and password
Install Open CV on Mac
- Install with MacPorts here
- Use Command:
sudo port -v install opencv +python27
- Now Check:
port select python
- Again:
port installed opencv
- Then, depending on mac ports settings:
/opt/local/bin/python2.7
- If we want to set this as the default interpreter:
sudo port select --set python python27
- To uninstall something:
sudo port uninstall opencv @2.3.1a_3+python26
- While we are at is go ahead and install pil
sudo port install py27-pil
- Also GStreamer python bindings
sudo port install py27-gst-python
- Use Command:
(Below are random notes)
- Install with OpenCV
- Otherwise, download latest open cv release for mac... nevermind this wont work for enthought, just give up.
- Install ffmeg using this tutorial
- Install libdc1394
- PYTHON_PACKAGES_PATH=/Library/Frameworks/EPD64.framework/Versions/7.2/lib/python2.7/site-packages PYTHON_LIBRARIES=/Library/Frameworks/EPD64.framework/Versions/7.2/Frameworks PYTHON_EXECUTABLE=/Library/Frameworks/EPD64.framework/Versions/Current/bin/python PYTHON_INCLUDE_DIR=/Library/Frameworks/EPD64.framework/Versions/7.2/include/python2.7 cmake -D CMAKE_BUILD_TYPE=RELEASE -D CMAKE_INSTALL_PREFIX=/usr/local -D BUILD_PYTHON_SUPPORT=ON /Users/lelandtaylor/Downloads/OpenCV-2.4.0.tar/OpenCV-2.4.0/ <path to the OpenCV source directory>
- Run sudo make install
Issues
- How do we connect to the camera wirelessly?
- New leads:
- http://bitshift.bi.funpic.de/en/dslr-remote/manual/bluetooth.php
- http://www.informationweek.com/news/cloud-computing/software/228800042
- Pros: we can use almost any dslr camera.
- Cons: A lot of (potentially very difficult) coding. And we still need how to send the photo back to our laptop.
- MATLAB approach:
- http://www.mathworks.com/matlabcentral/newsreader/view_thread/154601
- Pros: it's been done before (though maybe not without the acquisition toolbox, but I think it's possible). It looks very straightforward.
- Cons: Acquisition toolbox is a lot of money. Couldn't find anything about using newer versions of MATLAB to do this. Have to use low res webcam.
Image Processing
Matlab Imaging
- Image Acquisition Toolbox - I think this costs money
- Matlab toolbox to control cannon cameras
OpenCV
- Has some nice functions that could be used to detect opjects
- OpenCV (Open Source Computer Vision) is a library of programming functions for real time computer vision.
- "applications of the OpenCV library are Human-Computer Interaction (HCI); Object Identification, Segmentation and Recognition; Face Recognition; Gesture Recognition; Motion Tracking, Ego Motion, Motion Understanding; Structure From Motion (SFM); Stereo and Multi-Camera Calibration and Depth Computation; Mobile Robotics."
- OpenCV Documentation
- Python Interface
- C++ Interface
- List of compatible cameras
- Camera used in FLAIL project: Logitech Webcam Pro 9000
- This camera will work with openCV and matlab camera control features (see here)
- Camera used in FLAIL project: Logitech Webcam Pro 9000
- If we go with openCV for grabbing the images from webcam and matlab for navigation and some processing, then we'll need a way for the two to talk. Here are some (potentially) helpful links. I haven't yet found anything specifically about MATLAB/terminal interaction.
Below is python code taken from FLAIL v2
################## # Camera Control # ################## class FlailCam: """ Controls a USB webcam. Uses OpenCV to control the camera, and should therefore work with any V4L2-compatible camera (in linux) or just about anything (in Mac OS). Webcam support being what it is in Linux, though, it may not work at all. If you find yourself in the it-doesn't-even-begin-to-work boat, take a look at flailcam.py from FLAIL 1.0. It has a (terrible, kludgy, slow) workaround that sometimes worked better. The important method (get_image) returns a Python Imaging Library (or PIL) object; these are well-documented on the PIL website, at <http://www.pythonware.com/library/pil/handbook/image.htm>. You're probably most interested in the getpixel, putpixel, and save methods. FLAIL 1.0 had a FlailImage class, which wrapped the same PIL image class used here; if you'd like an extremely simplified interface, take a look at that code. If you'd prefer an OpenCV object, just take a look at the get_image source code, and you'll see what to do. """ def __init__(self, index = 0): """ Connect to a camera. OpenCV (which we use for camera control) numbers these starting from 0; the index parameter tells which camera to use. In Linux, an index of X implies the device /dev/videoX. """ self.cap = opencv.highgui.cvCreateCameraCapture(index) def get_image(self): """ Take a picture. Return it as a PIL image. """ return opencv.adaptors.Ipl2PIL(opencv.highgui.cvQueryFrame(self.cap))
Here's the potential navigation plan, as discussed by Cyrus, Jack, and Corey on 5/6. Our plan was to break up this general plan, itemize it, edit it, and for each person to claim some of it to work on. Email everyone with comments, suggestions, or new ideas.
- Wall Finding Algorithm
- Assume start on the 3rd floor on the right wall by the math department side. Use the wall finding algorithm developed by Cyrus to navigate along the right wall. In wall finding loop, use Door Recognition Algorithm to find classrooms.
- Door Recognition Algorithm
- Continuously take photos at regular intervals and analyze them for the doors. Find doors by placing tape on door jamb so it is noticeable from where the robot will be coming from. When a door is recognized, call the Door Navigation Algorithm.
- Door Navigation Algorithm
- Navigate some distance to get in front of door opening (maybe wait for the door tape to become a certain size before stopping?), then turn 90 degrees into the door, and go forward until in position. Then call Window Picture Algorithm (ask window team). After taking the picture and analyzing it, call Door Exit Algorithm.
- Door Exit and Re-Wall-Finding
- Robot goes back the same way it came (maybe follow same steps?). Once outside, turn right 90 degrees, ignoring tape seen on the door (because it's already seen it and navigated through that classroom) and navigate forward a predetermined distance, then start wall-finding again.
Notes
Jack and Duke met on 4/20 to discuss navigation. The consensus was that our first goal is to figure out how to design a virtual map of a space and write commands to get the robot to go to a target location in the space. Once we do that, we will work on mapping out chambers, hopefully using the robot to gather data of the shape of our chambers floor (which one are we going to do?) and turn that data into a map. We also discussed the possibility of using traveling salesman problem model to get the robot to our target locations as quickly as possible.
Uninstall Python
- Remove framework
sudo rm -rf /Library/Frameworks/Python.framework/Versions/2.7
- Remove Applications Dir
sudo rm -rf "/Applications/Python 2.7"
- Remove symbolic links in /usr/local/bin
ls -l /usr/local/bin | grep '../Library/Frameworks/Python.framework/Versions/2.7' | xargs rm
orcd /usr/local/bin; ls -l . | grep '../Library/Frameworks/Python.framework/Versions/2.7' | awk '{print $9}' | xargs rm