Advanced Imaging

AdvancedImagingPro.com

   

Advanced Imaging Magazine

Updated: January 12th, 2011 09:49 AM CDT

Exploring Mars in Utah

Student competition combines real-time video streaming and remote robotic exploration
The UCLA team checks out the Rover on “Mars.”
Point Grey Research
The UCLA team checks out the Rover on “Mars.”
A close look at the UCLA Rover showing where the Point Grey Dragonfly 2 cameras are attached.
Point Grey Research
A close look at the UCLA Rover showing where the Point Grey Dragonfly 2 cameras are attached.
The UCLA Rover operates on client-server architecture, with the Rover acting as the server. Interfacing with the server program through control subroutines are the serial motor controller, Point Grey Research (Richmond, BC, Canada) Dragonfly2 cameras, and a number of microcontrollers (used for sensor integration).
Point Grey Research
The UCLA Rover operates on client-server architecture, with the Rover acting as the server. Interfacing with the server program through control subroutines are the serial motor controller, Point Grey Research (Richmond, BC, Canada) Dragonfly2 cameras, and a number of microcontrollers (used for sensor integration).
Advertisement

By Barry Hochfelder

Navigation is primarily visual, with two Dragonfly2 cameras (one color and one black and white) providing streaming video at an average rate of 7fps over 900MHz and 12 fps over 802.11b. The software also allows for static pictures. To save on bandwidth, only one camera is active at a time, with the ability to swap between any number of attached cameras (up to the maximum supported by the 1394 hardware on board).

“We use the libdc1394 camera library to interface with the Dragonfly2 and take advantage of the bitmap image format to allow for a custom visual spectroscopy program,” says Boggeri, who will enter his senior year in aerospace engineering.” Using an IR-pass lens on the black & white camera and the red, green and blue color channels available on the color camera, the team created a program to perform visual spectroscopic measurements of a target to identify its composition and the error factor associated with the measurements.

“That gives us a fourth channel so we get RGB and IR responses,” he adds. “The program calibrates its readings using the published data for the CCD sensor spectral response against the spectral response of a Spectralon target. (Spectralon is a thermoplastic resin that can be machined into a wide variety of shapes for the fabrication of optical components.) The IR response is pretty accurate. We compare our rough spectral vs. known values for rocks and minerals. We can say it’s gypsum or that there’s a high concentration of copper in a rock.”

One problem the UCLA team ran into this year was underestimating the resolution needed for the visual range-finding mapping task. The target was white PVC pipes of about 10 cm (3.97 inches) in diameter and one to two meters (3.28-6.56 feet) high, with a white 15 x 30 cm (5.9 inches x 11.8 inches) flag attached. Teams had to survey the markers from about 0.8 km (875 yards), then had to determine the absolute map coordinates of the PVC pipes. “We weren’t able to find a zoom lens,” Boggeri explains. “We based it on the 1032 x 776 resolution of the Dragonfly 2, but we miscalculated the size of the markers.

“We knew the absolute dimensions of the marker. With a standard lens, the marker’s apparent size was only one pixel, not enough to carry out an accurate calculation of its distance from the Rover. There are two solutions possible for next year’s competition: one is to go to a higher resolution camera, the other to put a zoom lens on the camera. It narrows the field of view, but you get greater resolution at a distance. I’ve discovered that it’s hard to find zooms for a C-mount camera. And they’re very expensive.”



Subscribe to our RSS Feeds