How do you think the new GigE standards will influence the machine vision industry?
Respond or ask your question now!
By Keith Reid
There is tremendous interest today in the military applications of image analysis technology in such areas as remote sensing, automated targeting and automated navigation. The remote sensing challenges have already been covered. With the other applications, the challenge often involves dynamically changing backgrounds, moving platforms capturing the imagery, a lack of perspective control and the need for real-time analysis under harsh battlefield conditions. Since the output often results in life-or-death decisions, reliability has to be high.
"For us, the real challenge is conducting this image analysis in real-time," said Roger Joel, sales and marketing manager for OCTEC Limited, a U.K.-based company that focuses on image processing technologies for a range of military and security applications. "You can do image analysis in industrial machine vision types of applications fairly quickly, though rarely full frame rate because there isn't actually that much of the necessity to do that. The added challenge for us is to do image analysis in real-time and then get it into a piece of hardware that is sensible and manageable in a military environment."
On the navigation front, the DARPA Grand Challenge, hosted by the Defense Advanced Research Projects Agency, has a goal of developing self-navigating logistics vehicles using a range of sensors (heavily weighted towards computer/machine vision) that can reduce manpower demands with the associated human risk and financial costs. In 2005, five teams completed a grueling 132-mile desert course with the winning team from Stanford completing the race in 6 hours 53 minutes and 58 seconds to claim a $2 million prize.
Along similar lines, a recent DARPA and NASA project, Autonomous Airborne Refueling Demonstration System, was undertaken by OCTEC and Sierra Nevada Corporation. It involved the automated air-to-air refueling of a NASA F/A-18B fighter jet in anticipation of porting the technology for the automated refueling of unmanned aerial vehicles. In the basic task, a refueling probe is inserted into a basket at the end of a hose (called a drogue) which is trailed by a tanker. The basket funnels the probe into a coupling that joins the two aircraft and allows refueling. The drogue and hose are typically moving to some degree in the aircraft slipstream. This is considered to be one of the more challenging tasks for a human pilot, and a significant feat for an automated system. It was successfully accomplished on Sept. 13, 2006.
"This process involved two activities," said Joel. "The first was recognizing and finding the drogue, because depending upon the angle that the UAV approaches the tanker the drogue can quite often be obscured or cluttered by the background in the image. You have the tanker engines and fuselage and so on, most of which are of a circular nature, which makes looking for the circular drogue more difficult. Part of the activity was to recognize the drogue from different angles, but we did know what the drogue looked like before we started. The second goal was to work out where the drogue was in relationship to the probe as a positional error to navigate the UAV for a hook up. We were then able to recognize when the probe was fully inserted into the drogue by looking at the image of the probe and the drogue and how they related to each other so that we could confirm that to the pilot of the tanker."