How do you think the new GigE standards will influence the machine vision industry?
Respond or ask your question now!
By Leonard A. Hindus
Scientists at MBARI and collaborative researchers use information from the VARS database to investigate individual species and the relationships between them. They document such things as depths, spatial relationships, seasonal occurrences, seawater conditions and diversity of organisms in Monterey Bay waters. Scientists can also use the timecode-referenced database queries to return to archived videotapes for more comprehensive analysis of organisms, to view a geological feature, locate deep-sea equipment or other research.
MBARI is now developing technology to visually process images for event detection and for recognition of target biological species. The system, which is called AVED (Automatic Visual Event Detection), will use neural network technology to locate and identify animals and features.
"Annotation is very intense work by highly trained people," says Stout. "The annotator has to scan through the tape to locate an animal or feature. They have to identify that feature or classify that animal, enter that information with relevant comments into the knowledge base and then scan for the next animal or feature." Sometimes there are many animals in the frame. Each one must be annotated and entered into the knowledge base with comments on their relationship to each other. Says Stout: "Because of fatigue and eyestrain, we don't allow our annotators to work more than four hours a day. Even a system that did no more than scan the tape and locate animals or features would be a great help."
MBARI is developing an automated system for detecting marine organisms visible in the videos. Video frames are processed with a neuromorphic selective attention algorithm. The candidate objects of interest are tracked across video frames using linear Kalman filters. If objects can be tracked successfully over several frames, they are labeled as potentially "interesting" and marked in the video frames. The plan is that the system will enhance the productivity of human video annotators and cue a subsequent object classification module by marking candidate objects.?
The goal is to identify an organism autonomously in real time. State-of-the-art feature and motion-detection chips, modeled after biological vision systems as silicon implementations of the selected algorithms, will be used. These neuromorphic systems feature ultra-low power, large dynamic range and intrinsic real-time image processing and feature detection with greatly reduced data-storage requirements. The science application focus includes mid-water and deep-ocean animals, as well as animals that use bioluminescence in the deep sea.