How do you think the new GigE standards will influence the machine vision industry?
Respond or ask your question now!
Sensor fusion applications are particularly difficult if a sensor pod is located remotely relative to the sensor fusion engine. In applications where weight, bandwidth, or electro-magnetic interference are a problem, putting the sensor data onto a single high-speed fiber optic link is desired. This tech note describes how the new ARINC 818 protocol can be used to time multiplex three different sensor outputs onto a single fiber for transmission to the sensor fusion processing engine. ARINC 818 is a new video interface and protocol standard developed for high bandwidth, low latency, uncompressed digital video transmission. The advantage of ARINC 818 over other protocols, such as GigE, is that ARINC 818 is deterministic and can meet the stringent timing demands of line or pixel synchronous displays, in addition to having mechanisms built in for multiplexing video.
ARINC 818 is a point-to-point, 8B/10B encoded serial protocol for transmission of video, audio, and data. The protocol is packetized, but is video-centric and very flexible, supporting an array of complex video functions. Figure 1 shows an example of how an extended graphics array image is packetized using ARINC 818. The sum of all fiber channel frames required to transport an image is referred to as an ARINC 818 container.
ARINC 818 for Sensor Fusion
ARINC 818 includes mechanisms for multiple video streams to be time multiplexed on a single link. This is exploited in sensor fusion applications where several video sources are synthesized to achieve a superior composite image. For instance, long wave IR, short wave IR, and visible spectrum video are synthesized for better day/night and all weather vision. For this application, ARINC 818 can be used as a single fiber optic transport from the sensor pod to image processing equipment.
Figure 2 demonstrates how ARINC 818 frames can be interleaved on the link. This example assumes that all video sources are frame locked within the sensor pod (with all video frames more or less synchronous). In this case, the ARINC 818 container rates are the same and all three containers are transported within a single frame time.
The three sensors lead to three separate ARINC 818 containers on the link. These containers are uniquely identified by the source identification (S_ID) within each ARINC 818 frame header. The three video sources are multiplexed, a video line at a time, onto the fiber. The ARINC 818 receiver can then demultiplex these video streams based on the source ID, and assemble in real-time the video from each sensor, to synthesize them into a single fused image.