How do you think the new GigE standards will influence the machine vision industry?
Respond or ask your question now!
The true purpose of electronic imaging is to convert data to useful information. It doesn't matter how big your pixel array is if you can't get the data off the imager and processed into actionable information quickly enough for your application. With today's ultrafast, über-high resolution cameras, the limiting factor is often the speed of data transfer between the imager and the PC or PLC that uses the data. With the release of the new FireWire 1394 S3200 and USB 3.0 standards, however, such bottlenecks may be a thing of the past. S3200 operates at 3.932 Gb/second, a factor of four boost over the current S800 FireWire version, whereas USB 3.0 offers 5 Gb/second, 10 times the speed of USB 2.0.
Of course, faster isn't necessarily better, unless it improves performance or enables new applications. Gigabit Ethernet cameras run at, well, one Gb/second, while CameraLink provides point-to-point data transfer at two Gb/second. Does electronic imaging really need anything faster? Does the imaging community stand to benefit from the new standards? Let's take a closer look.
Burning up the wire
FireWire, the IEEE 1394 standard, was first released in 1995 and included data transfer rates for the now-glacial S100 (100 Mbytes/second), S200 (200 Mbytes/second), and S400 (400 Mbytes/second). At that point, FireWire was based on data-strobe encoding (alpha signaling), where the bit rate and data transfer rate are equal. The 2002 update, 1394b, included the S800 (983.04 Mbytes/second) specification, as well as nominal specifications for S1600 (1.966 Gb/second), and S3200. The most recent 1394-2008 standard combines all previous releases and includes complete definitions of S1600 and S3200.
In an effort to minimize line charging, the 1394b release switched from alpha signaling to 8b/10b encoding (beta signaling), in which an 8-bit word is actually encoded in 10 bits. In other words, the method imposes a 20 percent overhead, which means that the data transfer rate of 3.2 Gb/second is 20 percent less than the bus speed of 3.932 Gb/second. Today's S800 1394b PHY silicon is backward compatible and supports both alpha and beta signaling on each port , making them "bilingual." "Given the difficulty of achieving four Gb/s, it's thought that S3200 PHY ports will be beta only (S800, S1600, and S3200) or alpha only (S100, S200, and S400), thus creating a bilingual PHY," says Richard Mourn, president and founder of Quantum Parametrics (Colorado Springs, Colo.). The approach allows users to achieve backward compatibility as desired while leveraging the higher speed of S3200.
The shift from S800 to S3200 only requires faster silicon (PHY and Link) and very small changes to software; the cable and connectors remain unchanged. Unlike CameraLink, which is point-to-point, and USB, which is based on a master-slave topology, FireWire is a logical bus; as such, it allows multiple devices to share the bus, minimizing wiring. It also introduces topological flexibility—devices not only can be arranged in tree configurations but daisy chained together or even connected in loop topologies that provide functional benefits such as redundancy and hot swapouts.
Challenges do arise when it comes to linking the FireWire camera network to a communications network. FireWire supports internet protocol (IP-1394), but most cameras today implement the Industrial Imaging Digital Camera (IIDC) protocol and use 1394's isochronous facility. As a result, the video streams are not inherently compatible with LAN or Internet communications in the same way as GigE Vision output. Sending image data from a camera network to the shop floor network requires a computer to convert the data into IP packets and send them. Then again, that process has to take place somewhere, either in the PC or in the camera, as for GigE Vision. "It's a question of where you want to push the problem," says Mourn. "The [GigE Vision] camera must put the data into an IP-type packet that's not isochronous and not real-time, so it has to buffer more." Such tasks require processing power and memory, which drive the cost up. "Do you want to push the problem clear out into the camera and make devices more expensive or would you rather put the complexity into one host that's probably going to be interfacing with the Internet, anyway?"
Where's the beef?
Although S3200 has been defined, no companies are currently shipping silicon, nor were those contacted for this article willing to say when they might. The best estimates from industry sources are perhaps late 2009, which could allow S3200 cameras to be demonstrated in early 2010.
So far, S1600 FireWire seems to be somewhat lost in the shuffle. The 1394b update defining the 1.6 Gb/second PHY layer was released in 2006, but so far only a handful of chips, and even fewer cameras, have been released. S800 has continued to be the workhorse, and with the release of S3200, it appears likely the industry will simply leapfrog over S1600 to future-proof systems with S3200. "The problem is that compared to S800, S1600 doesn't offer so many advantages, so there is a need to make the big step and go directly to S3200," says Michael Scholles, business unit manager for sensor and actuator systems at the Fraunhofer Institute for Photonic Microsystems (Dresden, Germany).
S1600 may find use in some niche applications, for example when integrators need to get a few more meters of cable run than S3200 can provide. "If you look at a lot of the semiconductor companies out there that can do four or five gigs, they're doing it over shorter distances," says Mourn. "We want to be able to do that speed over a 4.5-meter cable. It's yet to be seen whether they can do that by the fall timeframe. It may well be S1600 in the fall."