How do you think the new GigE standards will influence the machine vision industry?
Respond or ask your question now!
The virtual replay visualizations of shots never seen before impressed home viewers and sports television commentators alike, but the real imaging news—vision news, in fact—at 2001's Super Bowl extravaganza at Tampa, FL hit the headlines only after the game.
The image processing-based 3-D replay effects tech, now dubbed "EyeVision," was jointly developed by virtual video advertising/video processing specialists at Princeton Video Image (PVI) and a team from CBS Sports Engineering. With 33 cameras placed around the field, in a manner conceptually similar to the approach used in making the sci-fi film The Matrix, (July `99) but taken to near real-time use, angles on key plays could be re-invented at will to show the best take on the play.
CBS broadcast viewers saw the stunning effect rotate to show multiple angles of the action after the touchdown kickoff return by the Baltimore Ravens' Jamal Lewis, for instance, to show he'd maintained possession of the football across the goal line, and about four times per half through the game. EyeVision can cut a 270-degree swath.
PVI (Lawrenceville, NJ), CBS, and digital asset management firm Core Digital Technologies have formed a joint venture to further the technology from here. The big news outside Raymond James Stadium was a first that caught many by surprise and created some protest: the hundred thousand fans and workers were caught on video, their faces digitized on the spot, for analysis and matching against a Tampa Police Department mugshot database in a security command center on the stadium grounds. There, local, state, and federal law enforcement officers monitored results in an effort to prevent terrorism or other major crime (shown at left).