The basic principle behind virtual sets may not be as new as you think. Way back in 1958, The Old Man and the Sea starring Spencer Tracy became one of the first films to use a new blue-screen technique invented by Warner Brothers employee and ex-Kodak researcher Arthur Widmer. Although groundbreaking at the time, it involved the creation of a cumbersome series of female and male motion picture mattes (masks) to filter each image, which after many hours of skilful manipulation in post production were combined to create a single image. An even older film technique is rear projection, devised by Farciot Edouart at Paramount Studios in 1933. Here studio actors performed in front a â€˜project ion tunnelâ€™ to fool audiences, with varying degrees of success, that they were part of a background scene shot earlier.
Both of these techniques have been enthusiastically embraced and developed by television. An electronic rather than optical filtering method called chromakey or â€˜colour separation over layâ€™ is routinely used, notably on weather forecasts, to combine foreground shot against a coloured screen with background graphics. This analogue system has since been replaced by digital compositing, although the terms â€˜chromakeyâ€™ and â€˜CSOâ€™ remain in popular use. The foreground screen
Presenters traditionally perform in front of a blue screen as skin contains very little blue, but the same (hopefully) applies to green, which has become popular as digital cameras capture more detail in the green colour channel. Also less light is needed as the colour is brighter. Other colours are sometimes used where foreground objects such as models contain blue, green, or both.
Conventional screens need a full lighting rig to minimise shadows, as these look like a different colour to software and may be misinterpreted as part of the foreground, resulting in an inaccurate key. A related technology designed to reduce lighting demands is a grey-looking reflective cloth screen called Reflecmedia from Chromatte, first developed by the BBC. The company claims that its LiteRing LED cluster encircling the camera lens â€“ available in blue or green â€“ is the only lighting required, and light is not wasted on areas the camera cannot see. A matte is created in software in the usual way. A further technique is difference keying, where software compares a stock picture of the blank screen with those including the foreground, although this normally requires several processes in post production. Meanwhile Serious Magic, the US company named after artificial satellite inventor and sci-fi novelist Arthur C Clarkeâ€™s assertion that â€œAny sufficiently advanced technology is indistinguishable from magic,â€Â manages to get away with a small, foldaway green screen for its Ultra 2 keying software, which can be used on location with relatively simple lighting.
The company says this is achieved by using vector keying instead of traditional chromakey techniques, which builds a mathematical model of the background colour to eliminate the wrinkles and colour variations that can be clearly seen during the shoot. After quickly drawing a garbage matte the user can apply this colour to the rest of the foreground scene, so the screen only has to cover the area actively occupied by the presenter, not the whole frame. According to Serious Magic, vector keying not only copes better with challenging keying conditions such as uneven lighting, creased backdrops and fine hair for example but is faster, and can even deliver convincing results from reduced bandwidth sources such as DV Ultra 2 is priced at around US $495 and works with standard PCs running Windows 2000 or Windows XP. It can also read and write standard AVI, DV, HDV and HD video clips from a range of systems including Avid, Adobe Premiere, Apple Final Cut Pro and Sony Vegas. An as-yet unexploited BBC invent ion is depth keying, where a close object (such as a presenter) within range of a strobe light is identified as foreground (the strobe effect is subsequently removed), eliminating the need for any screen at all. Background projection
Background project ion has made a recent return as an alternative to the blue screen technique, following the development of high resolution displays. At this yearâ€™s NAB in Las Vegas for instance, virtual set and graphics specialist Orad of Israel unveiled its new Maestro VRX graphics system on the Barco stand using the Belgian companyâ€™s large, high brightness Barco iPresent rear projection screen.
Described as a merging of Or adâ€™s Maestro on-air graphics system and VRX cluster middleware technology for high resolution displays, the drag-and-drop Maestro VRX uses 3D and 2D graphic templates produced on Oradâ€™s 3Designer authoring software, which are filled in with real time information either manually or from an external database. The system supports several projectors, with different graphics on each or the same image divided between them. A further boost at NAB for Oradâ€™s 3Designer came from the broadcast automation software, content design applications and plug-in products specialist VDS, which announced the release of a plug- in for 3Designer called Twister HD V7 â€“ the next generation of Liberty Paint â€“ on the Orad stand. Described as a paint, graphic content creation and workflow application designed specifically for broadcasters, Twister comes with browser utilities claimed to allow operators to move files between practically any related broadcast device, whether local or remote, or open or proprietary in nature. Features include support for third-party metadata formats for creating and updating searchable database information, and the ability to bring Adobe Photoshop layer files (among others) into 3Designer. The advantage of back projection is that the presenter can point to images directly without squinting at an off camera monitor to check position. Also the background is physically blocked, eliminating both the ghoulish seep-through of background on clothing close to the colour of the CSO screen, and the â€˜jaggiesâ€™ on fine detail such as hair where the CSO struggles to correctly define the foreground outline. Motion tracking
To create a true virtual set it must be possible to change the perspective of the background in relation to the position of the foreground, and to achieve all this the virtual set software needs to know not only the pan, tilt, focus and zoom settings of the foreground camera, but also its position in space. Several techniques have be en developed over the years to deduce the camera position, including a coded mesh superimposed on the chromakey screen (Orad pattern recognition), a network of discs attached to the lighting grid in the ceiling bearing different concentric barcodes detected by a small vertical camera (Vinten Radamec Free-d), infrared proximity detectors, and even â€˜dead reckoningâ€™ wheel turn counters on the studio pedestal. At NAB, InterSense of the US showed its IS-900 SCT camera tracking system, which could be seen working on the stands of several virtual production system manufacturers including Brainstorm, Dayang, FOR-A, Hybrid-MC, and vizrt. The system uses patented miniaturised precision motion sensors that detect movement by a hybrid technology of inertia and ultrasonic tracking. Inertia tracking uses sensors on the spindle of a gyroscope, which as we all know likes to stay in a fixed position, to detect the rate of movement and hence calculate the new position. This so-called â€˜accelerometerâ€™ arrangement is very good at responding to fast movement. However when the camera isnâ€™t moving, the position tends to drift due to the accumulated effect of even the tiniest eccentricity of the gyro. This is where the ultrasonic part comes in. Like a navy sonar â€“ indeed the kit was originally developed as an SBIR (Small Business Innovation Research) project for the US Navy â€“ this measures high-frequency audio reflections to determine the general position, overriding any spurious gyro readings when the camera is static. Or ad also announced a cooperative development project at NAB with the US based remote camera control systems specialist Telemetrics, which has already borne fruit with versions of its ProSet and SmartSet virtual studio systems featuring new remote controlled tracking heads. Compatible with both HD and SD formats and multi-camera productions, the remote controlled tracking heads are claimed to extract the pan, tilt, zoom and focus camera parameters to an accuracy of up to 800,000 interpolated counts. Using a programmable interface, the system has a dedicated panel controlling all camera movements, with a second operator monitoring the virtual studio operation. Orad says the system allows motorised camera movements in both the horizontal and vertical plane, using a heavy-duty belt driven track system. In a recent installation at Splash Media in Texas for instance, a ProSet â€“ OradÂ´s high end virtual studio system powered by four of the companyâ€™s HDVG (High Definition video graphics) rendering plat forms â€“ was equipped with an â€˜Hâ€™ shaped track allowing the camera to move back and forth, up and down, and simultaneously back/forth and up/down in addition to extracting the pan, tilt, zoom and focus parameters of the cameraâ€™s movement. The company also reports strong sales growth for its Xync camera tracking system for virtual sets. Based on infrared tracking, Xync is claimed to offer free movement within the studio including dolly, crane and handheld, and supports multi camera production with just two frames of video delay. As well as up/down and side-to-side positioning, the system also detects depth from the subject, which can be used to determine whether the presenter is behind or in front of objects added to the scene such as a desk for example.
Meanwhile Serious Magic has come up with an ingenious way of avoiding the complexities of camera tracking. Instead of measuring the position of a camera walked or wheeled across a large intensively-lit screen, VirtualTrak technology inside Serious Magicâ€™s Ultra 2 keying software simply moves the background. Ready-made sets
Serious Magic unveiled its latest Master Sets Library 4 at NAB, containing 12 new virtual sets for its Ultra 2 keyer and costing around US$395. All sets include multiple angles, virtual flying camera shots and places to insert additional video sources or graphics, and are claimed to take full advantage of Ultra 2â€™s VirtualTrak technology, which can also insert convincing video reflections into virtual scenes.
Virtualsetworks also offers a series of Virtual Set Packs for standalone or built-in keyers. These include the Apple Shake compositor, Autodeskâ€™s Combustion NLE, FOR.A DigiWarp EX live 2D motion t racked sets, the Fusion 5 compositor from eyeon Software (which now includes an optional Ultimatte AdvantEdge plug-in), NewTekâ€™s Video Toaster 3 post production virtual chromakey, Globecasterâ€™s â€˜studio-in-aboxâ€™ Studio 4000, and the Ultra keyer from Serious Magic. The company says hundreds of royalty-free sets are available, each offering at least 12 angles, from just US$99. Incidentally the veteran keying technology company Ultimatte Corporation has updated its Ultimatte AdvantEdge software for Mac OS X to version 1.6.2, adding compatibility with Apple Shake 4, Apple Final Cut Pro 5 and Adobe After Effects 7 applications plus 16-bit functionality in Adobe Photoshop. Ultimatte is also now shipping its realtime HD matte compositing system, which is claimed to be compatible with all digital HD and digital cinema 24p (24 fps progressive scan) image standards. Processing developments
As well as sensor-based tracking, Oradâ€™s CyberSport virtual set application for sports is able to extract tracking data from the video in real time, so that production staff donâ€™t even have to be present at the venue â€“ an especially useful feature for host broadcaster feeds or syndicated material. Orad says the system can be installed in a studio miles away, with 3D graphics overlaid onto the down-streamed video signal in the normal way. CyberSport also provides a series of plug-ins offering features appropriate to particular sports. The soccer (football) plug-in for instance includes a 9m distance indicator, a dynamic and static offside line, team formation and logos, scores and stats, and a virtual superimposed Jumbotron display. Additional plug-ins is available for other sports including swimming, athletics, basketball, tennis, American football, baseball, and horse racing. Red Bee Media (formerly BBC Broadcast) has developed a technology called Piero invented by BBC R&D that places pictures of real players into a virtual stadium, allowing animated analysis from different viewpoints even if the play has not been captured at these angles. TV Globo in Brazil and Hong Kong Cable both bought the Piero system in time for the World Cup. To form an inclusive graphics package, Piero also features a capability to track players across the grass and place pointers, badges and scores on the pitch in â€˜liveâ€™ video. As with systems from the likes of Orad and Vizrt, it can also place virtual advertising in real footage or within virtual stadiums. In June Apple delivered Shake 4.1, described as the first universal version of its industry leading compositing software, and cut the price from US$2,999 to just US$499. The company says the new release means that Final Cut Studio editors can now take advantage of Shake for sophisticated 3D compositing, keying, image tracking and stabilisation. Meanwhile the size of the hardware needed to process and combine the images has dramatically fallen in recent years. When Brainstorm Multimedia â€“ the Spain-based graphics innovator part-owned by FOR.A of Japan â€“ introduced one of the first virtual sets back in 1993, it needed a fridge-sized sgi Indigo computer wheeled around on heavy-duty casters. Today its latest eStudio V10 software runs on a high performance Windows PC. Similarly the Viz|Virtual Studio from Israel-based virtual set pioneer Vizrt can run on a Windows NT PC or sgi (Silicon Graphics Inc) Onyx, and when used in combination with Panasonicâ€™s AV-CGP500 processing unit is claimed to provide the only practical HD virtual set system on the market. Capable of rendering high resolution backgrounds containing some 40,000 polygons per field (or 64,000 with the opt ional AV-CGP50M01 3D processor board), the Panasonic AV-CGP5 00 HD platform occupies just two units of rack space, making it small enough for use in limited-function studios or OB vans. The system also supports virtual set software from Kaydara (now part of Autodesk Alias) and Brainstorm, and is claimed to use one-tenth of the electricity supply of conventional systems. At the heart of Vizrtâ€™s realtime graphics systems, including Viz|Virtual Studio, is Viz|Engine. Described as a powerful rendering engine, it was an early adopter of the OpenGL protocol introduced in 1992, which has since become the industry-standard application programming interface (API) for graphics. This says the company makes Viz|Engine one of the few 3D graphics systems that can run the same graphics content on multiple hardware platform and operating systems, including Windows XP/2000, Linux, and sgi Irix. The HD version of Viz|Engine runs on a non-proprietary, rack mountable Windows PC with NVIDIA QuadroFX 4400/SDI graphics card, as well as the specialised PC platform from Panasonic. Viz|Engine systems support up to four live video inputs which can be embedded in any graphics animation or virtual studio set, and performance can be increased by linking multiple systems together in a cluster.