Wanted: Movie theater projectionists with IT experience in network systems and client/server environments to operate digital cinema systems. Qualifications and requirements: systems analysis capabilities, a thorough working knowledge of e-business applications deployment, including implementation of a variety of IT systems in a computing and business environment. Please, no calls from people with obsolete traditional projectionist skills like “building” movies by physically splicing ads, trailers, PSAs and print reels into one large band of celluloid film stock and then threading it through an opto-mechanical projector while fixing presentation problems like static wraps, mis-frames, sound problems, etc. Successful candidates will be required to wear glasses on the job and must love the smell of popcorn.
This classified ad might not have run anywhere — yet — but it’s only a matter of time until it, or something similar, does. Digital projectors use a movie recorded on a hard drive instead of on film, meaning the images don’t get scratched up and are generally brighter and steadier (though digital projection has been criticized for lacking the textured richness of conventional film). Digital projectors also mean server environments, so the projectionists who operate and maintain them must be both geek as well as artiste.
Digital movies are still in the introduction stage of the business life cycle, but e-cinema — the digital distribution and exhibition of movies — is already in use in more than 1,600 (out of a total of 36,000) U.S. theaters.
E-cinema is now being sexed up by photorealistic 3-D technology in movies like “Journey to the Center of the Earth,” released last July. The film stars Brendan Fraser and is mostly live action, with only the landscape and creatures supplied by computer-generated graphics. “Journey 3-D” is projected using RealD Cinema technology, a format that made its debut in 2005 with the release of “Chicken Little,” a Disney CGI (computer generated imagery) animated feature film.
The entertainment industry and moviegoers eagerly await director James Cameron’s US$200 million 3-D blockbuster “Avatar,” currently set for a December 2009 release. The director is planning to create photo-realistic computer-generated characters through motion capture animation technology using his new virtual camera system. The film was originally scheduled to be released in May 2009 but was pushed back to allow more time for post production on the complex photorealistic CGI — and to give more time for theaters worldwide to install 3-D projectors.
3-D: The ‘Third Revolution’
Speaking at the 3-D Entertainment Summit held in Los Angeles earlier this month, DreamWorks Animation CEO Jeffrey Katzenberg called the transition from 2-D to 3-D in movies “the third revolution,” comparable in impact to the transformation of films in the 1920s and 1930s from silent to sound and from black-and-white to color.
“With ‘Avatar’ in the works and Disney and DreamWorks making 3-D product, this is no longer an experimental business,” said Bob Dowling, former editor-in-chief and publisher of The Hollywood Reporter and host of the conference. “It is going to happen inexorably, but what it needs now is just creative application. The caveat is that theaters have been significantly affected by the economic downturn, which has slowed the digital roll-out.”
At least 2,500 screens are needed for studios and other industry players to recoup their investments, Dowling told TechNewsWorld.
Global box office revenues for 3-D movies in 2008 totaled $240 million, 70 percent of which came from U.S. 3-D screens, according to Charlotte Jones, Screen Digest’s film and cinema senior analyst.
Seeing in 3-D relies on the concept of “stereopsis,” the binocular depth sense (literally “solid seeing”).Stereoscopy is the art and science of creating images featuring the stereoptic depth-sense. Actual 3-D imagery is produced by creating binocular disparity — slightly different two-dimensional images delivered to each eye — which are then integrated in the brain into a single, three-dimensional image through stereopsis.
Depth — the quality of deepness that joins length and width to make a third visual dimension (a.k.a. the z-axis) — is big business and getting bigger. The 3-D Consortium (founded by Sharp, Itochu, Sanyo, Sony and others) has projected that in Japan alone, the 3-D stereoscopic market will grow to over $25 billion in the next five years.
3-D technology is being used today to save money and improve productivity and effectiveness in many areas, including computer aided drawing, manufacturing and engineering; virtual prototyping; molecular modeling; minimally invasive surgery; petroleum exploration and dozens of additional applications in military, robotics, interactive gaming, e-commerce, 3-D animation and, of course, movies.
The quest to provide moviegoers with a realistic, enjoyable 3-D movie experience has given rise to a business model that’s based on the development of methods that merge film making with information technology. Image capture using specialized cameras, satellite delivery of movies directly into theaters, digital projectors — the computer is transforming big-screen entertainment on all levels.
All movies start with the camera, and Burbank, Calif.-based 3ality Digital has established itself as a leading provider of digital 3-D offerings by bringing to market the first live action film shot entirely in digital 3-D (“U2 3-D”), the first transatlantic 3-D broadcast (Jeffrey Katzenberg’s interview at IBC) and the first scripted television show shot entirely in live digital 3-D (NBC’s “Chuck”).
Film producer and 3ality cofounder Steve Schklair created the company’s 3-D filming technology, intended to be used as an inexpensive and effective way to film in 3-D live events such as concerts and sporting events. Motion control photography and real-time image processing create a realistic 3-D experience without subjecting the viewer to motion sickness.
3ality Digital utilizes separate and distinct hardware and software at various points throughout the production and post-production supply chain.
Running proprietary software with high-resolution digital cameras, 3ality is able to shoot true 3-D video by producing simultaneous 2-D images for each eye as shooting occurs. This raw 3-D imagery is then stored as digital data directly on clustered storage repository servers, where it is easily and immediately accessible online for 3ality’s post-production team.
“As for integration, all of our components produce metadata that other components can utilize,” said Howard Postley, 3ality Digital’s CTO. “For instance, production systems capture various physical metadata, such as focal distance, interaxial distance and convergence angle, and derive metadata such as depth. Our post production components can use this information for correction, compositing, etc.”
3ality Digital camera systems are the only ones available on the market capable of capturing perfectly aligned 3-D images from any cameras or lenses, according to Postley.
“Our automated and rapid setup process measures and compensates for mechanical imperfections of cameras and lenses,” he told TechNewsWorld. “We have digital image processing that provides analytics, scopes, reporting and adjustment for most types of 3-D issues. Our camera platforms and image processors can be networked to be controlled as a single system, as opposed to a bunch of independent cameras.”
On the post side, Postley explained, while other systems are beginning to provide rudimentary tools that offer basic 3-D alignment capabilities, 3ality’s system already provides a comprehensive 3-D-aware toolset designed to allow creative freedom while correcting or preventing most common 3-D errors.
A 3-D Future
We’re still at an early stage in the development of the machinery for content production and delivery, according to David Seigle, president and CEO of In-Three, a 2-D-to-3-D conversion company based in Westlake Village, Calif.
“Studios and producers don’t yet have a reliable way of knowing the proper way to construct 3-D and what the appropriate cost should be,” Seigle told TechNewsWorld. “We are all learning that. Also, enough theaters to make a 3-D-only release economical are only now coming online. Finally, 3-D standards for TV have yet to be developed. All this will be behind us soon, and 3-D will become common.”
While 3-D activity on the big screens dominates the headlines, smaller screens on computers and mobile phones are also providing venues for 3-D activity.
Big Stage Entertainment, a Los Angeles-based media technology company, enables consumers to project themselves photo-realistically into the digital realm. Through a new program called “PortableYou,” Big Stage opened its proprietary 3-D facial modeling system to third parties for seamless integration into video games, virtual worlds, Web sites, mobile apps, kiosk-based systems and more. PortableYou enables any third party to offer a consumer experience that features an animated, 3-D likeness of a user to enhance everything from entertainment to communications and retail.
PortableYou was developed in a Web services environment to allow a user to submit three photos via e-mail to create a realistic 3-D head that is modeled to that individual’s unique specifications. Through PortableYou, 3-D clones are ready to be directly inserted into 3-D animation and modeling systems and environments.
“You can now insert yourself into a wide range of great content on BigStage.com,” said Big Stage CTO and cofounder Jon Snoddy. “We also now have film campaigns on the iPhone. If you download the Big Stage ‘The Spirit’ app for the iPhone, you’ll be able to see yourself in clips from the film, all in Frank Miller’s iconic style.
“We’re constantly refining our system to improve quality and its capabilities, and we can definitely envision the day when filmmakers can make an entire feature that stars the audience,” Snoddy told TechNewsWorld.
Click here to be notified when the nextinstallment in this series is published.