The main reason for this has to do with resolution. Firstly, the resolution of the Digital Light Processor projectors that we saw for Star Wars Episode 2 last year was 1280x1024. This had two problems. Firstly, it is much lower than the best resolution of 35mm film, which has the equivalent of maybe 2000 lines, rather than 1024, so the image appears grainier than a conventional film. Secondly, 1280x1024 is a PC resolution, and not a film and television resolution. For one thing, the aspect ratio is wrong. To appear on a movie screen, the image produced by a 1280x1024 projector has to be stretched in the horizontal direction so that the pixels are around three times as long as they are wide. This again is not ideal. The best resolution for high definition television, which is also used by the best professional digital video cameras, is 1920x1080. This produces pixels approximately the correct shape (although they are still stretched a little for Cinemascope 2.35:1 aspect ratios). Plus, as was done with Star Wars, if you film a movie using one of these cameras and convert to 1280x1024, there are certain issues in converting from 1080 pixels vertically to 1024 that can cause more distortion.
It seems that when Texas Instruments developed the digital projector, they simply took the chip for one of their PC projectors off the shelf, and used it to build a cinema projector. This, obviously, was not ideal. This is why it is heartening for me to read in Variety (no free access, unfortunately), that TI have just demonstrated a new digital projection chip. This one was clearly designed specifically for digital projection of motion pictures.
TI was to demonstrate the new chips in two prototype projectors at the ShoWest conference. Their highest-profile appearance was to come Tuesday night with a screening of the new Disney/Pixar animated feature "Finding Nemo" at theaters in the Bally's and Paris hotels.
TI's original Digital Light Processing chips had a resolution of 1280 by 1024, more than double the resolution of standard U.S. analog TV and most flavors of digital television but well below the high-def television resolution of 1920 by 1080.
The new chips will have a resolution of 2048 by 1080, exceeding the so-called 2K level of picture definition. It's a relatively small improvement over HDTV, but crucial for TI's marketing efforts to get its chips accepted by exhibs and studios pondering the multibillion-dollar changeover from film projectors to digital ones.
This will fix the problems of converting from 1080 to 1024 lines, and will largely eliminate the problems with stretched pixels. If the image generated from a digital camera is 1920x1080, the simplest was to convert is to simply leave 64 pixels blank at each side of the image, so that camera pixels and projection pixels exactly correspond. However, in the case where you have a movie with the 2.35:1 aspect ratio then conceivably it is possible to project the movie with squarer pixels than if you had a 1920x1080 projector. This will work fine from an analogue source, but we are going to need a 2048x1080 camera if this is going to be any help for a digitally filmed project. (I wonder if Sony are working on such a camera).
Unsurprisingly, the article says that people viewing films projected using the new system find that films look a lot better than when projected using the older system
TI used the new prototype projectors two weeks ago in Hollywood to show side-by-side comparisons of its chips' output to film for about 400 cinematographers, directors, journos and others. Reactions were much warmer than for the original TI chips, which were
criticized in particular for their inability to handle subtle dark shades.
This time even cinematographers, who have been TI's most exacting critics in the past, were complimentary of side-by-side comparisons of clips from such films as "Road to Perdition" and "Harry Potter and the Chamber of Secrets."
"For the first time, they saw a 47-foot image at 12-foot-lamberts of brightness with more than 2K resolution," said one participant in the gatherings. "It's a real breakthrough. And the buzz was extraordinarily good. I was a little taken aback by the reaction, and I think TI was, too."
I think we are slowly getting there. This is an improvement. Digital and analogue projection are likely close to indistinguishable in a relatively small cinema, or if you are sitting a reasonable way back. On a big screen, though, you are still going to be able to see the difference.
The article talks about the films "exceeding 2K resolution". In filmmaking, when digital effects shots are made, the live action film is scanned into a computer, the image is modified digitally, and the digital image is converted back to film. There are two common resolutions at which scanning is done: so called "2K", in which 2048 horizontal pixels are scanned, and "4K", in which 4096 horizontal pixels are scanned. Most work is done using "2K", which is considered good enough for most purposes. However, images scanned at 2K and then converted back to film are not as good as the images that we started with. This is why for the very best quality work, 4K is used.
Personally, I do not want to see a format that is merely considered "good enough" become the standard for digital projection. A digital projection needs, as a minimum, to be as good as what can be done with 35mm film. For this reason, I do not want to see widespread digital cinemas until we have capabilities for 4K resolution in cinemas. At the moment, digital cameras cannot do 4K, and special effects houses can only generate very small amounts of effects to a 4K standard, but this is simply because these are the limits of technology at the moment. Moore's Law tells us that these will not be the limits of technology in a couple of years. And I think it is going to be worth the wait.
And if someone could sneak me in to one of those screenings of Finding Nemo, that would be great.
Update: We have a very bare bones report on the Finding Nemo screening here.
No comments:
Post a Comment