htfreak Posted April 22, 2006 Posted April 22, 2006 Not sure if anyone has posted this before but found this article that could prove to be an interesting read. Extracted from http://www.hdtvexpert.com/pages_b/reality.html ================================= 1080P — Time for a Reality Check! - July 1 2005 PETER PUTMAN, CTS Thinking about buying a new 1080p rear-projection TV, front projector, or LCD TV? You might want to put your credit card back in your wallet after you read this. It’s obvious that the buzzword in consumer TV technology this year is “1080p”. Several manufacturers are showing and shipping 1080p DLP and LCoS rear-projection TVs. We’ve seen RPTVs and front projectors with 1920x1080 polysilicon LCD panels at CESA, NAB, and InfoComm. And the trickle of large LCD TVs and monitors with 1920x1080 resolution is turning into a flood. To get your attention, marketers are referring to 1080p as “full spec” HD or “true” HD, a phrase also used by more than one HD veteran in the broadcast industry. We’re hearing about “1080p content” coming out of Hollywood, from broadcasters, from cable systems, and from direct broadcast satellite services. The budding format war between Blu-ray and HD DVD for the next generation of high definition DVD players promises the same thing — 1080p content at high bit rates, finally realizing the full potential of HDTV. STOP! Enough of this nonsense. It’s time to set the record straight, to clear up the air about what 1080p is and isn’t. First off, there is no 1080p HDTV transmission format. There is a 1080p/24 production format in wide use for prime time TV shows and some feature films. But these programs must be converted to 1080i/30 (that’s interlaced, not progressive scan) before airing on any terrestrial, satellite, or cable TV network. What’s that, you say? Those 1080p/24 could be broadcast as a digital signal? That’s true, except that none of the consumer HDTV sets out there would support the non-standard horizontal scan rate required. And you sure wouldn’t want to watch 24Hz video for any length of time; the flicker would drive you crazy after a few seconds. No, you’d need to have your TV refresh images at either a 2x (48Hz) or 3x (72Hz) frame rate, neither of which is supported by most HDTVs. If the HDTV has a computer (PC) input, that might work. But if you are receiving the signals off-air or using a DVI HDCP or HDMI connection, you’ll be outta luck. What about live HDTV? That is captured, edited, and broadcast as 1080i/30. No exceptions. At present, there are no off-the-shelf broadcast cameras that can handle 1080p/60, a true progressive format with fast picture refresh rates. It’s just too much digital data to handle and requires way too much bandwidth or severe MPEG compression. (Consider that uncompressed 1920x1080i requires about 1.3 gigabits per second to move around. 1080p/60 would double that data rate.) How about Blu-ray and HD-DVD? If either format is used to store and play back live HD content, it will have to be 1920x1080i (interlaced again) to be compatible with the bulk of consumer TVs. And any progressive-scan content will also have to be interlaced for viewing on the majority of HDTV sets. Here’s why. To cut manufacturing costs, most HDTV sets run their horizontal scan at a constant 33.8 kHz, which is what’s needed for 1080i (or 540p). 1080p scans pictures twice as fast at 67.6 kHz. But most of today’s HDTVs don’t even support external 720p signal sources, which requires a 44.9 kHz higher scan rate. In the consumer TV business today, it’s all about cutting prices and moving as many sets as possible through big distribution channels. So, I ask you: Why would HDTV manufacturers want to add to the price of their sets by supporting 1080p/60, a format that no HDTV network uses? Here’s something else to think about. The leading manufacturer of LCD TVs does not support the playback of 1080p content on its own 1920x1080 products, whether the signal is in the YPbPr component or RGB format. Only the industrial monitor version of this same LCD HDTV can accept a 1920x1080p RGB signal. Now, don’t blame HDTV manufacturers for this oversight. They are only supporting the 1080 format in actual use, 1920x1080i, a legacy digital format that has its roots in the older Japanese MUSE analog HDTV format of the 1980s. That’s one big reason that 1080i has remained as a production and transmission format. It gets worse. All kinds of compromises are made in the acquisition, production, and transmission of 1080i content, from cameras with less than full resolution in their sensors and reduced sampling of luminance and chrominance to excessive MPEG compression of the signal as it travels from antenna, dish, or cable to your TV. But that’s not all. To show a 1080i signal, many consumer HDTVs do the conversion from interlaced to progressive scan using an economical, “quickie” approach that throws away half the vertical resolution in the 1080i image. The resulting 540p image is fine for CRT HDTV sets, which can’t show all that much detail to begin with. And 540p is not too difficult to scale up to 720p. But a 540p signal played back on a 1080p display doesn’t cut the mustard. You will quickly see the loss in resolution, not to mention motion and interline picture artifacts. Add to that other garbage such as mosquito noise and macroblocking, and you’ve got a pretty sorry-looking signal on your new big screen 1080p TV. Oops! Almost forgot, that same 1080p TV may not have full horizontal pixel resolution if it uses 1080p DLP technology. The digital micromirror devices used in these TVs have 960x1080 native resolution, using a technique known as “wobbulation” to refresh two sets of 960 horizontal pixels at high speed, providing the 1920x1080 image. It’s a “cost thing” again. (Let’s hope these sets don’t employ the 540p conversion trick as well! To summarize: There are no fast refresh (30Hz or 60Hz) 1080p production or transmission formats in use, nor are there any looming in the near future — even on the new HD-DVD and Blu-ray formats. The bandwidth is barely there for 1080i channels, and it’s probably just as well, because most TVs wouldn’t support 1080p/60 anyway — they’d just convert those signals to 1080i or 540p before you saw them. The 1280x720 progressive-scan HDTV format, which can be captured at full resolution using existing broadcast cameras and survives MPEG-2 compression better than 1080i, doesn’t make it to most HDTV screens without first being altered to 1080i or 540p in a set-top box or in the HDTV set itself. So what chance would a 1080p signal have? Still think you’ve just gotta have that new 1080p RPTV? Wait until you see what standard definition analog TV and digital cable look like on it…
htfreak Posted April 22, 2006 Author Posted April 22, 2006 Here's a follow on article written by the same guy a few months later http://www.hdtvexpert.com/pages_b/followup.html ================================ 1080P — Q&A — August 22, 2005 PETER PUTMAN, CTS 1080p continues to be a “hot button” story, judging by reader emails. This article addresses specific questions about 1080p display and signal formats. ********************** Before I go any further, it's important to note that the original article was written as a cautionary piece for prospective buyers of new HDTV sets, who are no doubt hearing a lot of misinformation about 1080p programming, signal processing, and imaging systems from manufacturers these days. The important thing to remember is that at present, there is no 1080p acquisition, production, and broadcast formats using a fast picture refresh rate (50 Hz or 60 Hz) equivalent to the 720p standard. None. And there isn't likely to be any time soon, as the current crop of 1920x1080 cameras, switchers, graphics generators, processors, and title generators all work in an interlaced format. None of them support 1080p50/60 at present, although there are some expensive 1080p/50/60 products coming to market, such as Sony's $100,000+ camera shown at NAB 2005. This camera cannot be integrated into an existing 1080i production environment unless its output signal is format-converted to 1080i first. The only 1080p production format is 1080p/24/25, used to achieve a "film look" for prime time TV shows and some theatrical motion pictures. Programs shot in 1080p/24 in Hollywood are edited and finished in this format, then transferred to either 720p/60 or 1080i/30 for broadcast in the United States. On to the questions: Q. You keep referring to 1080i/30. There isn't any such standard! It's 1080i/60! A. Classic misinformation. The refresh (frame) rate in any SDTV or HDTV format describes the total time interval it takes to completely refresh a new picture. In the 720p/60 system, a new progressive-scan image (i.e. a "frame") with 1280 visible pixels on 720 lines is displayed every 1/60 of a second. In the 1080i system, there are two fields in a frame. One has all of the odd-numbered scan lines (1,3,5,7,9, etc.) and the other has all of the even-numbered scan lines (2,4,6,8, etc.) Each field is presented in 1/60 of a second. So, in the first 1/60, we have 1920 pixels on 540 odd-numbered scan lines, and in the next 1/60 of a second, we have the remaining 1920x540 pixels presented on the even-numbered lines. Do the math: 1/60 + 1/60 = 2/60, or 1/30. So, the frame rate for a complete 1920x1080i image is 1/30 of a second, not 1/60 (that would imply the interlaced fields are each being presented at 1/120 of a second). In the interests of technical accuracy, the actual frame rates are 1080i/29.97 and 720p/59.94 in the United States. Q. 1080p/24 programming can be shown on some TVs that have fast refresh rates. What's wrong with broadcasting 1080p/24 HDTV in its native format? A. True, some HDTVs can handle a 1080p/24 signal by refreshing it three times in a given time interval. That works out to a 72 Hz frame rate, and many plasma and LCD TVs support it. But the vast majority of TVs in use are still CRTs, and many HDTV sets use CRT technology. None of these sets support a 72Hz or even a 48 Hz (double) scan rate. That's because, to save a few bucks, manufacturers standardized on a 33.8 kHz scan rate, which is what is required to show 1080i/30. 720p/60 (44.9 kHz) often isn't supported, or if it is, the signal is scan-converted down in resolution to 540p. In fact, your 1080i content is often converted to 540p by simply throwing away one of the fields (usually the even one). So you aren't getting full resolution from the 1080i signal to begin with! Another point to consider: At present, a 72Hz refresh rate is not supported in the DVI HDCP and HDMI formats for DVD players and set-top receivers. 720p/60 is, as is 1080i/30. (But 1080p/60 is not, nor is 1080p/24.) This may change in the future. You wouldn't want to watch HDTV programs refreshing at a native 24Hz rate on a CRT set - the flicker would drive you crazy! And 1080p/24 is totally inappropriate for broadcasting sports. No sports fan would tolerate the motion blur and loss of fine detail in fast-moving objects. Even 1080i/30 would be a better choice. As it stands now, converting 1080p/24 to 1080i/30 for broadcast is a winning combination. Picture quality is quite good and the "film look" holds up well even when converted to interlaced scan. On some of the best HDTVs, picture quality is outstanding. Q. You mention "wobbulation" as the trick used by DLP RPTV set manufacturers to achieve more horizontal resolution on the screen from half-resolution chips. Not all manufacturers use this system. A. The current digital micromirror device (DMD) used in 1080p RPTVs does not have full 1920x1080 pixel resolution. Instead, it has 960 horizontal pixels. A technique that resembles segmented-field scanning presents 960 horizontal pixels in a brief time interval, then the remaining 960 pixels in the next time interval. Shifting the tiny mirrors slightly at super-fast speeds does the trick. The total time interval is fast enough that you see what appears to be a smooth image with lots of horizontal picture information. HP is credited with implementing this "wobbulation" technique in their RPTVs, but Samsung, Toshiba, and Mitsubishi all have variations of segmented-field picture scanning using different brand names. It's important not to confuse segmented-field scanning with interlace scanning. There are no motion artifacts with SF pictures, only with interlaced pictures converted to progressive scan. It's also important to note that some RPTVs use full-resolution 1920x1080 LCoS and LCD imagers, although some of those TVs need help in terms of image quality. Q. Isn't 1080p required for viewing HD on large screens? The pixels from a 720p or 7680 set would be distracting. A. This might be the best argument for a 1080p display, but it all depends on your viewing distance. The rule of thumb for seating distance, depending on whom you speak to, is 3x or 4x the screen height. If you have a 50-inch screen with 1366x768 pixels, the viewing distance using the 3x rule would be about 75 inches (assuming a picture height of about 25 inches) and with the 4x rule, it would be 100 inches. At a SMPTE presentation a couple of years back at Panasonic's Advanced Television Labs in New Jersey, discussion was made of the development of the 720p standard and that a 4x screen height viewing distance was recommended for display with that resolution. That would mean a viewer should be about 8.3 feet from that 50-inch screen, which is not at all unusual for a viewing distance in most American homes. Switching to a 1080p display and using the 3x recommendation would allow you to sit six feet from the same 50-inch screen. In my house, that would mean I could just barely use my reclining couch without hitting the TV cabinet or stand. Would a larger screen make a difference? With a 60-inch 720p set, I'd need to be 120 inches, or 10 feet away. With a 1080p 60-inch set, that distance could be optimally cut to 7.5 feet. I've have 61-inch plasma TVs in my studio for review on more than one occasion, and really haven't noticed the pixel structure at distances of 8 feet or more. So, I guess it depends on the individual. (Keep in mind that RPTVs with screen sizes of 60 inches or more are a very small part of the market at present.) Q. So why are manufacturers pushing people to buy 1080p HDTV sets? A. Because the prices of 720p and 768p RPTVs are dropping to a point where profit margins are becoming very slim. You can thank the aggressive pricing from plasma and LCD manufacturers for that. 1080p isn't feasible in plasma TVs smaller than 70 inches as of now, and 1080p LCD TVs are too darned expensive. 1080p provides a nice market niche for manufacturers of RPTVs — for now. Problem is, prices of these 1080p sets are already being discounted, pushing 720p and 768p prices even lower. At some point, RPTV manufacturers will likely opt out of 720p and 768p altogether in favor of 1080p products. Let's hope that the "half resolution" problem, SDTV (480i) scaling issues, and other issues such as clipped bandwidth are resolved by then!
htfreak Posted April 22, 2006 Author Posted April 22, 2006 Thought I'd just post this matrix here that I got off AVSforum for those that get confused by the numbers game 480p Displays: ___480i sources require deinterlacing only ___720p sources require downscaling only ___1080i sources require downscaling and deinterlacing ___1080p sources require downscaling Great for DVDs (no scaling artifacts), but loss of detail with HD sources. 720p Displays: ___480i sources require upscaling and deinterlacing ___720p sources display natively ___1080i sources require downscaling and deinterlacing ___1080p sources require downscaling Great for 720p material, but requires upscaling and deinterlacing for DVD (not so bad - no loss of detail) and downscaling and deinterlacing for 1080i and downscaling for 1080p (not so good - loss of detail). 1080p Displays: ___480i sources require upscaling and deinterlacing ___720p sources require upscaling ___1080i sources require deinterlacing ___1080p sources display natively Great for 1080i and 1080p sources, but requires upscaling for 720p (not so bad - no loss of detail) and upscaling and deinterlacing for 480i (not so bad - no loss of detail).
varun1624705824 Posted April 23, 2006 Posted April 23, 2006 This is good information, thanks. Here's one for the gurus: logically speaking, would it be "easier" for an LCD to scale 540i to a native-1080 display or 540i to a native 720 display? To my mind, 540 to 1080 should be easier (just double the horizontal information without introducing other scaling artifacts). What say? - V.
Recommended Posts