Jump to content

Recommended Posts

Posted
~

Coz my plan was to get a 1080p 50" panel in a year or so (if they exist and get cheaper), but maybe that's not such a good idea...?

--Geoff

G'day Geoff,

I don't think any of us would ever describe any sort of AV upgrade as being not such a good idea!! :D

But seriously, I'm only going by what Owen has been banging his head against the wall about, and the science makes perfect sense to me. There is a certain distance, beyond which your eyes cannot resolve the extra number of pixels. The quality of source information, processing and settings used to display via these pixels? Yes. But the number of them? No.

And from memory, he puts this distance at well under 3m for a 50" display. Couldn't say with any confidence where your 2.5m distance puts you, but I'd wager that you're right on the cusp. If it were me, for a 50" screen, I'd be biasing my outlay on processing and other image quality factors, rather than the sheer number of pixels.

Cheers,

Aaron.

(On a completely different matter, I've always loved the language of your posts. In particular, the use of "yer". Personality can be conveyed by means other than just avatars and emoticons!! :blink: )

  • Replies 267
  • Created
  • Last Reply

Top Posters In This Topic

Posted
With my xbox 360 connected, I do not notice any of the things you mention when feeding it either 720p or 1080i.

Andrew.

Quite possible, the lag is usually only about 1-3 frames which most people wont really notice, and the artifacts from the de-interlacing are only really visible on a bad de-interlace and if you look for them. I do notice them tho :blink: so I like 1080p. But 1080i should be fine for most people.

Posted
So, for example, a 1080p source broadcast at 1080i and scaled down to 720p would be indistinguishable at that distance to a panel that is natively 1080p?

Yes, essentially.

A chart has been quoted several times to elaborate, but you can also try this.

1080p means 1920 pixels across the screen width (~1100mm for 50") ie. pixels are ~ 0.6mm apart.

So draw a series of dots 0.6mm apart on a piece of paper and view it from 2.5m, do you see dots or a line?

Posted
1080p means 1920 pixels across the screen width (~1100mm for 50") ie. pixels are ~ 0.6mm apart.

No, 1080p means it has 1080 pixels vertically - it can have 1280, 1366 or any resolution horizontally.

p means progressive - it can display ALL horizontal lines in every frame.

i means interlaced - it cannot scan as fast vertically so it only does every SECOND horizontal line in each frame. There are still 1080 horizontal lines displayed, just not as quickly

Current HDTV broadcast standards mean we will not have 1080p broadcasts, only 1080i. For progressive, you have to limit to 720 lines.

Non-broadcast sources - games or PCs - can be 1080p or 1080i.

Mike

Posted
i means interlaced - it cannot scan as fast vertically so it only does every SECOND horizontal line in each frame. There are still 1080 horizontal lines displayed, just not as quickly

I think once again we are getting confused between progressive/interlaced inputs and displays.

All plasmas and LCDs (and some CRTs) are progressive displays.

The question is whether they accept progressive inputs or not.

Andrew.

Posted
No, 1080p means it has 1080 pixels vertically - it can have 1280, 1366 or any resolution horizontally...

~

Splitting hairs, if not taking his post out of context. I think Anton's example was a fantastic way of testing the benefit of an increase in display resolution in the home environment... :blink:

Posted

"1080p means 1920 pixels across the screen width "

Splitting hairs, if not taking his post out of context. I think Anton's example was a fantastic way of testing the benefit of an increase in display resolution in the home environment... :blink:

There is massive confusion about HDTV terms. It only adds to the confusion to imply that "1080p" says anything about the number of pixels in a horizontal line. I was only trying to clarify any possible confusion resulting from that one inaccuracy.

Posted
~

There is massive confusion about HDTV terms...

~

Agree entirely. And I didn't mean to suggest otherwise, or seem narky. Apologies... :D I just thought Anton's suggestion was so simplisticly brilliant, I'd hate to have seen it swallowed up in 1920 this, 768 that, progressive there and interlaced here...

You'd have to agree that if we enthusiasts can't even come to some consensus on a lot of this stuff, what chance is the consumer that just wants to keep up with the Jones'... :blink::P

Posted
"1080p means 1920 pixels across the screen width "

There is massive confusion about HDTV terms. It only adds to the confusion to imply that "1080p" says anything about the number of pixels in a horizontal line. I was only trying to clarify any possible confusion resulting from that one inaccuracy.

The HDTV terms themselves are *not* confusing.

Every reference site I looked up says 1080p stands for 1920x1080 (& progressive), not just 1080 lines.

IMO confusion is caused by calling 1024x1080 as 1080p, and 1366x768 as 1080i.

Posted
The HDTV terms themselves are *not* confusing.

Every reference site I looked up says 1080p stands for 1920x1080 (& progressive), not just 1080 lines.

The standard horizontal resolution of a 1080 signal is 1920 horizontally - but depending on source, transmission, storage and decoder, you may see no difference if you view a 1080 signal on a display with 1366 horizontal resolution.

Perhaps a better description would be "a 1080 signal has a maximum possible horizontal resolution of 1920 pixels"

IMO confusion is caused by calling . . . . 1366x768 as 1080i.

. . . Amen !!!!

Posted

HD Ready - Wikipedia does have "a" definition of this term at http://en.wikipedia.org/wiki/HD_ready.

" The term has had official use in Europe since January 2005 when, EICTA (European Information, Communications and Consumer Electronics Technology Industry Associations) announced the requirements for the label.

EICTA introduced the label as a quality sign for the differentiation of display equipment, capable of processing and displaying high-definition signals."

"The fact that a product bears the label "HD ready" does not necessarily mean that it can display the full picture resolution possible from a HD source. Most HD-ready sets do not have enough pixels to give true pixel-for-pixel representation without interpolation of the higher HD resolution (1920x1080) - or even the lower HD resolution (1280x720) horizontally (CRT based sets, or the plasma based sets with 1024x768 resolution)."

"HD ready requirements

In order to be awarded the label “HD ready” a display device has to cover the following requirements:

Display, display engine

The minimum native resolution of the display (e.g. LCD, PDP) or display engine (e.g. DLP) is 720 physical lines in wide aspect ratio.

Video Interfaces

The display device accepts HD input via:

Analog YPbPr. “HD ready” displays support analog YPbPr as a HD input format to allow full compatibility with today's HD video sources in the market. Support of the YPbPr signal should be through common industry standard connectors directly on the HD ready display or through an adaptor easily accessible to the consumer; and:

DVI or HDMI

HD capable inputs accept the following HD video formats:

1280x720 @ 50 and 60Hz progressive scan (“720p”), and

1920x1080 @ 50 and 60Hz interlaced (“1080i”)

The DVI or HDMI input supports copy protection (HDCP) "

I assume Consumer Protection laws in Australia don't require sets labelled "HD Ready" to comply with these requirements.

Mike

Posted

The best LCD PQ I've ever seen was on the newly released Sharp 42 running SWAT Bray.

I think these new Sharps have 2000:1 native contrast ratio's and it shows.......there was a blue SS Camaro on scene and it's detail and clarity was sensational.

I was standing between 2-2.5 mtrs{roughly} and it looked superb.

For all I know the PQ would be the same with a 720p source, but I'd let my eyes make the decision before assuming that 1080 makes no difference.

As I said before, both the new 32{720p} and 42 Sharp LCD's have the best LCD PQ.....not surprizingly, they have very high contrast ratio's.

I've also yet to see a PQ on a 3LCD Sony that would compel me to do more than swap my 13 yr old 51cm bedroom CRT for it.

Posted
For all I know the PQ would be the same with a 720p source, but I'd let my eyes make the decision before assuming that 1080 makes no difference.

No problem with you doing that, but just pointing out that the last few posts we talked about same 1080 source to different screen (native) resolutions whereas you mention different source resolutions to one particular screen. The two scenarios are not quite the same.

Posted

snapshotj20073160047po6.th.jpg

snapshotj20073160036iy4.th.jpg

Here's some 1080 stills.....I'm not sure if it's native 1080 or an upconvert, but either way, I can never get this sort of quality from any of the non-1080 channels....having said that, the ABC's best 720p broadcasts do look awesome, but they're rare.

I'm not goning to buy a large 1080 HDTV any time soon cause I barely watch FTA/Foxtel as it is.

My eyes tell me that smaller screens like 32's can get away with 720p and 1200:1 contrast ratio's......but once you break the 40in barrier, you at least need a higher contrast ratio, if not the extra res{but unsure about that}.

Btw, if someone wants me to email them the orginal 1080 snapshots, just ask.

Posted
Don't let i(interlaced) and p(progressive) confuse you - it has no impact on the resolution of the displayed image (or definition). A progressive display is just updated twice as fast as an interlaced display, therefore can display motion much better. On static images, i and p will look identical.

Mike

That’s not actually how things pan out.

Yes on static or low motion 1080p and 1080i are effectively identical resolution, however the motion issue is not what you expect.

The only 1080p source is 24 frames per second or 24 motion updated per second if you like, which has rather poor motion characteristics,

1080i form an interlaced video camera has 50 or 60 motion updated per second which results in MUCH smoother motion then 1080p 24.

1080i gives up a little vertical resolution in motion for much smoother motion, which is a good trade off IMHO.

1080p 24 can also be encoded as 1080i 50 or 60 and deinterlaced into its original progressive form (1080p 24/25).

1080i is actually a very good format but it is often misunderstood.

1080p 50/60 is ultimately a little better, but we are years away from seeing anything like that. Not even BluRay or HDDVD support 1080p 50/60, and no chance we will see it on free to air or even pay TV in the foreseeable future as the bandwidth requirements are just too high.

Posted
I think these new Sharps have 2000:1 native contrast ratio's and it shows.......there was a blue SS Camaro on scene and it's detail and clarity was sensational.

It shows all right, 2000:1 is PISS POOR, especially considering the brightness of the display. Just divide the brightness by 2000 and you get the pathetic black level of LCD’s

A good contrast ratio is more like 20,000:1 and if real blacks are the go, a high quality CRT setup is more like 200,000:1.

When you have seen blacks that are truly black, not some shade of grey, you will realise why it is so important for a truly enjoyable image when watching movies etc.

Posted
It shows all right, 2000:1 is PISS POOR, especially considering the brightness of the display. Just divide the brightness by 2000 and you get the pathetic black level of LCD’s

A good contrast ratio is more like 20,000:1 and if real blacks are the go, a high quality CRT setup is more like 200,000:1.

When you have seen blacks that are truly black, not some shade of grey, you will realise why it is so important for a truly enjoyable image when watching movies etc.

I think you need go and look at the new Sharps......"actually wait, don't do that, you might be tempted to take a hammer to them".......GAGAGAGAGAGAAA

My point is, regardless of how bad the new Sharps are compared to the very best, they now have the best LCD PQ that I've seen.....and I think that "most" people coming from a small boxoid CRT to a 32 Sharp LCD will be happy with it overall, and I think that HDTV/disc on the new Sharp 42's will entice people to buy.

As I've said before, I wouldn't buy a large 1080 LCD for SDTV and especially Foxtel....but if you've got the bray/PS3 bug, then it' all good.

Posted
...

A good contrast ratio is more like 20,000:1 and if real blacks are the go, a high quality CRT setup is more like 200,000:1.

...

An article by R. N. Clarke refers to the eye's ability to detect an intensity range -- in one scene -- of 1,000,000:1 as when viewing the night sky with the moon and a magnitude 3 star in the same field of view. See The Dynamic Range of the Eye.

Existing movie cameras (film or digital) cannot capture anything like a 1m:1 ratio in one scene, and need to have the lens aperture and/or exposure time per frame increased, when set up for photographing a dim scene.

Some modern LCD flat panels can vary the brightness of their backlight, in response to picture content. When displaying a twilight scene, the backlight of the display is dimmed. Some projectors can dynamically vary an iris to limit the light from the lamp (e.g. Sony SXRD rear-pro TVs).

It is relevant to mention that more than 8 bits of intensity are needed to smoothly represent an intensity ratio of 1m:1, as the eye can detect more subtly than 256 levels of intensity where a very wide intensity range is involved in one scene. Current video protocols for consumer display may attempt to hold video levels (for each of Red, Green and Blue) in the range 16 to 235, which is only 220 discrete levels if the scene involves black greys and white.

Years ago analogue television involved quite limited contrast ratios. If we attempt to require modern displays to represent a very wide range of intensity in any one scene, we need perhaps 10 bits for intensity gradations. [When we talk about 24-bit colour we refer to 8 bits of intensity for each of Red Green and Blue. Essentially the intensity is 8-bit.]

There is also the question of how to deal with a variation in overall intensity level between different scenes. There can be an efficiency in compressing the difference in overall video level between bright scenes and the dim scenes.

A twilight scene will usually be captured by the movie producer such that the peak intensity level in the final film for distribution is lower than for a midday scene, but not greatly lower. Because of limitations in film or digital camera capture, it is likely the lens aperture (or exposure time) will be at least partially adjusted so as to bring the brightness of the twilight scene up. In this way, the difference in overall light level between a midday and twilight scene can be compressed.

The technology in a modern display device can attempt to undo the compression the producer has introduced. The display can make dim scenes dimmer, and bright scenes brighter. However there can be side-effects. When a film camera pans across a midday scene and reaches shadows, the aperture and exposure time of the camera stay constant. However, the display electronics in a consumer's home display may interpret the shadowy scene taken at midday as being from a twilight scene, and dim the display inappropriately, making the shadows even darker.

As I have mentioned once before in this Forum, it could be useful to develop a video protocol that specifically encodes a brightness level adjustment factor for each frame, rather than the display device electronics relying on an algorithm that examines peak or average brightness levels of the incoming source material and attempts to infer an appropriate brightness adjustment.

Classic films could have these factors added in during the transfer to digital encoding, by a photography expert, so as to give a partial restoration of appropriate brightness levels for different scenes.

Posted

The sensors used in the highest quality still cameras digitise the light using a 12 bit linear convertor. This means that it can only capture a brightness ratio of 4096 to 1. For still images this is saved in an 8 bit JPG format, but because the 8 bits are non-linear, the original 4096:1 ratio can be restored.

But this depends on the display medium - photographs on paper have a maximum contrast of 100:1.

So if you have device that claims to display contrasts greater than 4000:1 it's a waste - you will need to find very specialised signal source to feed it with signals that have a brightness ration greater than 4000:1.

The resolution and contrast of video is less than for still pictures because the eye cannot detect as much detail in a rapidly changing image.

Posted
An article by R. N. Clarke refers to the eye's ability to detect an intensity range -- in one scene -- of 1,000,000:1 as when viewing the night sky with the moon and a magnitude 3 star in the same field of view. See The Dynamic Range of the Eye.

Existing movie cameras (film or digital) cannot capture anything like a 1m:1 ratio in one scene, and need to have the lens aperture and/or exposure time per frame increased, when set up for photographing a dim scene.

Some modern LCD flat panels can vary the brightness of their backlight, in response to picture content. When displaying a twilight scene, the backlight of the display is dimmed. Some projectors can dynamically vary an iris to limit the light from the lamp (e.g. Sony SXRD rear-pro TVs).

It is relevant to mention that more than 8 bits of intensity are needed to smoothly represent an intensity ratio of 1m:1, as the eye can detect more subtly than 256 levels of intensity where a very wide intensity range is involved in one scene. Current video protocols for consumer display may attempt to hold video levels (for each of Red, Green and Blue) in the range 16 to 235, which is only 220 discrete levels if the scene involves black greys and white.

Years ago analogue television involved quite limited contrast ratios. If we attempt to require modern displays to represent a very wide range of intensity in any one scene, we need perhaps 10 bits for intensity gradations. [When we talk about 24-bit colour we refer to 8 bits of intensity for each of Red Green and Blue. Essentially the intensity is 8-bit.]

There is also the question of how to deal with a variation in overall intensity level between different scenes. There can be an efficiency in compressing the difference in overall video level between bright scenes and the dim scenes.

A twilight scene will usually be captured by the movie producer such that the peak intensity level in the final film for distribution is lower than for a midday scene, but not greatly lower. Because of limitations in film or digital camera capture, it is likely the lens aperture (or exposure time) will be at least partially adjusted so as to bring the brightness of the twilight scene up. In this way, the difference in overall light level between a midday and twilight scene can be compressed.

The technology in a modern display device can attempt to undo the compression the producer has introduced. The display can make dim scenes dimmer, and bright scenes brighter. However there can be side-effects. When a film camera pans across a midday scene and reaches shadows, the aperture and exposure time of the camera stay constant. However, the display electronics in a consumer's home display may interpret the shadowy scene taken at midday as being from a twilight scene, and dim the display inappropriately, making the shadows even darker.

As I have mentioned once before in this Forum, it could be useful to develop a video protocol that specifically encodes a brightness level adjustment factor for each frame, rather than the display device electronics relying on an algorithm that examines peak or average brightness levels of the incoming source material and attempts to infer an appropriate brightness adjustment.

Classic films could have these factors added in during the transfer to digital encoding, by a photography expert, so as to give a partial restoration of appropriate brightness levels for different scenes.

Nice write up mate, but you have missed the point.

What I am getting at has nothing to do with the eyes dynamic range or about the limitations of film or video, its all about black level.

Put simply, take a typical LCD with a peak light output of 450cdm2 and a native panel contrast ratio of 2000:1. Black level on this display will be 450/2000=0.225. Now this is a long way from black. If we add in dynamic black lighting, which makes the picture dimmer not brighter so peak brightness remains the same, we can get a contrast ratio of 6000:1 according the manufacturers specs. Black levels works out at 450/6000=0.0083.

This number may seem impressive, but it’s actually still very grey in a dim or dark environment.

Now take as an example my now retired Hitachi CRT RPTV.

I typically ran it with a peak brightness of 75cdm2, which I find plenty bright for normal viewing and still too bright for a dark room.

Now CRT’s have the ability too turn off completely and make absolute black in a totally dark environment, this gives them an essentially infinite contrast ratio, but let’s say 200,000:1. Black level works out at 75/200,000=0.000375 or well below the perception of humans.

To get the same black level out of the LCD would require it to have contrast ratio of 1,200,000:1 due to its extra brightness. 450/1200000=.000375

\

As you can see people the brighter the display the better the contrast ratio needs to be to maintain the same black level.

Manufacturers concentrate on making displays brighter and brighter, as bright displays attract customers like moths to a flame in retail environments, where as black levels are not noticeable in a retial environment and a lot of people (if not most) just don’t care about blacks anyway. Just look at how most people adjust their TV’s and its plain they don’t care.

If you always view in a well lit room, black level is not an issue, but for Home Theatre, great blacks are a must as they add so much to the viewing experience.

The extra brightness available with modern displays is unusable in a Home Theatre environment as the screen will be too bright for comfort, especial if it is a large screen.

I am in the process of optimising my new display for Home Theatre. This involves optimising the inherent contrast ratio of the display and discarding 50% of the displays brightness to gain 50% better blacks. This is only possible to do on projection systems.

On flat panels you have to live with what you get.

Posted
Go down to the DVB section

1440 or 1920 x 1080 x 25 frames/s progressive (1080p25 - film mode)

These are pretty much for broadcasting, whereas we were talking about 1080p in the context of screen resolution.

If you were to buy a 1080p screen, take it home & find out it's only 1440x1080 pixels I'm sure you'd cry foul :blink:

Posted

True blacks.

first point here is that I am definitely not trolling!

Here's an exercise. Look at any show on TV, especially filmed outdoor material like, say CSI or Third Watch. Blacks Blacks Blacks. Right?

Now sit in your lounge room and look around you. What blacks do you see? Well, in my case there's my black leather business shoes as I just got home from work. There's the black bezel around my computer monitor, there's my torch in case of power cuts (QLD) - erm, there's a set of dumbells in the corner with black disks on a chrome rod... that's about it.

Sit out in your yard... you probably won't see a black anywhere.

The point I'm making is that apart from coal and a few bird and animal species , the only black you will ever see in real life is some object that has been deliberately coloured black by a human.

The reason we see blacks on TV and Movies is because they are an artifact of a system of representation that really does not convey the richness of visual information that we would see if we were actually there watching the scene.

So endless arguments about 'true' blacks is as useful as arguments about the number of angels you can fit on the head of a pin. Interesting to the enthusiasts but really nothing to do with the real world.

In the case of plasmas and lcds you are really saying that one of them shows an aspect of a CRAP picture better than the other.

OOps MLXXX just noticed your post above, pl feel free to move this post. cheers.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...
To Top