Jump to content

Recommended Posts

Posted

Article

"1080p, or 1080 progressive, is a very high resolution video format and screen specification. It is one of the ATSC HDTV specified formats which includes 720p, 1080i, and 1080p. If you are even casually interested in Home Theater, you no doubt have heard the term 1080p, and if so, you most likely have been misinformed about it. Common misconceptions being spread include that there is no media to carry it, that you need an enormous screen to benefit from it, and on the whole you just shouldn't care about it. Why the industry has persisted in the charade is beyond the scope of this piece, but suffice it to say, if you don't care about 1080p now, you will."

Posted

Great. More misinformation.

In real terms -

1080p = 1080 lines @ 50 or 60 FRAMES per second (NOT 24, 25 or 30)

1080i = 1080 lines @ 50 or 60 FIELDS per second (where 1 field is all of the even or odd numbered lines in a frame)

NO current format includes 1080p 50/60 in its specification. The only common 1080p sources are Xbox360/PS3/Computer.

1080p25 can be stored perfectly in a 1080i signal because 50 fields can be weaved together to create 25 frames.

That's why Channel 7's 576p50 is pretty useless for movies/dramas etc since most of the time they are filmed at 25fps which means that every frame is repeated. It also means that 576i is just as good.

I find all this stuff pretty simple but it seems there's lots of so-called experts out there that still can't get their heads around it.

Posted

Another article that just serves to muddy the water.

Main reason people pay thousands of $$ extra pursueing 1080p is to gain a two-fold increase in screen pixels,

not for the advantages of progressive 1080 materials over interlaced 1080 materials.

Posted
Great. More misinformation.

In real terms -

1080p = 1080 lines @ 50 or 60 FRAMES per second (NOT 24, 25 or 30)

1080i = 1080 lines @ 50 or 60 FIELDS per second (where 1 field is all of the even or odd numbered lines in a frame)

NO current format includes 1080p 50/60 in its specification. The only common 1080p sources are Xbox360/PS3/Computer.

1080p25 can be stored perfectly in a 1080i signal because 50 fields can be weaved together to create 25 frames.

That's why Channel 7's 576p50 is pretty useless for movies/dramas etc since most of the time they are filmed at 25fps which means that every frame is repeated. It also means the 576i is just as good.

I find all this stuff pretty simple but it seems there's lots of so-called experts out there that still can't get their heads around it.

Thanks Andrew - that article did confuse me a little. I'm still fairly new to all this picture techo stuff.

When you say 1080p can be in 50 or 60 frames a second, you're talking about the NTSC / PAL formats, yes?

So, I'm assuming that an Australian XBOX360 outputs 1080p60?

Posted

Yep, another misleading article, very disappointing.

1080p 50/60 frames per second just does not exist and is not supported by even BluRay or HDDVD.

The only 1080p format available now or in the foreseeable future is at 24 or 25 frames per second, which is fine for film source but no good for sport or other similar live events due to it’s low frame rate.

1080i 50/60 is definitely preferable to 1080p 24/25 for this type of content, but it requires very complicated deinterlacing for best results, so people should be very concerned about the quality of 1080i processing if they are considering a 1080p display.

A 1080p display with poor deinterlacing will be inferior to a 768p display with high quality deinterlacing.

1080p 24/25 can be carried in a 1080i video signal without loss, but its up to the deinterlacer in the display to recover it, although unlike true interlaced source it is simple to deinterlace properly.

1080p 24 source encoded as 1080i is the same as 1080p 24, so for progressive source 1080i and 1080p can be considered identical. Both deliver 24 1920x1080 frames per second.

  • 1 month later...
Posted
Yep, another misleading article, very disappointing.

1080p 50/60 frames per second just does not exist and is not supported by even BluRay or HDDVD.

The only 1080p format available now or in the foreseeable future is at 24 or 25 frames per second, which is fine for film source but no good for sport or other similar live events due to it’s low frame rate.

1080i 50/60 is definitely preferable to 1080p 24/25 for this type of content, but it requires very complicated deinterlacing for best results, so people should be very concerned about the quality of 1080i processing if they are considering a 1080p display.

A 1080p display with poor deinterlacing will be inferior to a 768p display with high quality deinterlacing.

1080p 24/25 can be carried in a 1080i video signal without loss, but its up to the deinterlacer in the display to recover it, although unlike true interlaced source it is simple to deinterlace properly.

1080p 24 source encoded as 1080i is the same as 1080p 24, so for progressive source 1080i and 1080p can be considered identical. Both deliver 24 1920x1080 frames per second.

Mmm, I agree the HD disc sources are 1080p24 Owen.

Doesn't the Tosh HDa1 then convert to 1080i60 and then your scaler or display return it to progressive?

For example, my VP30 scaler has output settings for 1080p50 and 1080p60 - so those signals do exist and are sent to my display - which it seems to be able to handle fine. So 1080i60 (and other signal resolutions) in, 1080p60 out to the HD1.

Certainly when the 1080p24 output BR and HDDVD players are out in force, that will be purchased (as judder is certainly noticeable and sl annoying)

Posted
Mmm, I agree the HD disc sources are 1080p24 Owen.

Doesn't the Tosh HDa1 then convert to 1080i60 and then your scaler or display return it to progressive?

Yep that’s what happens, and it should look just as good as 1080p output if the deinterlacing is any good, which is not difficult for progressive source content.

For example, my VP30 scaler has output settings for 1080p50 and 1080p60 - so those signals do exist and are sent to my display - which it seems to be able to handle fine. So 1080i60 (and other signal resolutions) in, 1080p60 out to the HD1.

You can take 1080p 24 up to 1080p 120 on a PC, but you aren’t gaining anything, as the source was 24fps. All that happens is frames are repeated, which is pointless.

If you have a HTPC, a high end scaler like the VP50 or preferably one of the Silicon Optics powered scalers, 1080i 50/60 from an interlaced video camera is deinterlaced to 1080p 50 or 60, with 50 or 60 separate and distinct frames per second, as well as 1080 resolution. Then 1080p 50/60 input to the display becomes useful as you can bypass the TV internal deinterlacing with something better.

The VP30 only uses basic bob deinterlacing on true interlaced 1080i as far as I know, and most current 1080p displays have better systems, so the VP30 should be bypassed for 1080i video source.

Posted
Great. More misinformation.

In real terms -

1080p = 1080 lines @ 50 or 60 FRAMES per second (NOT 24, 25 or 30)

1080i = 1080 lines @ 50 or 60 FIELDS per second (where 1 field is all of the even or odd numbered lines in a frame)

NO current format includes 1080p 50/60 in its specification. The only common 1080p sources are Xbox360/PS3/Computer.

1080p25 can be stored perfectly in a 1080i signal because 50 fields can be weaved together to create 25 frames.

That's why Channel 7's 576p50 is pretty useless for movies/dramas etc since most of the time they are filmed at 25fps which means that every frame is repeated. It also means that 576i is just as good.

I find all this stuff pretty simple but it seems there's lots of so-called experts out there that still can't get their heads around it.

Remind me to send an e-mail to Panavison that their cameras are running at the wrong speed. :blink:

Posted

The conclusions of the article are not so much on the 1080p versus 1080i issue (though the article spends a lot of time explaining p vs i) but about the almost self-evident proposition that 1920x1080 display gives better picture detail than 1368x768 and lower resolution formats.

The article suggests that a low resolution format program might best be displayed on a display of exactly the same resolution format, rather than being resampled for display in a higher resolution format. [it doesn't delve into the question of overscan.]

It makes the point that resampling to a higher resolution can result in relatively little loss in picture information, but in contrast a downsampling from 1920x1080 to a lower format is likely to result in a more significant loss of picture information.

This seems perfectly reasonable.

It's still really motherhood statement material, though very clearly explained.

However I felt a part on the last page (particularly the part I am highlighting in blue font) contained what was perhaps too much of a simplification:

With the 720 x 480 output image, you get a similar deal, but without any fine detail, and with the same artifacts of wide areas of shading that, in the original image, simply don't exist.
Now, you can take the argument that the 1920 x 1080 pattern is in such fine detail, that it doesn't apply to real world images. Oh, really?
Has anyone ever considered that 1920 x 1080, the highest ATSC resolution HD format, is a mere 2 megapixels? Is somebody going to actually propose that the grille of a truck in the distance will neither move nor have closely spaced lines in the image? Hmmm?

I think the writers might usefully have considered referring to the fact that if a 1920x1080 format video camera followed good practice, the grille of a truck would not be transmitted with 1920 pixel horizontal resolution in the first place. Good practice requires suppression of detail variation at the one pixel level. This is consistent with the Nyquist theorem and with avoiding false detail (aliasing). It also helps keep the encoded bandwith of the video signal more manageable.

However explaining this concept is not easy. And the fact would remain that if the truck grille were close enough to the camera to be captured with a good practice visible resolution of 960 vertical lines, a 1920x1080 display could do reasonable justice to that 960 line visible resolution, whereas a 1366x768 display would struggle.

So my technical quibble makes no difference to the conclusions reached in the article.

It's also a pity the article doesn't refer to the fact that most 1920x1080 flat panel displays crop and resample a 1920x1080 source, so as to give overscan. This might have given the article a more practical bent.

Although this resampling has only a minor effect, it is immediately noticeable with computer generated text. On my 60" SXRD, the blurring effect of resampling is also noticeable with a high resolution still picture, if I cycle the set betwen 1:1 pixel mapping (screen area set to normal) and overscan (screen area set to +1).

Posted

I find it hard to believe that anyone cant see (I know all the optos reckon you cant) the differance between a 1080 p picture and a 1080i or 720 p source. I ask all my friends and their kids witch source they like by a-b testing and I have only had smart arses that know its coming pick the lesser res. Ok burn me but I tell the truth, dont tell me? you cant handle the truth!

Posted

What size and resolution is the display and at what distance is it viewed?

Most displays are too small for your eyes to resolve 1080 resolution at a normal viewing distance, so 720 usually looks the same.

1080p and 1080i should and does look identical on a decent 1080p display, but 720p is noticeably softer if you sit close enough to resolve 1080.

Posted
Article

"1080p, or 1080 progressive, is a very high resolution video format and screen specification. It is one of the ATSC HDTV specified formats which includes 720p, 1080i, and 1080p. If you are even casually interested in Home Theater, you no doubt have heard the term 1080p, and if so, you most likely have been misinformed about it. Common misconceptions being spread include that there is no media to carry it, that you need an enormous screen to benefit from it, and on the whole you just shouldn't care about it. Why the industry has persisted in the charade is beyond the scope of this piece, but suffice it to say, if you don't care about 1080p now, you will."

What this doesn't explain or touch is the frame rate is critical with respect to the human eye and perceived ‘flicker, judder’. The optimum frame rate is between 55-65 fps.

We are still locked into 24fps, because it WAS ADDED to the HD specs purely for electronic film production. The original specs were 1125i60 (and analogue), then it moved to 1080i60/50 digital. P mode and 24/25/30 came later, from film industry lobbying.

Most high end HD cameras now can shoot in 1080p50/60 and most likely they will add (maybe) a 1080p48. That would be compatible with film release. The so called de-interlace issue will then go away. The down side 1080p60 will chew up nearly twice the data as 1080i60.

The first market is ‘film’ post production and then E-movie presentations and then within 2-5 years full 1080p50/60 home screens. Then sport will look great. I think the article had a ‘help push DVD up-scaling/p mode screens’ sales etc – not that there is anything wrong with that.

  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top