Jump to content
IGNORED

High Definition Hqv Benchmarks


ocujos
Monitored Thread

You have reached the maximum limit for the number of replies allowed at this time. Please check back later.

Recommended Posts



  • Replies 32
  • Created
  • Last Reply

Top Posters In This Topic

There is a reason for this - money. Do you know how many millions of lines of code have gone into the new demuxers, new decoders and all the rest of it? Then, the system requirements are skyhigh even for well otpimised code.

If they start doing per pixel processing like the terenex chip does in software - sure it can be done but imagine the impact to system requirements then - in a market where the companies are already enrvous about selling their product.

Link to comment
Share on other sites

I have no idea what they're on about in this article. My old PC was an AMD with an ATI card and it passed the HD deinterlacing tests. My new PC is an Intel with an nVidia card and it also passes the tests - even using the Cyberlink decoder with hardware acceleration turned on. (It fails if I turn off hardware acceleration)

Link to comment
Share on other sites



And the result is ..... shocking :blink:

Full article here.

I am already thinking about the next HTPC I am going to build and it does make you stop and think.

I was planning a system based aroung a HD-DVD/BluRay PC drive. If the video card and drivers can''t process and display the output at its optimum, It defeats the purpose of having an all in one HTPC!

The article does point out that the problem is mostly with 1080i material, not 1080p.

Still, there isn't much point buying an expensive premium video card if it can't handle everything you throw at it.

Link to comment
Share on other sites

That whole article is somewhat iffy though IMO. They don't mention what the rest of the computer specs were (they don't even specify what card was used) or what any of the software settings were.

I'd like to see a AVIVO vs PureVideo comparison done properly, not relying on a single benchmark disc and with less of a focus on the whole 3:2 pull down which is only relevant in with NTSC material and will become even less important with HD-DVD and Blu Ray.

These tests always seem to be done by hardware review sites who really don't seem to know as much as they need to to do a proper review. It would be good if you could get a HTPC setup by someone who knew what they were doing and then throw it in against the standalone HD-DVD/Blu Ray players.

Link to comment
Share on other sites

Glider consider this question:

Q: What HTPC products currently allow you to output the new audio formats over HDMI or 8 analog outs?

To my knowledge none.

Correct, thats why im waiting for the new AMD 690G based HTPC HDMI/HDCP MotherBoard from MSI (K9AGM2-FHI) to rectify the situation. :blink:

Link to comment
Share on other sites



The AnandTech review, if you can call it that, is a piece of utter, utter crap.

They clearly have no idea what they are doing when it comes to video playback on a PC and the issues involved. They should stick to PC hardware.

There is a glaring flaw in their test, and that is Power DVD 6.5, which is rubbish. I know because I have used it.

To get an Nvidia card to perform as it was intended you need an application that can run in DXVA VMR9 full screen exclusive renderless mode, which Power DVD cannot do.

The only player I know off that can run in this mode properly is Theatertek, which comes with the nVidia’s Pure Video decoder.

It performs flawlessly with 1080i contend on my 7800 and 8800 video cards. Deinterlacing on the 7800 is excellent, and full resolution is passed with both true interlaced and progressive sourced (film) 1080i without issue. (I have run resolution tests).

The 8800 is definitely superior to the 7800 as it maintains more detail in motion with true interlaced source (from a 1080i video camera), and has all round significantly better colour and video quality then the 7800 with any video source, something that AnandTech failed to notice.

The 8800 is also fussy about YUV mixing, and even a minor setting change like that has a dramatic affect on deinterlacing performance, not that AnandTech would know.

Bad edit detection is currently broken in the 8800 drivers, but since the 7800 works perfectly, I expect this will be addressed soon.

Noise reduction is off by default in the nVidia drivers, and it was not functional on the 7800 in VMR9 on the last driver revision I used, but was functional in overlay mode.

Noise reduction IS functional on the 8800 in VMR9, but I’ll bet the guys at AnandTech had no idea how to enable it on either card. :blink:

Noise reduction is usually done in the display, not the source and criticizing the movie studios for not filtering noise is ridiculous.

All video that is to be compresses usually undergoes pre filtering to remove out of band components and as much noise as possible prior to compression.

If you try and filter all the film grain out, you degrade the image, so studios wisely don’t.

AnandTech should be ashamed of them selves for posting such a blatantly inaccurate and ill researched article. I am surprised ATI and nVidia have not taken legal action to force them to take the article down.

ATI may not be too pissed as AnandTech never mentioned them and continuously got them mixed up with AMD, what a bloody joke. :D

Remember people, don’t believe everything you read on the net, even from “respected sites”.

Link to comment
Share on other sites

amd did buy ati, so using amd by name is not an issue really.

its a good idea to shoot holes into the article, because there are problems with it, some of which you mentioned

but one main issue is that for most users that simply want to use their current ati and nvidea cards to display hidef interlaced material from their pc's, there are major problems in trying to get it to work correctly "out of the box" with a program like powerdvd or windvd. and for those people this article is a nice wakeup call imo.

Link to comment
Share on other sites

There is a glaring flaw in their test, and that is Power DVD 6.5, which is rubbish. I know because I have used it.

To get an Nvidia card to perform as it was intended you need an application that can run in DXVA VMR9 full screen exclusive renderless mode, which Power DVD cannot do.

Zoom Player also has the option.

ATI may not be too pissed as AnandTech never mentioned them and continuously got them mixed up with AMD, what a bloody joke. :blink:

As said above AMD=ATI.

Link to comment
Share on other sites

amd did buy ati, so using amd by name is not an issue really.

Fair enough, but the graphics card line still goes by the name ATI does it not.

I have never heard of an AMD video card, and I am sure most readers of that article have not either.

its a good idea to shoot holes into the article, because there are problems with it, some of which you mentioned

but one main issue is that for most users that simply want to use their current ati and nvidea cards to display hidef interlaced material from their pc's, there are major problems in trying to get it to work correctly "out of the box" with a program like powerdvd or windvd. and for those people this article is a nice wakeup call imo.

If the article stated that the problem was with the software they where using, I would not have a problem with it, but they lay the blame on AMD (ATI) and nVidia, which is just plain wrong.

I have not used a current ATI card, but my nVidia cards work great, and provide outstanding quality and deinterlacing when driving a 70” 1080p display if the right software is used.

WinDVD and PowerDVD are trying to get BluRay and HDDVD playback happening, but so far they don’t appear to have things sorted out very well.

1080i and 1080p Mpg2 playback works exceptionally well in Theatertek, which goes to prove that the hardware deinterlacing of 1080i is working fine.

TT does not yet support h.264 or VC1 content.

Zoom Player also has the option.

Yes Zoomplayer has long supported full screen exclusive renderless mode, but it never worked properly with my system. (I have not tried the latest version)

Film (progressive) sourced 1080i has not been a problem for years, and Dscaler5 in Zoomplayer works just fine without DXVA, however true interlaced 1080i has been a nut only TT seems to have been able to crack.

Link to comment
Share on other sites

  • 2 weeks later...


just for some more fun have a read of this, an excellent LOCAL article on interlace/progressive flags on DVDs:

http://www.connectedhome.com.au/articles_novdec_reg_02.php

this also happens with digital TV in Australia which leads us too need these $500 gfx cards to get good quality.

i havent tested with HD/Blu Ray but I wouldn't be surprised if its the same.

so all this bad edit correction is most definitely needed.

Owen,

Did you do your 1080i resolution tests with incorrectly flagged HD streams? Because otherwise, a $50 gfx card can read correctly flagged 1080i and give a very good picture that rivals one produced by a $500 gfx card.

I find the biggest problem is that u need to run a special DXVA mode to get this stuff to work. If TheatreTek is the only program that can do it then all the tech is useless becuase I cant get this quality with MCE which going to become widely used with Vista (and the best HTPC front end IMHO).

Worse yet is people who think their system is working PERFECTLY when its not even coming close, but they just don't know it. Then you have all these people screaming blue murder when an article like the HD HQV one (which i agree is incorrect for using PowerDVD6.5) or the XBox360's shocking DVD quality come out, shooting down the TRUTH.

THE SOLUTION:

Personally i'd rather not have to worry about decoders and $500 gfx cards just to fix the mistakes made by the idiots who author this content and give it the wrong flags. If DVD/HD DVD/BluRay/HDTV all had correct flagging (which is completely possible and in the relevant specs) we would only need basic gfx cards and decoders that could read flags (which they all do) and could deinterlace half decently.

Or more to the point get rid of interlaced content all together!!! Why does it even exist on new technology (HD DVD/Blu Ray) introduced in 2006, there is no need for it and only causes problems.

Link to comment
Share on other sites

I think technology implementation timing has been botched.

Isn't it ridiculous that we have progressive displays but interlaced HD and have to jump through hoops to work around the incompatibility?

Me thinks they should have stuck with interlaced CRT displays until source materials evolved to progressive and then introduce progressive display to the masses at the same time. :blink:

At least with the release of HD-DVD and Blu-ray convergence of display and player is slowly occurring, yet there are still incompatibilities (ie little true 24/48/72p end to end).

Link to comment
Share on other sites

just for some more fun have a read of this, an excellent LOCAL article on interlace/progressive flags on DVDs:

http://www.connectedhome.com.au/articles_novdec_reg_02.php

this also happens with digital TV in Australia which leads us too need these $500 gfx cards to get good quality.

i havent tested with HD/Blu Ray but I wouldn't be surprised if its the same.

so all this bad edit correction is most definitely needed.

You can get "good" quality with almost any graphics card, however if you want something better it will cost you.

I have used in order an ATI 9600 Pro, nVidia 6600GT, nVidia 7800 and now an nVidia 8800.

Each step up yielded improvements in video quality and deinterlacing. The 7800 was the first card that handled true interlaced 1080i (from a 1080i video camera) properly under all circumstances (only with TheaterTek ).

The biggest improvement was with the 8800, which has colour processing, scaling capabilities and deinterlacing that make every previous card look sad.

I liken the difference to a comparison between a cheap no name Plasma and a Pioneer.

Owen,

Did you do your 1080i resolution tests with incorrectly flagged HD streams? Because otherwise, a $50 gfx card can read correctly flagged 1080i and give a very good picture that rivals one produced by a $500 gfx card.

Remember 1080i comes in two forms. True interlaced from a 1080i video camera, and Progressive sourced from 24fps film or 1080p video camera.

The Progressive sourced 1080i (film) is dead easy to deal with, and software only decoders like Dscaler5 handle it perfectly, as well as progressive sourced DVD’s with no assistance from video card hardware or drivers as only simple weave deinterlacing is required for a perfect result.

The big problem has always been the true interlaced source from interlaced video cameras, which has been very challenging to deinterlace well.

I don’t know if my 1080i video test video’s are flagged or not, but my biggest test is true 1080i video streams that have damaged (dropouts). After a drop out, the decoder has no way of knowing the field order of the video stream other then via flags in the video or bad edit detection in the decoder/hardware.

My ATI 9600Pro and nVidia 6600GT could never handle this properly and it was not until I upgraded to the 7800 that I got to see bad edit detection and correction in practice.

The 7800 was unflappable, and always played any video unconditionally smoothly, even badly edited or damaged video.

Bad edit detection is currently not functional in the 8800 drivers, and like older card’s its hit and miss whether video will be displayed in the correct field order after a bad video edit or drop out.

Since the 7800 worked perfectly, I expect the 8800 drivers will be sorted out soon.

I find the biggest problem is that u need to run a special DXVA mode to get this stuff to work. If TheatreTek is the only program that can do it then all the tech is useless becuase I cant get this quality with MCE which going to become widely used with Vista (and the best HTPC front end IMHO).

As I said above, a simple software decoder can give a perfectly deinterlaced progressive output if the video was originally progressive (film). It’s the true progressive source that has been the problem.

nVidia cards usually require VMR9 Full Screen Exclusive Renderless mode for smooth playback, nothing new about that.

I don’t use MCE and don’t want to in any form, Vista or XP, as I find “front ends” very limiting and inefficient. I much prefer to use the best possible application for each task.

I have two HTPC’s, the one that the family uses has a remote and uses HIP to start, stop and control the desired applications, DNTV (live TV), TheaterTek (DVD’s and other video files), Dscaler4 (Austar), no need for a cumbersome front end.

For anything more then basic functions it’s much better to use a cordless mouse.

My main system is driven like any normal PC with a mouse and keyboard, and that’s the way I like it.

Worse yet is people who think their system is working PERFECTLY when its not even coming close, but they just don't know it. Then you have all these people screaming blue murder when an article like the HD HQV one (which i agree is incorrect for using PowerDVD6.5) or the XBox360's shocking DVD quality come out, shooting down the TRUTH.

Yes, many people “think” they are getting good playback performance, but until you see it done properly you really have nothing to compare with.

THE SOLUTION:

Personally i'd rather not have to worry about decoders and $500 gfx cards just to fix the mistakes made by the idiots who author this content and give it the wrong flags. If DVD/HD DVD/BluRay/HDTV all had correct flagging (which is completely possible and in the relevant specs) we would only need basic gfx cards and decoders that could read flags (which they all do) and could deinterlace half decently.

Or more to the point get rid of interlaced content all together!!! Why does it even exist on new technology (HD DVD/Blu Ray) introduced in 2006, there is no need for it and only causes problems.

Most HD disks contain film sourced content which is progressive and very easy to deinterlace perfectly with simple software routines.

The problem is with “true interlaced” source from a video camera. This can be flagged or edited poorly and can be damage in transmission, so bad edit detection and correction will always be required.

Link to comment
Share on other sites



I think technology implementation timing has been botched.

Isn't it ridiculous that we have progressive displays but interlaced HD and have to jump through hoops to work around the incompatibility?

Me thinks they should have stuck with interlaced CRT displays until source materials evolved to progressive and then introduce progressive display to the masses at the same time. :blink:

At least with the release of HD-DVD and Blu-ray convergence of display and player is slowly occurring, yet there are still incompatibilities (ie little true 24/48/72p end to end).

Interlaced video will be with use for a long while yet and 1080i is the standard for HD, so people had better get used to dealing with it.

CRT’s just worked with interlaced video and we never gave interlacing a second thought.

Link to comment
Share on other sites

What exactly does TT do that the others don't? I'm using MCE2005 (VMR9) and I can play 1080i MPEG2 (video source) perfectly well. I get nice adaptively deinterlaced 1080p50. I can use my Dvico DXVA decoder or Cyberlink 7 DXVA decoder and both work perfectly.

I believe MCE can use VMR9 full screen exclusive renderless mode, just like TT and some other application can, although all the other apps I have tried don’t work anywhere near as well as TT.

As for MCE, I cant comment. You would like to think that Microsoft could write something that would work properly on there OS.

I did hear that TT was using a custom “allocator presenter” what ever that is, that other players where not using.

MCE may also use it, I don’t know.

If you are using a Radeon card, you don’t need full screen exclusive renderless mode, but nVidia cards do for VMR9.

1080i generally works quite well on most systems these days, but when you see what a 8800 can do with it you may have to reconsider what you thought was good in the past.

Link to comment
Share on other sites

Ian I agree with your point. Mostly I think interlacing is hanging around due to broadcasting. Interlaced fields use less bandwidth than progressive frames.

That is a commonly held misconception.

In fact 1080i 50 uses exactly the same bandwidth as 1080p 25, and a little more then the 1080p 24 that is used on BluRay and HDDVD.

1080p is easier to compress then 1080i, so 1080p 24 can use significantly LESS bandwidth then 1080i 50 and much less then 1080i 60.

1080p 50 or 60 uses double the bandwidth of 1080i 50 or 60 but it is NOT supported by BluRay or HDDVD

1080p 50/60 is a non issue as nothing is captured in that format.

Film is 24 frames per second and video cameras run in 1080i 50/60 or 1080p 24.

1080i 50/60 has a big advantage over 1080p 24, and that is motion smoothness.

With 50 or 60 motion updates per second compared to 24 for 1080p, there is just no comparison for sport or other fast moving content.

We have been watching 576i 50 PAL TV all our lives and motion smoothness is a non issue. 576p 25 is a disaster in comparison.

Link to comment
Share on other sites

I'm glad that Owen is with us as he's one of very few people on these forums that understand (de)interlacing in-depth.

Can you please tell me (if you know) how does nVidia 8800 compare to the X series ATI cards for de-interlacing true 1080i content?

Assuming that 1080i content broadcast here in Australia is true 1080i, de-interlacing it correctly is a must have feature for my HTPC system as I have a digital, progressive display (Panasonic AE700 projector).

Link to comment
Share on other sites

Thanks for the compliment mate, I try to dispel the myths about 1080i-1080p whenever possible, and in general I think the message is getting through.

I have no first hand experience with X series ATI cards, so I cant help you much.

The only way you will know for sure how any card will perform is to buy one and test it out. You should be able to find a retailer who will let you return the one you don’t want.

I purchased the 8800 as an experiment, and fully expected it to be no better then my already very good 7800, and was all set to return it if it did no provide a tangible improvement.

However, one I experience the very obviously superior colour processing, scaling and anti aliasing, as well as improved deinterlacing, I could not give the 8800 up, even though the drivers are very buggy and not fully functional at this time

Link to comment
Share on other sites



Monitored Thread

You have reached the maximum limit for the number of replies allowed at this time. Please check back later.

  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top