Jump to content

Xtremeplace Blu Ray player shootout test 1st Jan 2010

Recommended Posts



The enjoyment of hi fidelity music as well as home theatre can be a rather subjective affair, with some favoring one brand over another with such fervor that their enthusiasm borders on the zealous.


Magazine reviews are also not that helpful sometimes, due to the flowery but essentially useless language used, or that they are reined in when it comes to the amount of negative information that they can publish given the need to fulfill the advertising dollar and fill the pages of the magazine with ads. Witness how each “Awards” issue of Hi Fi magazines are filled with ads by the same companies which have just been awarded the best product of the year.


Or how some companies always do well and that the ads with their names pepper the magazines. This phenomenon is not only present in print but also in internet based hi fi forum. Witness also the emergence of paid forum members who post good comments or “unbiased” reviews of products and counter any negative comments on specific products. This is the reality of the hi fi business where a bad comment can make or break a new product.


At the end of the day, if enough people agree on a certain product, there should be some truth on it. However how many of us can claim access to all those lovely products in the market?


Demo sessions in the shops afford us an idea of how the intended purchase will sound or look like, but often there are distractions from fellow shoppers or impatient dealers twiddling their thumbs and tapping their feet as they wait for you to finish your demo. Some popular shops have become so successful that their sales staff are no longer as friendly as before and for a new entrant into the world of home theatre (HT) or hi fi, it becomes an unnecessarily daunting ordeal to step into a dealer’s lair and ask for a demo.


This is where our review or shootout comes in. There are relatively few such sessions where people are willing to travel miles and bring their favorite player along, and try it out against other players. Some forums where threads on certain brands exist, seem to feel their player can do no wrong and yet refuse a head to head objective session with other players with opinions garnered from a wider audience.


We believe this is one of the first to gather a series of current generation Blu Ray players in a single session to be assessed with a well defined methodology.


Link to comment
Share on other sites

  • Replies 87
  • Created
  • Last Reply

Top Posters In This Topic



The first objective of this session was to make friends, and help new members enjoy and try out new gear which they would not be able to do in the comforts of a home.


The main thrust is still to see if we can differentiate between different players, using a set series of material and first out if paying more does yield a better player.



Link to comment
Share on other sites


Our associated test equipment was:


Revel Studio 2 based (Revel voice centre) HT setup. Subs are SVS PB12+ and Def Tech Supercube Reference.


Onkyo 5007 processor driven in pass through mode to a Panasonic A3000E LCD projector displaying on a 133” Remaco screen. Goldmund 200W monoblocks drove the L/R/C. Theta Dreadnought drove the sides and Emotiva XPA5 drive the Front Highs and Rears


Our Blu Ray players were:


Sony BD 765 aka BD 760 in certain markets.

Oppo BD 83 (standard edition)

Panasonic BD 60

Pioneer LX71,

Pioneer BD-23

Denon DVD-A1UD

Sony PS 3 20G

Samsung 2550 (USA model)




24 Gauge 2m monoprice HDMI cables were used throughout the test for all cases. Each player was connected through its own HDMI input (the Onkyo has 8 inputs). The standard power cable was used in each case except for the Denon where the owner brought his own IEGO customized cable.


All players were plugged into a GW TD 1000 power conditioner running from a dedicated socket.


Time trials:


We also did time trials of


- Time to startup – which measured how fast it took for a player to eject the tray from a totally switched off state.

- Initiation time – time taken for a disc to load up and show the first FBI warning page (we use a disc {PAUL SIMON convert} with no BD live content and was tested to work on all the players).



Link to comment
Share on other sites

The session consisted of 3 segments.


The objective segment employed the HD HQV Benchmark Blu-Ray Disc. (http://www.hdtvsupply.com/hqv-benchmark.html). Further information on the test disc can be found here:



The Objective 1080p Screen fine resolution test

Using a HQV test disc, we displayed images in the Film Resolution Loss Test segment and we looked for the details and definitions of the lines on the screen, image stability and any flicker.


In this test, a horizontal pan over the standardized SMPTE test pattern was recorded at

24 frames per second. The 1080p24 source was then transferred to the 1080i60 broadcast standard. If the processor properly handles the signal, the boxes with the stripped horizontal lines will remain intact. If not, either the boxes will “strobe” between all black and all white, or you will see vertical bands on the sides of the box. Any “strobing” or banding constitutes a “fail”.


This test is relevant for testing Blu-ray and HD DVD players for any content that is 1080i and was sourced from a 1080p master that underwent a telecine process. This includes some concert footage, documentaries, films, and many television shows.


The goal of this test is to evaluate if a processor can appropriately identify the source cadence and apply the proper inverse cadence to recreate the original 1080p image. It also tests if your video processor, player or display can properly recognize the source type and apply the right de-interlacing to achieve the full 1080p image.


1080p  Screen fine resolution moving test


The same image used above in motion provided information on judder and flicker.


Real image test





We used the image of the football pitch to test how the images would look like in a real shot. Any moiré or flickering in the upper stands indicates half resolution processing. This test provides you with a real world video that can show you how improper video processing can affect an active image. The stands in this stadium are very high in detail and a good processor, player or display should be able to reconstruct the intended 1080p image with all of its intended resolution properly.



The next segment was a subjective ranking of the players from the best to the worst.


We used a segment from the “Walmart” version of the Transformers II – the forest fight scene (59min 06s to 60min 30s) and played the same scene for all the movies.


We had a single member who operated the various players whilst the participants were blinded to which player was being used. The order of play was randomized by the tester.


The movie was first watched on the most expensive player – the Denon to appreciate the scene with and without the sound. Then the test began.


We ranked the players for video quality, and for HT sound quality.


We understand that it was a subjective assessment, and we basically asked our participants to rank the players according to how they performed, in terms of picture quality, involvement of sound, immersion in the scene and clarity (dialogue, effects and direction).



Link to comment
Share on other sites

Objective tests:


Time trials:


Speed test:

The standout from this was the Oppo which opened its drawer within 5s. As for the rest, there was less difference between the players. The mean was 16.5 seconds and the Denon also did quite well at 6s.


Response test:


The Oppo was again the fastest, with the time taken from the drawer closing to the menu appearing to be 20s. The mean was 34.2s. The difference between players was not statistically significant.


As for the remotes, only the Denon, Oppo, Sony BD 765 had backlights, and the buttons were of different shapes allowing better use in the dark. All the players had eject buttons on the remote and a soft on/off button.


The Oppo had the most number of controls on the front panel, allowing most of the functions to be used. Most of the players except the Sonys had enough buttons to play, skip chapters and eject the discs. The Sony BD 765 in particular had a poor design with the whole front panel flipping out, and blocking the rest of the buttons.


All the players were Profile 2.0 and each was Code Free for SD DVDs but only the Oppo had the means to be Region Free for Blu Ray locally. There are kits for this for the Panasonic which you have to ship over or send your player for modifications.


Only the Oppo could play VCDs. The Denon and the Oppo are universal players with the ability to play SACDs and DVD-As. The Denon also comes with Gen IV of the Denon link for better signal transmission.


We did not have the time to test the true audio capabilities of the players using their own DACs and analogue outputs, that will be the subject of a future test.


The price range goes from SING$ 5xxx to < $3xx for the Panasonic so the players represent a wide spectrum of players. It would have been nice to get our hands on a local Samsung or Philips player since they represent the more budget models but this will be noted as one of the limitations of the study.



Picture quality




The Denon had stable and sharp images but in the motion test suffered from alternating color. The colors were also a little soft in the stadium image.


Panasonic BD 60


The image was stable with no flickering, but there was alternating color. The colors were also a little soft in the stadium image.


Oppo BD 83


The Oppo image was stable, with good definition but there was still flicker in motion. Again the image was a bit soft.




Both Pioneers did well in their tests. Image quality was more than acceptable and participants were all impressed with the picture quality and they were in no way surpassed by the Oppo. Most felt their colors were the most neutral. Colors were a bit soft in the stadium shot.


Sony BD 765


The Sony had the sharpest image on the stadium shot but video resolution was not that good with flicker and the player also did not do as well as the Pioneers in motion.


It must be mentioned that the Samsung and the Denon had some HDMI sync issues which took a while and re-boots to get them going.


Sony PS 3


How did this old war horse do? Well not too shabby indeed. It could play everything, something the Denon could not (the AVCHD disc could not load up). And the images were more than serviceable. There was little between it and the middle segment. Colors were a little soft in the stadium shot.


Link to comment
Share on other sites

So who came out tops for video?


I will get this out of the way first. Everyone was dying to know how the Denon did. I won’t hide the fact that the Denon was the best of the lot, except for the response time. This top of the pile machine performed the best for audio and video from our objective and subjective tests.


It was ranked best by most of the testers, even on repeat tests.




Here is the big caveat, we found that you could discern a difference between players. With the Denon, it was a step ahead of the rest, which even the Oppo could not outperform. For the big Oppo fans this may be a sad news, but take note of the big gap in price.


The Oppo was a fine performer, able to do much of what the Denon could do for less. With the added audio capabilities of SACD and DVD-A playback, it is quite a bargain.


The Pioneers were also very decent performers, keeping up with the Oppo, except in response time.


The Sony was no slouch and it had sharper images than many of the others in real images.


Finally the Panasonic which was identified as the lowest scoring model, was more than adequate in real tests.


If the results did not prove explicit or as conclusive as many may have wanted, then you would have reached the true thrust of the results.


Link to comment
Share on other sites

Player                              Start up time  Initiation time

Time in seconds


Pioneer LX71 30 41.5

Pioneer Elite 23 21 35

Oppo 83                                 3 20

Sony 765                                 4 35

Samsung 2550 7 na

Panasonic BD60 23 37

Denon                                 7 38

PS3 ( 20 GB ) na 33


Link to comment
Share on other sites

In HT audio


Again the Denon was tops but the differences between the other players were so small via HDMI that it was hard to place the rest. They remained bunched up together. The differences from room treatment, speakers, amps etc will make a bigger difference.



Link to comment
Share on other sites

tonedeft for being such a wonderful host

wizardofoz for bringing his coffee machine + HDMI cables - superb coffee!

audio - the man with the big heart for sharing his Kick ass Denon and being such a great sport!

Austrich - wine and food to go with the demo - generous to a fault

francis woo - the man with the gear - bringing his Pioneer and the conditioners and discs

allen - the IT guru who was our scribe

AJ - lights and Oppo

HT 102 - bringing the remote AJ forgot and the Panny + drinks

newkid - being the conductor of the test + the Sammy

C722 - for bringing his Brand New Sony 765 for testing, we opened the box for the first time that night!


All warm souls without whose help, I could not have done this. Kudos and thanks to all!

Link to comment
Share on other sites

The Denon was like a Maserati to me, sleek, gleaming, and supremely well built. You don't buy top of the line items just for function. From my own experience with the Reference series Marantz, you pay for the tactile feel of top notch build quality, smooth controls - see how the drawer flows or oozes out....


Although it was not 5 times better than the Oppo, it was better and for those with deep pockets, rest assured your money goes somewhere tangible and visible.


The Oppo is not cheap either but offers multi-function and does perform better than the <$500 crop. Yet, do not feel bad if the basic ones are all you aspire for. The difference is in audio and upscaling, so if you have a proper CD player and only use it for BR playback, you can indeed be happy with the cheaper and better value options.




Link to comment
Share on other sites



At the end of the day, we could have been more stringent in our application of the test. Playing the same test disc, especially one with more challenging colors, movement and flicker may be better. Transformers II with its CGI based movie, may not be best test disc, especially for flesh tones.


There is also an element of fatigue and whether the sequence of players mattered. Also we need to spend more time calibrating the players to the processor and display, which may influence the results as all tests were done on players “raw” – without ISF or other calibrations to ensure the best outputs from the machines.


Finally we cannot emphasis more that this is not a test of audio performance.


We also did not test the upscaling capabilities of the players. As this gathering is mainly Hi Def based, none of the members felt that this was of much importance. So if you have a large collection of SD DVDs, you may want to check which gives you a better upscaler. But it is useful to go back and read up on this topic of upscaling. A good video chip in the projector can be good enough, or one in the player. You do not need more than one chip.


In the HT audio test, the gain needs to be carefully matched so any differences in sound quality are not due to a difference in volume.


Link to comment
Share on other sites

A point from HT 102:


The panasonic used in the test was not using the latest FW. With the latest, startup times are now better.


- actually the Panny was pretty good already, so that was not a big deal for it.


At the latest prices it is a pretty decent player.

Link to comment
Share on other sites



Our tests were done in a rather exacting environment. Using a screen which measured 133” and using test discs and close scrutiny, with some going close to the screen to examine details, we did find differences.


Between the best and the worst, it was easier to differentiate. But given the price gulf of close to 10 times or more, the buyer will need to ask himself what kind of budget are you looking at?


This brings us to our recommendations.


Firstly, if money is no object and you want the best, and possess a large projector based setup, the Denon is the king. In terms of build quality, even response, and of course audio and picture quality, it is the top of the pile.


However for those with more real world budgets and more modest setups, the difference between the other players are far less.


If the display is a 40-50” flatscreen, the differences are much harder to pick up and you should choose the player based on other issues than merely the performance. Factors like feature set, price, ergonomics, or even compatibility with your system will matter more.


Again our tests were more concerned with HT performance.


The true is that the difference between players in terms of their Hi Def performance was much less than in the audio realm. In particular, for HT audio, the differences were simply not worth the price difference.


However if you are using it as an audio source, you need to audition more carefully.


If you are entering the world of Hi Def, understand your own needs better, and divide your funds accordingly. The subwoofer, amp and speakers will have a great impact on the whole HT experience.


The Pioneers impressed me with how well they did compared to the Oppo, and if you don’t like or don’t need the additional functions of the Oppo, they are a good bet.


For $1000, the Oppo is an all round player which can be a blind buy that will satisfy buyers who like an all in one player.


Yet it is not that cheap and you can easily find many players for less than half that price. They will do their job well for Blu Ray and the differences in load up time, audio performance for HT may not be that important.


Link to comment
Share on other sites

Right almost done :)


I got this comment from a chap in UK:


I only scan read but is it correct to assume the HQV tests in post #3 and "Stadium" sequence mentioned in the picture quality summary is assessing film based deinterlacing i.e. film encoded at 1080i ? Personally I think this is an important aspect if you need the player to deinterlace HD interlaced material. But in reality most Blu-ray films are progressive encodes. You may also find that other HD cadence test material trips up different aspects of deinterlacing solutions. The result being a Silicon Optix based player that performs well with Silicon Optix test material way do less well with third party test material.  As this is a Euro forum the other thing to bear in mind is the potential deinterlacing performance difference between interlaced "film" at 60hz i.e. nice easy 3:2 cadence and film at 50hz i.e. not so nice 2:2 same as video.
Link to comment
Share on other sites

I believe each contributor will share their own purchase prices as they wish.


Take note that the Samsung was a US model.


The Panasonic BD 60 can be had for <$300.


The Oppo is a standard price from Patrick in Adelphi.



Link to comment
Share on other sites

I've posted a video on youtube of the HQV sequence test...on the Denon before the break I think, but I could be wrong.


Updated the youtube with one that should be working now - it was in HD 720 but not sure what youtube might have done to that



you can download and save a 22MB version www.lovethe.net/xpl/br-shootout.m4v


Personally while the HQV test did show some marked differences I didnt think that the lower end players were too hard done by in real world playback. The Denon did have some edge but was rather picky about the setup and didnt want to sync up with the rest of the system where as the others were much more plug and play.


The Oppo while being the 2nd highest priced has a rather better spec, firmware upgrades and feature set seemed to be in my mind ahead of the lower priced machines visually and if I ever upgrade from the PS3 that would be the route I would take and retire my Oppo 983 DVD to another room where there is no need for BR.

Link to comment
Share on other sites

did the frame creation on Vincent's Ae3k disable ? also the respon set to normal ? gamma 2.2 ?

ficker on the highband could casuse by the frame creation and soft colour could due to gamma curve ( locked in service center on 2.2 when we fine tune it ) and base on Ghawk screen.

Link to comment
Share on other sites

Frame creation were off when we check.

Not sure on the gamma curve, that's what we suspected too as I can see the colors contrast on the screen was a bit too much.


Maybe some kind soul would want to assist to calibrate or contribute the setting for Vincents equipment.  :P



did the frame creation on Vincent's Ae3k disable ? also the respon set to normal ? gamma 2.2 ?

ficker on the highband could casuse by the frame creation and soft colour could due to gamma curve ( locked in service center on 2.2 when we fine tune it ) and base on Ghawk screen.

Link to comment
Share on other sites

I presume that basic video calibration has been done properly to the projector? like using the DVE HD bluray calibration disc?

Whatever changes made to the Gamma Curve...be it 2.2 or 2.4...re-calibration on the Contrast and Brightness level is a must!


Link to comment
Share on other sites

Guest francis woo

I presume that basic video calibration has been done properly to the projector? like using the DVE HD bluray calibration disc?

Whatever changes made to the Gamma Curve...be it 2.2 or 2.4...re-calibration on the Contrast and Brightness level is a must!



Not a techie but somehow I can see that the pictures were soft as compared to the recent demo by Peng during the AV show with the 3K!!...... :P


Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.

  • Create New...
To Top