Jump to content

Recommended Posts

Posted

Guys,

Firstly, this is not about just channel 9 or Sydney, it's a query about digital TV in general.

Was discussing how bad channel 9's HD picture was with someone. He was telling me that different areas of Sydney will give you a better looking HD picture. My argument was that as long as they can recieve the signal and display it without breakup, the quality of the picture will be the same for any location.

He was saying that you travel around to different Bing Lee's and you can see the different quality at different locations around Sydney. I suggest that is simply due to cabling at different stores.

Is there any online info that I can point him to that explains a digital signal will not show gradual degrading like analogue, or am I incorrect?

Posted

What you say is correct.

As long as there is adequate signal strength, and the signal quality (errors) are insignificant, then the 'picture quality' capability of the received digital data stream shouldn't vary between locations.

With digital reception, there are the three main states.

Perfect reception, no reception, and in the middle the zone where the picture and sound pixellate and chirp (the digital cliff area).

Please have a look at the docs in my sig below, this will hopefully explain some of the more important aspects of getting reliable digital reception. It should also provide a better explanation to your question.

Posted

You are basically correct. Digital TV is essentially an all or nothing proposition. So if you are receiving sufficient signal then you get exactly the same information going everywhere. If the signal strength drops too much you'll start getting pixelisation and something called macro blocking basically both being areas on the screen that start corrupting to unviewable. You'll also get audio drop outs beginning. What you won't get is some gradual degredation of the overall picture quality.

Differences between stores are as you say down to factors after the signal arrives. eg store lighting. Cables used (big influence). TV settings (eg contrast etc). and also the STB that might be hooked up (STBs translate digital signals into analogue ones and they don't all do as good a job (esp with HD) however this last factor is not huge)

As a relative simple example you might get 100% image for signal strengths better than 68%. Pixelisation and audio drop outs from 64% to 68%. Anything below 64% is a blank screen

An offshoot (mirror) of this topic is expensive digital HDMI cables. Its essentially a store con where they sell people $300 HDMI cables when a $50 one will do 100% the exact same job. You can read a recent article on this here

Regards

Peter Gillespie

Posted
So if you are receiving sufficient signal...

If the signal strength drops too much...

... you might get 100% image for signal strengths better than 68%.

Pixelisation and audio drop outs from 64% to 68%. Anything below 64% is a blank screen..

Sorry PG, I might use your wording above to clarify something that I see written fairly regularly on posts in forums. I know you probably used the above as a simple example, but maybe it can help clarify something.

In simplified terms, picture breakup (pixellation) and sound problems (chirping) in a DVB-T system is caused by too many errors coming through into the receiver. The inherent Forward Error Correction (FEC) that is used in the DVB-T system (in this case at the reception end in the receiver itself) can no longer correct this high level of errors. Problems become visible and audible. Reception is in the digital cliff area.

As long as the received signal is of adequate signal strength, it is essentially down to the errors within the received transmission. In other words, the Bit Error Rate (BER).

The receiver should have it's input signal (read stength here) presented to it within an operating window over which it is happy to operate. Not too strong, not too weak. If the signal is towards one extreme or the other, then things like amplifiers or attenuators may by themselves assist with seemingly 'reducing' errors. In fact what they are probably doing IMO is getting the received signal to conform better to the operating window that the receiver needs.

And bear in mind that there could be many channels being presented to the receiver (analogue TV, digital TV, FM radio etc.). One or more of these could be affecting reception, especially if they are too strong.

Ideally in a reception system you would only have the channels of interest, they would all be of the same signal level, and they would be of the correct level to allow the receiver to operate properly (within the operating window). They would of course have low BER characteristics (this being affected by many factors).

Changing the received signal strength up and down over a reasonable range in the above 'ideal' situation IMO would/should not cause the digital cliff area to be entered into. It is not directly a 'strength' thing as such.

As is often pointed out in these forums, as long as there is adequate signal level, it is the Bit Error Rate that is all-important.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...
To Top