Jump to content

Recommended Posts

Posted
35 minutes ago, Stereophilus said:

Next, let’s swap our Cat 5e for an audio Ethernet cable, maybe a Cat 7 spec with external shielding.  Now we permit the potential passage of noise from our network into our DAC via the shield (attached to ground).

 

This would seem to go against the marketing depts of said cable manufacturers. They refer to their cables and switches as improving noise reduction, not letting it propagate on the network. If they really were selling snake oil somebody would've used this idea already!

 

  • Volunteer
Posted (edited)
34 minutes ago, Stereophilus said:

I think you are unfairly misrepresenting me.  I never said “greater digital purity”, or that I want to justify power supplies or switches at exorbitant prices… The only “pitch” I presented in my post is that noise entering the DAC via shielded Ethernet could have an audible effect by acting effectively like dither.  

 

I wasn't poking at you, dude. Just restating what I've said before but with the subtlety of the point on changing power supplies to deliberately tune the noise as an example.

 

The comment on digital purity is in juxtaposition to the claims that shielded cables deliver a better sound. They may deliver a preferred sound due to the distortions they can propagate from upstream kit, but the acolytes of over-specced cables and rebadged zyxel switches posit that this is  purity/accuracy manifest. It isn't.

 

You may be on to something, however, trying to control the effect you hypothesise would be pretty hit-and-miss even if you may have identified the realities. I think many would be suggesting it is better to stop all noise and then use the DAC controls and/or DSP to get the tuned sound you want - what do you surmise with this position?

Edited by El Tel
  • Like 1
Posted (edited)
8 hours ago, El Tel said:

You may be on to something, however, trying to control the effect you hypothesise would be pretty hit-and-miss even if you may have identified the realities. I think many would be suggesting it is better to stop all noise and then use the DAC controls and/or DSP to get the tuned sound you want - what do you surmise with this position?

Sure. I have no problem with that, but I don't necessarily think it's completely black and white.  If we accept my hypothesis that network noise entering the DAC can act as dither then we should expect to see a variety of different reports on the perceived effect (depending on the DAC, the listener, the system, etc).  And we do.  The point of this would be that noise (acting as dither) may not be a bad thing, it may (depending on a few variables) be a good thing for certain aspects of digital audio.

 

8 hours ago, Hydrology said:

This would seem to go against the marketing depts of said cable manufacturers. They refer to their cables and switches as improving noise reduction, not letting it propagate on the network. If they really were selling snake oil somebody would've used this idea already!

I'm not asserting snake-oil although I agree, my thought does go against their marketing.  I don't have answers for that.

Edited by Stereophilus
Spelling
  • Like 1
  • Volunteer
Posted
1 hour ago, Stereophilus said:

The point of this would be that noise (acting as dither) may not be a bad thing, it may (depending on a few variables) be a good thing for certain aspects of digital audio.

 

Accepting the hypothesis as possible, then as I said, identifying exactly what and how the effect is manifesting is only part of the tale; one would likely struggle to control the level of the effect and therefore find it more or less impossible to tune the effect to personal taste.

 

It's not a theory I'm inclined to ignore or dismiss as it cannot be debunked in the same way as some of the claims of audiophile-credentialed digital equipment such as cables, switches and routers; most of those claims don't stand-up to even the simplest enquiries and comparisons of how networking actually functions. It would be good for it to be tested with the appropriate apparatus and measurements, using U/UTP as a control for each test/measurement exercise. An interesting avenue of thought; good work.

 

1 hour ago, Stereophilus said:

my thought does go against their marketing

 

I wouldn't even break stride on that thought. Marketing constructs are not always based on what happens on a technical basis. None of the questionable claims are made by reputable companies whose business is primarily enterprise customers including audio distribution and studio environments. The distinction is that shielded cables can reduce external influences on the cable itself along its run (that is not in dispute - although wholly unnecessary for the <250MHz application of a CAT6 1Gbps compliant cable), but the nature of the terminated shielding means it is susceptible to introducing noise by creating electrical continuity with attached devices. By design, let's not forget that the twisted-pair inherently reduces electromagnetic radiation outwards from the pair and crosstalk between neighbouring pairs; it also improves rejection of external electromagnetic interference over that of a single conductor.

  • Like 1
Posted
11 hours ago, Stereophilus said:

Now we permit the potential passage of noise from our network into our DAC via the shield (attached to ground).

Maybe.

 

If there is "noise" on the cable.... then it is not a given that it will go anywhere.   It is the designer of the audio device 's job to ensure that it doesn't go anywhere important.

 

This same logic (both that it is not a given that it will have an affect, and that it is the designers job to eliminate this outcome) applies to other things too... not just cables with grounded shields.    It applies to any "noise" that might be carried on the signal pairs, or any signal jitter, or resulting packet jitter, that might make it through to some circuit that cared about it  (eg. a clock, or DAC, or less likely, an analogue audio circuit)

 

11 hours ago, Stereophilus said:

I wonder if permitting network noise into the DAC is actually acting like the addition of dither…

Yes, it may be raising the noise floor in the analogue audio electronics.

 

However this is either inaudible or bad.    It is an audio myth to think this could ever be good.

 

11 hours ago, Stereophilus said:

That is, effectively increasing the noise floor in the DAC to improve quantisation distortion and SNR

Improve quantisation distortion.    No.

 

Improve SNR.... No..... worsen SNR, yes.

 

11 hours ago, Stereophilus said:

This could potentially be audible.

Inaudible or bad.

 

11 hours ago, Stereophilus said:

Furthermore, I wonder if the audible difference in Audio Ethernet cables are due to the way they “shape” the noise / dither they permit into the DAC?  In such a way the cables “tune” the effect the noise exerts within the DAC.

The bottom line is that if this is happening in a way which could even be anyway remotely audible ...... it is measurable (quantifiable).

 

So even leaving aside the "is it good or bad" generalisations ..... firstly it can be demonstrated.

 

There are reasons why it is not quantified.  ;) 

Posted
11 hours ago, Stereophilus said:

I had a thought about Ethernet cables today… potentially a dangerous thing.  I am also putting this out as completely speculative.  But I would value the thoughts of @El Tel, @davewantsmoore, @BugPowderDust and @Ittaku.  Please shoot me down if I’m way off track.
 

I wonder if we are thinking about Audio Ethernet cables in the wrong way.  What if they are actually adding noise?

 

Lets assume our standard Cat 5e patch cable is perfectly transmitting digital data.  It has no shield, and no ground link.  It is an intrinsically galvanically isolated connection.  All is good.

 

Next, let’s swap our Cat 5e for an audio Ethernet cable, maybe a Cat 7 spec with external shielding.  Now we permit the potential passage of noise from our network into our DAC via the shield (attached to ground).

 

Bad right?  Maybe not.  I wonder if permitting network noise into the DAC is actually acting like the addition of dither… That is, effectively increasing the noise floor in the DAC to improve quantisation distortion and SNR.  This could potentially be audible.

 

Furthermore, I wonder if the audible difference in Audio Ethernet cables are due to the way they “shape” the noise / dither they permit into the DAC?  In such a way the cables “tune” the effect the noise exerts within the DAC.

Sounds like a most plausible possibility to me.

  • Like 1
Posted
2 hours ago, davewantsmoore said:

Improve quantisation distortion.    No.

 

Improve SNR.... No..... worsen SNR, yes.

 

Inaudible or bad.

So, admittedly I've been working on the basis of some assumptions here.  Ie that "the noise" will travel from wherever, into the DAC, along the shield of the cable.  Also assuming that for the effect to be audible the DAC must be in some way susceptible to "the noise".  But I wonder if this noise entering on the digial side might act like dither rather than packet jitter?  Because if it does act like dither it will improve quantisation distortion and SNR in the same way dither does.

 

Posted
2 hours ago, Stereophilus said:

Ie that "the noise" will travel from wherever, into the DAC, along the shield of the cable.

Possible, but that would be a bad design..

 

2 hours ago, Stereophilus said:

Because if it does act like dither it will improve quantisation distortion and SNR in the same way dither does.

No

Posted
1 hour ago, davewantsmoore said:

No

Ok. A little deflating for this idea, but I cannot pursue the point further as I don't have the technical knowledge.

  • Volunteer
Posted (edited)
1 hour ago, Stereophilus said:

Ok. A little deflating for this idea, but I cannot pursue the point further as I don't have the technical knowledge.

 

This is a learning opportunity. Go dig, search, read. You're more motivated than anyone else to find the explanation here (I'm already happy to consign everything else after the fact that noise propagates with shielded Ethernet to the "wotevs" column as it matters not if you know how to avoid that). 

 

You could always take your hypothesis to ASR and someone there might pick it up and run with it. A fair few of them actually know what they are doing. Stay open-minded, test your conclusions and then they can stand-up to scrutiny. Experiment and have fun; remember:

lab.jpeg.458243d5030e7e39420e1c6d0f0932fb.jpeg

 

I think that your investigation is nothing more than solving your own curiosity and that's absolutely ok. I think that no matter what you might discover, trying to find, sell, promote or otherwise control any upsides would be impossible; the fact is that the unintended propagation of noise in the digital realm when the prevention of it is already known, means folk would not countenance deviation from the goal of digital purity. They can tune anything after the fact (DSP, any analogue influences etc) with greater certainty so why bother doing otherwise?

Edited by El Tel

Posted
42 minutes ago, El Tel said:

You could always take your hypothesis to ASR and someone there might pick it up and run with it. A fair few of them actually know what they are doing.

I would caution against this. You'll be ridiculed and even if someone tried to measure something, it'd be a foregone conclusion they'll find nothing. The noise levels you're talking about having a measurable effect on the audible output would be impossible, especially with ASR's blinkered approach to audio output.

  • Volunteer
Posted
2 minutes ago, Ittaku said:

I would caution against this. You'll be ridiculed and even if someone tried to measure something, it'd be a foregone conclusion they'll find nothing. The noise levels you're talking about having a measurable effect on the audible output would be impossible, especially with ASR's blinkered approach to audio output.

 

I get what you're driving at, but if it were framed according to the hypothesis that noise is propagated by the shielding and let them prove that (I have previously proven it myself whilst chasing down a different issue with a ham radio friend of mine), then the conversation could be coaxed into the realm of "so, noise propagates, what effects can we see when it does and what influences can we have on the noise itself within the wider picture of the sound output" - that can either by impacts that can be heard, or frequency changes of the actual output that can be plotted.

Posted
10 hours ago, El Tel said:

 

This is a learning opportunity. Go dig, search, read. You're more motivated than anyone else to find the explanation here (I'm already happy to consign everything else after the fact that noise propagates with shielded Ethernet to the "wotevs" column as it matters not if you know how to avoid that). 

 

You could always take your hypothesis to ASR and someone there might pick it up and run with it. A fair few of them actually know what they are doing. Stay open-minded, test your conclusions and then they can stand-up to scrutiny. Experiment and have fun; remember:

so why bother?

It doesn’t have to be ASR.  
 

http://archimago.blogspot.com/2015/02/measurements-ethernet-cables-and-audio.html?m=1

 

Same method, same finding:  does it surprise?   what he says in the final conclusion says it all.

  • Like 2
  • Volunteer
Posted
1 hour ago, Addicted to music said:

It doesn’t have to be ASR.  
 

http://archimago.blogspot.com/2015/02/measurements-ethernet-cables-and-audio.html?m=1

 

Same method, same finding:  does it surprise?   what he says in the final conclusion says it all.

 

Good digging. This is merely comparing cable to cable based on post-DAC output but it certainly puts a stake in the ground. And yes, you have a point - the conclusions say it all for the twisted pairs themselves, but doesn't take into account the shielding. His use of the ADC is way more sophisticated than the mic placed in front of the studio monitor used in my interference testing (although I was not measuring accuracy, merely looking for significant noise artefacts coming from the deliberate introduction of a pedestal fan placed on the switch itself).

Posted (edited)
1 hour ago, El Tel said:

 

Good digging. This is merely comparing cable to cable based on post-DAC output but it certainly puts a stake in the ground. And yes, you have a point - the conclusions say it all for the twisted pairs themselves, but doesn't take into account the shielding. His use of the ADC is way more sophisticated than the mic placed in front of the studio monitor used in my interference testing (although I was not measuring accuracy, merely looking for significant noise artefacts coming from the deliberate introduction of a pedestal fan placed on the switch itself).


I have never experience noise effecting data on a ethernet cable (mostly 5e ) or switches.  And I’ve seen the most noises places on earth, yet the client can transmit 12GB data to a server day in day out!    If anyone here would take the time in understanding how the protocol for ethernet works it’s literally perfect regardless of what “noises” you said it effects provided you work with in the physical and soft boundaries of ethernet, it’s that simple.   Like the conclusion says “if it sound different then something is faulty”   And from a service  point of view is 100 percent spot on!    Note most clients operate no better than 5e and a $10 switch, they may invest in firewalls but that’s it!   And to get perfect data to a terminal and server is all that’s needed, the rest is just extras that’s not needed!  BTW none are shielded!  Even if it was shielded ethernet the transmission of signal wouldn’t be effected by HF unless there’s a fault somewhere!   Even better is wifi,  where we sit a phone streaming near a noisy microwave was playing uninterrupted however when the FM/AM radio was place near it it would effect the reception of FM that it was not audible!  And as we all know, FM/AM isn’t digital. Nuff said! 
 

Edited by Addicted to music

Posted
11 minutes ago, Addicted to music said:


I have never experience noise effecting data on a ethernet cable (mostly 5e ) or switches.  And I’ve seen the most noises places on earth, yet the client can transmit 12GB data to a server day in day out!    If anyone here would take the time in understanding how the protocol for ethernet works it’s literally perfect regardless of what “noises” you said it effects provided you work with in the physical and soft boundaries of ethernet, it’s that simple.   Like the conclusion says “if it sound different then something is faulty”   And from a service  point of view is 100 percent spot on!    Note most clients operate no better than 5e and a $10 switch, they may invest in firewalls but that’s it!   And to get perfect data to a terminal and server is all that’s needed, the rest is just extras that’s not needed!  BTW none are shielded!  Even if it was shielded ethernet the transmission of signal wouldn’t be effected by HF unless there’s a fault somewhere!   Even better is wifi,  where we sit a phone streaming near a noisy microwave was playing uninterrupted however when the FM/AM radio was place near it it would effect the reception of FM that it’s was not audible!   Nuff said! 
 

The point of my thought experiment was not to dispute anything at all relating to data integrity.  The point was to question if spurious noise piggy-backing on shielded network cables (not the standard unshielded ones) could influence DAC output by acting as a form of dither.  I understand what you are saying. I understand cable shielding is not needed in ethernet transmission (at this level anyway).

 

The scenario proposed is a non-standard one.  A w-meaning audiophile tries out a cat7 cable to his/her DAC/streamer and hears a difference.  Could that difference be due to the shield on the cable conveying noise to the DAC and altering the DACs behaviour in an audible way.

 

Now, I need to do more reading on how this might work at the DAC end.  It’s just a thought bubble at this point.

  • Like 2
  • Volunteer
Posted (edited)
22 minutes ago, Stereophilus said:

A w-meaning audiophile tries out a cat7 cable to his/her DAC/streamer and hears a difference.  Could that difference be due to the shield on the cable conveying noise to the DAC and altering the DACs behaviour in an audible way.

 

This bit here is the crux of it. I know you're wanting to investigate the downstream effects in the DAC more and to isolate what parts are being affected. That's a laudable aim in the pursuit of the full picture. But to avoid the conveyed noise in the first place and then shuffle your DAC processing/settings around to get a controlled outcome is the ideal. Everything else is, well, it's just noise.

 

You should totally continue to investigate your hypothesis - it's a better place to start than most of the whacko stuff I've seen others come out with.

Edited by El Tel
  • Like 2
Posted
3 hours ago, Addicted to music said:

It doesn’t have to be ASR.  
 

http://archimago.blogspot.com/2015/02/measurements-ethernet-cables-and-audio.html?m=1

 

Same method, same finding:  does it surprise?   what he says in the final conclusion says it all.

Surprise - No!  Each of us will have different experience on a topic such as networking cables and components.  There is always more than one way at considering or experiencing a topic to which there is no absolute correct position.  It doesn't surprise me? Does it surprise you?  Read this and other sections on the site.

 

https://audiobacon.net/2017/07/09/sotm-iso-cat6-special-edition-the-flavors-of-audiophile-ethernet/3/

John

  • Like 1
Posted
5 hours ago, Addicted to music said:

 

He doesn't need to go as far as saying it is impossible.    Simply saying:

 

Quote

 

 that device should be returned because it is obviously defective

 

Seems sufficient.

 

The advertising standards case he linked is interesting.

 

Quote

The Chord Company explained that it was difficult to technically measure the improvement in sound quality which their Ethernet cables produced compared with standard Ethernet cables

 

What people need to understand, no matter how they feel about IF thieir cables and equipment, etc. make a different to their sound..... is that this statement is complete BS.... and their only save from that is that the word "difficult" is subjective  ('cos they can say, "no, well, it's difficult for us").

 

What they may mean, is that it's difficult to show those measurements in a way that is palatable to their target market..... but that's a different thing altogether.

 

They could measure things which are 1000s of times smaller than anything that could even be remotely audible, and point to them and say "see, there it is"..... and perhaps they don't do that because that would just be a jumping off point for a discussion about how their assessment of the evidence was wrong.

 

.... or what I think is more likely is that they don't even measure things that small, at least not in any sort of reproducible way.... which means they're better off just not going down that track at all.

  • Like 1
  • Volunteer
Posted
23 minutes ago, davewantsmoore said:

or what I think is more likely is that they don't even measure things that small, at least not in any sort of reproducible way

 

Yep, this.

 

Let's not conflate what anyone thinks they heard or experienced with actual plotted measurements.

 

My pedestal fan test, the outcome of which was seeing a 50Hz echo on the frequency plot when using SFTP that then disappeared when swapping to UTP, should be pretty easy to replicate. You may not even hear a difference, but if I could see it on a mic sample, you would see it on a plot using something like the ADC and some software on the PC plugged into the ADC output. 

 

Those who are using shielded cables don't know why they hear something they prefer. Let's just cut to the chase and be honest about enjoying the distortions they display from upstream kit.

Posted
3 hours ago, Stereophilus said:

The point of my thought experiment was not to dispute anything at all relating to data integrity.  The point was to question if spurious noise piggy-backing on shielded network cables (not the standard unshielded ones) could influence DAC output by acting as a form of dither.  I understand what you are saying. I understand cable shielding is not needed in ethernet transmission (at this level anyway).

 

The scenario proposed is a non-standard one.  A w-meaning audiophile tries out a cat7 cable to his/her DAC/streamer and hears a difference.  Could that difference be due to the shield on the cable conveying noise to the DAC and altering the DACs behaviour in an audible way.

 

Now, I need to do more reading on how this might work at the DAC end.  It’s just a thought bubble at this point.

A possible solution looking for a an issue to solve?

 

Cat6a is shielded and it is also tested in the link I provided.

 

Posted
On 06/03/2022 at 11:35 PM, Stereophilus said:

Bad right?  Maybe not.  I wonder if permitting network noise into the DAC is actually acting like the addition of dither… That is, effectively increasing the noise floor in the DAC to improve quantisation distortion and SNR.  This could potentially be audible.

While this is extremely at the fringes of what I'd consider might be happening in a system, why are you considering the effect to be more akin to dither instead of jitter or distortion in the output signal? There's every possibility that what's been hearing is coloration and not an improvement of the original source.

 

Has anyone performed these tests and do they yield any positive / negative effects  higher than about -120dB? 

 

While I'd be curious to see A vs B output tests of a single DAC with different spec ethernet cables, I'm not expecting to see anything substantial and if anything, a less positive result for the shielded cables. The reason the manufactures seem intent on pushing the Cat7 cable bandwagon is "surely a higher spec cable is better right?" without any logic applied as to whether this will create a better signal transmission outcome.

 

All I want in my system is something that renders the incoming signal as close as possible to what the ADC captured in the studio. 

  • Love 1
Posted
49 minutes ago, BugPowderDust said:

All I want in my system is something that renders the incoming signal as close as possible to what the ADC captured in the studio. 

Yes, me too.  But we don't know what the ADC captured.  So perhaps the best we can do is interpret what we hear from our systems and judge if we think that is what the recording intended to capture.  Then, if desired, add a dash of preferred coloration/distortion/noise (like herbs and spices to food).

  • Like 2
Posted
17 minutes ago, dbastin said:

But we don't know what the ADC captured. 

No, but we do know what the ADC captured and was subsequently digitally mastered. That gets rendered into PCM (or maybe DSD) and we can validate that at the bit level in perpetuity.

 

I want an accurate system, not a coloured one.  

Posted
5 minutes ago, BugPowderDust said:

No, but we do know what the ADC captured and was subsequently digitally mastered. That gets rendered into PCM (or maybe DSD) and we can validate that at the bit level in perpetuity.

 

I want an accurate system, not a coloured one.  

Umm, I will rephrase ... we don't know what the recording the ADC (or digital master) captured sounds like.  We get the verified bits delivered through things that could and seem to add various colours in different ways.

 

Even witnessing the Absolute Sound - a live instrument.  You hear what that sounds like but to compare in your system you'd need to record it and then play it back - and remember what it sounded like live (not what you wanted it to sound like).  Potential changes occur during the recording (quality of mic, software, hardware, cables etc) and also during the playback.

 

Perhaps a good reference is a conductor or music producer of an orchestra during the recording doing his/her best to remember what that sounded like and compare that memory to a playback of the recording of very same performance.  This approach is probably possible with other size bands that are acoustic instruments recorded live.  Either way, a very tall order.  Probably high margin of inaccuracy.

 

To obtain 100% accurate playback every piece of the playback system needs to be 100% accurate, neutral and colorless and capable of reproducing it.  This is likely to be impossible.  So I'd suggest we audiophiles are in pursuit of a satisfactory representation of the Absolute Sound.

 

Presumably all the audio gear involved it the recording (including their playback systems during recording) pursues the same goal, but the extent to which that is achieved???

 

From the moment a sound wave hits the microphone diaphragm, or guitar pick up, that is being changed with and/or without intent to craft the final result.

  • Like 1

  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top