Jump to content

Recommended Posts

Posted (edited)

So if optical is the way to go, what wavelength SFP do I use?

 

I or do I just install a DWDM network and tune the system? 

 

You just cant play Diana Krall on anything but 1598.89nm to do her justice. 

 

Should I be running the light levels near the upper or lower thresholds or somewhere around the middle? 

 

SFP vs SFP+ vs XFP.. Does bottom end get any better? 

 

 

We could really beat this subject to death.. 

 

 

Seriously thinking relabelling the hundreds of Old SPFs lying around at work 'audiophile' and putting them up on ebay. 

 

 

Conspiracy theory #1637

 

It's got nothing to do with the Ethernet side of the switch, it's the different power supplies of each switch backfeeding different amounts of noise into the premises power cabling causing the hifi equipment to sound different.. 

 

To fix the problem, spend more on power cables. 

 

But is all seriousness... 

If any one organises a GTG in Melbourne, I might be able to supply for the day a range of carrier grade switches and some Single and Multi Mode optics to play with..  Might even throw BiDi into the mix... Got a 1km fibre launch lead hanging around somewhere too. 

 

On a final note. 

 

How would streaming over ATM sound, anyone still on ADSL? 

 

Edited by Hytram
  • Haha 1

Posted
48 minutes ago, Hytram said:

We could really beat this subject to death..

Should we assume you are being sarcastic for the humour?

 

I wonder if anyone has tried all those variables?  They might matter, but I'm guessing you'd think it would be a waste of time.

 

Anybody wanna explore these type of optical options? To get an idea if they actually matter tò audio? Let's beat it to death to find out.

  • Like 1
Posted
1 hour ago, Hytram said:

To fix the problem, spend more on power cables.

In my limited experience with my 2006 Belkin modem (10/100), LPS mattered, Power Cord mattered, power conditioner mattered.  Each was small-moderate gain, together quite considerable improvement.

  • Like 1
Posted
5 hours ago, Stereophilus said:

I don’t think I’ll be buying one at that price tbh.

Me neither, but it might be fun to hear ... or maybe too hard to un-hear and agonising to not justify the desire to keep it and its expense.

  • Like 1
  • Haha 1
Posted
5 hours ago, dbastin said:

Should we assume you are being sarcastic for the humour?

 

I wonder if anyone has tried all those variables?  They might matter, but I'm guessing you'd think it would be a waste of time.

 

Anybody wanna explore these type of optical options? To get an idea if they actually matter tò audio? Let's beat it to death to find out.

@Hytram is being facetious. It’s clear from their post that they know more about networking than the average bear. 
 

This thread is a mess and should be locked. What started as “why a standard switch will suffice”, had plenty of really useful recommendations for improving your network peppered throughout. It has now turned into a bunch of people (with admittedly no understanding of networking or IT) denying expert opinions and instead suggesting we test every permutation in case it makes a difference when 30 years of real world science tells you it won’t.

 

I have to wonder at this juncture, how many audiophiles are climate change deniers? The Venn diagram would be amusing for sure.

  • Like 3

Posted
7 hours ago, Hytram said:

So if optical is the way to go, what wavelength SFP do I use?

 

I or do I just install a DWDM network and tune the system? 

 

You just cant play Diana Krall on anything but 1598.89nm to do her justice. 

 

Should I be running the light levels near the upper or lower thresholds or somewhere around the middle? 

 

SFP vs SFP+ vs XFP.. Does bottom end get any better? 

 

 

We could really beat this subject to death.. 

 

 

Seriously thinking relabelling the hundreds of Old SPFs lying around at work 'audiophile' and putting them up on ebay. 

 

 

Conspiracy theory #1637

 

It's got nothing to do with the Ethernet side of the switch, it's the different power supplies of each switch backfeeding different amounts of noise into the premises power cabling causing the hifi equipment to sound different.. 

 

To fix the problem, spend more on power cables. 

 

But is all seriousness... 

If any one organises a GTG in Melbourne, I might be able to supply for the day a range of carrier grade switches and some Single and Multi Mode optics to play with..  Might even throw BiDi into the mix... Got a 1km fibre launch lead hanging around somewhere too. 

 

On a final note. 

 

How would streaming over ATM sound, anyone still on ADSL? 

 

Wouldn't the other customers, behind you in the queue, get annoyed about you standing at the ATM listening to music?

  • Haha 1
Posted (edited)

 

45 minutes ago, recur said:

@Hytram is being facetious. It’s clear from their post that they know more about networking than the average bear. 

 

Damn, I usually get called a cynical bastard..  ?

 

Originally an licenced electrician and for the last 20 years building, maintaining and fault finding the infrastructure from core, access to edge in major ISP networks, I haven't got much passed layer 3 so most of experience is practical and physical... 

 

All I'll say is there is things in hifi that simply do not make sense to me, things that should not matter but do

 

Things that seam to defy logic. 

 

I think what the main difference between measuring something and hearing something is the end measuring device where one is mechanical/digital and the other is flesh/emotional and not calibrated. 

 

I am on the fence with all this Ethernet stuff, I am only back in the hifi area and because of renovations and haven't even started to prove/disprove power cables yet! 

 

 

Edited by Hytram
Posted
1 minute ago, bob_m_54 said:

Wouldn't the other customers, behind you in the queue, get annoyed about you standing at the ATM listening to music?

If the line went round the block would it be a token ring? 

  • Like 1
  • Haha 1
Posted
8 hours ago, Hytram said:

I think what the main difference between measuring something and hearing something is the end measuring device where one is mechanical/digital and the other is flesh/emotional and not calibrated. 

 

I am on the fence with all this Ethernet stuff, I am only back in the hifi area and because of renovations and haven't even started to prove/disprove power cables yet! 

 

This is a core sentiment. 

 

Measurements are only as good as the experiment inherent; my other half isn't calibrated but having her observe I'd 'changed something, it sounds different' with an Ethernet change was the start of this journey.    

 

Probably worth fostering an understanding among us of what can change with different Eth configurations.  

 

I think this (http://www.youtube.com/watch?v=UhBOnebNhe8&t=0m51s) is a crock... though happy to be corrected. And it's relevant to the topic. 

 

Thoughts?

Posted (edited)
15 hours ago, recur said:

This thread is a mess and should be locked. What started as “why a standard switch will suffice”, had plenty of really useful recommendations for improving your network peppered throughout. It has now turned into a bunch of people (with admittedly no understanding of networking or IT) denying expert opinions and instead suggesting we test every permutation in case it makes a difference when 30 years of real world science tells you it won’t.

I gather high end audio use of ethernet is a fairly recent thing.  And it behaves differently to conventional ethernet applications - our audio systems effectively put ethernet under a specialised microscope and thus reveal issues.

 

What probably began as a means of getting audio data from A to B, is likely have led some to explore possibilities of how that can be optimised for audio - and discoveries.

 

Now I'd suggest it is about how we can adapt ethernet tech from the 'real world' for the new world of audio - a purpose it was not really intended for.

 

Things that dont usually matter, do matter to audio, and it seems we are in the relatively early phase of discovering what does and does not matter for audio. 

 

I know I dont know much about the technicalities, and frankly don't really need to.  What matters to me is if something improves sound quality - to me, and other like minded people..

 

I am encouraging people be open minded, to experiment, challenge convention.  So that leaves us with a lot of possibilities to try - many wont help us, but we are likely to discover something that does.

 

I would hope to consider opinion and guidance of experts in this community, to help guide people's experiments.

 

I apologise for suggesting experiments that may have been done to death in conventional ethernet applications - that might not be so for high end audio applications.  But please, if you know it has been tried and tested for high end audio, please let us know.

 

People are welcome to settle with conventional approach.

 

If people feel this is going too much in a different direction to the original thread topic, maybe I could create anothèr thread?

Edited by dbastin
  • Like 1

Posted
2 hours ago, rmpfyf said:

I think this (http://www.youtube.com/watch?v=UhBOnebNhe8&t=0m51s) is a crock... though happy to be corrected. And it's relevant to the topic. 

 

Thoughts?

?‍♂️... excellent, he confused not only himself but most probably others too, talking about data correction and streaming via USB cable than quickly jumping to streaming and network cables to finish it off with audiophile jargon, he's talking about that like it takes weeks to resend correct data via network cable, anyway...

 

some basics for dbasin as follow up on what recur has said earlier below

 

 

there are 4 types of streaming services people would be typically using:

- personal cloud based storage

- P2P and on demand services (Spotify, Netflix, YT, etc.)

- Real time stream from internet (radio stations, live broadcast etc.) 

- Home/local network stream (from NAS or other devices within network)   

 

Streaming protocols:

- RTP, RTSP, RTMP, HLS, HTTP/s, UPNP, DAAP and many others with their subsequent/ underlying protocols 

for us are important HTTP/s, UPNP and potentially DAAP in case of Apple, in all 3 cases using TCP

the only potential exception would be Real time stream which might be using UDP  

information about differences between protocols available on Wikipedia 

 

besides using TCP to transfer data between networked devices (streamer and storage via router/switch) we need app/os which guarantees sufficient buffering, command control the stream with asynchronous (pull) method and act as controller (master clock), here comes the important part, it's up to app/os of the streamer/renderer to set right size of the buffer and provide proper command/information when and how to pull data from the upstream (cloud/P2P/NAS) but in general if the buffer size is sufficient and properly used there's no need to worry about other aspects of the streaming protocol as the music/stream is cached in local memory prior to playback, some apps using progressive-download method and cache entire song or more, as said before, easy check is to unplug the network cable and listen for how long does the playback last (mentioned elsewhere Jriver does in it's basic config about 3-5 seconds with 16/44 pcm file, spotify entire song) and in such case quality of the upstream including cables/connection and switch/router is completely irrelevant coz data were received intact and ahead of the playback...and now act like if they were originally stored on the USB/HDD drive attached to the streamer and that's why people can't distinguish difference between local and networked playback of the same file 

Of course this is all possible only in case streamer and entire network using appropriate network/streaming standards and protocols,  in case they are purpose build ignoring buffering, TCP protocols and working in Isochronous (push) mode you might be experiencing sound quality degradation, how big depends on different factors but than I would ask why you are trying to fix cables/routers if they aren't broken and if in proper setup completely irrelevant?     

 

we can go into more details if anyone's interested but we would need to work on specific case as options are pretty much endless

 

we can further discuss how it's all affecting DAC...

 

some reading

https://en.wikipedia.org/wiki/Transmission_Control_Protocol

https://en.wikipedia.org/wiki/User_Datagram_Protocol

https://en.wikipedia.org/wiki/Digital_Audio_Access_Protocol

https://en.wikipedia.org/wiki/Data_buffer

how the control mechanism works http://upnp.org/specs/av/UPnP-av-AVArchitecture-v2.pdf

Plenty of other papers about streaming

 

2 hours ago, dbastin said:

Now I'd suggest it is about how we can adapt ethernet tech from the 'real world' for the new world of audio - a purpose it was not really intended for.

 

it exists for decades, below diagram is over 20 years old and still valid, companies just developed better/newer codecs/protocols and mechanism to further improve its aspects and interface

 

how-real-streaming-works.png

 

https://patents.google.com/patent/US5793980 

 

 

  • Like 2
Posted
6 hours ago, kukynas said:

?‍♂️... excellent, he confused not only himself but most probably others too, talking about data correction and streaming via USB cable than quickly jumping to streaming and network cables to finish it off with audiophile jargon, he's talking about that like it takes weeks to resend correct data via network cable, anyway...

 

some basics for dbasin as follow up on what recur has said earlier below

 

 

there are 4 types of streaming services people would be typically using:

- personal cloud based storage

- P2P and on demand services (Spotify, Netflix, YT, etc.)

- Real time stream from internet (radio stations, live broadcast etc.) 

- Home/local network stream (from NAS or other devices within network)   

 

Streaming protocols:

- RTP, RTSP, RTMP, HLS, HTTP/s, UPNP, DAAP and many others with their subsequent/ underlying protocols 

for us are important HTTP/s, UPNP and potentially DAAP in case of Apple, in all 3 cases using TCP

the only potential exception would be Real time stream which might be using UDP  

information about differences between protocols available on Wikipedia 

 

besides using TCP to transfer data between networked devices (streamer and storage via router/switch) we need app/os which guarantees sufficient buffering, command control the stream with asynchronous (pull) method and act as controller (master clock), here comes the important part, it's up to app/os of the streamer/renderer to set right size of the buffer and provide proper command/information when and how to pull data from the upstream (cloud/P2P/NAS) but in general if the buffer size is sufficient and properly used there's no need to worry about other aspects of the streaming protocol as the music/stream is cached in local memory prior to playback, some apps using progressive-download method and cache entire song or more, as said before, easy check is to unplug the network cable and listen for how long does the playback last (mentioned elsewhere Jriver does in it's basic config about 3-5 seconds with 16/44 pcm file, spotify entire song) and in such case quality of the upstream including cables/connection and switch/router is completely irrelevant coz data were received intact and ahead of the playback...and now act like if they were originally stored on the USB/HDD drive attached to the streamer and that's why people can't distinguish difference between local and networked playback of the same file 

Of course this is all possible only in case streamer and entire network using appropriate network/streaming standards and protocols,  in case they are purpose build ignoring buffering, TCP protocols and working in Isochronous (push) mode you might be experiencing sound quality degradation, how big depends on different factors but than I would ask why you are trying to fix cables/routers if they aren't broken and if in proper setup completely irrelevant?     

 

we can go into more details if anyone's interested but we would need to work on specific case as options are pretty much endless

 

we can further discuss how it's all affecting DAC...

 

some reading

https://en.wikipedia.org/wiki/Transmission_Control_Protocol

https://en.wikipedia.org/wiki/User_Datagram_Protocol

https://en.wikipedia.org/wiki/Digital_Audio_Access_Protocol

https://en.wikipedia.org/wiki/Data_buffer

how the control mechanism works http://upnp.org/specs/av/UPnP-av-AVArchitecture-v2.pdf

Plenty of other papers about streaming

 

 

it exists for decades, below diagram is over 20 years old and still valid, companies just developed better/newer codecs/protocols and mechanism to further improve its aspects and interface

 

how-real-streaming-works.png

 

https://patents.google.com/patent/US5793980 

 

 

So - we can assume the data is getting there fine.

 

What else is there?

Posted
13 minutes ago, rmpfyf said:

So - we can assume the data is getting there fine.

 

What else is there?

 

Are you going to say, jitter?  If so perhaps if you describe, or link to a little detail on how the mechanism works, and how the ethernet switch affects it, especially if the final device playing the music has local buffers feeding the local DAC circuitry?  Wouldn't any OS jitter  be isolated from upstream and must come from that final device  only?   

 

I know we talked about this before with regard to the raspberry Pi and USB DAC, but I am having great trouble seeing how the upstream network matters.

 

  • Like 2
  • Volunteer
Posted
30 minutes ago, rmpfyf said:

So - we can assume the data is getting there fine.

 

What else is there?

I haven’t been following this thread closely so I may well be missing a nuance here. I’m interested in the “what else” that you seem to be alluding to 

  • Like 1
Posted (edited)
9 hours ago, kukynas said:

it exists for decades

Thanks for spending the time to share that info.  I honestly appreciate it.

 

I am confident the audio data gets from A to B completely in tact.  As proof I am achieving very good sound quality with my 2006 iinet/Belkin Bob modem operating as a switch (main modem is upstream).  I didnt choose it for its performance, it was a spare lying around and handy as a wifi AP too. Interestly, given its age, its speed is limited to 10/100 - handy for audio?

 

However, it performs much much better with:

 

- LPS (Core Audio)

- upgraded power cord (Shunyata ztron Alpha digital)

- ground treatment (Akiko USB tuning stick)

- Power conditioner (Shunyata Triton V2)

 

Again, this is stuff I already had so was available to experiment with.

 

And with Roon I gather it is controlling the data flow/management.

 

On a related subject, I changed from Wireworld Starlight 7 Cat 8 (which is far superior to Cat 5e) to Plarinum which was a considerable step up in SQ - it was much greater than I'd been expecting, I consciously expected diminishing return on the cost.  The puzzling thing is, these cables are exactly the same except for their conductors - Starlight is silver plated copper, Platinum is solid core silver.  Hence why I chose this interesting experiment.

 

And a final twist, I changed the Cat 6 cable (supplied by Antipodes) I had from the Bob to the wall outlet to a WW Starlight.  It took maybe 40-60 seconds, while music kept playing with the data going from my server to render via the Bob. And the cable change gave a small improvement in sound quality which I only took a couple of songs to be certain of - and that cable was not involved in the data transmission!

 

All the while, the data was being transmitted as intended - as per decades of practice.

 

At that point I think I gave up trying to understand this mysterious stuff.

 

Before someone says it was some sort of bias, I am on the fence with this - I really do believe the experts but also follow and believe the outcomes many are getting from the audiophile stuff. 

 

I've set this all out not to brag or seek explanation about why this all ocurred.  But to illustrate how despite it existing for decades, the application of ethernet to high end audio seems to have a lot of unpredictable and perhaps unexplainable outcomes.

 

So I agree with rmpfyf ... other than data transmission, what else is there going on here?  And how do we find out?

Edited by dbastin
  • Like 1

Posted
1 hour ago, aussievintage said:

Are you going to say, jitter?  If so perhaps if you describe, or link to a little detail on how the mechanism works, and how the ethernet switch affects it, especially if the final device playing the music has local buffers feeding the local DAC circuitry?  Wouldn't any OS jitter  be isolated from upstream and must come from that final device  only?   

 

I know we talked about this before with regard to the raspberry Pi and USB DAC, but I am having great trouble seeing how the upstream network matters.

 

51 minutes ago, sir sanders zingmore said:

I haven’t been following this thread closely so I may well be missing a nuance here. I’m interested in the “what else” that you seem to be alluding to 

 

(In the spirit of this being a discussion forum) well, we seem to have some broad agreement as to why it's likely not the integrity of the data stream that's affected. This is progress, and agrees with the few networking specialists that have chimed in also. It also debunks a bit of snakeoil among vendors (didn't mean to pick on Wireworld specifically). 

 

We could probably agree that if data gets to the NIC OK then it's probably getting to the DAC complete also, or we'd have all sorts of audible giveaways that data is missing - it'd not be a little detail here and there, it'd be complete chunks (ie packets) of data missing, and so there'd be intermittent silence. (I've only had this once with a bad cable and the problems were obvious). 

 

So what else can there be? Before the regulars get started, let's not rush to discrediting what people hear.

 

Yes, @aussievintage, if the data is getting there complete and in the right order then the only thing that can reasonably be affected is the timing with which it's delivered to the DAC IC. That'd manifest as jitter.

 

The mechanism is complex however and changes for most any configuration between 'Ethernet NIC is over here' and 'DAC IC is over there'. 

 

My current configuration (which is in the process of being upgraded to something more resilient to jitter) is Ethernet NIC in PC > USB > Amanero > I2S to DAC IC. From a ton of experiments I can attest (and again, this is speaking only to my own experiences):

  • The USB buffer and clocking mechanism on the Amanero isn't perfect. The buffer is small, clocking is FPGA-driven - cycle-to-cycle differences in processing overhead or timing will affect clocking accuracy
  • USB sends data regularly though what's in the packet can vary depending on what's upstream

 

Between these two there's a clear mechanism for 'what happens at the PC can affect timing out the DAC'. 

 

Amanero recommends a galvanic isolator in place. I don't have it, because I've been lazy. My Amanero is powered independent of the PC off its own LPS though if it weren't the case (or if there were any interaction) the power to the FPGA could well be (as it is on a PC) rectified from an upstream DC voltage source that isn't too clean (from an AC source that may be compromised). On a PC this is common; CPU power is relatively high current and inherently noisy given typical designs intended to manipulate voltage/power states extremely quickly and not necessarily with extremely low noise (which remains present even if we lock CPU speed post-POST and LPS it all into a standard motherboard). Small, periodic cycle-to-cycle variances in CPU performance are inherent - anyone interested is welcome to download their CPU technical manuals and read up (stuff like this is why IMHO a well-designed 5V ARM rig tends to sound so good). This is replicated throughout any aspect of the PC - power tolerances are a reason why we have buffers everywhere, buffers everywhere are not a panacea for all potential jitter. 

 

In the same vein there's of course the potential for conducted **** along the Ethernet cable making a difference, which is why we should be running UTP to our machines or at least some sort of isolator with an STP cable if using the latter helps with interference noise along the route. Yes, Ethernet has an isolator locally though these are not perfect isolation just because they're an isolator. Isolation is used with standard Ethernet in many applications. I run STP (last leg) and an isolator, and the results are audible.

 

So what else could affect any inconsistency in what's getting into the USB interface?

 

Asides from quality-of-power issues there's a general-purpose CPU in mine doing more than just playing music - this isn't a CD player running firmware to do one thing only with very few interrupts related to playback only. Even with multiple cores, interrupts are shared across the CPU. Packets landing at the NIC should only generate a CPU interrupt when they're determined relevant to my machine (or broadcast/multicast), though it's a built-in Ethernet port and I have no idea how much of this 'is this mine' checking is handled by the CPU. I can tell you that moving the Ethernet interrupt to a different core to playback makes a difference. (I'd also point out that the oscillators timing what a PC actually does are not reference-grade items - again, reasons for buffers). 

 

So potentially there's an avenue for the periodicity of packets from the switch to matter. The networking heads here are right, a router/switch that's CPU-driven over multiple ports will suffer the same issues here a CPU managing your audio will, whereas a pro-grade switch with dedicated resources per port simply won't. Sticking a reference-grade oscillator+LPS in the thing will help but it's something of a band-aid if the overall design is compromised. (I've also got no idea why people coo over 'audiophile' switches and then don't run the shortest possible Ethernet cable they can find into their device - it's about as silly as paying megadollars for a super clock for your DAC, and then installing it three rooms away... and sending the signal over an Ethernet cable... 'shortest possible cable' is not a 1m cable!).

 

Pro-grade switches can indeed be had fanless and small-ish. 

 

So there, a 'good' regular switch can suffice - if it's not injecting AC noise in to DC, if it's got a PHY per port and isn't CPU-driven (others have described this better), if it's installed with UTP cabling as short as possible to the device it's intended for, if whatever it's plugged into has CPU offload relevant to whatever is processing your music... etc.

 

Or you can have proper buffering and reclocking at the playback end (which is what I'm trying for - results to come in a few months!)

Posted

(I'd offer that the above is just my journey, setups will differ, and all are welcome to throw rocks. 

 

Also that running a spanky switch with increasingly shorter cable begs the question why doesn't that capability just sit inside my PC, so my next experiment will be to run Ethernet over optical to an optical card in the PC with good CPU offload - assuming that's got a buffer and oscillator the latter might be up for replacement, the card independently powered etc etc...

 

...but this is a science experiment of my own time. You could see how A$1k retail for an etherREGEN in a box is not expensive for what it could be IMHO).

Posted (edited)
1 hour ago, rmpfyf said:

 

 

(In the spirit of this being a discussion forum) well, we seem to have some broad agreement as to why it's likely not the integrity of the data stream that's affected. This is progress, and agrees with the few networking specialists that have chimed in also. It also debunks a bit of snakeoil among vendors (didn't mean to pick on Wireworld specifically). 

 

We could probably agree that if data gets to the NIC OK then it's probably getting to the DAC complete also, or we'd have all sorts of audible giveaways that data is missing - it'd not be a little detail here and there, it'd be complete chunks (ie packets) of data missing, and so there'd be intermittent silence. (I've only had this once with a bad cable and the problems were obvious). 

 

So what else can there be? Before the regulars get started, let's not rush to discrediting what people hear.

 

Yes, @aussievintage, if the data is getting there complete and in the right order then the only thing that can reasonably be affected is the timing with which it's delivered to the DAC IC. That'd manifest as jitter.

 

The mechanism is complex however and changes for most any configuration between 'Ethernet NIC is over here' and 'DAC IC is over there'. 

 

My current configuration (which is in the process of being upgraded to something more resilient to jitter) is Ethernet NIC in PC > USB > Amanero > I2S to DAC IC. From a ton of experiments I can attest (and again, this is speaking only to my own experiences):

  • The USB buffer and clocking mechanism on the Amanero isn't perfect. The buffer is small, clocking is FPGA-driven - cycle-to-cycle differences in processing overhead or timing will affect clocking accuracy
  • USB sends data regularly though what's in the packet can vary depending on what's upstream

 

Between these two there's a clear mechanism for 'what happens at the PC can affect timing out the DAC'. 

 

Amanero recommends a galvanic isolator in place. I don't have it, because I've been lazy. My Amanero is powered independent of the PC off its own LPS though if it weren't the case (or if there were any interaction) the power to the FPGA could well be (as it is on a PC) rectified from an upstream DC voltage source that isn't too clean (from an AC source that may be compromised). On a PC this is common; CPU power is relatively high current and inherently noisy given typical designs intended to manipulate voltage/power states extremely quickly and not necessarily with extremely low noise (which remains present even if we lock CPU speed post-POST and LPS it all into a standard motherboard). Small, periodic cycle-to-cycle variances in CPU performance are inherent - anyone interested is welcome to download their CPU technical manuals and read up (stuff like this is why IMHO a well-designed 5V ARM rig tends to sound so good). This is replicated throughout any aspect of the PC - power tolerances are a reason why we have buffers everywhere, buffers everywhere are not a panacea for all potential jitter. 

 

In the same vein there's of course the potential for conducted **** along the Ethernet cable making a difference, which is why we should be running UTP to our machines or at least some sort of isolator with an STP cable if using the latter helps with interference noise along the route. Yes, Ethernet has an isolator locally though these are not perfect isolation just because they're an isolator. Isolation is used with standard Ethernet in many applications. I run STP (last leg) and an isolator, and the results are audible.

 

So what else could affect any inconsistency in what's getting into the USB interface?

 

Asides from quality-of-power issues there's a general-purpose CPU in mine doing more than just playing music - this isn't a CD player running firmware to do one thing only with very few interrupts related to playback only. Even with multiple cores, interrupts are shared across the CPU. Packets landing at the NIC should only generate a CPU interrupt when they're determined relevant to my machine (or broadcast/multicast), though it's a built-in Ethernet port and I have no idea how much of this 'is this mine' checking is handled by the CPU. I can tell you that moving the Ethernet interrupt to a different core to playback makes a difference. (I'd also point out that the oscillators timing what a PC actually does are not reference-grade items - again, reasons for buffers). 

 

So potentially there's an avenue for the periodicity of packets from the switch to matter. The networking heads here are right, a router/switch that's CPU-driven over multiple ports will suffer the same issues here a CPU managing your audio will, whereas a pro-grade switch with dedicated resources per port simply won't. Sticking a reference-grade oscillator+LPS in the thing will help but it's something of a band-aid if the overall design is compromised. (I've also got no idea why people coo over 'audiophile' switches and then don't run the shortest possible Ethernet cable they can find into their device - it's about as silly as paying megadollars for a super clock for your DAC, and then installing it three rooms away... and sending the signal over an Ethernet cable... 'shortest possible cable' is not a 1m cable!).

 

Pro-grade switches can indeed be had fanless and small-ish. 

 

So there, a 'good' regular switch can suffice - if it's not injecting AC noise in to DC, if it's got a PHY per port and isn't CPU-driven (others have described this better), if it's installed with UTP cabling as short as possible to the device it's intended for, if whatever it's plugged into has CPU offload relevant to whatever is processing your music... etc.

 

Or you can have proper buffering and reclocking at the playback end (which is what I'm trying for - results to come in a few months!)

Thanks for the excellent and impartial summary.  This is the sort of post that is both practical and educational.

 

For those of us running bespoke, supposedly “low noise” server/streamers (in my case Antipodes) there isn’t really any easy way to alter the CPU interrupts.  Thus, most here would be looking for what you describe above: the most reasonably priced switch with low AC noise in to DC, a PHY per port, not CPU driven.  Presumably short cat 8 cable into server/streamer.  If not the Audiophile examples mentioned in this thread, then are there any others?

Edited by Stereophilus
  • Like 1
Posted
1 minute ago, Stereophilus said:

Thanks for the excellent and impartial summary.  This is the sort of post that is both practical and educational.

 

Ha! It's fraught with my experience - I'm sure others will chime in with their own and we'll be back to vigorous disagreement shortly :D 

 

1 minute ago, Stereophilus said:

For those of us running bespoke, supposedly “low noise” server/streamers (in my case Antipodes) there isn’t really any easy way to alter the CPU interrupts.  Thus, most here would be looking for what you describe above: the most reasonably priced switch with low AC noise in to DC, a PHY per port, not CPU driven.  Presumably short cat 8 cable into server/streamer.  If not the Audiophile examples mentioned in this thread, then are there any others?

 

 

Well... if we can get over it not being about the data, etherREGEN is about the only thing on market really designed ground-up for purpose. That's not to suggest it's 'best' (you're at the mercy of whatever power supply/oscillators/etc are in it), just to suggest that it includes all that's listed.

 

I'd defer to @recur on the rest as he was kind enough to point out other options to me - Cisco Catalyst, Cisco Meraki, Juniper and others all have fanless options in the 8-12 port range that are so equipped on the PHY/CPU driven bit, though you'll probably be on your own managing leakage currents (eg. https://audiophilestyle.com/forums/topic/37034-smps-and-grounding/ - this said, I'd not be too surprised if the switches inherent offer decent performance and, in being intended to be very 'elsewhere' relative to your audio system, it might not matter). Also, unless you buy these new, an etherREGEN is cheaper and comes with a warranty/company you can annoy should anything go squiffy. Other 'audiophile' attempts seem to be cheap switches that are tweaked, which leads to valid comments around silk purses and sow's ears. 

 

It shouldn't surprise that the 'networking' bits of the etherREGEN, which is the work of two people in a small operation, likely doesn't match what Cisco does. Depending how installed and how well your DAC does reclocking, there's every chance that what you'd hear out of an industrial switch vs an etherREGEN is just 'different' and that 'better' is a subjective thing.

 

So, swings and roundabouts. If you want something new that does well, the audiophile switches are what we've got and the etherREGEN is best in intent. Deeper pockets - try a Cisco Catalyst. Deepest? Wait for Paul Pang or similar to take one and tweak it (whether necessary or not).

 

Also - CAT8 is STP - would suggest UTP CAT 6 (pending environment, try both). 

Posted
On 01/02/2020 at 11:35 PM, rmpfyf said:

So - we can assume the data is getting there fine.

 

What else is there?

 

what else would you expect there? it sits in the cache (xram) and waiting to be played, same way like same song is sitting on HDD, except this one is fragmented in packets

 

 

if you performed previously suggested test by now you know if your streamer is buffering and whats the size of the buffer, I would suggest to do below simple test and check if you can detect any differences

 

 

first: take 2 same files, save first one onto local hard drive (HDD/USB/etc.) take second one and save it onto cloud (google/one drive, etc.) or NAS

second: load both files into the same playlist and perform AB comparison

 

On 01/02/2020 at 11:56 PM, aussievintage said:

 

Are you going to say, jitter?  If so perhaps if you describe, or link to a little detail on how the mechanism works, and how the ethernet switch affects it, especially if the final device playing the music has local buffers feeding the local DAC circuitry?  Wouldn't any OS jitter  be isolated from upstream and must come from that final device  only?   

 

precisely...

On 02/02/2020 at 2:22 AM, rmpfyf said:

My current configuration (which is in the process of being upgraded to something more resilient to jitter) is Ethernet NIC in PC > USB > Amanero > I2S to DAC IC.

 

 still the same R2R DAC? 

 

On 02/02/2020 at 2:22 AM, rmpfyf said:
  • The USB buffer and clocking mechanism on the Amanero isn't perfect. The buffer is small, clocking is FPGA-driven - cycle-to-cycle differences in processing overhead or timing will affect clocking accuracy
  • USB sends data regularly though what's in the packet can vary depending on what's upstream

 

- yes, unfortunately

- yes, timing may vary but only to the point of USB transmitter which sets the master clock down the line

   

I think your main issue here is your DAC design and I'm bit afraid it'll be pretty difficult to fix even with the best reclocking card available unless it's mounted on pins next to your DAC chip, but I might be wrong, if you wanna stay with R2R DAC you might look for PCBA with modern digital front end which has all this sorted and still enjoy your R2R DAC characteristics, I've heard only positive feedback from local guys whose running them with tube output stage 

 

Posted (edited)
12 hours ago, kukynas said:

what else would you expect there? it sits in the cache (xram) and waiting to be played, same way like same song is sitting on HDD, except this one is fragmented in packets

 

 

if you performed previously suggested test by now you know if your streamer is buffering and whats the size of the buffer, I would suggest to do below simple test and check if you can detect any differences

 

 

first: take 2 same files, save first one onto local hard drive (HDD/USB/etc.) take second one and save it onto cloud (google/one drive, etc.) or NAS

second: load both files into the same playlist and perform AB comparison

 

There's a slight difference - your thesis here negates network-driven interrupts. Even an infinite buffer won't counter that.

 

12 hours ago, kukynas said:

- yes, unfortunately

- yes, timing may vary but only to the point of USB transmitter which sets the master clock down the line

 

Not exactly - the USB receiver sets timing relevant to what it gets if async. Accordingly, everything upstream contributes.

 

The design is further limited in the timing mechanism not being fully independent of this.

 

12 hours ago, kukynas said:

I think your main issue here is your DAC design and I'm bit afraid it'll be pretty difficult to fix even with the best reclocking card available unless it's mounted on pins next to your DAC chip, but I might be wrong, if you wanna stay with R2R DAC you might look for PCBA with modern digital front end which has all this sorted and still enjoy your R2R DAC characteristics, I've heard only positive feedback from local guys whose running them with tube output stage 

 

Currently still on my R2R much of the time though not sure the DAC IC counts for much - timing errors resolve differently in newer chips but are still present.

 

As you suggest my current mod us a buffer/reclocker right next to the IC... Where it should be :)

Edited by rmpfyf
Posted
12 hours ago, kukynas said:

first: take 2 same files, save first one onto local hard drive (HDD/USB/etc.) take second one and save it onto cloud (google/one drive, etc.) or NAS

second: load both files into the same playlist and perform AB comparison

 

One last thing - I already do this. I have a bunch of bash scripts written to do this sort of stuff (including RAM disks, dropping Ethernet interfaces, etc). They're immensely useful. (I've also made version supporting blind/double blind testing, but really, who has time and I'm not that insecure - A/B works fine). 

 

Everyone should do this. I'm a Linux user so this is relatively easy for me though my computer is very 'rough' to use, though I'd encourage anyone to either use the same functionality or to write to their package developers and request the same.

Posted (edited)

TLDR

 

re: Network switching......Audiophile switching please give me a break....sometimes this hobby can be a little retarded.

 

Just dont use cheap rubbish switching or rubbish cable and you will be fine. Cheap nasty like TP Link, DLink etc....

 

If you want quality, buy a quality vendor like Aruba, Allied, Cisco or Juniper etc...and light your entire house up properly. Just get it off fleabay cheap and it do the job for many years to come.

 

Edited by sjay
  • Like 2

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...
To Top