Jump to content

Ethernet Routers for Audio


Recommended Posts

9 minutes ago, BugPowderDust said:

@Ittaku how is your i7 NUC going in your system? Maybe we both need to upgrade to something fancy because Dale told us to. We're leaving gains on the table!

Great, couldn't tell it apart from the $50K(?) Taiko streamer. Guess I need a more resolving system.

  • Like 1
Link to comment
Share on other sites



21 minutes ago, El Tel said:

 

Ethernet over copper incorporates galvanic isolation of every switch port, if an upstream change is made, the only way for a difference (such as interference) to occur elsewhere in the network is for that to traverse a switch port by circumventing the isolation. That is possible when using SFTP (like CAT7/8). I do not use anything above CAT6 for this very reason. It is also theoretically possible if equipment is placed closely together (or touching in some way).

 

We're well into the realms of psychoacoustics here. If it is not measurable, our ears won't hear it.

How does SFTP/Cat 7 or 8 circumvent the galvanic isolation in a switch?

Is it about how the shield is connected?

How is that prevented in the design of switch and routers ports?

Link to comment
Share on other sites

31 minutes ago, BugPowderDust said:

@Ittaku how is your i7 NUC going in your system? Maybe we both need to upgrade to something fancy because Dale told us to. We're leaving gains on the table!

It was just a suggestion to reconsider based on my experience, not telling you to upgrade to 'fancy'.

Edited by dbastin
Link to comment
Share on other sites

  • Volunteer
42 minutes ago, dbastin said:

How does SFTP/Cat 7 or 8 circumvent the galvanic isolation in a switch?

Is it about how the shield is connected?

Pretty much. You're effectively grounding the connected ports together and risks transmission of out-of-band noise i.e. noise other than the inherent EMI of twisted pairs within the insulated jacket of the cable.

 

CAT7 is designed for running 10Gbps and that's why the shielding exists. It's there to reduce attenuation. It is massive overkill for domestic use and can only really allow interference and issues to propagate elsewhere in your network. A poor power supply in an upstream switch that causes any sort of noise will not have that noise transmitted to downstream ports and connected devices when using an unshielded CAT6, for example. The irony of the higher standard and additional cost means you are forced to spend unnecessary money on switch and router power supplies when you didn't need to. There is no qualitative improvement, there is no latency improvement and you simply don't need the bandwidth.

 

Structured cabling installations for CAT7 and CAT8 have a higher commissioning and testing failure rate than unshielded installations due to the issue of poorly terminated shielding.

 

42 minutes ago, dbastin said:

How is that prevented in the design of switch and routers ports?

I'm not sure it can be prevented by the port designs as they conform to specifications agreed by IEEE standards. It's the cable that is creating the potential for an issue after all, not the port. Same way as there is very little to stop you picking up a jerry can with diesel in it and filling up your petrol vehicle - this is difficult at the servo due to nozzle design, but entirely doable when picking up the wrong jerry can at home.

 

In the enterprise switching world, a large Juniper or Cisco switch has specialist modular power supplies within it that are designed for purpose. If you buy a 9300 chassis, you'll be forking out an additional USD4k for a pair of power supplies alone and that's before you have put any line cards into it to provide data ports. You won't get the same kind of issues from enterprise network chassis power supplies and line cards as they are not using wall warts.

 

My head is starting to hurt now...

Edited by El Tel
  • Thanks 1
Link to comment
Share on other sites



3 hours ago, El Tel said:

My head is starting to hurt now...

I know that feeling.

 

I am puzzled by the so called galvanic isolation for ethernet.  I gather it is there for electrical safety, so an electrical fault can not be passed on along the cable.  Audion people seem to think is isolates from 'noise', which does not appear to be true.  The isolation transformer design may not be targeted to stop the 'noise'.  I am not convinced it can be relied upon for isolation from noise that affects audio - not like fibre or wifi.

 

Further, I would've thought that galvanic isolation would also apply to the noise being transmitted by the shield.  Apparently not.

 

So then it becomes an issue of how the shield is terminated, and at which end.  There is plenty to read from EtherRegen users about that.  If the Cat 7 and * have high failure rates on site, that may be due to the workmanship and/or tools involved in terminating on site (ie. inferior to in a workshop).

 

But despite your claims that Cat 6A and 7 are not good for audio due to allowing "interference to propagate", my audio grade ethernet cables by Shunyata and JCAT are Cat 6A and by Synergistic Research is Cat 7.  These companies would surely employ the right experts and not adopt those standards if they were inherently flawed.  And surely they would have concluded how to best terminate the shields.

 

Nevertheless, if fibre and wifi are immune to passing on out-of-band noise (due to galvanic isolation, no shield), why can I easily hear the difference when I change the cables in bold?

 

nbn > cable > FMC > fibre > router > cable > server > cable > wifi > ...

 

And last night I changed the power supply to the router (above) and could easily hear the difference, even though it is isolated from the audio gear by wifi and passes through an Antipodes server that has a very low noise floor and a built in 2 port switch (on the mobo).

 

I gave up trying to understand these things long ago.  But while exploring the combinations of building network 'lego' can be head hurting, the pleasure of the outcome is worth it. 

 

Sit back and enjoy the pleasure of what you have (with a drop or two of something), and be open to trying new things without understanding how/why they work.  You might enjoy yourself finding greater pleasure.

Edited by dbastin
Link to comment
Share on other sites

  • Volunteer

I'm fried with this now. I just don't have the energy to write a full response.

 

I am desperately aware that this might seem rude, but it is not my intention. Honestly. I am framing my exasperation more than attacking anything that has been said. I am minded of a Russian physicist (whose name escapes me right now), who said, and this is not word for word, but from memory and somewhat paraphrased - "If the text of each phrase requires a paragraph (to clarify, challenge or disprove), each paragraph requires a section, each section requires a chapter in response, and each chapter requires a book, the assertions become irrefutable and, therefore, acquires features of truth. This truthfulness I describe as transcendental."

 

George Horne, one time Bishop of Norwich, wrote in his 1786 Letters on Infidelity (and this is a direct quote, not paraphrasing) - "Pertness and ignorance may ask a question in three lines, which it will cost learning and ingenuity thirty pages to answer. When this is done, the same question shall be triumphantly asked again the next year, as if nothing had ever been written upon the subject. And as people in general, for one reason or another, like short objections better than long answers, in this mode of disputation (if it can be styled such) the odds must ever be against us; and we must be content with those for our friends who have honesty and erudition, candour and patience, to study both sides of the question."

 

It is with all sincerity and hope, that I have not unduly offended people (and I unreservedly apologise if so - I try not to attack the individual but tackle the argument with voracious tenacity). With that, I'm going to duck out of this thread for the foreseeable after one last point:

 

36 minutes ago, dbastin said:

why can I easily hear the difference when I change the cables in bold?

 

I don't know why you hear a difference and providing the burden of proof in the face of documented electrical engineering standards is not in my purview as it is your experience not mine.

 

If you re-read my earlier posts, you will see I have experimented with various "audiophile" networking devices and cables over time. I'm not averse to changing my mind, but I need something compelling which I just have not heard or been able to measure. Maybe my ears are shot; my brain certainly is now.

 

Thank you. This has been fun, but oft-times difficult and unintentionally confrontational.

  • Like 2
Link to comment
Share on other sites

11 hours ago, dbastin said:

These companies would surely employ the right experts and not adopt those standards if they were inherently flawed.  And surely they would have concluded how to best terminate the shields.

This is far from the case. Stereophile got a bunch of audiophile Ethernet cables tested a few years back and none of them passed a standard patch cable testing tool. Ie they were outside spec and the kinds of cables we would ordinarily chop into two with scissors and bin if we found on a network.

 

I wonder if the people positing these profound experiences here need to get at least an introductory level understanding of networking under their belt before they can comprehend what is going on electrically then electronically above the cable?  
 

This isn’t to point fingers but to reiterate @El Tel’s position above. The questions that get asked here are usually the same ones in disguise or maybe attempts to whittle down a perceived weakness in established electrical engineering theory that is so widely deployed, you are using it to read this post now.

 

Studying a Cisco ccna or a juniper jncia would be a wonderful step forward in the gulf in the conversation here. We’re being asked to explain vlans and cabling standards still, years into this and other threads here.

 

do you ask your radiographer if they are using audio quest power cables or synergistic research power cables?


I’m sure @Ittaku will tell you from personal experience the sensitivity of this equipment and I know for a fact (I have two publicly listed medical imaging customers today) that their cables are anything but fancy.

 

At a certain point you need to ask if expectation bias, your environment or your mental state (tired, happy, sad, bored) are a more likely answer than the leaps made to justify a new and never before experienced electrical phenomena.
 

Introspection definitely seems unusual around here.

Link to comment
Share on other sites

4 minutes ago, BugPowderDust said:

I’m sure @Ittaku will tell you from personal experience the sensitivity of this equipment and I know for a fact (I have two publicly listed medical imaging customers today) that their cables are anything but fancy.

Haha stop getting me involved, they're not going to listen to anything you say.

(But the radiology department used ordinary power cords and bulk cat5 cable.)

  • Haha 1
Link to comment
Share on other sites

52 minutes ago, Ittaku said:

(But the radiology department used ordinary power cords and bulk cat5 cable.)

 

There is no timing sensitive data conversion in radiology, except ultrasound video imaging.  There is actually information within the radiology industry that power supply noise is important to the imaging achieved for ultrasound. See Here and Here for example.

 

However, we digress. 
 

What seems important to me is understanding that the only point at which digital data transmission affects the analog signal is at the DAC.  Particularly the clock within the DAC.  Now, ideally, DAC are designed to be immune to any upstream issues, using buffers, isolation, high precision clocks or ideally all 3.  However, I posit that even the makers of the worlds most over-engineered DAC still considers upstream noise an issue, and their webpage indicates  Here.

 

It may be a stretch to ask people to accept that even modern, ultra expensive DACs remain sensitive to digital noise upstream, but as it seems to be a point of ridicule for those who own these DACs I thought I’d point out some of the irony on display here.

 

 

  • Like 3
Link to comment
Share on other sites



  • Volunteer

I don't know why I do this to myself. In the hopes that some poor, inexperienced individual doesn't get dragged-in, indoctrinated and starts pouring good money away, I guess... I do not have faith in my knowledge as faith requires no evidence and a mere belief that something is so. Thousands of scientific-research-years and peer-reviewed outputs carry the weight for me; I do not need faith.

 

Beating Dead GIF - Beating Dead Horse - Discover & Share GIFs

 

Of course a manufacturer is going to put mealy-mouthed explanations of their engineering "concerns" and how they have solved them. Why? Because then they have created the doubt-driven demand for their product. It's marketing 101. Digital signals are not a single continuous electrical stream unlike analogue signals. In Ethernet, they are regenerated via induced current across the isolation boundaries (switch ports) both inbound and outbound. If you haven't grounded the ports through shielding connections, there is no ability to breach the inherent galvanic isolation. This is electronics 101.

 

A 10MHz clock for a DAC is overkill. How many people listen beyond the stock offerings of 24/192 hi-res? One or two? OK, let's say you regularly listen at 768kHz sampling rates. Guess what? A clock running at double the sampling rate is entirely adequate, even over-engineered. According to the 2001 Master Handbook of Acoustics (Alton Everest https://en.wikipedia.org/wiki/F._Alton_Everest), the shortest perceptible sound that registers as an identifiable tone is in the 100msec range. 0.1sec. Anything less than that and it becomes audible as a click rather than a tone. This can be stretched a little if the tone is a higher frequency. I'll be very generous here and say let's assume one genetically anomalous individual can detect a tone and not a click of 0.01sec duration (10ms). In fact, make it a Marvel superhero, 🎶Earman🎶, who can detect tone at 100 times more than you or I at 0.001sec (1ms). At four decimal places, that's 44.1 cycles of samples from a CD. 768 cycles of your high sample rate 768kHz track. a 10Mhz clock would provide the granularity to shift a single sample backwards or forwards by approx 1/1300th of the sample rate (that's down in the 0.0000001s range - yep, 7 decimal places). It's beyond the realm of scientific questioning that this is perceptible to the human ear and auditory cortex even for our superhero. It's like being able to tell the difference between a vinyl LP rotating at spec and approx 0.000006 rpm away from spec.

 

Sheesh.

 

<WhyAmIBothering> And.... breathe. </InnerMonologue>

  • Haha 1
Link to comment
Share on other sites

15 hours ago, El Tel said:

I'm fried with this now. I just don't have the energy to write a full response.

 

I am desperately aware that this might seem rude, but it is not my intention. Honestly. I am framing my exasperation more than attacking anything that has been said. I am minded of a Russian physicist (whose name escapes me right now), who said, and this is not word for word, but from memory and somewhat paraphrased - "If the text of each phrase requires a paragraph (to clarify, challenge or disprove), each paragraph requires a section, each section requires a chapter in response, and each chapter requires a book, the assertions become irrefutable and, therefore, acquires features of truth. This truthfulness I describe as transcendental."

 

George Horne, one time Bishop of Norwich, wrote in his 1786 Letters on Infidelity (and this is a direct quote, not paraphrasing) - "Pertness and ignorance may ask a question in three lines, which it will cost learning and ingenuity thirty pages to answer. When this is done, the same question shall be triumphantly asked again the next year, as if nothing had ever been written upon the subject. And as people in general, for one reason or another, like short objections better than long answers, in this mode of disputation (if it can be styled such) the odds must ever be against us; and we must be content with those for our friends who have honesty and erudition, candour and patience, to study both sides of the question."

 

It is with all sincerity and hope, that I have not unduly offended people (and I unreservedly apologise if so - I try not to attack the individual but tackle the argument with voracious tenacity). With that, I'm going to duck out of this thread for the foreseeable after one last point:

 

 

I don't know why you hear a difference and providing the burden of proof in the face of documented electrical engineering standards is not in my purview as it is your experience not mine.

 

If you re-read my earlier posts, you will see I have experimented with various "audiophile" networking devices and cables over time. I'm not averse to changing my mind, but I need something compelling which I just have not heard or been able to measure. Maybe my ears are shot; my brain certainly is now.

 

Thank you. This has been fun, but oft-times difficult and unintentionally confrontational.

You have been a true gentleman in this debate.  I value that.  I have not felt confronted or offended.

 

I have been persistent and relentless with my challenging questions, but please be assured it is not personal, just seeking better understanding.  And to propose there is an alternative to the gospel of the ethernet standards,  Thank you for hanging in as long as you did while I rode the wave of your enthusiasm to reply.

 

We have different experiences - it is as simple as that.  If we try to understand why it gets complicated and potentially offensive.

 

4 hours ago, BugPowderDust said:

I wonder if the people positing these profound experiences here need to get at least an introductory level understanding of networking under their belt before they can comprehend what is going on electrically then electronically above the cable?

I am not sure that would change what they experience.

 

4 hours ago, BugPowderDust said:

At a certain point you need to ask if expectation bias, your environment or your mental state (tired, happy, sad, bored) are a more likely answer than the leaps made to justify a new and never before experienced electrical phenomena.

Many people have experienced the kinds of things I have shared here - I doubt it is never before experienced phenomena.  Many are not prepared to share that here.  I am not justifying my experience, just sharing.  Is it all in my head?  Probably, after all, isn't everything we experience processed (filtered, interpreted, etc) by our heads?  This is rhetorical, no answer required.

 

3 hours ago, Stereophilus said:

There is no timing sensitive data conversion in radiology, except ultrasound video imaging.  There is actually information within the radiology industry that power supply noise is important to the imaging achieved for ultrasound. See Here and Here for example.

Hence the existence of https://clearimagescientific.com/products/

 

3 hours ago, Stereophilus said:

It may be a stretch to ask people to accept that even modern, ultra expensive DACs remain sensitive to digital noise upstream, but as it seems to be a point of ridicule for those who own these DACs I thought I’d point out some of the irony on display here.

And this suggests another bias at play.

 

It does seem to be a phenomena.  Weather you believe it or not is up to you.

 

I refer to a statement I made a couple of pages back ... It is very hard to change strongly and deeply held beliefs.  The mind is a very persuasive manipulator of perception.  I wonder if objectivity is even really possible.  But that can be debated elsewhere - here we talk about using routers in a network for hifi audio.

Link to comment
Share on other sites

20 minutes ago, El Tel said:

It's like being able to tell the difference between a vinyl LP rotating at spec and approx 0.000006 rpm away from spec.

Oh oh, quick, over to the vinyl forums. There will be angst a stirring. 

  • Haha 1
Link to comment
Share on other sites

2 minutes ago, dbastin said:

And this suggests another bias at play.

 

There is a concept called nuance in this world.

 

I don't have to subscribe to everything a manufacturer says or does in order to like their product. I disagree with quite a lot of the hyperbole on the MSB website and I definitely object to the criminal and abhorrent behaviour of the company founder Larry Gullman.

Link to comment
Share on other sites

8 hours ago, BugPowderDust said:

Introspection definitely seems unusual around here.

Craig,

Is the lack of introspection limited to just one side of the discussion?  To me it could equally apply to all posters on this topic and similar and other threads and even many other forums.  I am definitely not exempt from a lack of introspection.  I do however constantly ask my self the questions of what is happening and why on a myriad of audio topics.  What is real and what is not and maybe imagined and reflective of possible bias.  What are the answers?  It is partly what attracts me to the pursuit of audio as well as the listening.

John

Link to comment
Share on other sites



2 hours ago, El Tel said:

I don't know why I do this to myself. In the hopes that some poor, inexperienced individual doesn't get dragged-in, indoctrinated and starts pouring good money away, I guess... I do not have faith in my knowledge as faith requires no evidence and a mere belief that something is so. Thousands of scientific-research-years and peer-reviewed outputs carry the weight for me; I do not need faith.

 

Beating Dead GIF - Beating Dead Horse - Discover & Share GIFs

 


 

I have read and understood everything you have written here.  The reason I participate in these fora is to learn and share experience.  I understand that there is a line where established scientific explanation and experience can seem to conflict.  I deal with that line daily in my field of expertise.  I am neither naive to your position, nor am I in agreement that a body of experience should be dismissed out of hand, no matter how misguided that experience seems to be in relation to the established science.
 

2 hours ago, El Tel said:

Of course a manufacturer is going to put mealy-mouthed explanations of their engineering "concerns" and how they have solved them. Why? Because then they have created the doubt-driven demand for their product. It's marketing 101. Digital signals are not a single continuous electrical stream unlike analogue signals. In Ethernet, they are regenerated via induced current across the isolation boundaries (switch ports) both inbound and outbound. If you haven't grounded the ports through shielding connections, there is no ability to breach the inherent galvanic isolation. This is electronics 101.

When multiple experts in a given field (in this case the designers of audio DACs) all arrive at the same conclusion, what does it tell you? The cynic would suggest collusion, although I think the likelihood very small.  We’re considering people who are both pioneering thinkers and well respected engineers.  People like Andreas Koch, Bruno Putzeys, Ted Smith, Rob Watts… I mean the list goes on and on.  This debate about how and why “noise” arriving at the digital side of a DAC causes an analog distortion needs to played out.  Ethernet transmission is also governed by a clock, asynchronously.  Because Asynchronous communications busses don't share a separate clock signal, the transmitter has to encode each transmission in a way that allows the receiver to know when one bit ends and the next bit begins. Ethernet's solution for that is to start every transmission with a long series of alternating 0 and 1 bits -- the preamble -- which allows the receiver to temporarily synchronize its bit-clock with the transmitter's clock for the duration of that transmission. As soon as one frame ends and the next begins, the temporary synchronization must begin again.  That suggests to me that phase noise in the transmitting clock can be transferred to the receiving clock, and thence, potentially down the chain to the DAC clock.

 

2 hours ago, El Tel said:

A 10MHz clock for a DAC is overkill. How many people listen beyond the stock offerings of 24/192 hi-res? One or two? OK, let's say you regularly listen at 768kHz sampling rates. Guess what? A clock running at double the sampling rate is entirely adequate, even over-engineered. According to the 2001 Master Handbook of Acoustics (Alton Everest https://en.wikipedia.org/wiki/F._Alton_Everest), the shortest perceptible sound that registers as an identifiable tone is in the 100msec range. 0.1sec. Anything less than that and it becomes audible as a click rather than a tone. This can be stretched a little if the tone is a higher frequency. I'll be very generous here and say let's assume one genetically anomalous individual can detect a tone and not a click of 0.01sec duration (10ms). In fact, make it a Marvel superhero, 🎶Earman🎶, who can detect tone at 100 times more than you or I at 0.001sec (1ms). At four decimal places, that's 44.1 cycles of samples from a CD. 768 cycles of your high sample rate 768kHz track. a 10Mhz clock would provide the granularity to shift a single sample backwards or forwards by approx 1/1300th of the sample rate (that's down in the 0.0000001s range - yep, 7 decimal places). It's beyond the realm of scientific questioning that this is perceptible to the human ear and auditory cortex even for our superhero. It's like being able to tell the difference between a vinyl LP rotating at spec and approx 0.000006 rpm away from spec.

 

Sheesh.

 

<WhyAmIBothering> And.... breathe. </InnerMonologue>

The assumptions you are making here are that 1) the clock deviations are isolated, 2) small scale and 3) result in a predictable analog output deviation corresponding to the digital deviation in the clock.  I would argue none of these assumptions hold true all of the time.  But the subject of jitter is, I agree, highly contentious.  If your natural inclination is to disavow the work of leading DAC manufacturers in this area as “doubt-driven demand” then I think it obvious I cannot hope to sway your opinion on this.

Link to comment
Share on other sites

  • Volunteer

I don't need swaying anymore than I need to take on board the musings of the folk that knock on my door from the church of the latter day saints.

 

You have not got the correct understanding of asynchronous timing. Do a search on the encoding methods I've referenced earlier. It all starts with Manchester Encoding and goes from there. The receiving network port doesn't need its clock to interpret the timing of an incoming stream. That is already baked in to the encoding and can be read directly from it.

 

Your appeals to authority are not the same as citing authors of peer reviewed papers. I venture that these audio personalities do not know what established networking professionals do.

Link to comment
Share on other sites

11 minutes ago, El Tel said:

I don't need swaying anymore than I need to take on board the musings of the folk that knock on my door from the church of the latter day saints.

No zealotry here.  If anything I’m the opposite, I’m a proponent of considering all available information and choices.

 

11 minutes ago, El Tel said:

You have not got the correct understanding of asynchronous timing. Do a search on the encoding methods I've referenced earlier. It all starts with Manchester Encoding and goes from there. The receiving network port doesn't need its clock to interpret the timing of an incoming stream. That is already baked in to the encoding and can be read directly from it.

I’ll read further.  I thought I had it right.  Do you suggest that there is no influence by the transmitter clock on the receiver clock even after decoding?

 

11 minutes ago, El Tel said:

Your appeals to authority are not the same as citing authors of peer reviewed papers. I venture that these audio personalities do not know what established networking professionals do.

Who are the peers of DAC engineers if not other DAC engineers?  We are discussing the influence (or lack thereof) of the network specifically to the application of digital to analog conversion of streamed music.  Networking data is indisputably robust and foolproof.  But where that intersects with conversion of digital data to an analog stream of music remains the remit of the audio engineer.

Link to comment
Share on other sites

  • Volunteer
5 hours ago, Stereophilus said:

Do you suggest that there is no influence by the transmitter clock on the receiver clock even after decoding?

Exactly. The clock exists for 2 main functions: one is for preparing the transmission of the stream and referenced for the encoding of the timing. The second is for the purposes of logging information to a repository including a reliably timestamped entry in order to compare wider network events.

 

5 hours ago, Stereophilus said:

Who are the peers of DAC engineers if not other DAC engineers?

 

I don't disagree. But.... This is like the poachers being the gamekeepers. They have a vested interest in creating obfuscation or alluding to areas of concern that they can fix. The IEEE has no such commercial conflicts.

Edited by El Tel
Link to comment
Share on other sites

2 minutes ago, El Tel said:

This is like the poachers being the gamekeepers. They have a vested interest in creating obfuscation or alluding to areas of concern that they can fix. The IEEE has no such commercial conflicts.

There's also a vicious cycle - manufacturer A finds an area they can introduce a completely new product X to fix an alleged problem. Manufacturer B then sees it both as an opportunity to create yet another device to add to their market, and also feel the demand from the marketplace that they need to produce an X themselves. Before you know it, there are 10 new electronic devices between the two devices that actually do the work.

  • Love 1
Link to comment
Share on other sites



19 minutes ago, Ittaku said:

There's also a vicious cycle - manufacturer A finds an area they can introduce a completely new product X to fix an alleged problem. Manufacturer B then sees it both as an opportunity to create yet another device to add to their market, and also feel the demand from the marketplace that they need to produce an X themselves. Before you know it, there are 10 new electronic devices between the two devices that actually do the work.

The obvious rebuttal to this is why then does the MSB Select DAC II have audible differences to the MSB reference DAC?  Not my assessment, but yours.  This question is posed notwithstanding the following:

 

4 hours ago, BugPowderDust said:

I don't have to subscribe to everything a manufacturer says or does in order to like their product. I disagree with quite a lot of the hyperbole on the MSB website and I definitely object to the criminal and abhorrent behaviour of the company founder Larry Gullman.

I apologise for integrating the quotes of 2 different people into the single rebuttal, but it seems to be the common position held for both.

 

My point here, is why are 2 staunch advocates of the “network has no impact on audio” argument also making purchasing decisions in favour of  a DAC brand the differentiates itself on the basis of how it treats/isolates streamed audio inputs.  I mentioned the irony of this before, but surely it warrants some degree of explanation… If clocks/jitter/upstream noise/power mean nothing in the realm of digital audio… why choose to buy MSB DACs that highlight these factors as a point of engineering differentiation, and not at inconsiderable cost?

  • Like 1
Link to comment
Share on other sites

36 minutes ago, Stereophilus said:

The obvious rebuttal to this is why then does the MSB Select DAC II have audible differences to the MSB reference DAC?  Not my assessment, but yours.  This question is posed notwithstanding the following:

I was talking about adding new non-existent components to the audio chain, and making them a new "audio device", like audiophile ethernet switches, not sure how that relates to completely different electronics within the Select vs the Reference.

Edited by Ittaku
Link to comment
Share on other sites



  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...
To Top