So having worked in electronics retail for 2 years, I got very comfortable with the idea that when you're transmitting digital content, the quality of the cable you're using doesn't matter; either all the bits get there or they don't (the latter causing very noticeable visual/audio artifacts that can't be missed)
However, I had an audiophile coworker who believed that, in the case of HDMI at least (which was primarily what we were debating), there's error correction built into the standard, and so he believed that cheaper cables did in fact cause errors, they were just smoothed over by the error correction software in the HDMI decoder (in most cases in the TV), but that the image and/or sound would be better if there were fewer errors to begin with. But then, there's this, this, and this
So my questions are a) is there any validity to what he's saying about HDMI specifically? Obviously this isn't exactly a home theater forum, but I figure some dudes who really have a grasp on digital encoding/decoding might know; and b) does the same apply to SPDIF, ADAT optical, USB, and/or Firewire? To quote from another thread:
But unless SPDIF has the elusive error-correction that HDMI supposedly does, I just can't believe clocking errors and signal degredation (degredation of a bitstream that has to be decoded back into an analog waveform, of course) wouldn't be immediately noticeable
NERD FIGHT
However, I had an audiophile coworker who believed that, in the case of HDMI at least (which was primarily what we were debating), there's error correction built into the standard, and so he believed that cheaper cables did in fact cause errors, they were just smoothed over by the error correction software in the HDMI decoder (in most cases in the TV), but that the image and/or sound would be better if there were fewer errors to begin with. But then, there's this, this, and this
So my questions are a) is there any validity to what he's saying about HDMI specifically? Obviously this isn't exactly a home theater forum, but I figure some dudes who really have a grasp on digital encoding/decoding might know; and b) does the same apply to SPDIF, ADAT optical, USB, and/or Firewire? To quote from another thread:
[Standard RCA cables] "may" work [for SPDIF] but it is a common misconception. Even though the signal is digital, it is still transmitted through the wire via an analog carrier at very high frequencies. So the cable still matters. What you get with extra cheap cables are clock errors and signal degradation that have to be corrected. It may not even sync.
But unless SPDIF has the elusive error-correction that HDMI supposedly does, I just can't believe clocking errors and signal degredation (degredation of a bitstream that has to be decoded back into an analog waveform, of course) wouldn't be immediately noticeable
NERD FIGHT