Digital cables - let's debunk this once and for all

MarcusGHedwig

Member
Mar 21, 2010
3,453
0
36
New York
www.myspace.com
So having worked in electronics retail for 2 years, I got very comfortable with the idea that when you're transmitting digital content, the quality of the cable you're using doesn't matter; either all the bits get there or they don't (the latter causing very noticeable visual/audio artifacts that can't be missed)

However, I had an audiophile coworker who believed that, in the case of HDMI at least (which was primarily what we were debating), there's error correction built into the standard, and so he believed that cheaper cables did in fact cause errors, they were just smoothed over by the error correction software in the HDMI decoder (in most cases in the TV), but that the image and/or sound would be better if there were fewer errors to begin with. But then, there's this, this, and this

So my questions are a) is there any validity to what he's saying about HDMI specifically? Obviously this isn't exactly a home theater forum, but I figure some dudes who really have a grasp on digital encoding/decoding might know; and b) does the same apply to SPDIF, ADAT optical, USB, and/or Firewire? To quote from another thread:

[Standard RCA cables] "may" work [for SPDIF] but it is a common misconception. Even though the signal is digital, it is still transmitted through the wire via an analog carrier at very high frequencies. So the cable still matters. What you get with extra cheap cables are clock errors and signal degradation that have to be corrected. It may not even sync.

But unless SPDIF has the elusive error-correction that HDMI supposedly does, I just can't believe clocking errors and signal degredation (degredation of a bitstream that has to be decoded back into an analog waveform, of course) wouldn't be immediately noticeable

NERD FIGHT
 
My experience with HDMI-cables >5m are:

Cheap cable: Couldn't do 1080p, but could do 1080i.
Expensive cable: Could do 1080p.

I'm still raging because of this... New tv, new video card. Who would start with checking if it's the cable's fault?
 
The HDMI thing is just sales bullshit.
My experience is that digital cable makes zero difference if they work but the great thing about spdif is it's incredibly easy to do a sum test in a DAW. Whether it's error correction or perfect transmission if the results cancel 100% then the difference is academic but practically irrelevant.

Edit: And I'll add that the videophile/audiophile thing is the same garbage. I'm constantly around pro audio and TV production and I never see $500 spdif or HDMI cables in either of those applications. An ESPN video engineer I know recommended I buy HDMI cables from monoprice. "Same shit. If it fails buy ten more."
 
^A strong argument indeed :D And FTR, I bought two 6' HDMI cables from Hong Kong for $.99 each back when I got my 360 and PS3 (2007 holidays), and they've survived countless trips back and forth from college and still work great (and have no problem transmitting 1080p)
 
In digital cables there's not a whole lot of advantages of using higher end cables until you start using long lengths. Most people's home theaters shouldn't be running cable long enough to matter.

One example off the top of my head is Cat5 vs Cat5e. Cat5e is required of you want a 1000baseT network to reach full speed on a 100m run.... But if you're just plugging a 10 foot cable cable from your laptop to your router then cheaper Cat5 will function fine.
 
That's an excellent point. All cable's shortcomings are amplified with length. That said, in most scenarios it's often a more cost effective/reliable solution to add a repeater or cat5/6 extender or if you are going really far covert to fiber. There is certainly a level of cable that is absolute garbage, but unfortunately price isn't a great indicator.
 
Apparently with HDMI it is actually a noticeable problem, although this is more a problem with the transmitter and receiver and not the cable specifically, the cable can just aggravate the problems that exist with or without any cable being plugged in, it's not a "SOUND" thing at all.....(it usually never is with cables, even the anal-log ones)
 
^What are the visual/aural manifestations of this problem?

...you either have a picture or you don't :p

Again, it's more the transmitter/receiver, if they're sending a poor signal you'll get "sparkles" (if you remember old analog television sets that had no reception, it's kind of like that white "snow" with the vague image still viewable), so as far as I can guess, the only cable that will "improve" the quality are those ones with the booster chips IN the actual cable, although this isn't an "improvement" it's just making up for the short comings of the transmitter/receiver, if the trans/rec are doing their job then any old cheap shit hdmi cable will do what it's supposed to, which is either work or not.

Never argue with an audiophool, they're borderline religious fanatics, they BELIEVE in all the cable garbage and foreskin affect and there's not a whole lot you can do to convince them otherwise, just as with the other religious fanatics the planet is only 6000 years old and science be dammed.
 
It only has to be good enough to make the level/edge (can't remember which they use) triggers for the converters on both ends see it. Crap cables mean you may not make the level or slew an edge and signal distortion. Otherwise its crap.
 
In digital cables there's not a whole lot of advantages of using higher end cables until you start using long lengths. Most people's home theaters shouldn't be running cable long enough to matter.

One example off the top of my head is Cat5 vs Cat5e. Cat5e is required of you want a 1000baseT network to reach full speed on a 100m run.... But if you're just plugging a 10 foot cable cable from your laptop to your router then cheaper Cat5 will function fine.

Nope, Cat5 won't even sync at gigabit. Literally just ran into this with a 3ft cable transferring some video. Grabbed a Cat 5 cable, was getting 8-10MB. WTF!?!? Double checked and the switch said 100Mb grabbed a Cat5e, back to 95MB, yeah baby.

On to other digital cables....

My understanding is that a digital signal is a square wave in the analog domain. But it is still a wave that is in the ANALOG domain. Based on my knowledge of gate logic, if those waves round off at all or become distorted, then the gate will trigger late, or trigger early interpreting another spike as a wave. Thus you get errors.

In networking you typically use TCP/IP which means that if the receiving end has an error it re-requests the packet. Thus error correction.

In other digital signals, it would depend on the protocol. But again my understanding is that the "error correction" is more like taking averages or make assumptions based on what you have seen.

Now also I do know that the impedance rating of digital cables is 75Ohm and the tolerances are much tighter. Analog cables range quite a bit. Now impedance of the cable affects frequency response. Analog of course being pretty low in the grand scheme. Digital being in the Ghz,Mhz, etc. But even for clocking, terminators need to be at very tight tolerances, otherwise errors occur.

Before hitting submit, I found this:

http://www.bluejeanscable.com/articles/digitalanalog.htm

Turns out I am not completely misinformed.
 
at least with spdif and AES/EBU the cable does make a difference

AES/EBU Needs cables with 110 ohm, spdif with 75 ohm

if cables do not match those specifications, beacuase of reflections in the cables jitter can occour and even Dropouts may happen

not likely to cause problems with short cables, but ...
 
at least with spdif and AES/EBU the cable does make a difference

AES/EBU Needs cables with 110 ohm, spdif with 75 ohm

if cables do not match those specifications, beacuase of reflections in the cables jitter can occour and even Dropouts may happen

not likely to cause problems with short cables, but ...

whatta ya mean "reflections"? Spdif is a coaxial style cable with heavy shield and insulator.
 
whatta ya mean "reflections"? Spdif is a coaxial style cable with heavy shield and insulator.

hi! i am not a native english speaker and no electric engineer, just an audio-engineer, so it is a bit hard for me to describe in detail, but the main reason is the so called (in german) "wellenwiderstand"(directly translated that would be wave-impedance)

in short, if the impedance does not match (impedane of both connectors AND cable), internal reflections of the signal between connections of the components can degrade the signal (though subtraction and addition of the signal)
 
As we get further into the technical realities, I want to revisit the thesis of what many of us our saying. Unlike analog where the capacitance can affect the sound of the material in low voltage circuits, with digital it is simply a matter of the efficiency of the transfer. At a certain point if a cable measures to spec the signal either gets there or it doesn't. Point being, there is a case for buying cable with specific tolerances and quality connectors but beyond that it's all smoke and mirrors. When you start talking about a $500 5' HDMI cable looking better than a $50 you are probably self-diluted. Likewise, if you run spdif out to in on your DAW and it nulls, no ammount of money is going to improve the accuracy beyond perfect.
 
Digital error correction ensures that errors don't get processed as data. This handshaking happens on both sides. If a cable is bad or intermittent, what will happen is your bandwidth will get reduced (hence the example of 1080i working but 1080p not working with a cheap HDMI cable.)

You're never going to "see" or "hear" errors, as they won't get processed, but you will have reduced bandwidth and/or latency issues. You will only ever experience this sort of thing as slow/reduced throughput, blatant skips or a complete loss of signal.

For the "audiophile theory" to be correct, there would have to be error-rounding fuzzy logic intentionally written into the error correction scheme - sort of a "virtual analog interference" algorithm. I don't think anyone would want that, and it would be difficult to engineer (also painfully and hilariously obvious when the algorithm tries to "guess" what's happening between arbitrary-length missing chunks of data.)

So, a cheap digital cable will work exactly the same as an expensive one, unless your cable goes bad - in which case the problem will be readily apparent to anyone. I've had cheap cables die more often than expensive ones, so you get what you pay for in terms of reliability - up to a reasonable point. However, spending $1100 on an HDMI cable is completely ridiculous (but audiophiles do it all the time.)