DI box Input Impedance

So what difference would it make if say for example. A passive pup of 35 Kohms output goes into my countryman with an input impedance of 10000 K ohms leaves me with 77 percent signal (assuming this already a bad signal?) What difference would it make going from there into my Line 6 UX2 input of 1M ohm?

10000/10035*100=99,65

99.65% not 77% :)

And as i said above:
If your interface has a hi-Z input, you don't need to buy a Direct Injection box.
 
hmm i wonder. maybe the distance/cable between guitar and console/audio interface/whatever is long enough to catch some nasty RFI?
 
Then explain why so many top producers use DI's into their top Interfaces? For example countryman type 85 into RME fireface 800? That's were I'm puzzled!

Most probably it is because they think (and they are most probably right) that some uber quality and uber expensive external specialized vintage tube mic preamp gives "warmer", or better sound than that little internal preamp in the hi-Z input (if the hi-Z input in their interfaces has any preamp) :)

IMHO On our and even on a low level professional level, that additional coloring of the sound is not that important - you can get outstanding results without it.

Or !

Maybe because the cable from the DI to the interface is balanced.
 
I noticed absolutely 0 difference when A/Bing my Little Labs Redeye vs the Fireface 400. Comparison of DIs with the Fireface 400 and Focusrite Saffire Pro 40 had no perceivable difference, either.
 
Maybe because the cable from the DI to the interface is balanced.

Ah that makes sense, The RME FF800 has balanced inputs.
But i would still love to get a definite answer.
Ive also heard that going into the RME FF800 channel 1 with a Countryman infront is a NO NO! Think its a too hot signal. Channel 1 has speaker emulation and drive. Not sure if they just using channel 2 then?
 
I noticed absolutely 0 difference when A/Bing my Little Labs Redeye vs the Fireface 400. Comparison of DIs with the Fireface 400 and Focusrite Saffire Pro 40 had no perceivable difference, either.

Most instrument inputs on interfaces are high enough to not have the loading effect on guitars. If thats the case, you would hear no difference and a DI would not be needed.

For the general populace, most of the time interface manuals that can be downloaded from the manufacture's website will have specs on the interface of choice, and if the instrument input impedance is at least 470Kohms, then there is really no reason what so ever to NEED a DI, it won't really do anything.

I know this comes up a bit too and it has been asked here, that even if the instrument input has a high impedance, then why do the DIs still sound dull? The answer most of the time, would be either bad cables or cables that are too long. You have to remember we are talking about the fact that guitar pickups have a very output impedance for a current source and that paired with the capacitance of the cable creates a pretty extreme low pass filter.
 
Ah crap, my maths clearly sucks. But I get it now. Then explain why so many top producers use DI's into their top Interfaces? For example countryman type 85 into RME fireface 800? That's were I'm puzzled!

Also consider workflow. Typically when you run a DI, you are still running that out to an Amp for monitoring/recording. The DI is just for insurance or reamping later, but you typically need something to monitor through.

Yeah you can use a plugin, but your latency has to be set really low which isn't an option sometimes. Especially when you are working with lots of tracks or laying down overdubs after some mixing has been done and cpu loads are high.

Also something like the Type 85 has a really really high input impedance and a transformer which will color the signal slightly and typically in all good ways. So often there is a preference just based on the DI and transformers. The differences are subtle but definitely there.
 
Any amplification is going to colour the signal. If you do that critical amplification from guitar level to mic level really nicely with minimal and/or desirable colouring (e.g. an expensive DI box) then you're gonna get better results than you would with an interface Hi-z input.


edit: even if you're not amplifying the signal that much, any process sounds better when done with better equipment!
 
Most probably it is because they think (and they are most probably right) that some uber quality and uber expensive external specialized vintage tube mic preamp gives "warmer", or better sound than that little internal preamp in the hi-Z input (if the hi-Z input in their interfaces has any preamp) :)

IMHO On our and even on a low level professional level, that additional coloring of the sound is not that important - you can get outstanding results without it.

Or !

Maybe because the cable from the DI to the interface is balanced.

My only question is, in order to capture the great DI sound sound and clarity, would the interfaces input impedance have to be the same or higher than the DI's impedance?
 
I have another question.
I own Focusrite Saffire 6 USB. This interface have hi-z input. But all guitars that i tried on it were clippin' in hi-z input, PAD in inner position and zero gain of inputs preamp.
2 of them have SD SH-8 Invaders in bridge. Also i've tested Gibson Flying V and Gibson SG Standart. Both of Gibsons were clippin' too.

Then i switched INST. and PAD. buttons on outer position. So as far as i can understand my interface input became ordinary line input (am i right?). Clipping gone in this mode. Gain set on 6 of 10 to achieve input peaks in Reaper -3 dB.
But as i know-recordin' DI's in line is not good idea coz signal looses it's frequences(i don't know how to say its correctly in english, hope you understood me)

So then i deсided to use my friends EBS MicroBassII as DI-box. Followed manual to make right connection. But i found that recorded signal in this mode and peaks (about -3 dB) sounds not better then signal recorded directly to interface line input (PAD. ON!!!!, INST.off)/
Why?
Also i had to switch PAD. on my interface input while connecting guitars with DI-box to Saffire 6 usb (otherwise it was clipping and showed me PEAKS over 0 db). So does it mean that EBS MicroBassII couldn't work as DI-box?

So now i don't understand what i'm doin' wrong.
hi-z input was cliipin' - i tried to use dibox.
but dibox made my signal very simular to signal recorded directly in line input of interface. So what the purpose of Di-box if it doesnt make signal better than line signal?

p.s.: sorry for my bad english. i don't know it good enough to express my thoughts more accessible.

p.s.s.^ thnx for helping me.
 
Welcome !

Your english is good enough, at least for me to understand :)

A good instrument (guitar) input should NOT clip even with hottest passive guitar pickups.

Maybe there is something wrong with your unit ?

Can you copypaste full tech specs of that input ?
 
Saffire 6 USB has to low maximum input level even with engaged Pad, +7 dbu is like a joke with most pickups :)
http://www.focusrite.com/products/audio_interfaces/saffire_6_usb/specifications/
"Instrument Inputs
• Dynamic Range (A-weighted): 105 dB
• SNR (A-weighted): 105 dB
• Frequency Response 20Hz-20KHz +/- 0.1 dB
&#8226; THD+N, -1dBFS, min gain, no pad < 0.0025%
&#8226; Maximum Input Level for 0 dBFS (minimum gain, no pad): -3 dBu&#8232;
&#8226; Maximum Input Level for 0 dBFS (maximum gain, no pad): -53 dBu&#8232;
&#8226; Maximum Input Level for 0 dBFS (minimum gain, with pad): +7 dBu
&#8226; Maximum Input Level for 0 dBFS (maximum gain, with pad): -43 dBu
&#8226; Pad Attenuation: 10 dB
&#8226; Crosstalk: > 80 dB"
GOOD instrument input should not clip, but bad ones can :)
It`s all in hands of engineers...
There is even worse things, for example Infrasonic Amon has instrument input with 0 dbFS= +0 dbV (i.e. +2.2 dbu)...
It`s like an engineers never have guitar and did not know anything about real pickups or, maybe, using specifications like published for Dimarzio pickups, i.e. RMS measurements :)
 
The problem with Focusrite Saffire 6 USB is most likely clipping at input stage op-amp/somewhere at later stage (poor quality or design) or too low supply voltage. Solid state amps can only produce a voltage swing less than half of the supply voltage before clipping. Some pickups can produce quite big signal, heck an EMG-81 can probably clip most any consumer level sound card input.

Edit: Also those dBu/dBV can be really misleading as you'd need to know the impedance and the frequency of the feeding device they used for the measurements to compare. Many manufacturers use their own methods for testing so the results can be non-standard. What you really need to know is the maximum input voltage in volts before clipping.
 
dbu/dbV is the voltage, just use formula 0.775*10^(A/20), where A level in dbu or 10^(A/20), where A level in dbV.
http://en.wikipedia.org/wiki/Decibel
"dBV
dB(1 VRMS) &#8211; voltage relative to 1 volt, regardless of impedance."
Also same thing applied to dbu, the only difference is reference voltage - 0.775 V.
Bigger problem persists because to little information exists about peak level of pickups, it`s exists for EMGs or Blackouts-1, but not for passive pickups.
Values supplied by Dimarzio is RMS values, I think, with peak level several times greater...
 
http://en.wikipedia.org/wiki/Decibel
"dBV
dB(1 VRMS) &#8211; voltage relative to 1 volt, regardless of impedance."
Also same thing applied to dbu, the only difference is reference voltage - 0.775 V.
Yeah, that's true but you forgot this part: "dBu can be used regardless of impedance, but is derived from a 600 &#937; load dissipating 0 dBm (1 mW)." - http://en.wikipedia.org/wiki/DBu#Voltage

What i mean, is that manufacturers can use different methods for testing their eq so they get results that suit them better ie. non-standard. It's kinda like when they use A-weighted curve for dynamic range to get higher numbers.

Edit: Also some manufacturers might actually mean dB&#956;V or dBuV - "dB(1 &#956;VRMS) &#8211; voltage relative to 1 microvolt. Widely used in television and aerial amplifier specifications. 60 dB&#956;V = 0 dBmV." They just 'forget' the V at the end and use dB&#956;/dBu.
 
Yeah, that's true but you forgot this part: "dBu can be used regardless of impedance, but is derived from a 600 &#937; load dissipating 0 dBm (1 mW)." - http://en.wikipedia.org/wiki/DBu#Voltage
This means that reference level (0.775 V) derived from this assumptions, instead of 1.0 V for dbV.

In most cases interfaces have right voltage specification in dbu or dbV.
Anyway most manufacturers don`t specify voltage characteristics directly in volts. All gear, that I have, have declared level, that can be measured with multimeter (at least RMS value on sine wave). I think that in case of one device output and input levels should be measured in the same way, so output level of device can be used as some reference.
 
Here's one for you wizards. I've often wondered about the quality of the Hi-Z input vs high impedance. For example the Hi-Z input of an API pre, although not low, 470K (I think) but relatively low compared to a countryman, IMO still sound just as good or better. Or is it more to do with the fact that once you get up into the high 90% signal through range you can't hear a difference?
 
Here's one for you wizards. I've often wondered about the quality of the Hi-Z input vs high impedance. For example the Hi-Z input of an API pre, although not low, 470K (I think) but relatively low compared to a countryman, IMO still sound just as good or better. Or is it more to do with the fact that once you get up into the high 90% signal through range you can't hear a difference?

Anybody that is using pedals before their amp, particularly tube screamers have an input impedance a little lower than 470K (the loss in signal from the pickup is made up by the buffer stages in the pedal, which means it isn't that huge of a difference as long as the input impedance is somewhat larger than the pickup's output impedance.

Here's the thing though, I have mentioned this before, probably in this thread, you still want to have DI's and reamp boxes to directly emulate the same input/output impedance as if it was straight cable to the amp. I know I have heard a lot of people say that When they reap guitars, the tone is brighter and more pushed than if it was played live, this mostly has to do with the output impedance of the reamp box more so than the DI, however the difference in output impedance from the Reamp to a real pickup can be drastic difference. the good thing however is that the reamp can't change the peak resonance of the pickup, the reamp box of choice however will have an effect on the pickup resonance.

So if you are being really picky in terms of impedance matching, you need to find the input impedance of your interface, and then select the correct transformer, the effective input impedance of the DI is a relationship between the turns ration and the input impedance of the interface, for most high gain amps input impedance is 1M. Same goes with the output impedance for the reamp, turns ration and output impedance of the interface determines effective output impedance, and should match the output impedance of the pickups used, most high output pickups minus active pups are around 16K.

If you had that highly of a calibrated setup, then you would not be able to tell the difference between reamps and a live recording, that is if you have good aliasing filters on your D/A converter.