The white noise voltage spectral density of a resistor is given by
Vn^2 = 4*k*T*R
where
k = boltzmann constant (1.38 E -23)
T = temp, assume 300 K
R = resistance in ohms
It is also well known that the CURRENT noise spectral density is given by
In^2 = (4*k*T)/R
This makes perfect sense, given ohms law, or general circuit intuition. If you have a high-output-impedance current source, and you are looking at output current, then noise voltage fluctuations at the output wont change the output current. they get divided by the output resistance. the same goes for a low-output-impedance voltage source. noise current variations wont affect the output voltage much, as they are multiplied by a small R.
my question is twofold.
Does this mean that, in a current-mode circuit; that is, a circuit where the signal remains a current (current mode opamps for example), you are free to use large resistors and still keep things low noise? since adding a larger resistance "reduces" current noise.
second, in the following datasheet, for the CA3094 operational transconductance amplifier
https://www.electro-tech-online.com/custompdfs/2010/06/CA309428AB29.pdf
down on pages 4 and 5 they talk about their noise test setup. for testing output CURRENT noise they use a large resistor (1MΩ), and when testing output VOLTAGE noise they use a 0Ω resistance on the source. in terms of voltage noise, 0Ω resistor would contribute no noise. In terms of current noise, a 1MΩ resistor would contribute very little as well.
am I thinking about this right? does someone wanna explain this?
it seems like a paradox that increasing resistance "decreases" noise. it sort of makes sense since, in a current mode circuit, you dont care what the voltage fluctuations are, but I am not sure if I trust my intuition.
much thanks to anyone that can shed some light on this.