What is the point of a voltage divider if you can't drive anything with it?

3,304

Solution 1

Oh, but you can. You can drive an high impedance input with it...including a buffer, which can then in turn be used to drive whatever you want. The more current you draw the more the voltage will droop, so you just make sure to draw as little current as possible. So that the output is, for example, 99.9% of what the divider formula says it should be.

The divider formula is simply a equation that holds true under certain ideal conditions. If you want to mathematically analyze it under real conditions, the equation gets complicated and case-specific, so often it is just easier to force your real world usage such that the equation's assumptions are approximated very closely.

Solution 2

You don't have to draw significant current to "use" a voltage. For example, if you want to measure the output voltage, which is a perfectly useful thing to do, then you can just attach a voltmeter. And ideally, voltmeters don't draw current at all.

If you wanted to drive something at a lower voltage than the input, you wouldn't use a voltage divider because that would be extremely wasteful; most of the energy would be lost in the resistors.

Solution 3

In a high-impedance amplifier, the currents are small in proportion to the voltages present and in this case, voltage dividers see popular and common use to prescale the overall gain of the first stage of the amplifier and also to vary the output level of the amplifier.

The effect you mention (finite current flow causing the voltages in the divider circuit to shift) is called loading and can be minimized even in low-impedance circuits through appropriate choices of the resistances in the divider circuit.

Solution 4

The word "anything" is to broad and depends of your context.

If your domain is electronics (as opposed to for example Electrotechnics), voltage dividers with negligible loads are used everywhere.

"Negligible" means that the current is so low that we don't care of its effect.

Example : a voltage divider connected to a $2\ V$ power supply, that is made of two $1 k\Omega$ resistors, and that sources $1 \mu A$ microampere delivers a voltage of $0.9995$ instead of $1\ V$.

You may ask : In that case, why not use a simple resistor instead of $2$ resistors ?

Answer : the situation that happens very often is that the specified $1\mu A$ current is not the real value but a maximum value. So this permits to get rid of the load current uncertainty.

Solution 5

This is perhaps a niche concern, but one advantage of voltage dividers is that they also cleanly divide down the noise.(*)

In very low temperature electronics experiments, you might want to drive your very sensitive device that can't handle more than 10 microvolts of applied voltage, but your voltage source might be way more noisy than this. So, you instead start out with 1 volt at your voltage source, and divide it down 100000x.

As you say, the resistance of the divider does end up adding to the final measured resistance of your device. But, this can just be subtracted off after the measurement is done.

(*) - they can also add back in a lot of noise if one is not careful of ground loops: the voltage divider adds its local ground voltage noise fully into the output.

Share:
3,304

Related videos on Youtube

Qmechanic
Author by

Qmechanic

Updated on July 26, 2020

Comments

  • Qmechanic
    Qmechanic over 3 years

    The voltage divider formula is only valid if there is no current drawn across the output voltage, so how could they be used practically? Since using the voltage for anything would require drawing current, that would invalidate the formula. So what's the point; how can they be used?

    • Ralf Kleberhoff
      Ralf Kleberhoff over 3 years
      If you know the characteristics of the device attached to the output, e.g. having a fixed impedance against ground, you can take that into account when applying the voltage divider formula.
  • Peter Mortensen
    Peter Mortensen over 3 years
    Voltmeters often have a well-defined standard input impedance, like 10 MΩ (even if it is possible to make it much higher).
  • uhoh
    uhoh over 3 years
    This is a fascinating answer! I'll think about this for a while and may post a related question, thanks!
  • supercat
    supercat over 3 years
    I think it might be useful to, instead of saying that the "1μA" isn't the real value, say that drives a load current that is specified to be no greater than 1μA (note that you redundantly specify the unit), and that this load will cause the voltage output to be somewhere between 0.9995V and 1.0000V, rather being precisely one volt. The precise value of the output when the input happens to be 1.00μA isn't as important as the range of values that could result from currents between 0.00μA and 1.00μA.