I could swear I saw a sample project to do this somewhere... I have a power
source that will always read between 11 - 14, or 0 volts. (It's the output of a
charge controller for a solar powered WiFi tower.) I'd like to periodically
read the output voltage, and send it to my desktop computer.
I already have the means to send small data samples from a SuperPro over the
network (via TTY session on network radio's serial port) but I'm not sure how to
read voltage. Ideally resolution would be 0.01V but 0.1V would be workable.
I see that the AD pins read voltage between 0 and 3.3V, so would I need to drop
the voltage with resistors? How do I calculate the resistance value? The
highest quality resistors are 5%, right? So I'd just have to calibrate the
values returned against a DMM?
One [more] thing that confuses me, I put a few K ohms resistor between my meter
and 12 volts and it still read 12 volts?
Any help appreciated.
-Mark McGinty
measuring voltages greater than 3.3V
-
- Posts: 1462
- Joined: Fri Oct 19, 2012 5:11 am
Re: measuring voltages greater than 3.3V
Hi Mark,
Basic Ohms Law is what you need.
Basic Ohms Law is what you need.
-
- Posts: 1462
- Joined: Fri Oct 19, 2012 5:11 am
Re: measuring voltages greater than 3.3V
I did not know that "pick your own value" resistors are available:
http://www.digikey.com/product-search/e ... esistors/6\
6806?k=resistors
1% resistors are very common.
Resistors in IC's can be passivated first and then "trimmed" using a laser
through the passivation.
Standard 1% values available are different than 5% and are different than 10%
Take a look here:
http://hyperphysics.phy-astr.gsu.edu/hb ... oldiv.html
A voltmeter gives you a good result because the input Resistance is much greater
than the Source resistance. A battery might have a internal resistance < 1
ohm, and a typical meter may have a resistance of 10 M ohms. 1 ohm || 10 M
ohms is extremely close to 1 ohm and thus the meter doesn't disturb the
circuit.
In electronic devices, there is a input bias current associated with the device
and it could be on the order of 1e-12 Amps or pA (picoamps). This affects a
measurement too. Input bias current has all sorts of dependencies on such
things as temperature etc.
You might be better with an expanded scale voltmeter and say a comparator.Â
e.g. 10-16 V, and when the voltage is say <9.5 volts you get a digital
indication of that.
With simple A/D converters there is something called a quantization error which
is easy to see if you hypothetically set the full scale count to 100. If the
system reads 1 +- 1 bit, then it's like a 50% reading error already. If you
read 3, it's less and if you read 50 it's even less. That's also why some
measurements take a zero of the process variable and shift it to 1V of the
measured value. e.g. 1- 5V.
When the outside world is measured all sorts of things can happen, One of the
most common is a difference in ground potentials and prtection against a
reversed battery or an alternator out of control, for instance. Automotive
transients can easily be from -200 V to + 50 Volts.
This is probably much more information than you need.
http://www.digikey.com/product-search/e ... esistors/6\
6806?k=resistors
1% resistors are very common.
Resistors in IC's can be passivated first and then "trimmed" using a laser
through the passivation.
Standard 1% values available are different than 5% and are different than 10%
Take a look here:
http://hyperphysics.phy-astr.gsu.edu/hb ... oldiv.html
A voltmeter gives you a good result because the input Resistance is much greater
than the Source resistance. A battery might have a internal resistance < 1
ohm, and a typical meter may have a resistance of 10 M ohms. 1 ohm || 10 M
ohms is extremely close to 1 ohm and thus the meter doesn't disturb the
circuit.
In electronic devices, there is a input bias current associated with the device
and it could be on the order of 1e-12 Amps or pA (picoamps). This affects a
measurement too. Input bias current has all sorts of dependencies on such
things as temperature etc.
You might be better with an expanded scale voltmeter and say a comparator.Â
e.g. 10-16 V, and when the voltage is say <9.5 volts you get a digital
indication of that.
With simple A/D converters there is something called a quantization error which
is easy to see if you hypothetically set the full scale count to 100. If the
system reads 1 +- 1 bit, then it's like a 50% reading error already. If you
read 3, it's less and if you read 50 it's even less. That's also why some
measurements take a zero of the process variable and shift it to 1V of the
measured value. e.g. 1- 5V.
When the outside world is measured all sorts of things can happen, One of the
most common is a difference in ground potentials and prtection against a
reversed battery or an alternator out of control, for instance. Automotive
transients can easily be from -200 V to + 50 Volts.
This is probably much more information than you need.
-
- Posts: 1462
- Joined: Fri Oct 19, 2012 5:11 am
Re: measuring voltages greater than 3.3V
I agree with Will, but want to clarify how the resistor divider works. You would
connect the 12K 1% resistor to your voltage source, then connect the 3K 1%
resistor to the return (ground) of your source. Connect the remaining ends of
the two resitors together, and also connect that junction to the input of your A
to D Converter. The two resistors will make a voltage ladder between Ground (0
Volts) and the Input voltage (~15 Volts). The junction voltage can be calculated
by the formula.
Junction Voltage = 3000 x (Input Voltage/15000).
1/8 watt resistors or larger are sufficient to dissapate the power.
connect the 12K 1% resistor to your voltage source, then connect the 3K 1%
resistor to the return (ground) of your source. Connect the remaining ends of
the two resitors together, and also connect that junction to the input of your A
to D Converter. The two resistors will make a voltage ladder between Ground (0
Volts) and the Input voltage (~15 Volts). The junction voltage can be calculated
by the formula.
Junction Voltage = 3000 x (Input Voltage/15000).
1/8 watt resistors or larger are sufficient to dissapate the power.
-
- Posts: 1462
- Joined: Fri Oct 19, 2012 5:11 am
Re: measuring voltages greater than 3.3V
Thanks Dan,
I should have made that clear.
"KeepIt SimpleStupid" I have standard value 0.1% resistors from 1950's and used
special value 0.01%? from Vishay 25 years ago.
Cheers
Will
I should have made that clear.
"KeepIt SimpleStupid" I have standard value 0.1% resistors from 1950's and used
special value 0.01%? from Vishay 25 years ago.
Cheers
Will
-
- Posts: 1
- Joined: Sat Feb 16, 2013 12:01 pm
Re: measuring voltages greater than 3.3V
I have tried it but the connection between the two resistors was giving a problem. So can anyone tell me what would be the problem..??
Other than kids the onesies for adults are also available in the markets.
Re: measuring voltages greater than 3.3V
Below is a sample circuit to measure a 12V signal. It divides the input by 4, so a max of 3V to th ARM.
It's not required but a zener diode can be used to limit the voltage if there is the potential of spikes on the input.
It's not required but a zener diode can be used to limit the voltage if there is the potential of spikes on the input.