23-07-2014, 05:30 PM
A question came up the other day from someone regarding voltage drop in a vehicles electrical system (12v nominal system)
It was concerning working out the voltage drop along a feed cable to the old standard 55 watt headlamp bulb under varying voltages without using a meter, as I understand it the voltage across the battery terminals (good well connected terminals) might vary between say 12.6 volts and 14.4 volts depending whether or not the engine is running and what loads are consuming power etc, let's assume that the cable is connected direct to the battery terminals and the cable is working within a normal temperature range, the cables conductor has CSA (cross sectional area) of 1 square millimetre and has a resistance of 0.054 ohms for its length of 3 metres, the bulb is connected and consuming power.
1) Assume that the bulb is rated at 55 watts at a supply voltage of 12.6 volts.
2) Results are needed for two battery terminal voltages of 12.6 volts and 14.4 volts, assume that at those two voltages their respective voltages are constant.
So how is the voltage drop calculated along the cable, bearing in mind that the bulbs resistance will vary between the two different supply voltages.
Would the first graph in the link below help or is it a chicken and egg situation.
http://www.clarisonus.com/Archives/TubeT...istics.pdf
This is not a trick question, I'm just interested.
Lawrence.
It was concerning working out the voltage drop along a feed cable to the old standard 55 watt headlamp bulb under varying voltages without using a meter, as I understand it the voltage across the battery terminals (good well connected terminals) might vary between say 12.6 volts and 14.4 volts depending whether or not the engine is running and what loads are consuming power etc, let's assume that the cable is connected direct to the battery terminals and the cable is working within a normal temperature range, the cables conductor has CSA (cross sectional area) of 1 square millimetre and has a resistance of 0.054 ohms for its length of 3 metres, the bulb is connected and consuming power.
1) Assume that the bulb is rated at 55 watts at a supply voltage of 12.6 volts.
2) Results are needed for two battery terminal voltages of 12.6 volts and 14.4 volts, assume that at those two voltages their respective voltages are constant.
So how is the voltage drop calculated along the cable, bearing in mind that the bulbs resistance will vary between the two different supply voltages.
Would the first graph in the link below help or is it a chicken and egg situation.
http://www.clarisonus.com/Archives/TubeT...istics.pdf
This is not a trick question, I'm just interested.
Lawrence.