I've read and watched a lot of basic electronic tutorials but I still don't quite get current.
So i bought an Antec power supply to play with. On the label it specifies +5V for a Max. 30A and Min 1.5 A
Does that mean that the minimum current it will output on the +5v is 1.5A?
I hooked it up to a blue LED and measure the current drawn which was about 80 mA. then hooked up the same LED to a "Switching Universal Power Supply" with a 4.5V and 1700mA rating and the current draw was about 8mA. Which didn't make any sense to me.
I wanted to clear up my confusion so I got a 10 Ohm - 5Watt resistor and connected it to each power supply as above. It drew over 400mA from the Antec power supply and 26 mA from the smaller one.
I don't understand where the rest of the current goes? I thought the Min. 1.5A on the label meant that the power supply was going to push that much current through whether needed or not even if it burns out the device?
And what happened to Ohm's law. The 10 Ohm resistor at 4.5V should draw 450mA but that did not happen with the smaller power supply?
"Min 1.5A" Your supply does not meet some specification at less than 1.5A. It was not designed to work with less than 1.5A. This is typical of switching power supplies.On the label it specifies +5V for a Max. 30A and Min 1.5 A
I hooked it up to a blue LED and measure the current drawn which was about 80 mA. then hooked up the same LED to a "Switching Universal Power Supply" with a 4.5V and 1700mA rating and the current draw was about 8mA. Which didn't make any sense to me.
I wanted to clear up my confusion so I got a 10 Ohm - 5Watt resistor and connected it to each power supply as above. It drew over 400mA from the Antec power supply and 26 mA from the smaller one.
I don't understand where the rest of the current goes? I thought the Min. 1.5A on the label meant that the power supply was going to push that much current through whether needed or not even if it burns out the device?
And what happened to Ohm's law. The 10 Ohm resistor at 4.5V should draw 450mA but that did not happen with the smaller power supply?
Thanks a lot I think this post cleared up some of myHi kal.a, welcome to ETO
There is not much in life that is certain except death and taxes, as the saying goes. Well, rest assured, you can add Oms law to that list. Ohms law is the basis for all electronics and you are wise to get the hang of it as a first move.
You were also wise to drop the LED and move to a pure resistor for your experiments.
The 1.5A to 10A specification for your Antec power supply unit (PSU) does not mean what you think: what it means is that the PSU 1s not designed to work at less than 1.5 A output current and will produce a maximum current of 10A. If you operate that PSU at less than 1.5A load:
(1) you may damage it
(2) it will not work at all
(3) it will produce a voltage higher than 5V
(4) it will produce a large amount of hash and noise at it's output.
That type of PSU is intended for an application where there is always a current drain, like in a personal c0omputer for example. It is not designed to be a bench power supply.
The answer to your problem is to permanently connect a resistor across the power supply of the right value and wattage to take 1.5A
(1) Oms law says:
(1.1) I=V/R or
(1.2) V=I*R or
(1.3) R= V/I
In this case you need to find the value of R so use formula (3):
R= V/I
Where:
V= 5V
I= 1.5A
Thus R = 5/1.5 = 3.33 Ohms The closest standard resistor value is 3.3 Ohms, so that would be ideal. You can work out how much current that would take from the 5v PSU
(2) The power in Watts (W) dissipated by a resistor is:
(2.1) W= V * I or
(2.2) W=( I*I) * R or
(2.3) W= (V*V)/R
You know the Volts and you now know the resistor value, so use formula (2.3) to find the minimum wattage rating for the load resistor:
W= (5*5)* 3.30= 7.58W
Thus 7.58W is the minimum power resistor that would reliably do the job, providing it is in air at 25 degrees centigrade or lower. Best to go for at least a 10W resistor though. Bear in mind that the resistor will get very hot.
This resistor would be perfect, especially if bolted to a heatsink (sheet of aluminium or PSU case possibly): https://uk.rs-online.com/web/p/panel-mount-fixed-resistors/0160708/
Thanks Spec.Your PSU is an ATX type intended for a desktop personal computers. https://www.newegg.com/Product/Product.aspx?Item=N82E16817103917
If you don't configure it properly it will do all sorts of stange things. But with a bit of work thay make excellent bench PSUs. There is heaps of data on the internet showing how to do this. The first thing to sort is the turn on signal- I used to know the approach but have forgotton. If you get stuck I will investigate further if you like.
Thanks a lot I think this post cleared up some of my
Thanks Spec.
If i'd like to go back to your earlier reply and and my misunderstanding. I understood current to be like an available water volume that is controlled by how much I open the tap. If I open the tap just little bit I get very little water even though the watter pressure in the pipe is constant. In other word the device draws as much current as it needs and not any more. Then that turned out to be wrong and I learned that I have to control the amount of current using resistance to insure that no more current than the device need is being forced upon it.
And that led to my experiment with the power supplu VS the transformer. I wanted a current supply source that's high and constant to measure across a resistor or to burn an LED but the fact that did not happen confused me.
As already stated a PSU does not force current out of its terminals. It has no idea how much current is flowing from its output terminals. All it knows is that it must keep the output voltage fixed at say 5V. Only when the current reaches the limit for the PSU will it say enough is enough and drop its output voltage to zero so no current flows. This is done to protect the PSU from damage. (note that I have simplified greatly for explanation). Always remember that Voltage is the master, current is the slave, not the other way around.And when you wrote that I'm wrong in my understanding that the power supply will always put out 1.5A confused me even more. How is the current put out by a power supply (not mine but a reliable power supply) controlled? What determines how much is ouput?
By the way I'm not sure what you mean by configured properly but I watched a couple of youtube clips and got it to work and put out all the voltages it is supposed to. The power supply is not that important to me though. I just want to understand.
Hi kal.a,
No sweat about the info
Good to meet you.
(if you ever like my posts please click the 'like' symbol, bottom right)
You are right. Current is like water and Voltage is like water pressure. Resistance is like a tap: low resistance (1 Ohm) is the tap right open so much water flows. When the tap is almost closed (1 megga Ohm) little water flows.
All you have to do now is translate that to electronics. Water = electrons, water pressure = volts, the tap is resistance. That is all there is to basic electrical circuits. A power supply is like the oceans it has a vast supply of electrons (water) that will never deplete. To be specific 1 Amp is 1.6 x 10^19. electrons passing a point in a wire in a second (that is 16000000000000000000 electrons). But never mind how much water is in the ocean the water will only flow if there is a height differfence. Just so with electrons, they will only flow if forced to do so by a voltage difference.
A PSU is designed to provide a constant voltage at its terminals regardless of what current you take or don't take. But every PSU has a limit. In the case of your PSU the 5V line has a limit of 30Amps: just think how many electrons that is!
I have underlined what is wrong. You didn't want a current supply, you wanted a voltage supply. Practically all supplies are voltage supplies. That is what confused you. Suppose you had a 5 V PSU that could provide a maximum current of 3A and you connected a 2 Ohm resistor across it. How much current would flow through the 2 Ohm resistor = 5V/2 Ohms = 2.5A. Now take the same 2 Ohm resistor and connect it to a 5V PSU which had a current capacity of 1000 amps. How much current would now flow through the resistor? Can you tell me?
2191U1 did not light up at all and had 0 Vf
tried it again and the 24V LED actually lights up but so dimly I couldn't see it but as soon as I pull out the red LED the 24V LED lights up quite bright.
Eah???A DC supply cannot generally be used to charge a battery without a series diode.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?