Hi there. I'm a mechanical engineer, so I'm struggling a bit with this particular problem...
The problem: I want to test some electrodes, but when I run the system the electrodes are effectively shorting the circuit (which is to be expected). The power-supply trips out until the load is removed - I assume because it's trying to draw too much current?
Previous Activity: I've built and tested some basic voltage and current limiting circuits, which work to an extent... The transistors successfully limit the current to 3A each (near the maximum capability of the transistor). However, due to this shorting effect, the voltage-drop is from 12V down to 3V, and the heat / power-dissipation is significant.
The aim: While the above works, my main concern is the inefficiency of this power supply method (as it happens 3V is adequate, voltage isn't critical to this) - it's wasting a lot of power, and creating a lot of heat. SO is it possible to create a power supply circuit that 1) doesn't drop the voltage when shorted (it simply continues to limit current to 3A regardless), 2) doesn't require excessive cooling to maintain such operation??
Thoughts: New current Limiting circuit, using different Transistors, or better regulator control??
Lower the current-limiting threshold of the Transistors (just use more transistors set lower)??
Using a transformer that only supplies a specific current, regardless of draw?? does this exist?
Any ideas, thoughts or suggestions would be most appreciated. Useful learning references also good
Thanks in advance. Rich