Varying voltage in a circuit

Kisen

Member
Hi,

I suspect that what i am about to ask will be OK, but wanted to get opinions.

I have a design i am working on. It is a low power design running from a coin cell. The coin cell voltage is boosted to 5v and stored in a supercapacitor.
The supercap is then feeding a variable regulator, in that i can switch it to 1 of 2 preset voltages. In my case 2.0v and 3.3v.

Typically i intend for the regulator to be in 2.0v and not doing anything particularly exciting. an MCU wakes up periodically and reads some data from sensors and eeprom etc. These devices are all within spec running at 2.0v all capable of upto 3.6v.

There are however 2 occasions where i need more voltage, the 3.3v supply. For transmitting RF and for an LED controller.

I plan to switch the regulator to 3.3v, which will also increase the supply voltage to the afore mentioned devices, periodically. The voltage change has slew control so its not "violent" and i can account for the voltage change in software, i.e waiting for the voltage to transition.

I have never done this before, my supply voltages are normally stable 3.3v etc. So my question is...

Does a device usually experience a negative effect of such voltage fluctuations when done so in a controlled manner?

Thanks
 
Why not just run at 3.3V all the time?, changing supply voltages is quite likely going to cause intermittent problems.

I'm also dubious about the point of the supercap?, generally people seem to try and use them for purposes they aren't suitable for, and of course a coin cell has very little power capacity to start with.

As in all of these types of thread, you'd do a lot better to tell us EXACTLY what you're trying to do, and why this way.
 
The purpose of the supercap is to store energy that can be discharged quickly. The coincell has about 600mA capacity, but discharging it above a few mA will shorten its overall life. So i am feeding the energy into a supercap with a current limit of 1mA. This takes time to charge the supercap. As such the energy within it should be used correctly.

For a 1F Supercap, the useable energy between 3.3v (Circuit) and 5.0v (Charged) is ~7Joules, whereas the energy between 2.0v and 5.0v is ~10 Joules. Using 3.3v clearly gives me less useable energy if i use it 100% of the time. So the idea i have is to use 3.3v for demanding jobs like LEDs, RF transmission. and then once they are done with fall back to 2.0v where there is plenty of useable energy available while the supercap charge recovers.


Nigel Goodwin You mention intermittent problems... This is what i would like to get to the bottom of.
So to ask the question again... Does a device suffer in any way when it voltage fluctuates between 2 fixed points within its acceptable voltage range?

For example... an eeprom. 1.6v to 3.6v range. If i design it to work on 1.8v it happily will. If i design it to work on 3.3v it happily will. But what happens if it was working on 1.8v does some work, and then moves to 3.3v, does some work, then back to 1.8v again?
 

One idea would be to keep it regulated at 2v while the power hungry part gets the 3.3v.

Another idea would be to change the voltage gradually over time. Even a second is probably enough.
 
Cookies are required to use this site. You must accept them to continue using the site. Learn more…