Hi,
I suspect that what i am about to ask will be OK, but wanted to get opinions.
I have a design i am working on. It is a low power design running from a coin cell. The coin cell voltage is boosted to 5v and stored in a supercapacitor.
The supercap is then feeding a variable regulator, in that i can switch it to 1 of 2 preset voltages. In my case 2.0v and 3.3v.
Typically i intend for the regulator to be in 2.0v and not doing anything particularly exciting. an MCU wakes up periodically and reads some data from sensors and eeprom etc. These devices are all within spec running at 2.0v all capable of upto 3.6v.
There are however 2 occasions where i need more voltage, the 3.3v supply. For transmitting RF and for an LED controller.
I plan to switch the regulator to 3.3v, which will also increase the supply voltage to the afore mentioned devices, periodically. The voltage change has slew control so its not "violent" and i can account for the voltage change in software, i.e waiting for the voltage to transition.
I have never done this before, my supply voltages are normally stable 3.3v etc. So my question is...
Does a device usually experience a negative effect of such voltage fluctuations when done so in a controlled manner?
Thanks
I suspect that what i am about to ask will be OK, but wanted to get opinions.
I have a design i am working on. It is a low power design running from a coin cell. The coin cell voltage is boosted to 5v and stored in a supercapacitor.
The supercap is then feeding a variable regulator, in that i can switch it to 1 of 2 preset voltages. In my case 2.0v and 3.3v.
Typically i intend for the regulator to be in 2.0v and not doing anything particularly exciting. an MCU wakes up periodically and reads some data from sensors and eeprom etc. These devices are all within spec running at 2.0v all capable of upto 3.6v.
There are however 2 occasions where i need more voltage, the 3.3v supply. For transmitting RF and for an LED controller.
I plan to switch the regulator to 3.3v, which will also increase the supply voltage to the afore mentioned devices, periodically. The voltage change has slew control so its not "violent" and i can account for the voltage change in software, i.e waiting for the voltage to transition.
I have never done this before, my supply voltages are normally stable 3.3v etc. So my question is...
Does a device usually experience a negative effect of such voltage fluctuations when done so in a controlled manner?
Thanks