Of course, using your 25W resistor wouldn't hurt either. 10 ohms might be close enough to 8.57ohms for you and the 25W would just not heat up at all when running and be very very very large.
The above holds true only when the device is always drawing 350 mA. If the current of the device varies and 350 mA is just the max current then all bets are off using a resistor in series with the device. For example a load like a lamp would work fine but a load like a small amplifier driving speakers will not draw a constant current. What is the device in this case?
I am using 6 volts battery on a dc to dc conventer that need 3 volts at 350 MA so am I doing right thing using resister to drop the voltage from 6 volt to 3 volts and 350 MA
Is there better way to drop voltage from 6 volts dc to 3 dc volts at 350 MA?
mrel
I am using 6 volts battery on a dc to dc conventer that need 3 volts at 350 MA so am I doing right thing using resister to drop the voltage from 6 volt to 3 volts and 350 MA
Is there better way to drop voltage from 6 volts dc to 3 dc volts at 350 MA?
mrel
Most of the DC/DC converters I have worked with from manufacturers like Cosel USA and Power One generally have an input range rather than a fixed input voltage. Not knowing what you have and not seeing a data sheet it is hard to guess.
Generally speaking the current consumed by a DC / DC converter is a function of the output current and voltage. If the load varies then the input current will vary. Should that be the case and if the Dc/DC converter requires a fixed input voltage then no, using a resistor as part of what amounts to a voltage divider won't work.
Again, a link to the data sheet for what you have would help considerably.