Hey, just caught me editing the post for typos!
BT, I have to say there are things I don't understand yet. Will I still need the LED board at all or will the PIC convince the power board that all is ok?
You shouldn't need the LED board at all! Although I have only run this for 4 minutes at a time, I only concentrated on what the LED board sent out, when it wasn't throwing an error. It was consistent, and I tried to account for every situation - start, stop, speed up/down, key removed. Every other feature the LED board has, like change programs etc.. can only effect the speed value sent to the power/motor-board. As this unit didn't have an incline motor, all it can really do is control the motor.
When the 2 packets are sent to set the speed will the controller stay at that speed or does it need a constant feed of identical packets?
It appears to need constant packets, regardless of whether the speed value is changed.
To change speed depending on above answer, would I simply change the binary all at once and allow the motor board to ramp up or down the speed or do I need to change the binary one bit at a time?
Good question. Given that the LED board sends out speed values that are incremented at the start (slow start up), I believe it is down to the control signal to determine the slow start - the motor board just sets the speed to what its told to.
I'm sure it won't change speed 'instantly', but, say if you set it to 4/32, then suddenly 24/32, I don't think it would take long to change speed, maybe a second?
If it really does require the speed to only change by one bit at a time..
Perhaps it sends each one 'twice' for error detection, meaning both speed packets will have the same speed value, and can only change once it sends the next two speed packets. So updating the speed value would happen every 4 packets (two speed, two control), giving a maximum 'change' of speed of 1/32 (assuming a 5-bit speed value) every ~400ms. That would mean if you were to increment the speed as quickly as it wants (1 bit at a time), it would take over 12 seconds to go from 0 to max rpm. Seems far too high to me, so although its a not the best reason to think so... I'm fairly sure the speed value is absolute, and can be changed next packet by any amount - any remaining ramp up/down period would be down to the power board's micro.
One other thing that puzzles me is the timing issues. I have never had much success with interrupts and tend to avoid them if I can. Will I need them or is there a way I can create the packets without them? Maths is not my strong point so any simple way is always best for me.
Plus any way you can simplify the method will always benefit others with the same problems.
Yeah I used to avoid interrupts where-ever possible
But once you start using timers has 'hartbeats' or periodic updating of things, you start to get into the mindset of 'synchronous events', that need to be done in regular intervals, and asynchronous events, that can be done regularly, but with no real urgency. It can get complicated very quickly, but also allows a micro to do a hell of a lot... and react quicker to events.
With a PIC pumping out the signal to the motor, if you don't use interrupts, only fixed delays between output transitions, then your micro will be tied up 'doing stuff' for the entire packet - but even then that's only 60ms, every 100ms. So 60% of the time will be busy. Inefficient, yes, but that still leaves 40ms, every 100ms to read ADC's, a switch, calculate speeds etc.. and on a PIC running at 4MHz, that still gives you 40k instructions
If you were to use the PIC to read in serial data, or capture fast events, then interrupts would have to be used. For safety I would use at least one interrupt - the stop button. If the PIC has just started sending one packet, and you hit the stop button, it won't check that button for at least 60ms (whilst its kicking out bits), and even then you need to wait maybe another 20ms to check it wasn't a glitch (button debounce), so already we're approaching a 100ms delay between hitting an emergency stop, and the motor *starting* to stop. Even in that case, 100ms isn't that long.
So, do you need interrupts? naaa. It would be nice! Because the signals output changes every ~ 1.625ms, which on a 4MHz PIC is.. 1625 instructions. Even if the interrupt uses 125 instructions it means you have 1500 to play with *between* bits in the packet. Add on the large gap between packets and you have a situation where your PIC free to do other things 95% of the time, as opposed to 40%.
I would just use timer1 to overflow every 100ms. And in your interrupt, just raise a flag. In your main code you check that flag, if set, you send a packet, then increment a 2 bit counter that determines what 'packet' you send (two speed, two control = 4 packets). After it has sent the packet, it reads the ADC, and checks the status of buttons - and makes the appropriate changes to speed, or start/stop the signal, then clears the timer flag. Then it just loops back, waiting for this flag to be set again (by the timer interrupt). As long as you do your ADC reading, and button checking within 40ms, it'll be fine. The timer just ensures a packet is sent every 100ms.
Its simple, maybe not the most efficient, but it'll work, and you'll read your ADC/buttons every 100ms. If it seems slow to react, you can always knock something more complicated up later (using interrupts).
Can't thank you enough for all the work you put into this project mate. I know its doubtful but if you ever get stuck and I can help I am there like a shot.
Any time sir
My chicken dinner awaits!
BT