HI all,
Im working on practice exams for my final which is next week but in doing so, came across a problem i wasnt able to solve. Any advice is highly appreciated. The question is as follows.
You have an embedded system using a mcu with an 8-bit count-up timer. You also have an external sensor which monitors the revolutions of a motor and generates an interrupt once per revolution. The timer which ticks at a 100khz rate doesnt generate interrupts. The timer sets a flag (roll) when it rolls over and this must be manually reset. The question is how can the isr calculate the motor RPM and whats the minimum value that can be calculated.
I have not been able to solve this question. Any help is highly appreciated. Thanks!
Im working on practice exams for my final which is next week but in doing so, came across a problem i wasnt able to solve. Any advice is highly appreciated. The question is as follows.
You have an embedded system using a mcu with an 8-bit count-up timer. You also have an external sensor which monitors the revolutions of a motor and generates an interrupt once per revolution. The timer which ticks at a 100khz rate doesnt generate interrupts. The timer sets a flag (roll) when it rolls over and this must be manually reset. The question is how can the isr calculate the motor RPM and whats the minimum value that can be calculated.
I have not been able to solve this question. Any help is highly appreciated. Thanks!