Hello everybody
Do not get me wrong Nigel. I know you put 0.5 second for example. What i meant is : Why do you change the time needed to update a single digit. If your code needs 10us to update 1 digit, then that's it. For one digit it will take 10us, for 2 digits 20us, for 3 digits 30us, and so on. Now when the refresh rate changes, it that time should stay the same. I mean if you decide to refresh your screen 10 times per second, the time to refresh each digit should be the same 10us, since your code for refreshing one digit has not changed(i.e. take the same time as before).
EXAMPLE TAKEN FROM YOUR POST
How I see it: Starting from the second example , i follow you. Each digit takes 0.5us to refresh EVERY 1us.
Now going to the first example. Since the refresh is 1Hz, it means that the whole display should be updated once every second. Fair enough. However, why is the time needed to update one digit is incresed to 0.5sec???? As i see it, it should be the same routine as in the above example i.e. taking 0.5us to refresh one digit. Therefore the brightness should be much lower since current from each digit is only for 0.5us every second, instead of 0.5us every 1us from the previous example.
But as I told Mike, I think the problem is at how the refresh rate is defined. My point of view i shown in the figure I posted yesterday, which i think I explain it ok. If you look at it, I am sure you will understand how i see it.
Thank you for reading.
Same here!Thank you for being patient with me...
Nigel Goodwin said:demestav said:Well i agree with the calculations of both posts. What it seems strange to me is this
Why should each digit get 0.5seconds??? All it needs is some miliseconds right?so digit 1 gets 1A for 0.5 seconds then digit 2 gets 1A for 0.5 seconds
Yes, the time examples were only to show that the actual time doesn't affect the brightness, and as I explained the 0.5 second example would exhibit extreme flickering.
Do not get me wrong Nigel. I know you put 0.5 second for example. What i meant is : Why do you change the time needed to update a single digit. If your code needs 10us to update 1 digit, then that's it. For one digit it will take 10us, for 2 digits 20us, for 3 digits 30us, and so on. Now when the refresh rate changes, it that time should stay the same. I mean if you decide to refresh your screen 10 times per second, the time to refresh each digit should be the same 10us, since your code for refreshing one digit has not changed(i.e. take the same time as before).
EXAMPLE TAKEN FROM YOUR POST
1Hz, current set at 1A by a resistor, so digit 1 gets 1A for 0.5 seconds then digit 2 gets 1A for 0.5 seconds - so each digit gets an average of 0.5A, but the display will obviously exhibit extreme flicker!.
1000Hz, same resistor, digit 1 gets 1A for 0.5uS, then digit 2 gets 1A for 0.5uS, so each display gets EXACTLY the same average of 0.5A.
How I see it: Starting from the second example , i follow you. Each digit takes 0.5us to refresh EVERY 1us.
Now going to the first example. Since the refresh is 1Hz, it means that the whole display should be updated once every second. Fair enough. However, why is the time needed to update one digit is incresed to 0.5sec???? As i see it, it should be the same routine as in the above example i.e. taking 0.5us to refresh one digit. Therefore the brightness should be much lower since current from each digit is only for 0.5us every second, instead of 0.5us every 1us from the previous example.
But as I told Mike, I think the problem is at how the refresh rate is defined. My point of view i shown in the figure I posted yesterday, which i think I explain it ok. If you look at it, I am sure you will understand how i see it.
Thank you for reading.