A few more clarifications: First the 10.0273790935 number is the length between clock pulses in standard seconds; I divide it by 10 and get 1.00273790935 seconds of SOLAR time. This is equivalent to 1 second of SIDEREAL time. Thus a sidereal clock runs just a hair faster than a standard time clock. So, when a sidereal clock reaches a full 24 hours, only 23 hours, 56 minutes, and 4.1 seconds of solar time have elapsed. This extra 3 minutes, 56 seconds is why we have February 29th every four years. Notice how the sun rises just a little earlier every morning lately? Well, so do the stars. For instance, I go out in my back yard, see a beautiful star juxtaposed between a couple of mountains. I want a photo, so the next night I set up my camera and the star's not there. Without compensating for the difference between solar time and sidereal time, I missed it. It moved on 3 minutes and 56 seconds earlier.
Well, I've been told that the lowest crystal frequency that I could get with this frequency is 10.0273790935 MHz. Please notice the "MHz" there. Using a 10 Meg + crystal requires 6 stages of division. I wouldn't mind that except for the creeping error that I KNOW I would get. And that's what I need to solve. If I can get the clock frequency I need, the rest of the circuit is easy.
As far as Reader "mneary" asks "How will I prove it? Simple, I take that same camera, tripod, and azimuthal motor-drive, point it at the star when I see it, and lock it down. The next night, 3 minutes, 56 seconds earlier, I look through the camera and there it is. I do this every night (clear weather permitting) for a few weeks, months, years, whatever, and if I see the same star through the camera, I'd say that's proof enough.
I'm sorry if this comes off as being pompous or asinine, but I like astrophotography and just plain stargazing.
Any help or ideas are most welcome. Thank you all. Peace. <ckd>