The CMOS clock that keeps track of time and date on computers drifts at least few times more than even the cheapest standalone clock/watch. Why is this and why can't they match the accuracy of standard digital watch?
The typical clock has a 32,768Hz crystal and it is divided by 2^15 to derive one pulse per second. Considering cheapie watches can be made for a dollar or two in quantity, I really don't understand why motherboard clock can not have accuracy of better than 15sec/month.
I've heard mobo clock generators are 20MHz or so and my my guesstimates are :
67.108865MHz divided by 2^26 or 33.554432MHz divided by 2^25 or 16.777216MHz divided by 2^24 or 4.194304MHz divided by 2^22
Does mobo have different type of time-base than typical clock and if it does, how is the time base derived on computer mobo's?
The typical clock has a 32,768Hz crystal and it is divided by 2^15 to derive one pulse per second. Considering cheapie watches can be made for a dollar or two in quantity, I really don't understand why motherboard clock can not have accuracy of better than 15sec/month.
I've heard mobo clock generators are 20MHz or so and my my guesstimates are :
67.108865MHz divided by 2^26 or 33.554432MHz divided by 2^25 or 16.777216MHz divided by 2^24 or 4.194304MHz divided by 2^22
Does mobo have different type of time-base than typical clock and if it does, how is the time base derived on computer mobo's?