High precision timing on a PC.

Armitage

Banned
Feb 23, 2001
8,086
0
0
So, for a project I'm speccing out I'm going to need to grab data off an external device (CCD) an give it a very high precision timetag for the start and end times of the exposure. I need to get the accuracy on the timestamp down to less then 1 ms. A PC clock with a good NTP sync might be good to 10s of ms ... obviously not good enough.

So I was thinking about hanging one of these off the box. They get down to about 10us syncing to CDMA cell phone signals. GPS based time references might be an option also.

Has anybody done this kind of stuff before? Suggestions on where to start wrt this kind of hardware & programming?
Thanks
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Armitage
So, for a project I'm speccing out I'm going to need to grab data off an external device (CCD) an give it a very high precision timetag for the start and end times of the exposure. I need to get the accuracy on the timestamp down to less then 1 ms. A PC clock with a good NTP sync might be good to 10s of ms ... obviously not good enough.

So I was thinking about hanging one of these off the box. They get down to about 10us syncing to CDMA cell phone signals. GPS based time references might be an option also.

Has anybody done this kind of stuff before? Suggestions on where to start wrt this kind of hardware & programming?
Thanks

Well... how complex you need to get depends on what exactly you're trying to do.

PCs can be *very* good relative timers (accurate down to microseconds), although to get this level of accuracy you need to either run an RTOS, or, if you're using a Windows program, run your threads at REALTIME priority, which keeps the scheduler from messing with you while you wait. However, getting a very, very accurate *absolute* timestamp (with regard to some sort of external clock) can be tough. That's where a gizmo like that time sync thing might come in handy, since it can get a very accurate, low-latency timing measurements from a remote source.

You haven't really provided enough information. Are you polling the external device? Does it trigger something that tells you when to start capturing data? Is it delivered in a packetized form, or are you capturing analog data? What kind of interface hardware are you using? You say you need to timestamp the start and end times of an "exposure" -- are you setting the length of the exposure yourself, or polling some sort of external flag, or having something trigger an interrupt to tell you when to stop?
 

Lynx516

Senior member
Apr 20, 2003
272
0
0
The most elegant solution to this woudl be to use a microcontroller to grab the data off the CCD and the forward it to the PC. This way you can accuratly (down to the clock cycles) grab data at the correct.

 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Yea, I expect I'll have to use a RTOS ... maybe one of the RT Linux hacks. Something else I need to learn about.

Basically, I will trigger the "exposure" ... basically the start & stop times of the CCD integration. The precise time that the exposure starts and stops isn't important so long as I know what it is. What I mean is, if I ask for the CCD exposure to start @ 01:00:00.0000000, but it actually starts at 01:00:00.010000, that's OK so long as I know what time it actually started. The interface tot the CCD will likely be USB. The problem of characterizing the CCD hardware & software is a whole other issue that I haven't considered just yet.

Right now I'm interested in how to access & use an external time reference like this, and is it really feasible to get sub-millisecond accuracy on PC hardware, even with this sort of external time reference. I guess a primer on real time OSen & programming would be in order as well.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Originally posted by: Lynx516
The most elegant solution to this woudl be to use a microcontroller to grab the data off the CCD and the forward it to the PC. This way you can accuratly (down to the clock cycles) grab data at the correct.

Well, the CCD package will have a microcontroller on board I suspect, and the clock cycles on there (if I have access to them?) will likely be as good as it gets for relative timing ... ie. the length of the exposure. But how to correlate that back to the absolute UTC time is the problem.

As I see it, I have the following issues:

1. How to access a very accurate absolute time on the PC. This one should be fairly easy with the right HW.
2. How to correlate this very accurate timestamp with some action (start of the CCD integration)
3. How to get the actual integration time. Clock cycles on the CCD microcontroller?

I'm making this up as I go along, if you haven't noticed :D
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Armitage
Originally posted by: Lynx516
The most elegant solution to this woudl be to use a microcontroller to grab the data off the CCD and the forward it to the PC. This way you can accuratly (down to the clock cycles) grab data at the correct.

Well, the CCD package will have a microcontroller on board I suspect, and the clock cycles on there (if I have access to them?) will likely be as good as it gets for relative timing ... ie. the length of the exposure. But how to correlate that back to the absolute UTC time is the problem.

As I see it, I have the following issues:

1. How to access a very accurate absolute time on the PC. This one should be fairly easy with the right HW.
2. How to correlate this very accurate timestamp with some action (start of the CCD integration)
3. How to get the actual integration time. Clock cycles on the CCD microcontroller?

I'm making this up as I go along, if you haven't noticed :D

The problem is that the answers to questions 2 and 3 are *highly* dependent on your software and hardware setup, the exact requirements for your project, and how precise you need to be. You need more details on the system, and probably some more research, before anyone (including yourself :p) is going to be able to answer those questions.

 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
Another thing you need to specify is how accurate the time-stamp needs to be. How long would atypical measurement be?Hours ,days (weeks, months, years)? ms second accuracy over a period of several weeks is probably not possible with your typical PC-clock
And do you need to know the absolute time or is relative time enough?
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Originally posted by: f95toli
Another thing you need to specify is how accurate the time-stamp needs to be. How long would atypical measurement be?Hours ,days (weeks, months, years)? ms second accuracy over a period of several weeks is probably not possible with your typical PC-clock
And do you need to know the absolute time or is relative time enough?

Yea, ms accuracy from a PC clock is definitely out for absolute time stamps. For relative timestamps, you might get that for periods of 10s of minutes. In any case, that's why I need to go to an external time & frequency reference as noted in the 1st post.

A typical exposure would be on the order of 10th's of a second. Actual length of the exposure does not need to be controlled to very high precision ... hundredth of a second maybe. IOW, if I want a 0.1 second exposure, and get 0.11 or 0.09 seconds, no big deal.

But the absolute time for the endpoints of the exposure needs to be known to microsecond accuracy.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Originally posted by: Matthias99
Originally posted by: Armitage
Originally posted by: Lynx516
The most elegant solution to this woudl be to use a microcontroller to grab the data off the CCD and the forward it to the PC. This way you can accuratly (down to the clock cycles) grab data at the correct.

Well, the CCD package will have a microcontroller on board I suspect, and the clock cycles on there (if I have access to them?) will likely be as good as it gets for relative timing ... ie. the length of the exposure. But how to correlate that back to the absolute UTC time is the problem.

As I see it, I have the following issues:

1. How to access a very accurate absolute time on the PC. This one should be fairly easy with the right HW.
2. How to correlate this very accurate timestamp with some action (start of the CCD integration)
3. How to get the actual integration time. Clock cycles on the CCD microcontroller?

I'm making this up as I go along, if you haven't noticed :D

The problem is that the answers to questions 2 and 3 are *highly* dependent on your software and hardware setup, the exact requirements for your project, and how precise you need to be. You need more details on the system, and probably some more research, before anyone (including yourself :p) is going to be able to answer those questions.

Yea, I know. I guess I'm trying to figure out if its feasible at all, and what kind of hardware & software would be required to accomplish it. The question here is likely premature! Just looking for where to start & a general sanity check.

FWIW, initially the CCD would likely be a webcam.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Armitage
But the absolute time for the endpoints of the exposure needs to be known to microsecond accuracy.

This is a very tough constraint -- why do you need such high accuracy on the absolute time? Are you trying to capture a very short-lived event or something? It sounds you like you don't care very much about the exposure length, though... not making any sense to me. :p
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
The application I have in mind is tracking and taking data on satellites with amateur class telescopes. Believe it or not, there is a large community of amateur astronomers using webcams to take deep space & planetary images. I want to try it with satellites.

To get metric data, you keep the scope stationary, and the satellite moves across the field of view leaving a streak. The start & stop times of the exposure correlate to the end points of the streak. You can figure out the angular position of those endpoints by measuring relative to the surrounding starfield. But the position data has to have accurate time associated with it to be useful.

Depending on the orbit &amp; the field of view (FOV) of the scope, the satellite could be in view from a fraction of a second to several seconds. For the configuration I'm favoring at the moment, most LEO satellites will transit a single pixel ... the best measurement I can get off of a CCD ... in < 1 ms. It'd be great if I could get my time hack down to microseconds to support that same resolution. If not, it's still useful, but that's the goal!

As far as the exposure length ... as long as you expose long enough to get a good star background, and the satellite is bright enough to leave a streak, you're good. The precise exposure time isn't important, so long as you know the endpoints.
 

harrkev

Senior member
May 10, 2004
659
0
71
I did something like this once, years ago.

The key is to use DOS. You do NOT want to use Windows, as it is not set up for this type of thing.

The following info is from memory, which may be fuzzy. But the general idea is correct...

There is a timer chip in the PC. It has been there since the original PC back in the early 80's, except that modern computers build it into the chipset. This timer "chip" has three separate timers, and one controlls the PC speaker (the one on the mobo). Another one is used to generate the interupts in Windows which does the time-slicing. I believe that the clock rate going to this chip is something like 1MHz.

Sooo, you have to bypass windows, and then have software which will set this timer chip. Then have a polling loop wait until it goes off and do your business.

Go to this web site. It also involves LInux and using the speaker, but there is a lot if info to get you started:
http://www.linuxgazette.com/issue69/mathew.html
According to this article, the chip is an 8254. You should be able to get a datasheet for this chip. Otherwise, you can go to the Intel web site and download some chipset datasheets. Since EVERY PC has this chip installed, then the datasheet for ANY chipset should help, as long as it is a good datasheet.
 

Armitage

Banned
Feb 23, 2001
8,086
0
0
Originally posted by: harrkev
I did something like this once, years ago.

The key is to use DOS. You do NOT want to use Windows, as it is not set up for this type of thing.

The following info is from memory, which may be fuzzy. But the general idea is correct...

There is a timer chip in the PC. It has been there since the original PC back in the early 80's, except that modern computers build it into the chipset. This timer "chip" has three separate timers, and one controlls the PC speaker (the one on the mobo). Another one is used to generate the interupts in Windows which does the time-slicing. I believe that the clock rate going to this chip is something like 1MHz.

Sooo, you have to bypass windows, and then have software which will set this timer chip. Then have a polling loop wait until it goes off and do your business.

Go to this web site. It also involves LInux and using the speaker, but there is a lot if info to get you started:
http://www.linuxgazette.com/issue69/mathew.html
According to this article, the chip is an 8254. You should be able to get a datasheet for this chip. Otherwise, you can go to the Intel web site and download some chipset datasheets. Since EVERY PC has this chip installed, then the datasheet for ANY chipset should help, as long as it is a good datasheet.

An interesting article in its own right, but as I need absolute time hacks, I don't think it helps. This approach can only really get you relative time, and likely only good to ms over short durations.
Thanks