I took pictures of almost every split timer I could find on the Chicago
Marathon course Sunday and when compared to the clock in my digital camera.
I found a few very minor differences.
These split times are in relation to the clock in my digital camera. I
really don't think it was the clock in the camera that did all the drifting.
Mile 1 OK
Mile 2 -2
Mile 3 -2
Mile 4 -1
Mile 5 OK
Mile 6 +1
Mile 7 -1
Mile 8 -1
Mile 9 OK
Mile 10 -1
Mile 11 OK
Mile 12 OK
Mile 13 -1
Mile 14 -1
Mile 15 -1
Mile 16 OK
Mile 17 OK
Mile 18 No split clock found
Mile 19 -1
Mile 20 -1
Mile 21 -1
Mile 22 OK
Mile 23 -1
Mile 24 OK
Mile 25 OK
Mile to go -1
Mile 26 OK
Finish line time was used as the reference for the other clocks.
While they are close enough for government work, I think it shows that they
were started by hand.
Quick explanation on how I did the above:
Using a picture I took of the finish line clock, I calculated that the start
of the race was when the clock on my camera said 7:31:04.
I wrote a short program to dig the time out of the JPG file, subtract
7:31:04, by converting both times to seconds of the day, then calculate
Then the program would rename the .JPG with the "_Race_Time__ ?H??M??" at
then end of the original file name.
After that all I had to do was view the photos with a time clock in them,
compare it to the "Race_Time__" in the file name and find out how far apart
I've done this before, with a bunch of finish line photos and in two hours,
only one photo was 1 second off. It has worked very well before and I was
surprised to see how many clocks at Chicago were 2 seconds off.
Note: Unlike the file time used by IBM where it is always an even second,
the time in the JPG itself can have an odd second.
Any questions or ideas?