The only problem I see with your odometer calibration technique has nothing to do with your rollatape measurement device (not really even sure what that is). But let's assume that you accurately measured the 1/10th mile course. By only checking your odometer over a 1/10th mile, which is essentially
a single division on the odometer, the gauge R&R (repeatability and reproduceability) of the
odometer is not adequate to get a significant individual reading. To improve the significance, you would have to repeat the experiment many times and then make an analysis of those results as a group. Only then could you apply some calculations to your 1/10th mile data and determine if there is any significance of the results.
If you made the same experiment over 1 mile, 10 miles or 100 miles, since the R&R of the odometer is always roughly uniform, you would find the significance improves at increasing mileages (to a point of diminishing returns).
Same principle would apply to the highway mile markers along the side of the road. The distance between the individual 1/10th mile markers is variable as they tend to place them wherever a convenient spot is. But the absolute accuracy of sign placement becomes less and less significant as the total distance is increased. In other words, even though the first sign you pass may be off by say 20 feet (somewhat significant in a 1 mile measurement), the 100th mile post is also only likely to be off by the same amount. That 20 foot error would be is insignificant over 100 miles.
A GPS'es inherent inaccuracy should be a similar situation. A GPS may only be able to accurately locate you to within 3-5 meters absolute position (1-3m with WAAS), and that would significantly effect your distance/speed calculation over 1/10th of a mile, the GPS will be accurate to within the same 3-5 meters absolute position at the end of a 100 mile trip that is was at the beginning.
I won't get into doppler radars as that could become political.