精确的GPS测量速度如何?

2010年11月28日

Question

Hi Dr. Smith
Would you be so kind to clarify this relating to GPS speed monitor on next week's show with Redi?

My thoughts are as follows, but they are ever so slightly incoherent.

In South Africa a good signal will allow a WAAS enabled GPS to be around 3m accurate on average, with a variation up to 15m for the horizontal plane this is usually trebled in the vertical plane - so 9 to 45m.
Most GPS, outdoors type GARMINS - "take a point" every second - it may be saved as a track or waypoint or not, so if you use it to monitor your speed at around 120km/h- where you travel roughly 33.3 m/s you can have a horizontal error of up to 30m but mostly the distance measured could vary from -6 to 6m from 33.3 which seems very inaccurate - contrary to the "evidence" on display

So what is the reason for the accuracy, pure averaging?

Or does the error follow a pattern? I.e two points taken directly after each other will err more or less in the same direction and extent. I ask this because there is apparently still some error introduced by governments and this may cause a "pattern following" error, or will the ionospheric and atmospheric and satellite delays for two consecutive measurements be roughly the same?
I'm not even considering the vertical plane, as my trig is non-existent, but you may
BTW car manufacturers err on the side of caution for speedometers but are spot-on for odometers, unless you change your tyre circumference or gearing.
Sincerely
Andre Grobler

Answer

Dave - The two devices are measuring speed in very different ways. The speedometer is measuring the number of times your wheels turn every second or every minute, and if you know the circumference of your wheels you can work out how fast you're going.

There can be errors on that because the tyres will wear down, that will change the circumference of your wheels. Quite often, I think they build speedometers to possibly slightly under-read, which is quite good because that way, you get less speeding tickets.

Old-fashioned speedometers also weren't as accurate as modern ones. They weren't computerised and they just tended to be less sensitive at high speeds.

The GPS is basically measuring your position repeatedly and measuring how far you move over a certain period of time and then dividing that distance by the time it's averaged over, and then that will give your speed.

If you're stationary, the GPS will quite often give you a speed, so the GPS is not at all accurate in giving a speed when you're going very, very slowly because the errors in position can be a few metres. And the accuracy of the speed will depend on how long it's averaging over to get the speed.

So if it's averaging for 10 minutes the GPS will be more accurate, if it's averaging over 2 seconds the speedometer will certainly be.

Comments

Even if the GPS receiver does not calculate speed using doppler shift, as stated in Mi5's comment, the answer is still incorrect. The answer given doesn't consider the difference between absolute and relative error. If the absolute position error is consistently off in the same direction to roughly the same extent, the absolute error cancels out when taking a relative measurement such as speed. This point was even mentioned in the original question. As a result, it is possible for GPS to return accurate speed measurements using relatively inaccurate position measurements.

GPS receivers do Not measure speed by differencing the positions over time. They could, but it would be grossly inaccurate. The speed is actually measured as a side effect of the solution to another problem: the satellites are moving, causing a Doppler shift in the received radio frequency. The receiver has to adjust for each satellite - but any remaining discrepancy of the received frequency must be due to the movement of the GPS receiver itself. Thereby, the GPS ends up knowing exactly how fast it is moving, to and accuracy much greater than that attainable from position differences. Try searching for GPS Doppler.

Add a comment