I'm building a small yacht racing app to measure my speed upwind, where the speed is relatively low, in the 6-10 knot range, and where the inherent GPS positional error is causing significant fluctuations in reported speed (which is why I'm going to this effort instead of using a hand-held GPS unit).
So, I'm trying to smooth out the errors by calculating the speed over longer periods of 5 and 10 calls to the onNmeaReceived event (which occur by default at 1000 ms), using the NMEA lat/lon and time from the $GPRMC sentence.
When comparing the NMEA reported speed to the speed computed since the previous event I find they are significantly different, so I'm wondering how the NMEA speed provided in the $GPRMC sentence is computed? Possible answers could be
- Deduced from GPS Doppler speed ?
- calculated as distance/time over the previous second?
- Or something else....?
The documentation is entirely silent on this matter.
Using Android 4.0.3 Samsung SGS 11