I'm creating a tracking app. At the start, GPS will first look for satellites. The more satellites the higher the position accuracy. After some time the coordinates and speed will be stable.
Like everyone I want to detect a real movement. No problem at good conditions/good reception after a while. But what about a non optimal situation (f.e. under a bridge or bad weather conditions where GPS works but maybe get's inaccurate data). Then the position "jumps" some meters and the speed maybe shows 5m/s.
In Google Maps shows this effect as a bigger (less accurate)/smaller(accurate) circle. Any formula or idea how to check this by program (-> does the object really move or is it just an adjustment of gps?).
Yep. I saw that parameter but what does it exactly mean? I thougt it was to set like "50m accuracy is ok for me or I need 1m to the point". So if I need to know "how accurate is the position and the movement" - how do I do that?
If you get an accuracy of 15, this means that you have a probability of 68% that the location is in a radius of 15m.
There are no percentages.
What does 99% of accuracy mean for you ?
In your case with GPS1.Start(10000,10) you will get Locations only if the distance change is at least 10 meters or every 10 seconds.
Just trying to understand. If you set a geo-fence of 15x15m: How do you know if your are inside the fence or "probably inside with a 68% chance"? How does a car GPS know that you are on the road and not "probably"? Or do they just assume that (because a car will most likely stay on that road which is inside the radius)?
Most GPS softs that I've used show a small dot where the real coordinates are in the map and the car in the nearest road... Sometimes, when the real coordinates are equidistant of 2 roads, the GPS doesn't "know" which one to choose and keeps trying to recalc!