Hi All.
My APP implements an interface that calls a service to monitor latitude and longitude of ther user by GPS.
My service is implemented with a loop, using StartServiceAt, with a period of 5 seconds. For each 5 seconds the APP calls a web service to register the position of the user.
Well, everything works like a charm, except the consume of memory RAM. After some minutes (30 or 40), the application that has started with approximately 13 Mb, could increase until 90/100 Mb.
How can I identify what piece of code is responsible for this higher consume of memory RAM?
Thanks
My APP implements an interface that calls a service to monitor latitude and longitude of ther user by GPS.
My service is implemented with a loop, using StartServiceAt, with a period of 5 seconds. For each 5 seconds the APP calls a web service to register the position of the user.
Well, everything works like a charm, except the consume of memory RAM. After some minutes (30 or 40), the application that has started with approximately 13 Mb, could increase until 90/100 Mb.
How can I identify what piece of code is responsible for this higher consume of memory RAM?
Thanks