Android Question how correct to do long operation in the main thread ?

peacemaker

Expert
Licensed User
Longtime User
In some database init must be done first of all, other operations are not allowed - so, the process can be done as is in the main thread, with some progressbar.
Init is a process with a huge text file (20000 ... 100000 lines, several megabytes, or tens of), reading line by line.

I usually make using Doevents in some IF condition. But here is very long operation, maybe 5 minutes. And an user may need to interrupt it, seeing the progressbar percent.

How correct to do such loop with minimal delay ?
 

peacemaker

Expert
Licensed User
Longtime User
No difference where, the interface must be locked by this process. But i mean - the only way is to use a loop with some condition for interrupting, calcualting persent for the progressbar and using DOEVENTS in some extra condition, to avoid the system "app is not responding" alert ?
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
Thanks for the prompt.
But the source file is very huge, see the first my post. It's not sure that enought memory to open whole the source file, or collect lots of SQL.AddNonQueryToBatch before INSERT.
So, i'm working line by line.
 
Upvote 0

Mahares

Expert
Licensed User
Longtime User
But the source file is very huge, see the first my post. It's not sure that enought memory to open whole the source file
I used it on importing a text file 14 MB in size with 175000 lines into a table on a Galaxy Tab A tablet. It worked.
Can youput a copy of your huge text file ( may be in Dropbox) even with fake data and table structure and give the forum a chance to test it.
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
Wow ! How long is your processing 175 K lines ?
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
It's commercial request development, i make as requested: first of all a some exported DB in the text file (100 - 200 symbols in 6 fields per line, with "TAB" as separator-symbol) must be imported, and then working with this big database.

Super! Only 16 seconds. At me 3 mins with progressbar for 4 MB file and 28000 lines, if to read line-by-line, parse by RegEx and INSERT into SQLite one by one by DBUtils.
Yes, i can optimise, if without DButils subs, just direct SQLite requests, but, seems, minimal difference will be.

So, now you should understand why my question. Yes, i can try to read whole the file first, maybe 4MB is not a problem, but files will be several times bigger. But the device is a Android based barcode-scanner device with 1GB memory only.
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
Hmmmm, correct question - it needs to check
 
Upvote 0

RandomCoder

Well-Known Member
Licensed User
Longtime User
How often are you trying to update the progressbar using DoEvents? This will have a significant effect.
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
HA ! Reading file line by line without INSERTing into SQLite - is just 6 seconds for 28000 lines !
Progressbar is updated each 200 records, if 500 - it's not confortable, delaying\hangs for the user interface.

So - it needs to improve SQLite inserting
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
Re-checked, direct request to SQLite - the same situation - Inserting into SQLite DB is the much more slow process, if line by line.

UPD: and loop with adding SQL.AddNonQueryToBatch 28000 times is the same long
You can insert f.e. 1000 rows and then the next 1000 pushed by a timer with 0.5s or so.
How do _you_ make so fast ?
 
Last edited:
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
B4X:
Starter.SQL.ExecNonQuery2("INSERT INTO db VALUES (?,?,?,?,?,?,?)", Array As Object(count, p(0), p(1), p(2), p(3), a, b))
each line
 
Upvote 0

peacemaker

Expert
Licensed User
Longtime User
It's a loop, reading line by line, parsing and inserting by this SQL request, that's all. But this loop only lines reading without SQL inserting is just 6 secs for 28000 lines.
 
Upvote 0
Cookies are required to use this site. You must accept them to continue using the site. Learn more…