Bulk insert in sqlite out of memory - sql.js [500k records]


i’m using a sqlite database with sql.js library in a electron project. when i need to update my db, i call a SOAP service which returns a JSON formatted string with all the query that i need to set like this:


the total amount of records to be insert is about 400k (but it can grow in the future)

when i call the SOAP service, it throws me an error:

> <--- Last few GCs --->

>   125053 ms: Mark-sweep 1247,0 (1301,1) -> 1089,3 (1143,4) MB, 440,5 / 0,0 ms [allocation failure] [GC in old space requested].
>   125173 ms: Mark-sweep 1089,3 (1143,4) -> 1089,0 (1143,4) MB, 119,8 / 0,0 ms [allocation failure] [GC in old space requested].
>   125287 ms: Mark-sweep 1089,0 (1143,4) -> 1089,0 (1104,4) MB, 113,6 / 0,0 ms [last resort gc].
>   125384 ms: Mark-sweep 1089,0 (1104,4) -> 1089,0 (1104,4) MB, 97,6 / 0,0 ms [last resort gc].

> <--- JS stacktrace --->

> ==== JS stack trace =========================================

> Security context: 0x13047a0cc421 <JS Object>
>     2: wb [/Users/gaetano/Desktop/sqlite/node_modules/sql.js/js/sql.js:~350] [pc=0x24ced26fb87](this=0x2ecf4e3070e9 <an Object with map 0xc5e4890dfc9>,a=0x17e1a2e49d41 <a dc with map 0xf7c43efd9f9>)
>     3: Sa [/Users/gaetano/Desktop/sqlite/node_modules/sql.js/js/sql.js:~350] [pc=0x24cec57039a](this=0x2ecf4e3070e9 <an Object with map 0xc5e4890dfc9>,a=0x17e1a2e49d41 <a dc with map 0xf7c43efd9f9>,...

it’s not clear for me if it’s from a nodejs lack of memory or sqlite.

is there something that i can “increase” like a pragma statement or something like that?!

thank you!



It’s not a Sqlite issue, but a Garbage collection issue.
actually i didn’t find how to flush memory or menage the garbage collection.
i’ve been trying with --enable-gc flag on start, but nothing happens…


How big is the JSON formatted string? If each of the 400,000 INSERT statements averages 512 bytes, then that string is going to be ~195.3MB, not counting the overhead of the JSON itself. Does it have to be submitted all in one gigantic string?


Yes it must be submitted in one giant string because it part of the update statement.
actually i’m downloading the entire sqlite db via FTP. looks quicker :wink:


I don’t fully know the architecture of this one mate, but I’m inclined to advise re-think this one. Is sqllite the best fit for purpose? Especially if the data is going to grow (I’m thinking of performance, indexing…). How important is the longevity of the data also? (backups, exporting etc).

Is the data you are talking about static? If so ship it with the app as seed data.

Maybe the INSERT\UPDATE statements can be generated locally with parameters that have been downloaded only (to help reduce the incoming payload)?