-
Couldn't load subscription status.
- Fork 408
Closed
Labels
Description
Describe your environment
- Operating System version: Heroku
- Firebase SDK version: 8.3.0
- Firebase Product: Database
Describe the problem
I'm parsing a CSV with 130k lines and as I'm iterating through each row I'm adding it to the collection, this is rated limited to 450/s. However with each CSV parse it seems Firestore is using more and more memory and stays. Removing the collection().add() code fixes this problem and the parsing of the CSV alone never goes above 80mb.
Steps to reproduce:
I'd imagine you could run the relevant code through a loop, in my case is was 130k times for the CSV size. If there is a more efficient way of achieving this please do let me know.
Relevant Code:
db.collection("games")
.doc(req.body.docid)
.collection("players")
.doc()
.add({
Number: row[0],
FirstName: row[1],
Lastname: row[2],
Email: row[3],
Prize: row[4],
gameID: req.body.docid,
guid: row[5]
})
.then(() => {
progressCount++;
updateProgress(req.body.docid);
})
.catch(err => {
console.log(err.message);
});
});
