Cost of each mutation grows as more mutations are in a transaction #3046
Labels
area/performance
Performance related issues.
kind/enhancement
Something could be better.
priority/P1
Serious issue that requires eventual attention (can wait a bit)
status/accepted
We accept to investigate/work on it.
status/needs-attention
This issue needs more eyes on it, more investigation might be required before accepting/rejecting it
I originally asked this on slack, but it might be more useful to track it as an issue.
Every few days our application will need to insert up to 3 million (this number may grow) predicates into the database. To assess dgraph's performance, I wrote this little python script below to benchmark the time it takes to insert 1000, 10000, 30000, 50000, and 100000 predicates. Results are as follows:
The growth of the time is a bit worrying. Why does inserting 100 thousand predicates take 70x the time to insert 10 thousand predicates?
Here's the script:
Initially, I thought it's because of the fulltext index. So I also tried without without
@index(fulltext)
. Here are the results:It's slightly better, but the time growth is still worrying.
Any guidance is appreciated.
Configurations:
Dgraph version : v1.0.11
Commit SHA-1 : b2a09c5
Commit timestamp : 2018-12-17 09:50:56 -0800
Branch : HEAD
Go version : go1.11.1
The text was updated successfully, but these errors were encountered: