Postgres as GraphDB: high volume of data #1741
Replies: 3 comments
-
You can still apply some optimization techniques you would do in PostgreSQL, like batch processing and parallel insertion... On your local app for test, you could use PG's performance monitoring tools, like the pg_stat_statements module or pgAdmin and keep comparing execution times then adjust some logic in your application... @jrgemignani @rafsun42 do you have any insight on this? |
Beta Was this translation helpful? Give feedback.
-
@markgomer @jorgemgomes |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
Hi everyone,
First things first, I am new to graph databases. On a daily basis, I have approximately 1TB of data stored in S3 in .csv format (edges and nodes) that I need to insert into a graph database every 10 minutes (let's say 7GB every 10 minutes).
I would like to know:
Is a PostgreDB a good choice to act as a graph database for this use case?
What would be the best way to achieve this in terms of performance? In other words, what kind of application do I need to build to send the data to the PostgreDB every 10 minutes?
Sample of data:
Nodes.csv:
Edges.csv:
Thanks for your input!
Beta Was this translation helpful? Give feedback.
All reactions