I have thousands of records to insert to the tables. So want to try the bulkinsert kind of featuare. #659
Unanswered
patelronakp
asked this question in
Questions
Replies: 1 comment
-
You should be able to use Session.bulk_insert_mappings. Just pass your model as the 'mapper'. An example using the tutorial models: session.bulk_insert_mappings(Hero, [{"name": "Deadpond", "secret_name": "Dive Wilson"}]) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
First Check
Commit to Help
Example Code
Description
Like sqlalchamy has bulk_insert_mappings option which i didn't find in the SQLModel's AsyncSession object. So want to know how to do bulk inserting using AsyncSession of SQLModel.
I tried with the same bulk insert with add_all but it will take lots of time to insert 70K records to the table. But then I created a sample project with sqlalchamy bulk_insert_mapping. It will insert the same data within a few seconds only.
Operating System
Windows
Operating System Details
No response
SQLModel Version
sqlmodel Version: 0.0.8
Python Version
Python 3.10.10
Additional Context
If anyone has any kind of solution to bulk insert efficiently then please share your suggestions.
Beta Was this translation helpful? Give feedback.
All reactions