You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
When deploying Dify with pgvector as the vector store, the vector tables are stored with names starting with embedding_vector_index_ in PostgreSQL. However, when generating migration files, the new migration will delete all these vector tables.
There are two potential solutions to address this issue:
1. Store the base data in one database and store the vectors in another database. 2. a line of code to the migration file to include an exception for these vector tables:
To prevent migration files from deleting vector tables with names starting with embedding_vector_index_ in PostgreSQL, you can implement a custom function in the env.py file to exclude these tables from migrations. The include_name function you've mentioned is a suitable approach. Here's how it works:
Custom Function: The include_name function checks the name of each table during the migration process. If the table name starts with embedding_vector_index_, it returns False, effectively excluding these tables from being affected by the migration.
Integration in env.py: You need to ensure that this function is integrated into the migration context configuration. This is done by adding include_name=include_name to the context.configure call in the run_migrations_online function.
Here's a brief overview of the code snippet you provided:
definclude_name(name, type_, parent_names):
iftype_=="table":
# Ignore all tables starting with 'embedding_vector_index_'returnnotname.startswith('embedding_vector_index_')
returnTruedefrun_migrations_online():
...
withconnectable.connect() asconnection:
context.configure(
connection=connection,
target_metadata=get_metadata(),
process_revision_directives=process_revision_directives,
include_object=include_object,
include_name=include_name, # Add this line**current_app.extensions['migrate'].configure_args
)
...
By implementing this change, you ensure that the migration process respects the presence of these vector tables and does not delete them inadvertently. This approach allows you to manage your vector tables effectively without needing to separate them into a different database.
Self Checks
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
When deploying Dify with pgvector as the vector store, the vector tables are stored with names starting with embedding_vector_index_ in PostgreSQL. However, when generating migration files, the new migration will delete all these vector tables.
There are two potential solutions to address this issue:
1. Store the base data in one database and store the vectors in another database.
2. a line of code to the migration file to include an exception for these vector tables:
/api/migrations/env.py
✔️ Expected Behavior
Migration files should not delete vector tables, or they should provide a way to manage these tables properly in the migration process.
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: