You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Semantic Kernel has memories which allows you to store data in a vector database (such as Postgres) and then retrieve it using semantic functions that SK understands.
I think it would be better to show off the workflow of using memory rather than the custom approach to managing embeddings and vector queries that is in eShop.
Note: Memory in SK is currently marked as Experimental, so it does pose a risk on adoption.
The text was updated successfully, but these errors were encountered:
This moves from the custom solution for vector searching on Postgres, instead using the SK memory feature to do it. Added some new SK dependencies for the memory (and PG memory store), then removed the Embedding column from the current data model, as there is a new table with all that in it.
Refactored the seed logic to load the memory store using the previously generated embeddings.
Changed the CatalogAPI route to use the ISemanticTextMemory search feature to search memory, rather than the custom SQL query. This does mean we don't get distance surfaced, also, pagination is currently lost and SK memory doesn't support that (we could roll that ourselves if we want).
Included a fix so that AOAI can be deployed (issue dotnet#280), and pgadmin for easier debugging of the data in the database.
Semantic Kernel has memories which allows you to store data in a vector database (such as Postgres) and then retrieve it using semantic functions that SK understands.
I think it would be better to show off the workflow of using memory rather than the custom approach to managing embeddings and vector queries that is in eShop.
Note: Memory in SK is currently marked as Experimental, so it does pose a risk on adoption.
The text was updated successfully, but these errors were encountered: