Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory Module Embedder #5

Open
BillChan226 opened this issue Mar 29, 2024 · 1 comment
Open

Memory Module Embedder #5

BillChan226 opened this issue Mar 29, 2024 · 1 comment

Comments

@BillChan226
Copy link

Hi, thanks again for the incredible work!

As I'm trying to reproduce the Agent-Driver end-to-end, I'm running into some problems with the memory module. It seems that all the embedding of the encoded query has already been cached in the local memory file database.pkl? Thus I'm wondering what encoding model did you use to get the embedding of the queries and keys? Did you simply use the coordinates of the ego-states and historical trajectories and use the one-hot vector to embed the mission goal? If not, I would really appreciate it if you could kindly share the encoder you use.

Thanks!

@Jay-Ye
Copy link
Collaborator

Jay-Ye commented Apr 18, 2024

Hi, thanks for your interest in our work.
We retrieve the memory scenarios by querying [ego_states, mission_goal, ego_hist_traj]. Please see here for details of the query vector. Specifically, the mission goal is embedded by a 3x1 one-hot vector indicating going left, right, and straight.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants