You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When memorizing a sequence (1D intervention) is it possible to attend to it, as in 'where is GO-> located' (Stanford).?
I'd be interested in using pyreft for 'online-learning' similar to approaches with associative memory proposed in Larimar/MemoryLLM/CameLoT/Memory of Amortized Contexts. These projects lack implementations or usable interfaces and possiblities to transfer/ load learned behavior that pyreft comes with.
As an alternative would I train and load (hundreds of) partitioned SubLorefts to achieve the same?
The text was updated successfully, but these errors were encountered:
chris-aeviator
changed the title
[Question] How to attend to memorized example?
[Question] How to attend to memorized intervention?
Apr 20, 2024
@chris-aeviator thanks for the question. could you elaborate more on the memorization example? the current setup is to train an intervention to memorize the whole sequence. Could you explain the part of "attending"? Thanks!
frankaging
changed the title
[Question] How to attend to memorized intervention?
[P1] How to attend to memorized intervention?
Apr 20, 2024
Let's see the memorized sequence as a 'llm memory' for a second. If one could selectively write to (and query) these memorized sequences and attend to them it's possible to build a long term memory. The mentioned projects more or less do a top_k nn search for keys to do that query, but this is not part of my issue, I'm looking for controllable query/ writes to 'memory'.
Using a loreft intervention allows me to attend to the trained material but it seems no suited to train a single piece of text (I might might be wrong).
What I mean with attending to the memory: There is no way to interact with a 1d memorized sequence - I can not mix user query with the memorized sequence query, it's either looking up the memorized sequence or not having 'acces to it' (when mixed with a user query).
When memorizing a sequence (1D intervention) is it possible to attend to it, as in 'where is GO-> located' (Stanford).?
I'd be interested in using pyreft for 'online-learning' similar to approaches with associative memory proposed in Larimar/MemoryLLM/CameLoT/Memory of Amortized Contexts. These projects lack implementations or usable interfaces and possiblities to transfer/ load learned behavior that pyreft comes with.
As an alternative would I train and load (hundreds of) partitioned SubLorefts to achieve the same?
The text was updated successfully, but these errors were encountered: