Skip to content

Conversation

belerico
Copy link
Contributor

This PR introduces LongLora as in #1237 for both the LoRa and full fine-tuning, while also enabling it during generation.

belerico added 30 commits April 3, 2024 10:54
@rasbt
Copy link
Contributor

rasbt commented Apr 24, 2024

Wow, this is a huge PR! Thanks for working on that!
Starting with some high-level feedback, is LongLora always enabled by default? I think we should probably make this optional. Similar to Galore for example. I.e., adding

use_longlora: bool = False,
     """Whether to enable LongLoRA."""

What do you think?

@belerico
Copy link
Contributor Author

HI @rasbt, I'm closing this because I've seen that there are some unrelated histories containing also commits regarding the removal of the last layers: my bad because i've put everything together for some experiments I'm running. I'll close this and open a new branch with the history fixed.
Sry about this

@rasbt
Copy link
Contributor

rasbt commented Apr 24, 2024

I think it might have been possible to rebase, but all good.
Next time, here's one approach to do this (thanks to @awaelchli who explained this to me):

# update main
git checkout main
git pull origin main

# update target
git checkout branch-to-rebase
git pull origin branch-to-merge

# make backup
git checkout -b backup-branch-to-merge

# start rebasing
git checkout branch-to-merge
git rebase main branch-to-merge --interactive

# remove overlap
# delete the ones that are already in main
# save and quit 

# solve merge conflict
git add file
git rebase --continue

git push -f

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants