Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak? #115

Open
josecannete opened this issue Sep 5, 2022 · 1 comment
Open

Memory leak? #115

josecannete opened this issue Sep 5, 2022 · 1 comment

Comments

@josecannete
Copy link

Hello, thank you so much for your work.

I'm trying to use the FlopCountAnalysis class but I'm not able to free the GPU memory used.

As a minimal example, without the FlopCountAnalysis I can do something like:

from transformers import AutoModel
import torch

model = AutoModel.from_pretrained('dccuchile/bert-base-spanish-wwm-uncased')
query = torch.randint(low=0, high=20, size=(8, 16))

print(torch.cuda.memory_allocated())

model.to("cuda:0")
query = query.to("cuda:0")

print(torch.cuda.memory_allocated())

del model
del query

print(torch.cuda.memory_allocated())

And that prints "0", "439937024", "0".

When using the FlopCountAnalysis:

from transformers import AutoModel
import torch
from fvcore.nn import FlopCountAnalysis

model = AutoModel.from_pretrained('dccuchile/bert-base-spanish-wwm-uncased')
query = torch.randint(low=0, high=20, size=(8, 16))

print(torch.cuda.memory_allocated())

model.to("cuda:0")
query = query.to("cuda:0")

print(torch.cuda.memory_allocated())

counter = FlopCountAnalysis(model, inputs=query)
total = counter.total()

print(torch.cuda.memory_allocated())

del model
del query
del counter
del total

print(torch.cuda.memory_allocated())

It shows "0", "439937024", "530033664", "530033664".
I expect the final memory allocated to be 0 again.

I also tried with:

gc.collect()
torch.cuda.empty_cache()

at the end, but the result was the same.

Is there a proper way to free the memory?

Thank you.

@abhishekaich27
Copy link

same issue. any inputs here @ppwwyyxx ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants