Replies: 5 comments
-
>>> mrthorstenm |
Beta Was this translation helpful? Give feedback.
-
>>> brian2 |
Beta Was this translation helpful? Give feedback.
-
>>> mrthorstenm |
Beta Was this translation helpful? Give feedback.
-
>>> brian2 |
Beta Was this translation helpful? Give feedback.
-
>>> mrthorstenm |
Beta Was this translation helpful? Give feedback.
-
>>> brian2
[April 29, 2020, 2:05pm]
I have a GPU with 6GB memory, and received out of memory error. I
reduced the batch size in the config file by
import json
from utils.generic_utils import load_config
CONFIG = load_config('config.json')
CONFIG['datasets'][0]['path'] = '../LJSpeech-1.1/'
CONFIG['output_path'] = '../'
CONFIG['batch_size'] = 8
CONFIG['eval_batch_size'] = 8
with open('config.json', 'w') as fp:
json.dump(CONFIG, fp)
but still, I get the memory error
RuntimeError: CUDA out of memory. Tried to allocate 106.00 MiB (GPU 0; 5.93 GiB total capacity; 3.20 GiB already allocated; 25.44 MiB free; 3.47 GiB reserved in total by PyTorch)
[This is an archived TTS discussion thread from discourse.mozilla.org/t/how-to-change-batch-size]
Beta Was this translation helpful? Give feedback.
All reactions