Popular repositories Loading
-
lumi-llm-scaling
lumi-llm-scaling PublicForked from spyysalo/lumi-llm-scaling
Scripts and documentation on scaling large language model training on the LUMI supercomputer
Shell
-
lm-evaluation-harness
lm-evaluation-harness PublicForked from EleutherAI/lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
Python 1
-
-
Megatron-DeepSpeed
Megatron-DeepSpeed PublicForked from deepspeedai/Megatron-DeepSpeed
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Python 1
-
If the problem persists, check the GitHub status page or contact support.