Skip to content

facebookresearch/optimizers

Folders and files

NameName
Last commit message
Last commit date
Dec 19, 2024
Jan 3, 2025
Dec 19, 2024
Oct 28, 2024
Nov 26, 2024
May 28, 2022
Nov 26, 2024
May 28, 2022
Nov 11, 2024
Dec 12, 2024
Nov 19, 2024
Nov 14, 2024
Dec 17, 2024
Dec 17, 2024
Nov 13, 2024
Nov 26, 2024

Repository files navigation

Optimizers

Python 3.10 | 3.11 | 3.12 tests gpu-tests linting formatting type-checking examples

Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.

Description

Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.

Currently includes the optimizers:

  • Distributed Shampoo

See the CONTRIBUTING file for how to help out.

License

Optimizers is BSD licensed, as found in the LICENSE file.

Installation and Dependencies

This code requires python>=3.10 and torch>=2.5.0. Install distributed_shampoo with all dependencies:

git clone git@github.com:facebookresearch/optimizers.git
cd optimizers
pip install .

If you also want to try the examples, replace the last line with pip install ".[examples]".

Usage

After installation, basic usage looks like:

import torch
from distributed_shampoo import AdamGraftingConfig, DistributedShampoo

model = ...  # Instantiate model

optim = DistributedShampoo(
    model.parameters(),
    lr=1e-3,
    betas=(0.9, 0.999),
    epsilon=1e-8,
    grafting_config=AdamGraftingConfig(
        beta2=0.999,
        epsilon=1e-8,
    ),
)

For more, please see the additional documentation here and especially the How to Use section.