Skip to content

Commit

Permalink
update URL github.com/deepmind -> github.com/google-deepmind and bran…
Browse files Browse the repository at this point in the history
…ch to main

PiperOrigin-RevId: 598769941
  • Loading branch information
fabianp authored and OptaxDev committed Jan 16, 2024
1 parent 35c719f commit 1be84ff
Show file tree
Hide file tree
Showing 3 changed files with 25 additions and 20 deletions.
22 changes: 11 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Optax

![CI status](https://github.com/deepmind/optax/workflows/tests/badge.svg)
![CI status](https://github.com/google-deepmind/optax/workflows/tests/badge.svg)
[![Documentation Status](https://readthedocs.org/projects/optax/badge/?version=latest)](http://optax.readthedocs.io)
![pypi](https://img.shields.io/pypi/v/optax)

Expand Down Expand Up @@ -41,7 +41,7 @@ pip install optax
or you can install the latest development version from GitHub:

```sh
pip install git+https://github.com/deepmind/optax.git
pip install git+https://github.com/google-deepmind/optax.git
```

## Quickstart
Expand Down Expand Up @@ -77,7 +77,7 @@ updates, opt_state = optimizer.update(grads, opt_state)
params = optax.apply_updates(params, updates)
```

You can continue the quick start in [the Optax quickstart notebook.](https://github.com/deepmind/optax/blob/main/examples/quick_start.ipynb)
You can continue the quick start in [the Optax quickstart notebook.](https://github.com/google-deepmind/optax/blob/main/examples/quick_start.ipynb)


## Components
Expand All @@ -86,7 +86,7 @@ We refer to the [docs](https://optax.readthedocs.io/en/latest/index.html)
for a detailed list of available Optax components. Here, we highlight
the main categories of building blocks provided by Optax.

### Gradient Transformations ([transform.py](https://github.com/deepmind/optax/blob/main/optax/_src/transform.py))
### Gradient Transformations ([transform.py](https://github.com/google-deepmind/optax/blob/main/optax/_src/transform.py))

One of the key building blocks of `optax` is a `GradientTransformation`.

Expand All @@ -107,7 +107,7 @@ state = tx.init(params) # init stats
grads, state = tx.update(grads, state, params) # transform & update stats.
```

### Composing Gradient Transformations ([combine.py](https://github.com/deepmind/optax/blob/main/optax/_src/combine.py))
### Composing Gradient Transformations ([combine.py](https://github.com/google-deepmind/optax/blob/main/optax/_src/combine.py))

The fact that transformations take candidate gradients as input and return
processed gradients as output (in contrast to returning the updated parameters)
Expand All @@ -127,7 +127,7 @@ my_optimiser = chain(
scale(-learning_rate))
```

### Wrapping Gradient Transformations ([wrappers.py](https://github.com/deepmind/optax/blob/main/optax/_src/wrappers.py))
### Wrapping Gradient Transformations ([wrappers.py](https://github.com/google-deepmind/optax/blob/main/optax/_src/wrappers.py))

Optax also provides several wrappers that take a `GradientTransformation` as
input and return a new `GradientTransformation` that modifies the behaviour
Expand All @@ -148,7 +148,7 @@ Other examples of wrappers include accumulating gradients over multiple steps
or applying the inner transformation only to specific parameters or at
specific steps.

### Schedules ([schedule.py](https://github.com/deepmind/optax/blob/main/optax/_src/schedule.py))
### Schedules ([schedule.py](https://github.com/google-deepmind/optax/blob/main/optax/_src/schedule.py))

Many popular transformations use time-dependent components, e.g. to anneal
some hyper-parameter (e.g. the learning rate). Optax provides for this purpose
Expand Down Expand Up @@ -176,7 +176,7 @@ optimiser = chain(
scale_by_schedule(schedule_fn))
```

### Popular optimisers ([alias.py](https://github.com/deepmind/optax/blob/main/optax/_src/alias.py))
### Popular optimisers ([alias.py](https://github.com/google-deepmind/optax/blob/main/optax/_src/alias.py))

In addition to the low-level building blocks, we also provide aliases for popular
optimisers built using these components (e.g. RMSProp, Adam, AdamW, etc, ...).
Expand All @@ -192,7 +192,7 @@ def adamw(learning_rate, b1, b2, eps, weight_decay):
scale_and_decay(-learning_rate, weight_decay=weight_decay))
```

### Applying updates ([update.py](https://github.com/deepmind/optax/blob/main/optax/_src/update.py))
### Applying updates ([update.py](https://github.com/google-deepmind/optax/blob/main/optax/_src/update.py))

After transforming an update using a `GradientTransformation` or any custom
manipulation of the update, you will typically apply the update to a set
Expand All @@ -212,7 +212,7 @@ critical to support composing a sequence of transformations (e.g. `chain`), as
well as combining multiple updates to the same parameters (e.g. in multi-task
settings where different tasks need different sets of gradient transformations).

### Losses ([loss.py](https://github.com/google-deepmind/optax/tree/master/optax/losses))
### Losses ([loss.py](https://github.com/google-deepmind/optax/tree/main/optax/losses))

Optax provides a number of standard losses used in deep learning, such as
`l2_loss`, `softmax_cross_entropy`, `cosine_distance`, etc.
Expand All @@ -229,7 +229,7 @@ avg_loss = jnp.mean(huber_loss(predictions, targets))
sum_loss = jnp.sum(huber_loss(predictions, targets))
```

### Second Order ([second_order.py](https://github.com/google-deepmind/optax/tree/master/optax/second_order))
### Second Order ([second_order.py](https://github.com/google-deepmind/optax/tree/main/optax/second_order))

Computing the Hessian or Fisher information matrices for neural networks is
typically intractable due to the quadratic memory requirements. Solving for the
Expand Down
11 changes: 8 additions & 3 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -263,9 +263,14 @@ def linkcode_resolve(domain, info):
return None

# TODO(slebedev): support tags after we release an initial version.
return 'https://github.com/deepmind/optax/tree/master/optax/%s#L%d#L%d' % (
os.path.relpath(filename, start=os.path.dirname(
optax.__file__)), lineno, lineno + len(source) - 1)
return (
'https://github.com/google-deepmind/optax/tree/main/optax/%s#L%d#L%d'
% (
os.path.relpath(filename, start=os.path.dirname(optax.__file__)),
lineno,
lineno + len(source) - 1,
)
)


# -- Intersphinx configuration -----------------------------------------------
Expand Down
12 changes: 6 additions & 6 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
:github_url: https://github.com/deepmind/optax/tree/master/docs
:github_url: https://github.com/google-deepmind/optax/tree/main/docs

Optax
-----
Expand Down Expand Up @@ -34,7 +34,7 @@ can be used to obtain the most recent version of Optax::
pip install git+git://github.com/google-deepmind/optax.git

Note that Optax is built on top of JAX.
See `here <https://github.com/google/jax#pip-installation-cpu>`_
See `here <https://github.com/google/jax?tab=readme-ov-file#installation>`_
for instructions on installing JAX.


Expand Down Expand Up @@ -86,19 +86,19 @@ The Team
The development of Optax is led by Ross Hemsley, Matteo Hessel, Markus Kunesch
and Iurii Kemaev. The team relies on outstanding contributions from Research
Engineers and Research Scientists from throughout
`DeepMind <https://github.com/deepmind/jax/blob/main/deepmind2020jax.txt>`_ and
Alphabet. We are also very grateful to Optax's open source community for
`Google DeepMind <https://deepmind.google/discover/blog/using-jax-to-accelerate-our-research/>`_
and Alphabet. We are also very grateful to Optax's open source community for
contributing ideas, bug fixes, issues, design docs, and amazing new features.

The work on Optax is part of a wider effort to contribute to making the
`JAX Ecosystem <https://github.com/deepmind/jax/blob/main/deepmind2020jax.txt>`_
`JAX Ecosystem <https://deepmind.google/discover/blog/using-jax-to-accelerate-our-research/>`_
the best possible environment for ML/AI research.

Support
-------

If you are having issues, please let us know by filing an issue on our
`issue tracker <https://github.com/deepmind/optax/issues>`_.
`issue tracker <https://github.com/google-deepmind/optax/issues>`_.


License
Expand Down

0 comments on commit 1be84ff

Please sign in to comment.