Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Major update #25

Open
wants to merge 98 commits into
base: master
Choose a base branch
from
Open

Major update #25

wants to merge 98 commits into from

Conversation

franckma31
Copy link
Collaborator

Major revision 1.0
Update parametrization with register_parametrization (replacing hook)
Unit testing update and common with deel-lip
Add new layers to support resnet like 1-Lipschitz NN
Add multiclass losses
Normalizers (Spectral and Bjorck): Updated with a stopping criterion based on an epsilon value (eps) rather than a fixed number of iterations
Bjorck parenthesis trick added (reduce computational complexity)
A new parameter disjoint_neurons is introduced in FrobeniusLinear to handle multiple output neurons (like in deel-lip library)

cofri and others added 30 commits June 14, 2022 11:15
Note that black hook was broken, requiring the update for this hook. In the
meantime, other hooks were updated too.
Moreover, some other files did not pass the "trailing spaces" hook. They were
fixed.
NumPy<1.22.0 has a vulnerability (https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-34141).
This is an opportunity to remove the dependency to a specific version and to
fetch the latest NumPy version.
Update versions (requirements, tox, pre-commit hooks)

Versions of packages in multiple files have been updated:
- NumPy 1.19.5 has a vulnerability and must be avoided (https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-34141). The fixed version in requirements.txt is then removed to fetch the latest NumPy version.
- The tox virtual environments were fetching different torch/torchvision versions from the ones in requirements.txt. These versions are now aligned in both files to torch==1.10.2+cu113 and torchvision==0.11.3+cu113.
- The pre-commit hooks have been updated to retrieve the latest versions up to date. Note that black hook 21.6b0 was broken: an update was mandatory.
A new parameter is introduced in FrobeniusLinear to handle multiple output
neurons. If disjoint_neurons is True, each output neuron is 1-Lipschitz,
imitating multiple networks with a single output. If disjoint_neurons is False,
a Frobenius normalization is performed on the whole matrix weight.
New parameter "disjoint_neurons" in FrobeniusLinear

So far, the FrobeniusLinear layer was offered to replace SpectralLinear when only one output neuron is used.
In this PR, a new parameter disjoint_neurons is introduced in FrobeniusLinear to handle multiple output neurons (like in deel-lip library):
- If disjoint_neurons = True, each output neuron is 1-Lipschitz. It is equivalent to have multiple networks with a single output neuron.
- If disjoint_neurons = False, a Frobenius normalization is performed on the whole matrix weight.
Keep the layer names when vanilla exporting
It does not depends on input shape anymore
Franck Mamalet and others added 26 commits November 27, 2024 17:31
…fle; modify pytest (warning only support 2D inputs, and single value kernel size)
Add several new layers to be able to support Resnet like architectures
@franckma31 franckma31 requested a review from cofri January 24, 2025 10:44
franckma31 and others added 2 commits February 3, 2025 11:43
Deprecated spectral_ and bjorck_ initialization
Update to remove deprecated function for initialization
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants