You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The higher vertical levels show poor skill during inference/forecasting. This is a common issue and is often solved by using a weighting scheme during training. I suggest to downweight the highest layers. But of course one can also try the opposite and emphasize the models attention (weights) in these upper layers.
change the weights in the constants.py script, currently they are all set to one
- Added linting and formatting using pre-commits
- Trigger pre-commit on main-branch push
- Change line length to 80
- Fixing existing formatting issues
- Added section about development with pre-commits
---------
Co-authored-by: joeloskarsson <[email protected]>
The higher vertical levels show poor skill during inference/forecasting. This is a common issue and is often solved by using a weighting scheme during training. I suggest to downweight the highest layers. But of course one can also try the opposite and emphasize the models attention (weights) in these upper layers.
constants.py
script, currently they are all set to oneThe text was updated successfully, but these errors were encountered: