Warning
Currently heavy work in progress, please check out RecurrentLayers.jl for a more polished recurrent layers library.
LuxRecurrentLayers.jl extends Lux.jl recurrent layers offering by providing implementations of additional recurrent layers not available in base deep learning libraries.
LuxRecurrentLayers.jl is not on the general registry yet! To install it please use:
julia> ]
Pkg> add https://github.com/MartinuzziFrancesco/LuxRecurrentLayers.jl
using Lux, LuxRecurrentLayers, Random
# Seeding
rng = Random.default_rng()
Random.seed!(rng, 0)
# Define the recurrent model (a cell in this case)
rnn = AntisymmetricRNNCell(3=>5)
# Get parameters and states
ps, st = Lux.setup(rng, rnn)
# Random input
inp = rand(Float32, 3)
# Forward pass with random input
output, st = Lux.apply(rnn, inp, ps, st)
RecurrentLayers.jl: Equivalent library, providing recurrent layers for Flux.jl.
ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained neural networks.