The repository contains a set of artificial intelligence/ machine learning algorithms in Fortran.
File | Description |
---|---|
knn_8.f90 | k-nearest neighbours regression algorithm employing the Euclidean distance |
knn_8_manhattan.f90 | k-nearest neighbours regression algorithm employing the Manhattan distance |
knn_evolutionary.f90 | Evolutionary algorithm finding the optimal k value for the k-nearest neighbours method |
recurrent_network.f90 | Recurrent neural network employing the Adam gradient descent optimisation algorithm |
rnn_att_3.f90 | Recurrent neural network employing the Adam gradient descent optimisation algorithm and element-wise dot-product attention |
rnn_mh_att_3.f90 | Recurrent neural network employing the Adam gradient descent optimisation algorithm and multi-head dot-product attention |
rnn_add_att_3.f90 | Recurrent neural network employing the Adam gradient descent optimisation algorithm and element-wise additive attention |
rnn_mh_sc_att_3.f90 | Recurrent neural network employing the Adam gradient descent optimisation algorithm and scaled multi-head dot-product attention |
linear_3.f90 | 3-neuron linear network employing the Adam gradient descent optimisation algorithm |
linear_3_t_test.f90 | 3-neuron linear network employing the Adam gradient descent optimisation algorithm; t-test |
linear_3_nadam.f90 | 3-neuron linear network employing the Nadam gradient descent optimisation algorithm |
linear_8_att.f90 | 8-neuron linear network employing the Adam gradient descent optimisation algorithm and element-wise dot-product attention |
linear_8_mh_att.f90 | 8-neuron linear network employing the Adam gradient descent optimisation algorithm and multi-head attention |
sompr_5.f90 | Second order multivariate polynomial regression employing the Adam gradient descent optimisation algorithm |
gelu_att.f90 | GELU activation function, Adam gradient descent optimisation algorithm, element-wise dot-product attention |
mish_att.f90 | Mish activation function, Adam gradient descent optimisation algorithm, element-wise dot-product attention |
smish_att.f90 | Smish activation function, Adam gradient descent optimisation algorithm, element-wise dot-product attention |
gms_att.f90 | GELU + Mish + Smish activation functions, Adam gradient descent optimisation algorithm, element-wise dot-product attention |