Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add gradient_descent algorithm #580

Merged
merged 3 commits into from
Oct 16, 2023

Conversation

Navaneeth-Sharma
Copy link
Contributor

@Navaneeth-Sharma Navaneeth-Sharma commented Oct 16, 2023

Description

This code contribution introduces a Rust function for implementing gradient descent optimization. Gradient descent is an iterative algorithm used to minimize a mathematical function. This function is designed to work with both univariate (single numeric value) and multivariate (multiple numeric values) functions represented as f64 data types.

Inside the function, it repeatedly updates the values in the x vector in a direction that minimizes the function's value. The update equation for each value in x is given as: x_{k+1} = x_k - learning_rate * gradient_of_function(x_k).

To ensure the correctness of the optimization process, the code includes test cases that check whether the final optimized vector matches the expected results, with a specified tolerance for small differences. This provides a way to validate that the optimization process is functioning correctly when applied to various functions. Covers #578

Copy link
Member

@siriak siriak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks!

@siriak siriak merged commit ecafde6 into TheAlgorithms:master Oct 16, 2023
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants