-
-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementing RNN Structure #195
Comments
For example, you can use LSTM to encode data into the state. |
I was going to ask same question, but tankfully came across this and another discussion. I know that stable-baselines (v1) supported Also it seems that implementing LSTM in stable-baseline3 is non trivial according to this thread and this thread. So I was guessing what option I am left with. Can you please give me some article link explaining what you exactly meant by "LSTM to encode data into the state" in your last reply. Also I know FinRL is built atop stable-baselines3 which targets pytorch, whereas stable-baselines (v1) targets tensorflow. So I was guessing about interoperability of FinRL with stable-baselines (v1) as it has support for |
We previously spent less than 30 minutes to switch from tensorflow stable-baselines (v1) to Pytorch stable-baselines3. So, it will not take too long for you to switch back. |
Hello, I have the same problem. I also want to build an LSTM feature extractor on Sb3 for financial time series data, using mlppolicy strategy network. This feature extractor is an LSTM neural network. After reading the above discussion, I want to ask whether fin-rl now has an LSTM algorithm feature extractor. Thank you for your previous exploration. |
Apologies in advance for naivety... I wanted to know about the possibility of applying recurrent networks over the framework (e.g. using LSTM in training)
If this is possible, where / how would it be implemented?
Thanks in advance for any help
The text was updated successfully, but these errors were encountered: