Environment for Reinforcement Learning Project

6 次查看(过去 30 天)
Hi everyone!
I'm currently looking to work on a small Reinforcement Learning project. Friends have reccomended the OpenAI Gym (https://gym.openai.com/envs/#classic_control) where they provide many classical/non-classical control environments where one can apply reiforcement learning rules. However, these are based on python. Being a MatLab user myself, I was wondering wheter anyone knew something like OpenAI where I can download an environment (I'm interested in the Lunar Lander env, but it's not a strong preference) where I can apply RL rules easily.
I'd appreciate any tips!

采纳的回答

Emmanouil Tzorakoleftherakis
编辑:Emmanouil Tzorakoleftherakis 2020-7-21
Hello,
We are working on providing an interface between OpenAI Gym and Reinforcement Learning Toolbox but this will take some more time. In the meantime, you could use community posts like this one to get an idea of how this could be accomplished. I have not personally tried the code in the link above, but seems like it is along the lines of what you were looking for.
Hope that helps.
  2 个评论
John Adams
John Adams 2021-11-29
Hi Emmanouil,
When will this interface be ready?
I am currently trying to interface using the link you posted above and it works fine for discrete action problems as in the example in the link using "this.open_env.step(int16(Action));" for the discrete cart pole problem. However for the continuous cart problem I get the following error when calling the step function [this.open_env.step(double(Action));] :--
Python Error: TypeError: 'float' object is not subscriptable
How can this problem be avoided?
Thx!
Alberto Tellaeche
Alberto Tellaeche 2023-2-20
The same problem here....when actions are continuous, the "object is not subscriptable problem appears, no matter you use a 'float' or cast the data to 'single', the error remains the same.
Thank you,

请先登录,再进行评论。

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Introduction to Installation and Licensing 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by