ExperienceBuffer has 0 Length when i load a saved agent and continue training in reinforcement training
3 次查看(过去 30 天)
显示 更早的评论
Hi all,
I'm trying to train a saved agent further. In the training option of this saved agent, the SaveExperienceBufferWithAgent is set to true. But when I load the saved_agent and open the property ExperienceBuffer I noticed the Length is 0. I tried to look in the documentation of such property but the there is no information on it. If I stop a training and directly check the property "Length" of the agent in the workspace, it has some value.
My question would be what does this "Length" mean? If it's 0, when I perform training further with a saved agent like in https://de.mathworks.com/matlabcentral/answers/495436-how-to-train-further-a-previously-trained-agent?s_tid=answers_rc1-2_p2_MLT , does it really continue training with saved agent and with saved expeirence buffer?

Yours
0 个评论
采纳的回答
Takeshi Takahashi
2021-4-20
Length 0 means there isn't any experience in this buffer. I think it didn't save the experience buffer due to this bug. Please set agent.AgentOptions.SaveExperienceBufferWithAgent = true immediately before saving the agent.
2 个评论
Dmitriy Ogureckiy
2023-1-12
Can I ask you, does networks weights saved when agent saved between simulations?
更多回答(0 个)
另请参阅
类别
在 Help Center 和 File Exchange 中查找有关 Introduction to Installation and Licensing 的更多信息
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!