How can I use RL Agent with PX4 Host Target?
2 次查看(过去 30 天)
显示 更早的评论
Hi, I have a question related with Pixhawk and Reinforcement Learning Toolbox. I wanna use PX4 Host Target to implement RL training algorithm before deployin the RL algorithm into the real Pixhawk board. But, when I started training MATLAB shows me that following statements. How can I use RL Agent and PX4 Host Target together?
-----------------------------------------------------------------------------------------------------------------------
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.PWM System object has private or protected properties, but does not implement both the
saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Subscriber System object has private or protected properties, but does not implement both
the saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
Warning: The px4.internal.block.Publisher System object has private or protected properties, but does not implement both the
saveObjectImpl and loadObjectImpl methods. The save, load, and clone methods may not copy the object exactly when it is
locked.
-----------------------------------------------------------------------------------------------------------------------
5 个评论
Ankur Bose
2022-5-20
I dont think these warnings are responsible for RL issue you are facing. I suggest reach out to MathWorks Tech Suport https://www.mathworks.com/support/contact_us.html
回答(1 个)
Ankur Bose
2023-1-24
Manually closing this question as user has been recommended to reach out to MathWorks Tech Support
0 个评论
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!