Load a pretrained neural network object in rlNeuralNe​tworkEnvir​onment

5 次查看(过去 30 天)
Hi,
I want to train an RL MBPO Agent that samples from a model. The model is a trained DL object, trained in matlab. I am wondering how I can load its weights inside the env object. The examples for rlNeuralNetworkEnvironment can be used to define a network structure but I would like to add my weights to this?
Best Regards,
Vasu

回答(1 个)

Emmanouil Tzorakoleftherakis
Hi Vasu,
You can use a pretrained environment model with MBPO agent as follows:
1) Create a rlContinuousDeterministicTransitionFunction with the trained dlnet if it is deterministic or rlContinuousGaussianTransitionFunction if it is stochastic (mean heads and std heads).
2) After that, you need to create rlNeuralNetworkEnvironment with newly defined function from 1.
3) Create MBPO agent.
4) Set LearnRate = 0 in TransitionOptimizerOptions in rlMBPOAgentOptions to avoid updating the models during training.
Hope this helps

类别

Help CenterFile Exchange 中查找有关 Training and Simulation 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by