Reinforcement Learning Memory Error
4 次查看(过去 30 天)
显示 更早的评论
When I turn on the SaveExperienceBufferWithAgent, I get the following error:
Warning: Unable to save the agent to the directory "savedAgents". Increase the
disk space or check SaveAgentCriteriaValue in training options.
> In rl.train/TrainingManager/saveAgentToDisk (line 653)
In rl.train/TrainingManager/updateDisplaysFromTrainingInfo (line 717)
In rl.train/TrainingManager/update (line 147)
In rl.train.TrainingManager>@(info)update(this,info) (line 437)
In rl.train/Trainer/notifyEpisodeFinishedAndCheckStopTrain (line 56)
In rl.train.SeriesTrainer>iUpdateEpisodeFinished (line 31)
In rl.train.SeriesTrainer>@(src,ed)iUpdateEpisodeFinished(this,ed) (line 17)
In rl.env/AbstractEnv/notifyEpisodeFinished (line 324)
In rl.env.SimulinkEnvWithAgent.executeSimsWrapper/nestedSimFinishedBC (line 222)
In rl.env.SimulinkEnvWithAgent>@(src,ed)nestedSimFinishedBC(ed) (line 232)
In Simulink/SimulationManager/handleSimulationOutputAvailable
In Simulink.SimulationManager>@(varargin)obj.handleSimulationOutputAvailable(varargin{:})
In MultiSim.internal/SimulationRunnerSerial/executeImplSingle
In MultiSim.internal/SimulationRunnerSerial/executeImpl
In Simulink/SimulationManager/executeSims
In Simulink/SimulationManagerEngine/executeSims
In rl.env/SimulinkEnvWithAgent/executeSimsWrapper (line 244)
In rl.env/SimulinkEnvWithAgent/simWrapper (line 267)
In rl.env/SimulinkEnvWithAgent/simWithPolicyImpl (line 424)
In rl.env/AbstractEnv/simWithPolicy (line 82)
In rl.task/SeriesTrainTask/runImpl (line 33)
In rl.task/Task/run (line 21)
In rl.task/TaskSpec/internal_run (line 166)
In rl.task/TaskSpec/runDirect (line 170)
In rl.task/TaskSpec/runScalarTask (line 194)
In rl.task/TaskSpec/run (line 69)
In rl.train/SeriesTrainer/run (line 24)
In rl.train/TrainingManager/train (line 421)
In rl.train/TrainingManager/run (line 211)
In rl.agent.AbstractAgent/train (line 78)
The disk that I am running it on still has 100 GB of space. What is causing this issue?
5 个评论
回答(0 个)
另请参阅
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!