Brace indexing is not supported for variables of this type. in training multiagent

3 次查看(过去 30 天)
I am using multiagent by custom environemt. When i am trying to run the agent i got an error in training section.
`Error using rl.internal.train.OnlineMultiAgentSeriesTrainer/run_ (line 97)
Brace indexing is not supported for variables of this type.`
maxepisodes = 200;
maxsteps = 150;
trainingOptions = rlMultiAgentTrainingOptions(...
'MaxEpisodes',maxepisodes,...
'MaxStepsPerEpisode',maxsteps,...
'StopOnError','on',...
'Verbose',false,...
'Plots','training-progress',...
'StopTrainingCriteria','AverageReward',...
'StopTrainingValue',Inf,...
'ScoreAveragingWindowLength',10, ...
SaveAgentCriteria="AverageReward",SaveAgentValue=0.001 ...
);
%% Train agent
trainingStats = train([agent, agent1],env,trainingOptions);
my environment is class of
classdef intiail < rl.env.MultiAgentEnvironment

采纳的回答

Namnendra
Namnendra 2024-4-8
Hi Nasim,
The error you're encountering, `Brace indexing is not supported for variables of this type`, typically occurs when MATLAB expects an object or structure that supports brace indexing `{}`, but instead encounters a variable type that does not, such as a custom class or an unexpected data type. This can happen in various contexts, but in your case, it seems related to how the agents or the environment are being passed or handled within the training routine.
Given the information and the code snippet you've provided, let's go through some troubleshooting steps and considerations that might help resolve this issue:
1. Verify Agent and Environment Compatibility
Ensure that both your agents (`agent`, `agent1`) and your custom environment (`env`) are correctly implemented and compatible with the MATLAB Reinforcement Learning Toolbox's requirements for multi-agent training.
-Agents: Verify that both `agent` and `agent1` are instances of agents that are supported for multi-agent training (e.g., `rlDQNAgent`, `rlPGAgent`, etc.). Each agent should be properly configured with its observation and action specifications matching those defined in your environment.
- Environment: Your custom environment class `intiail` (assuming there's a typo and it should be `initial`) must correctly implement all the required methods (`reset`, `step`, `getObservationInfo`, `getActionInfo`) and properties as expected by the `rl.env.MultiAgentEnvironment` abstract class. Make sure that the `step` method returns observations, rewards, isDone flags, and any info as cell arrays, one cell per agent, if that's how your environment is supposed to work.
2. Check Environment and Agent Initialization
Before training, ensure that the environment and agents are initialized correctly and can interact with each other as expected. You can do this by manually calling the `step` and `reset` methods of your environment with appropriate actions for the agents and checking the outputs.
3. Debugging the Custom Environment
Given the error message, there's a possibility that the issue lies in how your custom environment handles indexing or returns its outputs. Specifically, MATLAB expects certain outputs to be in cell arrays or structures that support brace indexing when dealing with multiple agents.
- Step and Reset Methods: Double-check the `step` and `reset` methods of your custom environment. Ensure that for multiple agents, you're returning the observations, rewards, and done flags in cell arrays, where each cell corresponds to an agent.
4. Simplify to Identify the Issue
If possible, simplify your setup to isolate the issue. Start by training with only one agent in the environment to see if the problem persists. If it works with a single agent, the issue likely lies in how multiple agents are managed or how their outputs are handled.
The error suggests a mismatch in expected data types or structures, particularly related to how multiple agents' data is indexed or returned. Carefully review your custom environment's implementation, especially the output formats of key methods, and ensure compatibility with MATLAB's expectations for multi-agent setups.
I hope the above steps resolve the issue.
Thank you.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Environments 的更多信息

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by