Multi agent reinforcement learning for gain tuning power electronics
2 个评论
回答(1 个)
Hi @Muhammad,
Upon reviewing the code, it appears that the specifications for observations and actions are created using rlNumericSpec, which is indeed a valid class for defining observation and action spaces. However, the error message indicates a mismatch in the expected input types. The issue may arise from the way the observationInfo and actionInfo are being passed to the rlSimulinkEnv function. To resolve this issue, ensure that the observationInfo and actionInfo are explicitly defined as cell arrays of rl.util.RLDataSpec objects. The current code already does this correctly, but it is essential to verify that the rlNumericSpec objects are indeed being recognized as rl.util.RLDataSpec objects.Here is a revised version of the code with additional checks:
% Load the Simulink model mdl = 'RL_two_agents.slx'; open_system(mdl);
% I/O specifications for agent A obsInfo1 = rlNumericSpec([3, 1]); obsInfo1.Name = 'observations1'; obsInfo1.Description = 'a, b, c'; % Add space after commas for clarity actInfo1 = rlNumericSpec([1, 1], 'LowerLimit', 1, 'UpperLimit', 20); actInfo1.Name = 'gain_1';
% I/O specifications for agent B obsInfo2 = rlNumericSpec([3, 1]); obsInfo2.Name = 'observations2'; obsInfo2.Description = 'a2, b2, c2'; % Add space after commas for clarity actInfo2 = rlNumericSpec([1, 1], 'LowerLimit', 0.01, 'UpperLimit', 1); actInfo2.Name = 'gain_2';
% Combine observation and action specs into cell arrays observationInfo = {obsInfo1, obsInfo2}; % Cell array of observation info actionInfo = {actInfo1, actInfo2}; % Cell array of action info
% Create the reinforcement learning environment env = rlSimulinkEnv(mdl, ... {'RL_two_agents/RL Agent1', 'RL_two_agents/RL Agent2'}, ... observationInfo, actionInfo); % Explicitly pass as cell arrays
% Set the reset function env.ResetFcn = @(in)localResetFcn(in);
Make sure that the agent block paths are passed as a cell array in the rlSimulinkEnv function and verify that the observationInfo and actionInfo are indeed recognized as cell arrays of rl.util.RLDataSpec objects.
Hope this helps.
11 个评论
Hi @Muhammad,
This error typically arises when the Items property of a UI control (like a dropdown or list box) is not being set correctly. In MATLAB, the Items property must be defined as either a 1-by-N cell array of character vectors or as a string array. Here's how you can address this issue: Ensure that the data you're trying to assign to the Items property is formatted correctly. For example:
Cell Array
app.DropDown.Items = {'Item1', 'Item2', 'Item3'};
String Array
app.DropDown.Items = ["Item1", "Item2", "Item3"];
Verify the source of the data being assigned to Items. If it is being generated dynamically (e.g., from a simulation), print it out before assignment to confirm its structure. If you are using a function to generate this list, ensure that it returns the correct format. Here is an example of how to correctly set up a dropdown in your App:
function startupFcn(app) % Example items for the dropdown items = {'Agent1', 'Agent2', 'Agent3'}; app.DropDown.Items = items; % Assigning items correctly end
If you are setting Items within a callback function, make sure that it is being executed after all necessary variables are initialized and available.
Hope this helps.
另请参阅
类别
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!