matlab强化学习多维离散动作如何创建
12 views (last 30 days)
Show older comments
action1_values = 0:1:40;
action2_values = 0:1:40;
[action1, action2] = ndgrid(action1_values, action2_values);
discreteActions = [action1(:), action2(:)];
actionCellArray = num2cell(discreteActions, 2);
actInfo = rlFiniteSetSpec(actionCellArray);
上述为我创建的代码 是生成一个1681个element的cell 但是我对此疑惑的是 agent是否能像连续的动作空间那样知道如何通过改变数字大小来进行学习
2 Comments
Mr. Pavl M.
on 24 Oct 2024
Translated:
Question Title:
How to create multi-dimensional discrete actions for matlab reinforcement learning?
Question body:
The code created for me above generates a cell of 1681 elements. But what I wonder about this is whether the agent can know how to learn by changing the size of the numbers like a continuous action space.
Answer to the question:
Suppose it learns in grid by making steps, rememorizing in vicinity of current step and the learning steps are connected as Markov Chain.
What is the motivation from real world trials to tend to continuous space?
In MPPL NCE TCE and computerized environment it is numeric and discrete, continuous can be approximated by discrete with sampling_step(grid_res) -> 0 with continuous to discrete approx. error depends on agent learning process and learning environment, that's why I added the for loop, also continuous can be simulated by symbolic math.
What are the Objective, fitness function of the desired learning detailed formulation in symbolic or numeric?
Have you tried next:
clc
clear all
close all
Grid_dim = 40;
Grid_step = 0.1;
start = 0
for grid_res = 0.0000001:0.0000001:0.5
%Action space:
action1_values = start:grid_res:Grid_dim;
action2_values = start:grid_res:Grid_dim;
[action1, action2] = ndgrid(action1_values, action2_values);
discreteActions = [action1(:), action2(:)];
actionCellArray = num2cell(discreteActions, 2);
actInfo = rlFiniteSetSpec(actionCellArray);
%Learning environment space:
%Agent Learning logic code:
%Fitness function:
end
% ...
Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!