Putting symbolic array into matrix

3 次查看(过去 30 天)
Hello,
I am trying to take a symbolic variable and subtract it from every value in an array (of some 1xn size) and then solve for this unknown variable later, but I am having trouble seemingly getting the symbolic array to nest correctly.
Here is an example of what I am trying to do,
syms q0
Vy=[300 500 300]
q(1,:)=[q0-(Vy.*2.*4./8)]
After this, q(i,:) will be in a for loop that will use this symbolic variable.
I keep getting an error and I dont know how to modify this in a way where I can still solve for q0 later on.
Thanks for the help!!!

采纳的回答

Walter Roberson
Walter Roberson 2022-4-29
That code works in the form you posted.
syms q0
Vy=[300 500 300]
Vy = 1×3
300 500 300
q(1,:)=[q0-(Vy.*2.*4./8)]
q = 
The most common mistake for something like that would have been to initialize q using zeros(), such as
q = zeros(5,3);
If you had initialized q to double precision, then the assigning into q(1,:) would fail because what you are assigning needs to contain a symbolic variable.
The cure for that would be to use something like
q = zeros(5, 3, 'sym');
for new enough versions of MATLAB, or
q = sym(zeros(5,3));
if your MATLAB is older than that.

更多回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Assumptions 的更多信息

产品


版本

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by