Main Content

Results for

Get ready to roll up your sleeves at MATLAB EXPO 2025 – our global online event is back, and this year we’re offering 10 hands-on workshops designed to spark innovation and deepen your skills with MATLAB Online and Simulink Online.
Whether you're exploring AI, modeling batteries, or building carbon trackers, these live workshops are your chance to:
  • Work directly in MATLAB and Simulink Online
  • Solve real-world challenges with guidance from MathWorks experts
  • Connect with peers across industries
  • Ask questions and get live feedback
Join the Experience to learn more about each workshop below!
Which workshop are you most excited to attend?!
Day 1:
  • Beyond the Labels: Leveraging AI Techniques for Enlightened Product Choices
  • A Hands-On Introduction to Reinforcement Learning with MATLAB and Simulink
  • Curriculum Development with MATLAB Copilot and Generative AI
  • Simscape Battery Workshop
  • Generating Tests for your MATLAB code
Day 2:
  • Hands-On AI for Smart Appliances: From Sensor Data to Embedded Code
  • A Hands-On Introduction to Reduced Order Modeling with MATLAB and Simulink
  • Introduction to Research Software and Development with Simulink
  • Hack Your Carbon Impact: Build and Publish an Emissions Tracker with MATLAB
  • How to Simulate Scalable Cellular and Connectivity Networks: A Hands-On Session
We look forward to Accelerating the Pace of Engineering and Science together!
It’s an honor to deliver the keynote at MATLAB EXPO 2025. I'll explore how AI changes the game in engineered systems, bringing intelligence to every step of the process from design to deployment. This short video captures a glimpse of what I’ll share:
What excites or challenges you about this shift? Drop a comment or start a thread!
I'm working on training neural networks without backpropagation / automatic differentiation, using locally derived analytic forms of update rules. Given that this allows a direct formula to be derived for the update rule, it removes alot of the overhead that is otherwise required from automatic differentiation.
However, matlab's functionalities for neural networks are currently solely based around backpropagation and automatic differentiation, such as the dlgradient function and requiring everything to be dlarrays during training.
I have two main requests, specifically for functions that perform a single operation within a single layer of a neural network, such as "dlconv", "fullyconnect", "maxpool", "avgpool", "relu", etc:
  • these functions should also allow normal gpuArray data instead of requiring everything to be dlarrays.
  • these functions are currently designed to only perform the forward pass. I request that these also be designed to perform the backward pass if user requests. There can be another input user flag that can be "forward" (default) or "backward", and then the function should have all the necessary inputs to perform that operation (e.g. for "avgpool" forward pass it only needs the avgpool input data and the avgpool parameters, but for the "avgpool" backward pass it needs the deriviative w.r.t. the avgpool output data, the avgpool parameters, and the original data dimensions). I know that there is a maxunpool function that achieves this for maxpool, but it has significant issues when trying to use it this way instead of by backpropagation in a dlgradient type layer, see (https://www.mathworks.com/matlabcentral/answers/2179587-making-a-custom-way-to-train-cnns-and-i-am-noticing-that-avgpool-is-significantly-faster-than-maxpo?s_tid=srchtitle).
I don't know how many people would benefit from this feature, and someone could always spend their time creating these functionalities themselves by matlab scripts, cuDNN mex, etc., but regardless it would be nice for matlab to have this allowable for more customizable neural net training.
Arkadiy Turevskiy
Arkadiy Turevskiy
Last activity on 15 Oct 2025 at 16:08

Please share with us how you are using AI in your control design workflows and what you want to hear most in our upcoming talk, 4 Ways to Improve Control Design Workflows with AI.
Arkadiy
Hello Everyone, I’m Vikram Kumar Singh, and I’m excited to be part of this amazing MATLAB community!
I’m deeply interested in learning more from all of you and contributing wherever I can. Recently, I completed a project on modeling and simulation of a Li-ion battery with a Battery Management System (BMS) for fault detection and management.
I’d love to share my learnings and also explore new ideas together with this group. Looking forward to connecting and growing with the community!
Excited for MATLAB EXPO 2025!
I’m a Master’s student in Electrical Engineering at UNSW Sydney, researching EV fleet charging and hybrid energy strategies integrating battery-electric and hydrogen fuel cell vehicles.
LinkedIn link: www.linkedin.com/in/yuanzhe-chen-6b2158351
ResearchGate link: https://www.researchgate.net/profile/Yuanzhe-Chen-9?ev=hdr_xprf
#MATLABEXPO #EV #FCEV #SmartGrid
Inspired by @xingxingcui's post about old MATLAB versions and @유장's post about an old Easter egg, I thought it might be fun to share some MATLAB-Old-Timer Stories™.
Back in the early 90s, MATLAB had been ported to MacOS, but there were some interesting wrinkles. One that kept me earning my money as a computer lab tutor was that MATLAB required file names to follow Windows standards - no spaces or other special characters. But on a Mac, nothing stopped you from naming your script "hello world - 123.m". The problem came when you tried to run it. MATLAB was essentially doing an eval on the script name, assuming the file name would follow Windows (and MATLAB) naming rules.
So now imagine a lab full of students taking a university course. As is common in many universities, the course was given a numeric code. For whatever historical reason, my school at that time was also using numeric codes for the departments. Despite being told the rules for naming scripts, many students would default to something like "26.165 - 1.1" for problem one on HW1 for the intro applied math course 26.165.
No matter what they did in their script, when they ran it, MATLAB would just say "ans = 25.0650".
Nothing brings you more MATLAB-god credibility as a student tutor than walking over to someone's computer, taking one look at their output, saying "rename your file", and walking away like a boss.
It was 2010 when I was a sophomore in university. I chose to learn MATLAB because of a mathematical modeling competition, and the university provided MATLAB 7.0, a very classic release. To get started, I borrowed many MATLAB books from the library and began by learning simple numerical calculations, plotting, and solving equations. Gradually I was drawn in by MATLAB’s powerful capabilities and became interested; I often used it as a big calculator for fun. That version didn’t have MATLAB Live Script; instead it used MATLAB Notebook (M-Book), which allowed MATLAB functions to be used directly within Microsoft Word, and it also had the Symbolic Math Toolbox’s MuPAD interactive environment. These were later gradually replaced by Live Scripts introduced in R2016a. There are many similar examples...
Out of curiosity, I still have screenshots on my computer showing MATLAB 7.0 running compatibly. I’d love to hear your thoughts?
Edit 15 Oct 2025: Removed incorrect code. Replaced symmatrix2sym and symfunmatrix2symfun with sym and symfun respectively (latter supported as of 2024b).
The Symbolic Math Toolbox does not have its own dot and and cross functions. That's o.k. (maybe) for garden variety vectors of sym objects where those operations get shipped off to the base Matlab functions
x = sym('x',[3,1]); y = sym('y',[3,1]);
which dot(x,y)
/MATLAB/toolbox/matlab/specfun/dot.m
dot(x,y)
ans = 
which cross(x,y)
/MATLAB/toolbox/matlab/specfun/cross.m
cross(x,y)
ans = 
But now we have symmatrix et. al., and things don't work as nicely
clearvars
x = symmatrix('x',[3,1]); y = symmatrix('y',[3,1]);
z = symmatrix('z',[1,1]);
sympref('AbbreviateOutput',false);
dot() expands the result, which isn't really desirable for exposition.
eqn = z == dot(x,y)
eqn = 
Also, dot() returns the the result in terms of the conjugate of x, which can't be simplifed away at the symmatrix level
assumeAlso(sym(x),'real')
class(eqn)
ans = 'symmatrix'
try
eqn = z == simplify(dot(x,y))
catch ME
ME.message
end
ans = 'Undefined function 'simplify' for input arguments of type 'symmatrix'.'
To get rid of the conjugate, we have to resort to sym
eqn = simplify(sym(eqn))
eqn = 
but again we are in expanded form, which defeats the purpose of symmatrix (et. al.)
But at least we can do this to get a nice equation
eqn = z == x.'*y
eqn = 
dot errors with symfunmatrix inputs
clearvars
syms t real
x = symfunmatrix('x(t)',t,[3,1]); y = symfunmatrix('y(t)',t,[3,1]);
try
dot(x,y)
catch ME
ME.message
end
ans = 'Invalid argument at position 2. Symbolic function is evaluated at the input arguments and does not accept colon indexing. Instead, use FORMULA on the function and perform colon indexing on the returned output.'
Cross works (accidentally IMO) with symmatrix, but expands the result, which isn't really desirable for exposition
clearvars
x = symmatrix('x',[3,1]); y = symmatrix('y',[3,1]);
z = symmatrix('z',[3,1]);
eqn = z == cross(x,y)
eqn = 
And it doesn't work at all if an input is a symfunmatrix
syms t
w = symfunmatrix('w(t)',t,[3,1]);
try
eqn = z == cross(x,w);
catch ME
ME.message
end
ans = 'A and B must be of length 3 in the dimension in which the cross product is taken.'
In the latter case we can expand with
eqn = z == cross(sym(x),symfun(w)) % x has to be converted
eqn(t) = 
But we can't do the same with dot (as shown above, dot doesn't like symfun inputs)
try
eqn = z == dot(sym(x),symfun(w))
catch ME
ME.message
end
ans = 'Invalid argument at position 2. Symbolic function is evaluated at the input arguments and does not accept colon indexing. Instead, use FORMULA on the function and perform colon indexing on the returned output.'
Looks like the only choice for dot with symfunmatrix is to write it by hand at the matrix level
x.'*w
ans(t) = 
or at the sym/symfun level
sym(x).'*symfun(w) % assuming x is real
ans(t) = 
Ideally, I'd like to see dot and cross implemented for symmatrix and symfunmatrix types where neither function would evaluate, i.e., expand, until both arguments are subs-ed with sym or symfun objects of appropriate dimension.
Also, it would be nice if symmatrix could be assumed to be real. Is there a reason why being able to do so wouldn't make sense?
try
assume(x,'real')
catch ME
ME.message
end
ans = 'Undefined function 'assume' for input arguments of type 'symmatrix'.'
Pavan Kumar
Pavan Kumar
Last activity on 10 Oct 2025 at 13:00

Excited to link and sync to be a part of better learning experience
Patrick
Patrick
Last activity on 16 Oct 2025 at 17:15

PROMISE
PROMISE
Last activity on 9 Oct 2025 at 20:48

Excited to link up
Gregory Vernon
Gregory Vernon
Last activity on 8 Oct 2025 at 13:32

Something that I periodically wonder about is whether an integration with the Rubi integration rules package would improve symbolic integration in Matlab's Symbolic Toolbox. The project is open-source with an MIT-licensed, has a Mathematica implementation, and supposedly SymPy is working on an implementation. Much of my intrigue comes from this 2022 report that compared the previous version of Rubi (4.16.1) against various CAS systems, including Matlab 2021a (Mupad):
While not really an official metric for Rubi, this does "feel" similar to my experience computing symbolic integrals in Matlab Symbolic Toolbox vs Maple/Mathematica. What do y'all think?
Benjamin
Benjamin
Last activity on 2 Oct 2025 at 17:03

excited to learn more on Mathworks
Nermin
Nermin
Last activity on 2 Oct 2025 at 15:23

Looking forward to the Expo!