Results for
When Cody hides test cases, test your function with random small inputs first. If it works for many edge cases, it will almost always pass the grader.
I set my 3D matrix up with the players in the 3rd dimension. I set up the matrix with: 1) player does not hold the card (-1), player holds the card (1), and unknown holding the card (0). I moved through the turns (-1 and 1) that are fixed first. Then cycled through the conditional turns (0) while checking the cards of each player using the hints provided until it was solved. The key for me in solving several of the tests (11, 17, and 19) was looking at the 1's and 0's being held by each player.
sum(cardState==1,3);%any zeros in this 2D matrix indicate possible cards in the solution
sum(cardState==0,3)>0;%the ones in this 2D matrix indicate the only unknown positions
sum(cardState==1,3)|sum(cardState==0,3)>0;%oring the two together could provide valuable information
Some MATLAB Cody problems prohibit loops (for, while) or conditionals (if, switch, while), forcing creative solutions.
One elegant trick is to use nested functions and recursion to achieve the same logic — while staying within the rules.
Example: Recursive Summation Without Loops or Conditionals
Suppose loops and conditionals are banned, but you need to compute the sum of numbers from 1 to n. This is a simple example and obvisously n*(n+1)/2 would be preferred.
function s = sumRecursive(n)
zero=@(x)0;
s = helper(n); % call nested recursive function
function out = helper(k)
L={zero,@helper};
out = k+L{(k>0)+1}(k-1);
end
end
sumRecursive(10)
- The helper function calls itself until the base case is reached.
- Logical indexing into a cell array (k>0) act as an 'if' replacement.
- MATLAB allows nested functions to share variables and functions (zero), so you can keep state across calls.
Tips:
- Replace 'if' with logical indexing into a cell array.
- Replace for/while with recursion.
- Nested functions are local and can access outer variables, avoiding global state.
I realized that using vectorized logic instead of nested loops makes Cody problems run much faster and cleaner. Functions like any(), all(), and logical indexing can replace multiple for-loops easily !
Many MATLAB Cody problems involve recognizing integer sequences.
If a sequence looks familiar but you can’t quite place it, the On-Line Encyclopedia of Integer Sequences (OEIS) can be your best friend.
OEIS will often identify the sequence, provide a formula, recurrence relation, or even direct MATLAB-compatible pseudocode.
Example: Recognizing a Cody Sequence
Suppose you encounter this sequence in a Cody problem:
1, 1, 2, 3, 5, 8, 13, 21, ...
Entering it on OEIS yields A000045 – The Fibonacci Numbers, defined by:
F(n) = F(n-1) + F(n-2), with F(1)=1, F(2)=1
You can then directly implement it in MATLAB:
function F = fibSeq(n)
F = zeros(1,n);
F(1:2) = 1;
for k = 3:n
F(k) = F(k-1) + F(k-2);
end
end
fibSeq(15)
When solving MATLAB Cody problems involving very large integers (e.g., factorials, Fibonacci numbers, or modular arithmetic), you might exceed MATLAB’s built-in numeric limits.
To overcome this, you can use Java’s java.math.BigInteger directly within MATLAB — it’s fast, exact, and often accepted by Cody if you convert the final result to a numeric or string form.
Below is an example of using it to find large factorials.
function s = bigFactorial(n)
import java.math.BigInteger
f = BigInteger('1');
for k = 2:n
f = f.multiply(BigInteger(num2str(k)));
end
s = char(f.toString); % Return as string to avoid overflow
end
bigFactorial(100)
Hi cool guys,
I hope you are coding so cool!
FYI, in Problem 61065. Convert Hexavigesimal to Decimal in Cody Contest 2025 there's a small issue with the text:
[ ... For example, the text ‘aloha’ would correspond to a vector of values [0 11 14 7 0], thus representing the base-26 value 202982 = 11*263 + 14*262 + 7*26 ...]
The bold section should be:
202982 = 11*26^3 + 14*26^2 + 7*26
Title: Looking for Internship Guidance as a Beginner MATLAB/Simulink Learner
Hello everyone,
I’m a Computer Science undergraduate currently building a strong foundation in MATLAB and Simulink. I’m still at a beginner level, but I’m actively learning every day and can work confidently once I understand the concepts. Right now I’m focusing on MATLAB modeling, physics simulation, and basic control systems so that I can contribute effectively to my current project.
I’m part of an Autonomous Underwater Vehicle (AUV) team preparing for the Singapore AUV Challenge (SAUVC). My role is in physics simulation, controls, and navigation, and MATLAB/Simulink plays a major role in that pipeline. I enjoy physics and mathematics deeply, which makes learning modeling and simulation very exciting for me.
On the coding side, I practice competitive programming regularly—
• Codeforces rating: ~1200
• LeetCode rating: ~1500
So I'm comfortable with logic-building and problem solving. What I’m looking for:
I want to know how a beginner like me can start applying for internships related to MATLAB, Simulink, modeling, simulation, or any engineering team where MATLAB is widely used (including companies outside MathWorks).
I would really appreciate advice from the community on:
- What skills should I strengthen first?
- Which MATLAB/Simulink toolboxes are most important for beginners aiming toward simulation/control roles?
- What small projects or portfolio examples should I build to improve my profile?
- What is the best roadmap to eventually become a good candidate for internships in this area?
Any guidance, resources, or suggestions would be extremely helpful for me.
Thank you in advance to everyone who shares their experience!
Parallel Computing Onramp is here! This free, one-hour self-paced course teaches the basics of running MATLAB code in parallel using multiple CPU cores, helping users speed up their code and write code that handles information efficiently.
Remember, Onramps are free for everyone - give the new course a try if you're curious. Let us know what you think of it by replying below.
From my experience, MATLAB's Deep Learning Toolbox is quite user-friendly, but it still falls short of libraries like PyTorch in many respects. Most users tend to choose PyTorch because of its flexibility, efficiency, and rich support for many mathematical operators. In recent years, the number of dlarray-compatible mathematical functions added to the toolbox has been very limited, which makes it difficult to experiment with many custom networks. For example, svd is currently not supported for dlarray inputs.
This link (List of Functions with dlarray Support - MATLAB & Simulink) lists all functions that support dlarray as of R2026a — only around 200 functions (including toolbox-specific ones). I would like to see support for many more fundamental mathematical functions so that users have greater freedom when building and researching custom models. For context, the core MATLAB mathematics module contains roughly 600 functions, and many application domains build on that foundation.
I hope MathWorks will prioritize and accelerate expanding dlarray support for basic math functions. Doing so would significantly increase the Deep Learning Toolbox's utility and appeal for researchers and practitioners.
Thank you.
Hey Relentless Coders! 😎
Let’s get to know each other. Drop a quick intro below and meet your teammates! This is your chance to meet teammates, find coding buddies, and build connections that make the contest more fun and rewarding!
You can share:
- Your name or nickname
- Where you’re from
- Your favorite coding topic or language
- What you’re most excited about in the contest
Let’s make Team Relentless Coders an awesome community—jump in and say hi! 🚀
Hey Cool Coders! 😎
Let’s get to know each other. Drop a quick intro below and meet your teammates! This is your chance to meet teammates, find coding buddies, and build connections that make the contest more fun and rewarding!
You can share:
- Your name or nickname
- Where you’re from
- Your favorite coding topic or language
- What you’re most excited about in the contest
Let’s make Team Cool Coders an awesome community—jump in and say hi! 🚀
Welcome to the Cody Contest 2025 and the Relentless Coders team channel! 🎉
You never give up. When a problem gets tough, you dig in deeper. This is your space to connect with like-minded coders, share insights, and help your team win. To make sure everyone has a great experience, please keep these tips in mind:
- Follow the Community Guidelines: Take a moment to review our community standards. Posts that don’t follow these guidelines may be flagged by moderators or community members.
- Ask Questions About Cody Problems: When asking for help, show your work! Include your code, error messages, and any details needed to reproduce your results. This helps others provide useful, targeted answers.
- Share Tips & Tricks: Knowledge sharing is key to success. When posting tips or solutions, explain how and why your approach works so others can learn your problem-solving methods.
- Provide Feedback: We value your feedback! Use this channel to report issues or share creative ideas to make the contest even better.
Have fun and enjoy the challenge! We hope you’ll learn new MATLAB skills, make great connections, and win amazing prizes! 🚀
Welcome to the Cody Contest 2025 and the Cool Coders team channel! 🎉
You stay calm under pressure. No panic, no chaos—just smooth problem-solving. This is your space to connect with like-minded coders, share insights, and help your team win. To make sure everyone has a great experience, please keep these tips in mind:
- Follow the Community Guidelines: Take a moment to review our community standards. Posts that don’t follow these guidelines may be flagged by moderators or community members.
- Ask Questions About Cody Problems: When asking for help, show your work! Include your code, error messages, and any details needed to reproduce your results. This helps others provide useful, targeted answers.
- Share Tips & Tricks: Knowledge sharing is key to success. When posting tips or solutions, explain how and why your approach works so others can learn your problem-solving methods.
- Provide Feedback: We value your feedback! Use this channel to report issues or share creative ideas to make the contest even better.
Have fun and enjoy the challenge! We hope you’ll learn new MATLAB skills, make great connections, and win amazing prizes! 🚀
I am excited to join this community to learn the more particularly the Matlab/Simulink
I'm developing a comprehensive MATLAB programming course and seeking passionate co-trainers to collaborate!
Why MATLAB Matters:Many people underestimate MATLAB's significance in:
- Communication systems
- Signal processing
- Mathematical modeling
- Engineering applications
- Scientific computing
Course Structure:
- Foundation Module: MATLAB basics and fundamentals
- Image Processing: Practical applications and techniques
- Signal Processing: Analysis and implementation
- Machine Learning: ML algorithms using MATLAB
- Hands-on Learning: Projects, assignments.
What I'm Looking For:
- Enthusiastic educators willing to share knowledge
- Experience in any MATLAB application area
- Commitment to collaborative teaching
Interested in joining as a co-trainer? Please comment below or reach out directly!
Online Doc + System Browser
11%
Online Doc + Dedicated Browser
11%
Offline Doc +System Browser
11%
Offline Doc + Dedicated Browser
23%
Hybrid Approach (Support All Modes)
23%
User-Definable / Fully Configurable
20%
35 votes
I'm working on training neural networks without backpropagation / automatic differentiation, using locally derived analytic forms of update rules. Given that this allows a direct formula to be derived for the update rule, it removes alot of the overhead that is otherwise required from automatic differentiation.
However, matlab's functionalities for neural networks are currently solely based around backpropagation and automatic differentiation, such as the dlgradient function and requiring everything to be dlarrays during training.
I have two main requests, specifically for functions that perform a single operation within a single layer of a neural network, such as "dlconv", "fullyconnect", "maxpool", "avgpool", "relu", etc:
- these functions should also allow normal gpuArray data instead of requiring everything to be dlarrays.
- these functions are currently designed to only perform the forward pass. I request that these also be designed to perform the backward pass if user requests. There can be another input user flag that can be "forward" (default) or "backward", and then the function should have all the necessary inputs to perform that operation (e.g. for "avgpool" forward pass it only needs the avgpool input data and the avgpool parameters, but for the "avgpool" backward pass it needs the deriviative w.r.t. the avgpool output data, the avgpool parameters, and the original data dimensions). I know that there is a maxunpool function that achieves this for maxpool, but it has significant issues when trying to use it this way instead of by backpropagation in a dlgradient type layer, see (https://www.mathworks.com/matlabcentral/answers/2179587-making-a-custom-way-to-train-cnns-and-i-am-noticing-that-avgpool-is-significantly-faster-than-maxpo?s_tid=srchtitle).
I don't know how many people would benefit from this feature, and someone could always spend their time creating these functionalities themselves by matlab scripts, cuDNN mex, etc., but regardless it would be nice for matlab to have this allowable for more customizable neural net training.
I recently published this blog post about resources to help people learn MATLAB https://blogs.mathworks.com/matlab/2025/09/11/learning-matlab-in-2025/
What are your favourite MATLAB learning resources?