Two algorithms with the same complexity in Matlab

3 views (last 30 days)
I was able to implement two algorithms and they have the same complexity in Matlab. One algorithm uses vectorized code and the other algorithm uses for loops. I computed the times for each algorithm for increasing sizes.
I noticed the following : 1) Algorithm 1 with vectorized code is much faster than algorithm 2 2) I plotted the times vs size for each algorithm and noticed they were both had N^2 growth however the curve for algorithm 2 grows much faster.
Can I conclude that in terms of complexity, both algorithms have the same complexity and ignore their actual running times?
I'm just measuring how the algorithm scales for different input sizes.

Accepted Answer

Walter Roberson
Walter Roberson on 15 Mar 2013
Is the plot of the ratio of their running times tending towards linear? If it is not then they have different complexities.
Did you try on some quite big problems? After a certain point, MATLAB turns a number of vectorized code patterns over to multithreaded libraries. If you have not turned off multithreading, you get misleading measurements.
R2012a and newer (I think it is; might be R2011b) are able to find patterns that can be submitted to the multithreaded libraries even in some cases where the code is not written as vectorized -- so your conclusions might depend upon which release you are using. And upon whether you have an "end" statement matching your "function" statement (if you do, the code can be optimized more than if you do not.)

More Answers (2)

Jan
Jan on 16 Mar 2013
The runtime of an algorithm can follow a rule like:
runtime = a * n^2 + b * n^8
When b is sufficiently small, the runtime can look like depending on the size of the input n quadratically for sufficiently small problems. The real nature of the algorithm needs to be explored by a scientific analysis.
But such an deep analysis is not trivial, as it is not easy to define "complexity" accurately. Note, that an algorithm does not run independently from the hardware. For a growing problem, the sizes of the cacheline, 1st and 2nd level caches and even the size of the RAM, when disk swapping slows down the processing substantially.
Measuring the complexity of an algorithm by comparing the runtimes does not allow strong results.

Image Analyst
Image Analyst on 15 Mar 2013
I don't know how you're measuring complexity. Always or almost always the vectorized approach will be fewer lines of code. And it's usually, but not always, faster. Since it's faster for your code, just go with it. I'm not sure why you care only about the "complexity" and want to "ignore their actual running times". Just go with whatever is faster for the size of data you expect to encounter most.
  2 Comments
John
John on 15 Mar 2013
I need to analyze complexities, not running times
Image Analyst
Image Analyst on 16 Mar 2013
You never answered Walter's questions, but since you accepted his answer, I assume that it answered your question and so this question is done.

Sign in to comment.

Categories

Find more on Introduction to Installation and Licensing in Help Center and File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!