Loopin Matrix Multiplication Performance Improvement

3 views (last 30 days)
Hi all! I'm struggling to reduce computation time on a function I created:
function [beta,covariance,residuals] = hac_regression(y,x,ratio)
t = length(y);
[beta,~,residuals] = regress(y,x);
h = diag(residuals) * x;
q_hat = (x.' * x) / t;
o_hat = (h.' * h) / t;
l = round(ratio * t,0);
for i = 1:(l - 1)
o_tmp = (h(1:(t-i),:).' * h((1+i):t,:)) / (t - i);
o_hat = o_hat + (((l - i) / l) * (o_tmp + o_tmp.'));
end
covariance = (q_hat \ o_hat) / q_hat;
end
Below, a result of a profiling run:
I think nothing can be done with respect to built-in "regress" call.
But I'm wondering if the loop can somehow be optimized in order to reduce the overhead. On computations performed on very large datasets, even a small 5% improvement may dramatically reduce the overall computation time.
Below a reproducible example:
y = rand(100,1);
x = rand(100,3)
[beta,covariance,residuals] = hac_regression(y,x,0.1);
Thanks for your help!

Answers (0)

Categories

Find more on Programming in Help Center and File Exchange

Products


Release

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!