It is at least easy to vectorize your inner loop:
Here, I'm using the new automatic singleton expansion. If you have an older version of MATLAB you'll need to use
bsxfun for two of the subtraction operations. For example,
X(c+1:T,:)-X(c,:) is the same as
What is happening in the bit of code is that instead of looping
cc=c+1:T, we take all of those indices at once. So I simply replaced
d is then a matrix with multiple rows (9 in the first iteration, and one fewer in each subsequent iteration).
Surprisingly, this is slower than the double loop, and similar in speed to Jodag's answer.
Next, we can try to improve indexing. Note that the code above extracts data row-wise from the matrix. MATLAB stores data column-wise. So it's more efficient to extract a column than a row from a matrix. Let's transpose X:
This is more than twice as fast as the code that indexes row-wise.
But of course the same trick can be applied to the code in the question, speeding it up by about 50%:
My takeaway message from this exercise is that MATLAB's JIT compiler has improved things a lot. Back in the day any sort of loop would halt code to a grind. Today it's not necessarily the worst approach, especially if all you do is use built-in functions.