I'm trying to find the cumulative RMS of a vector. (The data happens to be model error - labelled devCoM for deviation of centre of mass. The calculation should get less accurate as time proceeds, so cumulative error is important.)
I can find the cumulative sum by taking
...but this fluctuates about zero.
I can find the RMS of the entire dataset by taking
...but this doesn't reveal the expected increase in error over time.
Is there some way I could combine these methods?