I would like to compare the speed of two algorithms
solving the same problem.
As far as I understood, the tic-toc command gives
the time difference between the beginning and the
end of the computation, without taking into account
the time spent by the processor doing something else.
I'm using a multi-processor multi-users machine and
the CPU time used by my process is not 100% and
is not constant. The measured speeds are fluctuating
so much that I can hardly conclude anything.
Is it possible to obtain realistic measurements of
the speed of my algorithms?
Thanks for your help