For timing an algorithm (approximately in ms), which of these two approaches is better:
clock_t start = clock();
algorithm();
clock_t end = clock();
double t
How about gettimeofday()
? When it is called it updates two structs (timeval
and timezone
), with timing information. Usually, passing a timeval
struct is enough and the timezone
struct can be set to NULL
. The updated timeval
struct will have two members tv_sec
and tv_usec
. tv_sec
is the number of seconds since 00:00:00, January 1, 1970 (Unix Epoch) and tv_usec
is additional number of microseconds w.r.t. tv_sec
. Thus, one can get time expressed in very good resolution.
It can be used as follows:
#include
struct timeval start_time;
double mtime, seconds, useconds;
gettimeofday(&start_time, NULL); //timeval is usually enough
int seconds = start_time.tv_sec; //time in seconds
int useconds = start_time.tv_usec; //further time in microseconds
int desired_time = seconds * 1000000 + useconds; //time in microseconds