Skip to main content

comparing time of algorithms

1 reply [Last post]
Joined: 2007-09-07
Points: 0

Hi everyone,
I'm trying to implement the DFS and BFS algorithm and compare the time each one require to find the solution in a maze.
what I did is the following:
at the beginning of each algorithm
long start = System.currentTimeMillis();

and at the end:
long elapsed = System.currentTimeMillis() - start;
but the problem is that for the same algorithm in the same maze I get different number each time.

what should I do to get the time (or cpu time) which each algorithm require to solve a maze.


Reply viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Joined: 2004-04-26
Points: 0

The method System.currentTimeMillis() is very inaccurate for performance time measurements in the area of millisecods. On Windows it only "tick's" around 15 milliseconds.
Use the method System.nanoTime() instead, which gives you the most precise timer values available on the system. If you need milliseconds, calculate them out of the nanoseconds returned, and you'll get better measured data.