Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Timing a sorting algorithm?

Options
  • 25-10-2006 10:43pm
    #1
    Registered Users Posts: 4,946 ✭✭✭


    hey guys, i've been doing a program to sort an array (wow!) part of the question is to time the sorting algorithm.

    I got my hands on a stopwatch class, which i've added to my program, but i cant get it to call the start/stop/get overall time method... theres a bit to it, but i was wondering if there is a way that i can do a very basic timer that doesnt involve 5/6 different methods, and can be part of the algorithm method.

    heres a basic insertion sort


    // Insertion-Sort(a)
    for (j = 1; j < a.length; j++){

    temp = a[j];
    i = j - 1;

    while ((i >= 0) && (a > temp))
    {
    a[i+1] = a;
    i = i - 1;
    }

    a[i+1] = temp;
    }


    Is there anyway i can do a timer in ms from within that space that will start when the algorithm starts and end when its done? i also have to be able to get the over all time...end time - starttime = duration.. The most basic code will do.

    Cheers!


Comments

  • Registered Users Posts: 1,481 ✭✭✭satchmo


    Language?

    If it's Java try System.currentTimeMillis() - but as far as I remember the resolution is pretty crap, something like 10ms (depending on the implementation). If using this just make sure to test on large datasets to get a reasonable estimate.

    If C/C++ (and Win32), check out QueryPerformanceCounter/QueryPerformanceFrequency which has resolutions somewhere in the nanoseconds. The other option is timeGetTime(), but the resolution isn't as good, multiples of milliseconds again.


  • Registered Users Posts: 590 ✭✭✭dal


    Yes, why can't you just get the system time before you start your algorithm and then the system time after? Are these not guaranteed to be accurate in what ever language you are using?


  • Registered Users Posts: 4,946 ✭✭✭red_ice


    im doing it in java!

    heres the basics of what i have done timerwise in the sorting


    /
    long startTime;
    long endTime;
    long elapsedTime;
    startTime = System.currentTimeMillis();
    for (j = 1; j < a.length; j++)
    {
    temp = a[j];
    i = j - 1;

    while ((i >= 0) && (a > temp))
    {
    a[i+1] = a;
    i = i - 1;
    }
    a[i+1] = temp;
    }
    endTime = System.currentTimeMillis();
    elapsedTime = endTime - startTime;
    //print Array
    for(int x=0;x<a.length;x=x+1)
    {
    System.out.println(a[x]);
    }
    System.out.println("The Sorting took");
    System.out.println(elapsedTime);
    System.out.println("milliseconds");

    /


    Should that work?

    i think the array im using is to small for it to give any time difference, heres the output...

    /
    The Sorting took
    0
    milliseconds
    /

    Any ideas?


  • Registered Users Posts: 1,481 ✭✭✭satchmo


    Yeah looks fine - how big is the array? Try initialising it with random numbers and give it a size of 100,000 or more, you should see a difference then.


  • Registered Users Posts: 4,946 ✭✭✭red_ice


    got it working

    This bad boy sorted it out ^^


    int a[]=new int[10000];
    for (int n=0; n<a.length; n=n+1)
    {
    a[n]=(int)(Math.random()*1000000)%10000;
    }

    cheers lads


  • Advertisement
  • Closed Accounts Posts: 619 ✭✭✭Afuera


    For more accurate timing in Java you could also look into the System.nanoTime() method. This'll only work for 1.5+.


Advertisement