Home > Net >  Why is my binary search algorithm so slow despite theoretically being O(log n) time?
Why is my binary search algorithm so slow despite theoretically being O(log n) time?

Time:12-19

I implemented a binary search, and in theory should run in O(log N) time, and this holds up when counting the number of times it is looped through. However, when it is run, it seems to be extremely slow.

int binary_search(int i, vector<int> list) {
   int min_ = 0;
   int max_ = list.size();
   while (max_ != min_ 1) {
       if (list[(max_ min_)/2] > i) {
           max_ = (max_ min_)/2;
       } else if (list[(max_ min_)/2] <= i) {
           min_ = (max_ min_)/2;
       }
   }
   return min_;
}

Can anyone explain why my algorithm is so slow?

CodePudding user response:

For starters, you're making a copy of the vector<int> list that is passed in. Change it to be pass by reference:

Instead of this:

int binary_search(int i, vector<int> list) {

This:

int binary_search(int i, const vector<int>& list) {

CodePudding user response:

Either your function is being called:

  1. Repeatedly with large lists: O (n Log n)
  2. Once with a very very large list O (Log n) large constant

In a practical sense, you'd benefit from minimizing overhead, and possibly pre-sorting the list.

Edit: And as Selbie says, pass your vector in by reference.

  • Related