-2

Hello i want to draw the time/array size graph for binary and linear search algorithms.(Worst, best and average case for array size = 1 000, 10 000,100 000,1 000 000). I'm using DEV C++. However, when i run the code for array size = 1 000 000, the program crashes. Here is the code :

#include <stdio.h>

int binarySearch(int arr[], int l, int r, int x)
{
   if (r >= l)
   {
        int mid = l + (r - l)/2;

        if (arr[mid] == x)  return mid;

        if (arr[mid] > x) return binarySearch(arr, l, mid-1, x);

        return binarySearch(arr, mid+1, r, x);
   }

   return -1;
}

int main(void)
{
   int arr[10] = {2,5,8,9,15,18,19,25,34,50};
   int n = sizeof(arr)/ sizeof(arr[0]);
   int x = 10;
   int result = binarySearch(arr, 0, n-1, x);
   (result == -1)? printf("Element is not present in array")
                 : printf("Element is present at index %d", result);
   return 0;
}
Can Özgen
  • 63
  • 1
  • 2
  • 8

1 Answers1

2

You are allocating array arr on stack. For 1 000 000 elements you need 4 MB of memory (assuming that sizeof(int) == 4). On Windows for example most of cases stack has default limit of 1 MB. For quick fix define arr as static

static int arr[size];

or put in global scope outside function body or as others said, use dynamic allocation.

Daniel Sęk
  • 2,504
  • 1
  • 8
  • 17