What's the fuss about?
I'm trying to find the minimum distance between points (distance between 2 points in a 2D - plane: distance from (x1, y1) to (y1, y2))
, which are stored in array arr
and then calculate and return the minimum of those distances.
However, the problem is that my source code produces a random garbage output.
The idea is to get the distance between points (x1, y1) and (x2, y2)
with the formula:
sqrt((x1 - x2)^2 + (y1 - y2)^2)
.
To do this I select 4 elements for each iteration:
x1 = arr[0], x2 = arr[1], y1 = arr[2], y2 = arr[3]
.
x1
and x2
stays constant for each iteration (of i
) while the distance between x1, x2
and y1, y2
(varies for every unique iteration of j
) is calculated. Finally, the shortest distance between the two points are chosen and is returned to main()
.
What have I done to solve this mess?
Including debugging statements in the source code reveals that the main culprit are the random garbage values (that shouldn't even exist, literally!).
Another culprit is that sqrt(arg)
is giving out a random garbage value. For example, when calculating the distance between (4, 4)
and (1, 100)
, the result is sqrt(0 + (-99)^2) = 99
. But instead it ouputs -2147483648
.
Here's my code:
#include<iostream>
#include<vector>
#include<cmath>
using std::sqrt;
using std::cin;
using std::cout;
using std::endl;
using std::vector;
int dist_cal(vector<int>&, int);
int main()
{
int num_pairs = -1;
cout << "Enter the number of pairs of point co-ordinates (x, y) that you want to enter: " << endl;
cin >> num_pairs;
vector<int> points;
cout << "Now enter the (x, y) co-ordinate pairs: " << endl;
for (int i = 0; i < num_pairs * 2; i++)
{
int buff;
cin >> buff;
points.push_back(buff);
}
cout << "The minimum distance between the array of points entered is " << dist_cal(points, num_pairs) << "." << endl;
return 0;
}
int dist_cal(vector<int>& arr, int num_pairs)
{
int min_distance = -1, temp_distance = -1, x1, x2, y1, y2, itr_count = 0;
for (int i = 0; i <= num_pairs; i += 2)
{
x1 = arr[i + 0];
x2 = arr[i + 1];
for (int j = i + 2; j <= num_pairs; j += 2)
{
y1 = arr[j + 0];
y2 = arr[j + 1];
temp_distance = sqrt((x1 - x2)^2 + (y1 - y2)^2);
if (itr_count == 0)
{
min_distance = temp_distance;
itr_count++;
}
if (min_distance > temp_distance)
{
min_distance = temp_distance;
}
}
}
return min_distance;
}
I understand that this method is naive and O(n^2), but to move onto faster algorithms, I must first solve it with the most basic approach for mental sanity.
For an input:
4
4 4
7 8
1 100
4 4
the output should be 0
.
The actual output reads:
The minimum distance between the array of points entered is -2147483648.
What am I doing wrong here? Alternative (and more efficient algorithms) are also welcome! Thanks in advance! :)