1

I'm working in JavaScript and this is a bit confusing because the code is returning the correct sum of primes. It is working with larger numbers. There is a bug where for 977 it returns the sum of primes for 976, which is 72179, instead of the sum for 977 which is 73156. Everything I've test so far has come back correctly.

function sumPrimes(num) {

    var sum = 0;
    var count = 0;
    var array = [];
    var upperLimit = Math.sqrt(num);
    var output = [];

    for (var i = 0; i < num; i++) {
        array.push(true);
    }

    for (var j = 2; j <= upperLimit; j++) {
        if (array[j]) {
            for (var h = j * j; h < num; h += j) {
                array[h] = false;
            }
        }
    }

    for (var k = 2; k < num; k++) {
        if (array[k]) {
            output.push(k);
        }
    }

    for (var a = 0; a < output.length; a++) {
        sum += output[a];
        count++;
    }

    return sum;
}

sumPrimes(977);
php-dev
  • 6,998
  • 4
  • 24
  • 38
Daniel Semel
  • 175
  • 1
  • 1
  • 13

1 Answers1

2

The problem stems from the fact that your "seive" Array is indexed from 0, but your algorithm assumes that array[n] represents the number n.

Since you want array[n]===true to mean that n is prime, you need an Array of length 978 if you want the last item to be indexed as array[977] and mean the number 977.

The issue seems to be fixed when I change all instances of < num to < num+1.

A. Vidor
  • 2,502
  • 1
  • 16
  • 24