This works:
func averageOf(numbers: Int...) -> Float {
var sum = 0
for number in numbers {
sum += number
}
return Float(sum) / Float(numbers.count)
}
averageOf() // (not a number)
averageOf(42, 597, 12) // (217.0)
But this doesn't:
func averageOf(numbers: Int...) -> Float {
var sum = 0
for number in numbers {
sum += number
}
return Float(sum / numbers.count)
}
averageOf()
averageOf(42, 597, 12)
It gives me this error on the }
line:
Execution was interrupted, reason: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)
I stumbled upon another question with the same first and second snippets of code and the author apparently doesn't get the same errors.
Let's remove that cast:
func averageOf(numbers: Int...) -> Float {
var sum = 0
for number in numbers {
sum += number
}
return sum / numbers.count
}
averageOf()
averageOf(42, 597, 12)
It gives me this error, on the division sign:
Cannot invoke '/' with an argument list of type '(@lvalue Int, Int)'
If I then change the return type of the function to Int
:
func averageOf(numbers: Int...) -> Int {
var sum = 0
for number in numbers {
sum += number
}
return sum / numbers.count
}
averageOf()
averageOf(42, 597, 12)
I get the same EXC_BAD_INSTRUCTION error.
If I cast only numbers.count
:
func averageOf(numbers: Int...) -> Int {
var sum = 0
for number in numbers {
sum += number
}
return sum / Float(numbers.count)
}
averageOf()
averageOf(42, 597, 12)
I get this error on the division sign:
Cannot invoke 'init' with an argument list of type '(@lvalue Int, $T5)'
I also get this error if I change the return type back to Float
.
All of this makes no sense to me. Is it Xcode going postal, or have I missed something subtle?