I'm trying to learn Swift and I made a simple average function:
func average(numbers: Int...) -> Float {
var sum = 0
for number in numbers {
sum += number
}
return Float(sum)/Float(numbers.count)
}
average(1,2,3,4,5,6)
This gives me the correct result: 3.5 However, I am wondering why I have to cast both sum and numbers.count to floats. I tried casting this way:
return Float(sum/numbers.count)
but it gives me just 3.0