-2

Was playing around with the reduce function on collections in Swift.

//Reduce method should return an Int with the value 3141.
let digits = ["3", "1", "4", "1"]
.reduce(0) {
    (total:Int, digit:String) in
    return ("\(total)" + digit).toInt()!
}

The function is giving the correct output but why does ("0"+"1").toInt()! return 1 as an Int, rather than 0? The string combined to be turned into an Int is "01". I assume this is a String that the function cannot covert to an Int directly. Does it just default to the second character instead?

DanielRak
  • 239
  • 1
  • 2
  • 11
  • 3
    What? Why would an integer parser ever stop at the first character? Should "123“ be parsed as just 1? – Jonathon Reinhart Oct 13 '14 at 06:06
  • "I assume this is a String that the function cannot covert to an Int directly." You assume wrong. :) Strings that have leading zeros are pretty easy to parse. – Paul Manta Oct 13 '14 at 06:06
  • Sorry this is new to me. Is there a universal rule that skips leading zeros when a string is being parsed? Just to be clear I didn't assume that only 0 will be parsed. I assumed that the function would not be able to parse at all because "01" isn't a real number. – DanielRak Oct 13 '14 at 06:09
  • When you see the number 001 do you interpret it as zero? I'm just pointing out the flaw in your logic. Also note that a literal with a leading zero in many languages is interpreted as octal (base 8). – Jonathon Reinhart Oct 13 '14 at 06:12
  • I assume that the function toInt() does exactly what it says it does. What warrants 001 to become 1? I'm just trying to figure out if theres some sort of standard rules this function is using for it's logic such as: "Also note that a literal with a leading zero in many languages is interpreted as octal (base 8)" Thanks. – DanielRak Oct 13 '14 at 06:15
  • 1
    It's because of a concept called "encoding". The computer doesn't store numbers exactly as you type them. It trys to compress them into something smaller that is easier to store. What you type is in something called ascii (or UTF8 or ...) that may take 8 (or more) bits per character you type. That's what strings are made of. However integers use a fixed number of bits that are usually much fewer and can be calculated much faster. – candied_orange Oct 13 '14 at 06:21
  • Thanks for the explanation. So the computer actually stores 01 as 1 which is why that function is returning it as 1, correct? – DanielRak Oct 13 '14 at 06:24
  • 1
    According to https://developer.apple.com/library/prerelease/mac/documentation/Swift/Conceptual/Swift_Programming_Language/TheBasics.html an int is the native word size. On my 64 bit computer that's, well, 64 bits. It's stored as 0000000000000000000000000000000000000000000000000000000000000001 but when you ask it to display you just see a 1. If you have a windows7 computer handy play with the calculator in view| programer mode. It shows both what's stored and what is displayed. Play with Dec & Bin. Play with Byte & Word. – candied_orange Oct 13 '14 at 06:38

1 Answers1

4

"0"+"1" == "01". You're doing concatenation not addition. You lose the 0 when you convert to int because it's a leading zero.

Leading zero's are usually dropped as meaningless but in some contexts they actually signal that you're expressing an octal based number. Even if that's the case here it'd still end up evaluating to 1.

candied_orange
  • 7,036
  • 2
  • 28
  • 62
  • 1
    The OP is aware of this. He assumed however that only the 0 would be parsed for some reason. – Jonathon Reinhart Oct 13 '14 at 06:08
  • Just to be clear I didn't assume that only 0 will be parsed. I assumed that the function would not be able to parse at all because "01" isn't a real number. – DanielRak Oct 13 '14 at 06:12
  • This actually isn't a bad question. It just demonstrates a lack of familiarity with how most languages typically parse and store a number. There are languages out there that would store the number exactly as he typed it. Swift just isn't one of them – candied_orange Oct 13 '14 at 07:02