0

Since Swift 2.0 I have the problem, that defining global variables only work fine, if only one global variable per line is defined. With local variables there is no issue like this. In Swift 1.0 this worked fine, but something has changed by Apple.

Example:

var x1: Int = 0
var y1: Int = 0
var z1: Int = 0

func Test1 ()
{
    x1 = 30
    y1 = 20
    z1 = 10
}

var x2: Int = 0, y2: Int = 0, z2: Int = 0

func Test2 ()
{
    x2 = 30
    y2 = 20
    z2 = 10
}

func main_program ()
{
    Test1()
    Test2()
    print("x1=\(x1), y1=\(y1), z1=\(z1)\n")
    print("x2=\(x2), y2=\(y2), z2=\(z2)\n")
}

When you now call main_program, you will get

x1=30, y1=20, z1=10
x2=0, y2=0, z2=10

in the Terminal. But both lines should be identically and were indentically with Swift 1.0

So I had to change all my programs, to only define one global variable in a line. If the variables are part of some fomulas, you will get some funny results, not only zero.

Is this a bug in Swift or is there a serious background for this behaviour?

j.s.com
  • 1,422
  • 14
  • 24

1 Answers1

4

From the release notes:

Known Issues in Xcode 7 beta 6 — Swift 2.0 and Objective-C

Declaring multiple global variables in a single var or let may cause corruption of their values. (22207407) Workaround: Declare each variable using a separate var or let.

matt
  • 515,959
  • 87
  • 875
  • 1,141