1

How would I go about creating a 3-dimensional array of UInt16? with a set size where each element is by default set to nil?

My attempt was

var courseInfo = UInt16?(count:10, repeatedValue: UInt16?(count:10, repeatedValue: UInt16?(count:10, repeatedValue:nil)))

although that doesn't seem to work. Any ideas?

rmaddy
  • 314,917
  • 42
  • 532
  • 579
Lahav
  • 781
  • 10
  • 22

2 Answers2

4

Your code errored because you weren't creating arrays, you were mixing them up with UInt16?s.

Let's start at the base case, how do you make a one-dimensional array?

Array<UInt16?>(count: 10, repeatedValue: nil)

What if we wanted a two dimensional array? Well, now we are no longer initializing an Array<UInt16?> we are initializing an Array of Arrays of UInt16?, where each sub-array is initialized with UInt16?s.

Array<Array<UInt16?>>(count:10, repeatedValue: Array<UInt16?>(count:10, repeatedValue:nil))

Repeating this for the 3-dimensional case just requires more of the same ugly nesting:

var courseInfo = Array<Array<Array<UInt16?>>>(count:10, repeatedValue: Array<Array<UInt16?>>(count:10, repeatedValue: Array<UInt16?>(count:10, repeatedValue:nil)))

I'm not sure if this is the best way to do it, or to model a 3D structure, but this is the closest thing to your code right now.

EDIT:

Martin in the comments pointed out that a neater solution is

var courseInfo : [[[UInt16?]]] = Array(count: 10, repeatedValue: Array(count : 10, repeatedValue: Array(count: 10, repeatedValue: nil)))

Which works by moving the type declaration out front, making the repeatedValue: parameter unambiguous.

Alex Popov
  • 2,509
  • 1
  • 19
  • 30
  • 1
    Alternatively: `let courseInfo : [[[UInt16?]]] = Array(count: 10, repeatedValue: Array(count : 10, repeatedValue: Array(count: 10, repeatedValue: nil)))` – Martin R Jan 07 '16 at 22:52
  • Much neater, by putting the type declaration out front and letting the parser figure out the rest. +1, I'd add this to the answer but I don't know how to credit you in it :/ – Alex Popov Jan 07 '16 at 22:53
  • Don't worry about that :) – Martin R Jan 07 '16 at 22:55
1

Build yourself an abstraction to allow:

var my3DArrayOfOptionalUInt16 = Matrix<UInt16?> (initial: nil, dimensions: 10, 10, 10)

using something like:

struct Matrix<Item> {

  var items : [Item]
  var dimensions : [Int]

  var rank : Int {
    return dimensions.count
  }

  init (initial: Item, dimensions : Int...) {
    precondition(Matrix.allPositive(dimensions))
    self.dimensions = dimensions
    self.items = [Item](count: dimensions.reduce(1, combine: *), repeatedValue: initial)
  }

  subscript (indices: Int...) -> Item {
    precondition (Matrix.validIndices(indices, dimensions))
    return items[indexFor(indices)]
  }

  func indexFor (indices: [Int]) -> Int {
    // Compute index into `items` based on `indices` x `dimensions`
    // ... row-major-ish
    return 0
  }

  static func validIndices (indices: [Int], _ dimensions: [Int]) -> Bool {
    return indices.count == dimensions.count &&
      zip(indices, dimensions).reduce(true) { $0 && $1.0 > 0 && ($1.0 < $1.1) }
  }

  static func allPositive (values: [Int]) -> Bool {
    return values.map { $0 > 0 }.reduce (true) { $0 && $1 }
  }
}
GoZoner
  • 67,920
  • 20
  • 95
  • 145