I played around with Swift again today and was in need of a undefined()
function. Basically a function that can be any type you want but crashes when it's actually run/evaluated. That's useful if you haven't had the time to implement a certain expression yet but want to see if the program type checks.
I also implemented a Result<T>
and a (questionable) function chain
that returns the left Result<>
if not successful, otherwise the right Result<>
. Therefore, the signature is chain<L,R>(l : Result<L>, r : @autoclosure () -> Result<R>) -> Result<R>
. I defined the right Result<>
as @autoclosure
because there's no need to evaluate it iff the left is a failure.
I'm not interested in the usefulness (or improvements) for any of this, I'm just interested why my program crashes in the line marked [L3]
.
Please note that
- as expected
[L1]
works fine (||
is lazy evaluated) [L2]
works fine (chain
is also lazy in its right argument)
But, weirdly
[L3]
crashes the program evaluatingundefined()
From my understanding L2 and L3 should be equivalent at runtime: L3 is just telling the type checker something it already knows... The same effect happens by the way if you change the type of undefined
to () -> Result<T>
(instead of () -> T
), then it all works even without the as Result<String>
.
This is all my code:
import Foundation
enum Result<T> {
case Success(@autoclosure () -> T);
case Failure(@autoclosure () -> NSError)
}
func undefined<T>(file:StaticString=__FILE__, line:UWord=__LINE__) -> T {
fatalError("undefined", file:file, line:line)
}
func chain<L,R>(l : Result<L>, r : @autoclosure () -> Result<R>) -> Result<R> {
switch(l) {
case .Failure(let f):
return .Failure(f())
case .Success(let _):
return r()
}
}
func testEvaluation() {
let error = NSError(domain: NSPOSIXErrorDomain, code: Int(ENOENT), userInfo: nil)
let failure : Result<String> = .Failure(error)
assert(true || undefined() as Bool) // [L1]: works
let x : Result<String> = chain(failure, undefined() as Result<String>) // [L2]: works
print("x = \(x)\n")
let y : Result<String> = chain(failure, undefined()) // [L3]: CRASHES THE PROGRAM EVALUATING undefined()
print("y = \(y)\n")
}
testEvaluation()
Believe it or not, the output of the program is:
x = (Enum Value)
fatal error: undefined: file main.swift, line 27
Illegal instruction: 4
That can't be right! Why would as Result<String>
(which is the only difference between L2 and L3) change the output of the program? That should be entirely handled in the type checker and that means at compile time, no? Compiler bug?
The quickest way for you to reproduce this should be:
- copy all my code into the clipboard
cd /tmp
pbpaste > main.swift
xcrun -sdk macosx swiftc main.swift
./main
actual output:
x = (Enum Value)
fatal error: undefined: file main.swift, line 27
Illegal instruction: 4
expected output:
x = (Enum Value)
y = (Enum Value)