I need to validate the length of a string. The allowed values for the character count are:
- 6 – 9 characters
- 12 characters
- 15 characters
All strings with a different character count are invalid. Thus, I would like to create a Swift function that accepts a number of ranges and evaluates the string:
extension String {
func evaluateLength(validCharacterCounts: Range<Int>...) -> Bool {
// Implementation
}
}
Now I can call the function for a single Int
range:
"Live long and prosper".evaluateLength(validCharacterCounts: 6..<10)
and multiple Int
ranges:
"Live long and prosper".evaluateLength(validCharacterCounts: 6..<10, 15..<20)
But I cannot call the function with single, isolated integer values:
"Live long and prosper".evaluateLength(validCharacterCounts: 6..<10, 12, 15)
because 12
and 15
are typed as Int
and not as Range<Int>
.
Swift compile error: Cannot convert value of type 'Int' to expected argument type 'Range'
Is there a way to treat a single Integer as a Range
in Swift, like casting it automatically to Range<Int>
?
(After all 5
is equivalent to 5..<6
, so mathematically speaking 5
is a range as well.)