If an array in JSON is at root level, then code is simple and beautiful:
JSONDecoder().decode([T].self, data)
But how does it work under the hood?
I want to know this in order to implement a custom decoder (with the same calling style) in a case when the array is not at the root level.
For example:
{
"status": "success",
"data": {
"base": {
"symbol": "USD",
"sign": "$"
},
"coins": [
{
"name": "Bitcoin",
"price": 7783.1949110647,
},
{
"name": "Ethereum",
"price": 198.4835955777,
},
{
"name": "Tether",
"price": 1.0026682789,
},
{
"name": "Litecoin",
"price": 45.9617330332,
}
]
}
}
struct Coin: Decodable {
let name: String
let price: Double
init(from decoder: Decoder) throws {
let rootContainer = try decoder.container(keyedBy: CodingKeys.self)
let nestedContainer = try rootContainer.nestedContainer(keyedBy: CodingKeys.self, forKey: .data)
var unkeyedContainer = try nestedContainer.nestedUnkeyedContainer(forKey: .coins)
let coinContainer = try unkeyedContainer.nestedContainer(keyedBy: CodingKeys.self)
name = try coinContainer.decode(String.self, forKey: .name)
price = try coinContainer.decode(Double.self, forKey: .price)
}
enum CodingKeys: String, CodingKey {
case data
case coins
case name
case price
}
}
It almost works!
When .decode(Coin.self, data)
it just returns single, the very first element in the array.
When .decode([Coin].self, data)
sadly, but it throws the error:
Expected to decode Array, but found a dictionary instead.
Looks like I've missed some last step to make it works in a way I want.