I was just investigating the D3 Interpolate Object function, and I noticed some strange behavior. However, I'm not very familiar with D3, so it could be that I'm simply misunderstanding something. Given the following data and interpolation function:
var a = {"Country": "Ireland", "Year": 2010, "Data": 10};
var b = {"Country": "Ireland", "Year": 2015, "Data": 50};
var iFunc = d3.interpolateObject(a, b);
The following results are as expected:
console.log(iFunc(0.2)) // Returns: { Country: "Ireland", Year: 2011, Data: 18 }
console.log(iFunc(0.4)) // Returns: { Country: "Ireland", Year: 2012, Data: 26 }
However, when both function calls are included in the same console log, like this:
console.log(iFunc(0.2), iFunc(0.4))
The output is just the second Object twice:
{ Country: "Ireland", Year: 2012, Data: 26 } { Country: "Ireland", Year: 2012, Data: 26 }
And, when the function calls are put inside an array like so:
console.log([iFunc(0.2), iFunc(0.4)])
The previous output gets multiplied by two:
[{ Country: "Ireland", Year: 2014, Data: 42 }, { Country: "Ireland", Year: 2014, Data: 42 }]
What is going on here?
The reason I am investigating this is that I'd like to create a series of intermediate objects using something like:
var iVals = d3.range(0, 1, 0.2).map( iFunc );
If anybody can show me how I could achieve this, I'd really appreciate it!