I am trying to calculate the weighted average from a 2d javascript array. This is the array:
var timestamp = [
[1524751200, 6000],
[1556287200, 6000],
[1587909600, 6000],
[1619445600, 6000],
[1650981600, 3000],
[1682517600, 3000],
[1714140000, 3000],
[1745676000, 3000],
[1777212000, 1500],
[1808748000, 1500],
[1840370400, 1500],
[1871906400, 1500],
[1903442400, 750],
[1934978400, 750],
[1966600800, 750],
[1998136800, 750],
[2029672800, 375],
[2061208800, 375],
[2092831200, 375],
[2124367200, 375],
[2155903200, 187.5],
[2187439200, 187.5],
[2219061600, 187.5],
[2250597600, 187.5],
[2282133600, 93.75],
[2313669600, 93.75],
[2345292000, 93.75],
[2376828000, 93.75],
[2408364000, 46.875],
[2439900000, 46.875],
[2471522400, 46.875],
[2503058400, 46.875],
[2534594400, 23.4375],
[2566130400, 23.4375],
[2597752800, 23.4375],
[2629288800, 23.4375],
[2660824800, 11.71875],
[2692360800, 11.71875],
[2723983200, 11.71875],
[2755519200, 11.71875],
[2787055200, 5.859375],
[2818591200, 5.859375],
[2850213600, 5.859375],
[2881749600, 5.859375],
[2913285600, 2.9296875],
[2944821600, 2.9296875],
[2976444000, 2.9296875],
[3007980000, 2.9296875],
[3039516000, 1.46484375],
[3071052000, 1.46484375],
[3102674400, 1.46484375],
[3134210400, 1.46484375],
[3165746400, 0.732421875],
[3197282400, 0.732421875],
[3228904800, 0.732421875],
[3260440800, 0.732421875],
[3291976800, 0.3662109375],
[3323512800, 0.3662109375],
[3355135200, 0.3662109375],
[3386671200, 0.3662109375],
[3418207200, 0.18310546875],
[3449743200, 0.18310546875],
[3481365600, 0.18310546875],
[3512901600, 0.18310546875],
[3544437600, 0.091552734375],
[3575973600, 0.091552734375],
[3607596000, 0.091552734375],
[3639132000, 0.091552734375],
[3670668000, 0.0457763671875],
[3702204000, 0.0457763671875],
[3733826400, 0.0457763671875],
[3765362400, 0.0457763671875],
[3796898400, 0.02288818359375],
[3828434400, 0.02288818359375],
[3860056800, 0.02288818359375],
[3891592800, 0.02288818359375],
[3923128800, 0.011444091796875],
[3954664800, 0.011444091796875],
[3986287200, 0.011444091796875],
[4017823200, 0.011444091796875],
[4049359200, 0.0057220458984375],
[4080895200, 0.0057220458984375],
[4112431200, 0.0057220458984375],
[4143967200, 0.0057220458984375],
[4175503200, 0.0028610229492188],
[4207039200, 0.0028610229492188],
[4238661600, 0.0028610229492188],
[4270197600, 0.0028610229492188],
[4301733600, 0.0014305114746094],
[4333269600, 0.0014305114746094],
[4364892000, 0.0014305114746094],
[4396428000, 0.0014305114746094],
[4427964000, 0.0007152557373047],
[4459500000, 0.0007152557373047],
[4491122400, 0.0007152557373047],
[4522658400, 0.0007152557373047],
[4554194400, 0.00035762786865235],
[4585730400, 0.00035762786865235],
[4617352800, 0.00035762786865235],
[4648888800, 0.00035762786865235]
];
Column 1 represents the timestamp, each one is the date 26.04 of each year for the next 100 years, and column 2 is a value that basically halves every 4 years. The weighted average must be calculated between all the values from column 2, and the number of times each value is taken into the calculation is the weight of that value. The condition that I must apply is a date that I take from the user input, and I only have to calculate the values that correspond to all the timestamps that are smaller than the date given by the user input. I have no idea where to start.