1

I have a data set and I have a list of model points derived from them. I would like to estimate the fit of one to the other. What is the most appropriate statistical test to use? Here's some example data as tuples of n, x and y for real, and model values.

points = [(0, 710.804, 493.076), (1, 710.117, 491.902), (2, 709.565, 491.409), (3, 709.036, 490.947), (4, 707.839, 490.396), (5, 707.424, 491.456), (6, 706.889, 491.887), (7, 705.913, 492.917), (8, 705.037, 494.022), (9, 704.58, 494.882), (10, 703.758, 496.085), (11, 704.105, 496.934), (12, 704.552, 497.723), (13, 704.833, 498.17), (14, 705.204, 498.656), (15, 706.027, 498.929), (16, 706.932, 499.248), (17, 708.041, 499.156), (18, 708.849, 498.768), (19, 709.379, 498.487), (20, 709.797, 497.853), (21, 710.272, 497.212), (22, 710.494, 496.753), (23, 710.871, 495.957), (24, 711.033, 495.003), (25, 711.018, 493.997), (26, 710.804, 493.076)]

model = [(0, 712.284, 492.011), (1, 711.531, 489.898), (2, 710.708, 488.997), (3, 709.905, 488.067), (4, 707.919, 486.965), (5, 707.237, 489.079), (6, 706.417, 489.966), (7, 705.162, 491.933), (8, 704.354, 493.699), (9, 703.776, 494.777), (10, 702.083, 496.42), (11, 702.5, 497.663), (12, 702.926, 498.969), (13, 703.25, 499.746), (14, 703.711, 500.649), (15, 705.015, 501.098), (16, 706.428, 501.762), (17, 708.238, 501.564), (18, 709.486, 500.737), (19, 710.268, 500.204), (20, 710.748, 499.029), (21, 711.226, 497.932), (22, 711.343, 497.201), (23, 711.729, 496.139), (24, 712.0, 494.919), (25, 712.217, 493.526), (26, 712.284, 492.011)]

Ninja Chris
  • 324
  • 1
  • 3
  • 12

0 Answers0