Sorry. Im new here at stackoverflow. I am working with Self-Organizing Maps (SOM) which deals with projecting a high dimensional data in a 2D plane where each "nodes" or boxes in this plane contains weights that are relatively small in values ranging from 0 to 1.
Here is an example of the lattice or matrix produced that I want to color with RGB values.
For example, a box in the illustration contains weights like
[0.000124, 0.000000004, 0.999923, 0.119999220, 0.999311, 0.000552, 0.00000001223, 0.00045, 0.00001132, 0.99999998211, 0.00000000000000008821]
My algorithm ONLY tries to consider the FIRST THREE VALUES
this.setNodeColor(new Color((int)this.getDoubleElementAt(0) * 255,(int)(this.getDoubleElementAt(1)*255),(int)(this.getDoubleElementAt(2)*255)))
and this algorithm produces the illustration above.
Each boxes in the lattice has its own weights like what I have posted above. Now consider the given vector above. That is the value of the first BOX row 0, column 0 in the lattice.
Maybe you are wondering why did it has a BLUE color. The reason is that when you plug in the 3 first values of the vector which is approximately (0, 0, 1) on RGB, it produces a BLUEISH color as 0.999923 x 255 = 254 so it seems (0, 0, 254).
Now I do want to utilize all the weights of a node to produce a color for the BOX so that it produces variations on the colors like below.
I am also aware that I think, the values are so small I need some NORMALIZATION.
I want to ask any idea on how to - NORMALIZE the values - USE all WEIGHTS to come up with a color to paint the NODE. (more important)
Thank you. Feel free to ask for any more clarifications.