I have a 3D point cloud (XYZ) where the Z
can be position or energy. I want to project them on a 2D surface in a n-by-m grid (in my problem n = m
) in a manner that each grid cell has a value of the maximum difference of Z
, in case of Z
being position, or a value of summation over Z
, in case of Z
being energy.
For example, in a range of 0 <= (x,y) <= 20
, there are 500 points. Let's say the xy-plane has n-by-m partitions, e.g. 4-by-4; by which I mean in both x
and y
directions we have 4 partitions with an interval of 5
(to make it 20
at maximum. Now, each of these cells should have a value of the summation, or maximum difference, of the Z
value of those points which are in the corresponding column in the defined xy-plane.
I made a simple array of XYZ just for a test as follows, where in this case, Z
denotes the energy of the each point.
n=1;
for i=1:2*round(random('Uniform',1,5))
for j=1:2*round(random('Uniform',1,5))
table(n,:)=[i,j,random('normal',1,1)];
n=n+1;
end
end
How can this be done without loops?