We are using CesiumJS as an overlay in a spherical geometry app where it renders its own sphere with a known size (in pixels) and the user may zoom in/out to change its size. We would like to superimpose CesiumJS as another layer in the app and have the globe rendered to match the size of the app's sphere. In the screenshow below, the red circle is our sphere (with a known radius in pixels, and the earth globe is rendered by CesiumJS)
We were able to compute the projected size of the globe (in pixels) at the camera projection plane using the camera frustum:
const frustum = camera.frustum as PerspectiveFrustum;
// Compute the projected area of a pixel at the globe XY-plane?
const pixelDimensions = frustum.getPixelDimensions(
scene.drawingBufferWidth,
scene.drawingBufferHeight,
camera.getMagnitude(), // distance of the camera from the projection plane?
(scene as any).pixelRatio,
tmpCartesian2
);
const globeRadius = CESIUM_GLOBE_RADIUS_IN_METERS / pixelDimensions.y
For additional background
CESIUM_GLOBE_RADIUS_IN_METERS
the actual earth ellipsoid radius (= 6.371 million meters)- The function
getPixelDimensions()
gives the width/height (in meters) of the area covered by a screen pixel at a certain distance (the third argument of the function call)
Using this approach we calculate the required distance to move CesiumJS camera to match the sphere dimension. Our debugging output shows that after the camera translates, the CesiumJS globe is projected at the same size as the unit sphere of our app (close to several decimal places). However, the screenshot shows otherwise (it is off by hundred pixels when zoomed in closer). We also notice that the difference becomes smaller as we zoom out:
We suspect we supply an incorrect value in the third argument, but we don't know what better value to pass here.