3

So the idea is quite simple: given the sun's position (azimuth and elevation) I want my app to be able to display a shape using augmented reality when the camera is pointing at the sun.

So there is a few steps:

  1. Convert azimuth and elevation into radians, then into cartesian coordinates to get a simple vector {x, y, z}.
  2. Get the phone's gyroscope data to get its orientation in space as a 3D vector {x, y, z}.
  3. Calculate new coordinates for the sun regarding the phone orientation.
  4. Display a random shape using Three.js at these coordinates.

1 and 2 are quite easy. There are a lot of APIs out there giving the sun's position depending on a location. Then I used a formula to convert the sun's spherical coordinates into cartesian ones:

x = R * cos(ϕ) * sin(θ)
y = R * cos(ϕ) * cos(θ)
z = R * sin(ϕ)

with R, the distance of the point from the origin, θ (the azimuth) and ϕ (the elevation).

I got the device's orientation in space with Expo.io, using their Device Motion API. Documentation here

I'm really struggling with the third step. I don't know how to combine sun and device coordinates in space, and project the whole thing through Three.js perspective camera.

I found this post the other day: Compare device 3D orientation with the sun position but I've found the explanations a bit confusing.

Let's say I want to display a cube with Three:

const geometry = new THREE.BoxGeometry(0.07, 0.07, 0.07);
const material = new THREE.MeshBasicMaterial({ color: 0x00ff00 });
const cube = new THREE.Mesh(geometry, material);

cube.position.x = 0;
cube.position.y = 0;
cube.position.z = -1;

The final goal here will be to find the correct {x, y, z} so the cube can be displayed at the sun's location. This vector will be of course updated every time the user moves his phone in space.

Mencls
  • 329
  • 2
  • 6
  • 15

0 Answers0