I'm trying to use Python to compute the apparent magnitude of the star Algol.
Wikipedia: Algol's magnitude is usually near-constant at 2.1, but regularly dips to 3.4 every 2.86 days during the roughly 10-hour-long partial eclipses.)
Where's the error in my code? I'm getting 2.2 and no visible dips towards that 3.4
from skyfield.api import Star, load
from skyfield.data import hipparcos
from datetime import timedelta
import math
# Load the JPL ephemeris DE421 (covers 1900-2050).
planets = load('de421.bsp')
earth = planets['earth']
# load Hipparcos ephemeris
with load.open(hipparcos.URL) as f:
df = hipparcos.load_dataframe(f)
# Create a timescale
ts = load.timescale()
now = ts.now()
# load Algol from Hipparcos
algol = Star.from_dataframe(df.loc[14576])
absolute_magnitude = -0.07
for x in range(100):
time = now + timedelta(hours=x)
astrometric = earth.at(time).observe(algol)
ra, dec, distance = astrometric.radec(epoch=time)
apparent_magnitude = absolute_magnitude + (5 * math.log10((distance.au/206264.80749673))) - 5
print(apparent_magnitude)
Output:
2.20099154236185
2.200991542408758
2.2009915424554958
2.2009915425020647
etc.