0

I'm trying to use Python to compute the apparent magnitude of the star Algol.

Wikipedia: Algol's magnitude is usually near-constant at 2.1, but regularly dips to 3.4 every 2.86 days during the roughly 10-hour-long partial eclipses.)

Where's the error in my code? I'm getting 2.2 and no visible dips towards that 3.4

from skyfield.api import Star, load
from skyfield.data import hipparcos
from datetime import timedelta
import math

# Load the JPL ephemeris DE421 (covers 1900-2050).
planets = load('de421.bsp')
earth = planets['earth']

# load Hipparcos ephemeris
with load.open(hipparcos.URL) as f:
    df = hipparcos.load_dataframe(f)

# Create a timescale
ts = load.timescale()
now = ts.now()

# load Algol from Hipparcos
algol = Star.from_dataframe(df.loc[14576])


absolute_magnitude = -0.07
        
for x in range(100):
        time = now + timedelta(hours=x)
        astrometric = earth.at(time).observe(algol)

        ra, dec, distance = astrometric.radec(epoch=time)

        apparent_magnitude = absolute_magnitude + (5 * math.log10((distance.au/206264.80749673))) - 5
        print(apparent_magnitude)

Output:

2.20099154236185
2.200991542408758
2.2009915424554958
2.2009915425020647
etc.
velkyvont
  • 23
  • 3
  • I'm showing the absolute magnitude of Algol as -0.11. Where did you get -0.07? And nothing in here is taking into account the occultation. You're computing a result strictly based on distance. That's never going to dip. – Tim Roberts May 07 '23 at 00:06
  • -0.07 is mentioned on Wikipedia. Changing it to -0.11 results in 2.16. Could you please give some tips on how to take the occultation into account? Cause I have no idea. – velkyvont May 07 '23 at 01:04
  • No idea. That's not a computing question, that an astronomy question. You might check out https://astronomy.stackexchange.com/ – Tim Roberts May 07 '23 at 05:56

0 Answers0