I am using apscheduler to create background jobs to run in my fastapi application, however when I start the application with more than one worker the job is duplicated across the workers resulting in all kinds of error and potentially security flaws.
here is a simplified example
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from fastapi import BackgroundTasks, FastAPI
def some_job():
print ("hello world")
app = FastAPI()
shed = AsyncIOScheduler()
shed.add_job(some_job, "interval", seconds=5)
@app.get("/test")
async def test():
return {"Hello":"world"}
shed.start()
and to run it use uvicorn main:app --workers 4
this will result in hello world
being printed 4 times every time the job is triggered.
Is there a way to call one instances across all the jobs (on the parent process level)?
I research some solutions online but most of them use celery and similar modules or lock files and memory locations but both options are too complicated and I prefer to use the minimum number of modules