0

I have installed airflow locally and I am changing the executor to run parallel tasks For that, I changed

1- the Database to Postgres 13.3

2- in the config file sql_alchemy_conn = postgresql+psycopg2://postgres:postgres@localhost/postgres

3- executor = LocalExecutor

I have checked the DB and no errors using airflow db check --> INFO - Connection successful. airflow db init --> Initialization done

Errors that I receive and I don't use SQLite at all

1- {dag_processing.py:515} WARNING - Because we cannot use more than 1 thread (parsing_processes = 2 ) when using SQLite. So we set parallelism to 1.

2- I receive this error from airflow web-interface

The scheduler does not appear to be running.
The DAGs list may not update, and new tasks will not be scheduled.

So shall i do any other change ?

2 Answers2

2

Did you actually restart your Airflow webserver/scheduler after you changed the config?

Jarek Potiuk
  • 19,317
  • 2
  • 60
  • 61
0

The following logging statement:

{dag_processing.py:515} WARNING - Because we cannot use more than 1 thread (parsing_processes = 2 ) when using SQLite. So we set parallelism to 1.

It comes from Airflow 2.0.1 with the following code fragment

if 'sqlite' in conf.get('core', 'sql_alchemy_conn') and self._parallelism > 1:
    self.log.warning(
        "Because we cannot use more than 1 thread (parsing_processes = "
        "%d ) when using sqlite. So we set parallelism to 1.",
        self._parallelism,
    )
    self._parallelism = 1

This means that somehow, it is still on 'sqlite' based on your [core] sql_alchemy_conn setting. I think if you are certain you changed the airflow.cfg and restart all airflow service, that it might be picking up another copy of an airflow.cfg then you expect. Please inspect the logs to verify it is using the correct one.

Jorrick Sleijster
  • 935
  • 1
  • 9
  • 22