You can refer to the answer in the following link :
https://alanbuxton.wordpress.com/2018/10/09/using-local-settings-in-a-scrapy-project/
I copy here for quick reference:
Edit the settings.py file so it would read from additional settings files depending on a SCRAPY_ENV environment variable
Move all the settings files to a separate config directory (and change scrapy.cfg so it knew where to look
The magic happens at the end of settings.py:
from importlib import import_module
from scrapy.utils.log import configure_logging
import logging
import os
SCRAPY_ENV=os.environ.get('SCRAPY_ENV',None)
if SCRAPY_ENV == None:
raise ValueError("Must set SCRAPY_ENV environment var")
logger = logging.getLogger(__name__)
configure_logging({'LOG_FORMAT': '%(levelname)s: %(message)s'})
# Load if file exists; incorporate any names started with an
# uppercase letter into globals()
def load_extra_settings(fname):
if not os.path.isfile("config/%s.py" % fname):
logger.warning("Couldn't find %s, skipping" % fname)
return
mdl=import_module("config.%s" % fname)
names = [x for x in mdl.__dict__ if x[0].isupper()]
globals().update({k: getattr(mdl,k) for k in names})
load_extra_settings("secrets")
load_extra_settings("secrets_%s" % SCRAPY_ENV)
load_extra_settings("settings_%s" % SCRAPY_ENV)
Then in the python file you want to get the variables defined in the setting, use the following code
from scrapy.utils.project import get_project_settings
settings = get_project_settings()
env_variable = settings.get('ENV_VARIABLE')