Dask client spams warnings in my Jupyter Notebook output. Is there a way to switch off dask warnings?
Warning text look like this: "distributed.worker - WARNING - Memory use is high but worker has no data to store to disk. Perhaps some other process is leaking memory? Process memory: 3.16 GB -- Worker memory limit: 4.20 GB"
The problem appear after these code:
import pandas as pd
from sqlalchemy import create_engine, MetaData
from sqlalchemy import select, insert, func
import dask.dataframe as dd
from dask.distributed import Client
client = Client(n_workers=4, threads_per_worker=4, processes=False)
engine = create_engine(uri)
meta_core = MetaData()
meta_core.reflect(bind=engine)
table = meta_core.tables['table']
dd_main = dd.read_sql_table(
table=table,
uri=uri,
index_col='id'
)
dd_main.head()
After executing the chunk above, I get a lot of these warnings in every Jupyter cell, so I can't even find my actual output.