I need to round a column in PySpark using Banker's Rounding (where 0.5 is rounded to the nearest even number).
So far, I've tried this:
from pyspark.sql.functions round as _round
df = df.withColumn(new_name, col(old_name) * col('ExchangeRate'))
df = df.select("*", _round(col(new_name)))
Even if I'm running this in Python 3+, PySpark's rounding function will still apply the HALF_UP rounding method. I can't use Python's round() because it won't apply to a column object.
Is there a way to force PySpark's round() to use Banker's Rounding?