For example, I have 2 syntaxes that accomplish the same thing on a finance data frame:
Spark SQL
df.filter("Close < 500").show()
PySpark
df.filter(df["Close"] < 500).show()
Is one of them better for any reason like performance, readability or something else I'm not thinking about?
I'm asking because I'm about to start implementing Pyspark in my company and whatever route I chose will probably became cannon there.
Thanks!