I'd like to append columns from one pyspark dataframe to another.
In pandas, the command would look like
df1 = pd.DataFrame({'x':['a','b','c']})
df2 = pd.DataFrame({'y':[1,2,3]})
pd.concat((df1, df2), axis = 1)
Is there a way to accomplish this in pyspark? All I can find is either concatenating the contents of a column or doing a join.