0

I have a dataframe in Pyspark - df_all. It has some data and need to do the following

count = ceil(df_all.count()/1000000)

It gives the following error

TypeError: Invalid argument, not a string or column: 0.914914 of type <class ‘float’>. For column literals, use ‘lit’, ‘array’, ‘struct’ or ‘create_map’ function.

How can I use ceil function in pyspark?

user2280352
  • 145
  • 11

1 Answers1

0

Looks like for your requirement, this would be suitable:

import math

count = math.ceil(df_all.count()/1000000)
o_O
  • 341
  • 3
  • 14