1

I have a function that is in main_file.py

def transform_table(table_name, start, end):
  return sparkSession.table(table_name).filter(column.isBetween(start,end))

I wonder if it's possible to mock the sparkSession.table function call. This is what I have so far...I get Column not iterable error

@mock.patch("main_file.pyspark.sql.SparkSession")
@pytest.mark.usefixtures("spark")
  def test(self,mocked,spark):
    injected_df = spark.createDataFrame([(1, 100, "ketchup"), (2, 200, "mayo")], ["id", "qty", "condiment"])
    expected_df = spark.createDataFrame([(1, 100, "ketchup"),["id", "qty", "condiment"]) 

    mocked.table.return_value = injected_df
    
    obj = Class(spark)
    obj.transform_table("useless table name", 100, 150)
    assert obj == expected_df
user3746406
  • 81
  • 10
  • Can you provide an example with an included data excerpt? That would help. Where did you write your `transform_table` function? If it is inside a different file, e.g. `myfile.py`, you need to include this in your `mock.patch` path, e.g. `@mock.patch("myfile.pyspark.sql.SparkSession")` – Lukas Hestermeyer Mar 09 '23 at 07:18
  • @LukasHestermeyer not sure why StackOverflow deleted my comment but I updated – user3746406 Mar 09 '23 at 17:10

0 Answers0