In order to do some unit test on my data I am using PyDeequ. Is there a way to filter out the rows which violate the defined constraints? I was not able to find anything online. Here is my code:
df1 = (spark
.read
.format("csv")
.option("header", "true")
.option("encoding", "ISO-8859-1")
.load("addresses.csv", sep = ','))
check = Check(spark, CheckLevel.Warning, "Review Check")
checkResult = (VerificationSuite(spark)
.onData(df1)
.addCheck(
check
.isComplete("Nome")
.isComplete("Citta")
.isUnique("CAP")
.isUnique("Number")
.isContainedIn("Number", ("11","12","13","14","15","16"))
)
.run())
checkResult_df = VerificationResult.checkResultsAsDataFrame(spark, checkResult)
checkResult_df.show()