0

In order to do some unit test on my data I am using PyDeequ. Is there a way to filter out the rows which violate the defined constraints? I was not able to find anything online. Here is my code:

df1 = (spark
       .read
       .format("csv")
       .option("header", "true")
       .option("encoding", "ISO-8859-1")
       .load("addresses.csv", sep = ','))

check = Check(spark, CheckLevel.Warning, "Review Check")

checkResult = (VerificationSuite(spark)
    .onData(df1)
    .addCheck(
        check
        .isComplete("Nome")
        .isComplete("Citta")
        .isUnique("CAP")
        .isUnique("Number")
        .isContainedIn("Number", ("11","12","13","14","15","16"))
    )
    .run())

checkResult_df = VerificationResult.checkResultsAsDataFrame(spark, checkResult)
checkResult_df.show()
leop
  • 41
  • 7

1 Answers1

0

Filtering for where constraint_status in checkResult_df is equal to Failure should be what you're looking for.

Building off the above example:

from pydeequ.checks import Check, CheckLevel, ConstrainableDataTypes
from pydeequ.verification import VerificationResult, VerificationSuite
from pyspark.sql import functions as F

df1 = (spark
       .read
       .format("csv")
       .option("header", "true")
       .option("encoding", "ISO-8859-1")
       .load("addresses.csv", sep = ','))

check = Check(spark, CheckLevel.Warning, "Review Check")

checkResult = (VerificationSuite(spark)
    .onData(df1)
    .addCheck(
        check
        .isComplete("Nome")
        .isComplete("Citta")
        .isUnique("CAP")
        .isUnique("Number")
        .isContainedIn("Number", ("11","12","13","14","15","16"))
    )
    .run())

checkResult_df = VerificationResult.checkResultsAsDataFrame(spark, checkResult)

# Added this snippet
# Filtering for any failed data quality constraints
df_checked_constraints_failures = \
    (checkResult_df
     .filter(F.col("constraint_status") == "Failure"))

It also might be helpful to alert or log these failures:

import logging

logger = logging.getLogger(__name__)

# If any data quality check fails, log/raise exception/alert Slack
if df_checked_constraints_failures.count() > 0:
    logger.info(
      df_checked_constraints_failures.show(n=df_checked_constraints_failures.count(),
                                           truncate=False)
    )
   # maybe raise exception here
   # maybe send POST message to Slack webhook for channel that monitors applications
Ethan Wicker
  • 526
  • 7
  • 14
  • This explains how to filter for constraints that failed, not how to filter rows from the original dataframe that failed those constraints – Prassi May 02 '23 at 13:21