-1

I have multiple files in my folder , i want to pattern match if any file is present , if that file is present then store the variable with whole file path.

how to achieve this in pyspark

harshith
  • 71
  • 9
  • [`input_file_name`](https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.functions.input_file_name.html) – blackbishop Aug 08 '22 at 08:13
  • Does this answer your question? [Spark load data and add filename as dataframe column](https://stackoverflow.com/questions/39868263/spark-load-data-and-add-filename-as-dataframe-column) – blackbishop Aug 08 '22 at 08:15

1 Answers1

0

Since you want to store the whole path in a variable, you can achieve this with a combination of dbutils and Regular expression pattern matching.

  • We can use dbutils.fs.ls(path) to return the list of files present in a folder (storage account or DBFS). Assign its return value to a variable called files.
#my sample path- mounted storage account folder.

files = dbutils.fs.ls("/mnt/repro")
  • Loop through this list. Now using Python's re.match() you can check if the current item's file name matches your pattern. If it matches, append its path to your result variable (list).
from re import match
matched_files=[]

for file in files:
    #print(file)
    if(match("sample.*csv", file.name)): #"sample.*csv" is pattern to be matched
        matched_files.append(file.path)
#print("Matched files: ",matched_files)

Sample output:

enter image description here

Saideep Arikontham
  • 5,558
  • 2
  • 3
  • 11