I want to get the length of each txt file in a folder. The files are all in txt format and are all in the same directory. The name of the txt files beginning with date Mon year format and followed with news titles such as upper case and lower case letters and signs such as space and '-', ','.
folder_path = '/home/runner/Final-Project/folder1/12 Aug 2020 File Name With Different Format.txt
I have sorted the txt files first according to the date and month format chronologically. Like below:
12 APR 2019 Nmae's something Something.txt
13 APR 2019 World's - as Countr something.txt
14 APR 2019 Name and location.txt
15 APR 2019 Name then location,for something.txt
and the code is below:
import re
import pandas as pd
import seaborn as sns
from matplotlib import pyplot as plt
from datetime import datetime
import os
import glob
folder_path = '/home/runner/Final-Project/folder1'
results=[os.path.basename(filename) for filename in glob.glob(os.path.join(folder_path, '*.txt'))]
out_1=sorted(results, key=lambda file: datetime.strptime(' '.join(file.split()[:3]), '%d %b %Y'))
print(*out_1,sep='\n')
How do I get the length of each txt file? Namely the word counts of each text file according to this date sorted order?