I have a list of url like this:
url = ['url_1','url_2', 'url_3']
and there are 300 elements in the list.
As their HTML structure is similar, I have written a function to crawl it and extract the information that I need:
def get_department_and_units_hours(url):
res = requests.get(url)
soup = BeautifulSoup(res.content, "html.parser")
data = [item.string for item in soup.find_all('td')]
data = data[1:]
return data
Then, I iterate through the list and append the data into an array
department_and_units_hours = []
for item in url:
department_and_units_hours.append(get_department_and_units_hours(item))
print(department_and_units_hours)
When I run it, it is no response. Just nothing happened. It can't print out the content of the array.
I want to ask why the problem occurs and how to fix it? I really have no idea about it.