0

The following code is only returning the first game. I would like to get all the Week 5 games and lines into a dataframe. Thanks in advance.

import pandas as pd # library for data analysis
import requests # library to handle requests
from bs4 import BeautifulSoup # library to parse HTML documents

# get the response in the form of html
url="https://www.espn.com/nfl/lines"
response=requests.get(url)

# parse data from the html into a beautifulsoup object
soup = BeautifulSoup(response.text,'html.parser')
indiatable=soup.find('section',{'class':"Card"})

df=pd.read_html(str(indiatable))
# convert list to dataframe
df=pd.DataFrame(df[0])
print(df.head())

df

           9:30 AM    REC (ATS)  LINE  OPEN   ML    FPI
0    New York Jets  1-3 (1-3-0)  45.0  43.5  130  42.8%
1  Atlanta Falcons  1-3 (1-3-0)  -2.5  -2.5 -150  56.9%
Out[85]:
9:30 AM REC (ATS)   LINE    OPEN    ML  FPI
0   New York Jets   1-3 (1-3-0) 45.0    43.5    130 42.8%
1   Atlanta Falcons 1-3 (1-3-0) -2.5    -2.5    -150    56.9%
Pmannn
  • 55
  • 7

2 Answers2

1

The data you are looking for can be fetched using API call.

Just iterate over the response and build the df(s)

See below

import requests

url = 'https://site.web.api.espn.com/apis/v2/scoreboard/header?sport=football&league=nfl&region=us&lang=en&contentorigin=espn&buyWindow=1m&showAirings=buy%2Clive%2Creplay&showZipLookup=true&tz=America/New_York'

r = requests.get(url)
if r.status_code == 200:
  print(r.json())
else:
  print(f'Oops - status code is {r.status_code}')
balderman
  • 22,927
  • 7
  • 34
  • 52
  • `requests.get('https://site.web.api.espn.com/apis/v2/scoreboard/header', params={'sport': 'football', 'league': 'nfl', 'region': 'us', 'lang': 'en', 'contentorigin': 'espn', 'buyWindow': '1m', 'showAirings': 'buy,live,replay', 'showZipLookup': 'true', 'tz': 'America/New_York'})` – Olvin Roght Oct 08 '21 at 21:14
0

You can use only pandas for this:

dfs = pd.read_html("https://www.espn.com/nfl/lines")

dfs - list of dataframes

To merge on single DataFrame:

df = pd.concat(dfs)
dimay
  • 2,768
  • 1
  • 13
  • 22