0

I am trying to scrape the first two sections values i.e 1*2 and DOUBLECHANCE sections values using bs4 and requests from this website https://web.bet9ja.com/Sport/SubEventDetail?SubEventID=76512106 The code which I written is:

import bs4 as bs
import urllib.request

source = urllib.request.urlopen('https://web.bet9ja.com/Sport/SubEventDetail?SubEventID=76512106')
soup = bs.BeautifulSoup(source,'lxml')

for div in soup.find_all('div', class_='SEItem ng-scope'):
    print(div.text)

when I run I am not getting anything please help me anyone

2 Answers2

1

The page is loaded via JavaScript, so you have 2 option. or to use selenium or to call the Direct API.

Instead of using Selenium, I've called the API directly and got the required info.

Further explanation about XHR & API < can be found once you click here.

import requests

data = {
    'IDGruppoQuota': '0',
    'IDSottoEvento': '76512106'
}


def main(url):
    r = requests.post(url, json=data).json()
    count = 0
    for item in r['d']['ClassiQuotaList']:
        count += 1
        print(item['ClasseQuota'], [x['Quota']
                                    for x in item['QuoteList']])
        if count == 2:
            break


main("https://web.bet9ja.com/Controls/ControlsWS.asmx/GetSubEventDetails")

Output:

1X2 ['3.60', '4.20', '1.87']
Double Chance ['1.83', '1.19', '1.25']
  • This is great. Ahmed, where should i refer to advance in requests (especially post, like you did)? – Joshua Varghese Apr 10 '20 at 13:04
  • @InfinityTM tracking the `XHR` request, [check](https://stackoverflow.com/questions/61044616/using-find-all-function-returns-an-unexpected-result-set/61045691#61045691) let me know if it's not clear yet or if you have additional questions – αԋɱҽԃ αмєяιcαη Apr 10 '20 at 13:06
  • @InfinityTM within the network tab, once you click on the `XHR` request made. you can see [params](https://imgur.com/MzLsU0e) – αԋɱҽԃ αмєяιcαη Apr 10 '20 at 13:19
  • @αԋɱҽԃαмєяιcαη I tried this https://bpaste.net/W45Q but getting [] can you please help me out –  Apr 10 '20 at 13:58
  • @sweety this is the code which I gave to you over the [chat](https://chat.stackoverflow.com/transcript/message/49079101#49079101) so you can check what I've wrote for you there then. – αԋɱҽԃ αмєяιcαη Apr 10 '20 at 14:05
  • Ok I will try @αԋɱҽԃαмєяιcαη –  Apr 10 '20 at 14:08
  • @αԋɱҽԃαмєяιcαη I should get only this 1X2 1 3.60, X 4.20, 2 1.87 –  Apr 10 '20 at 14:14
  • @sweety i think that my answer is already provided for you the expected output. if you are asking about selenium right now. so you browse the community as it's will be very duplicated question. – αԋɱҽԃ αмєяιcαη Apr 10 '20 at 14:15
  • actually your answer getting all the values I want to get only first one and even the 1,x in the section are not coming @αԋɱҽԃαмєяιcαη –  Apr 10 '20 at 14:17
  • @sweety your question is `I am trying to scrape the first two sections values i.e 1*2 and DOUBLECHANCE sections values` and you have received an answer according to it. – αԋɱҽԃ αмєяιcαη Apr 10 '20 at 14:18
  • yes @αԋɱҽԃαмєяιcαη but I should get like this I mentioned 1*2 1 -- DoubleChance values –  Apr 10 '20 at 14:19
  • only two I need to get please say that @αԋɱҽԃαмєяιcαη –  Apr 10 '20 at 14:20
  • @αԋɱҽԃαмєяιcαη one question where did you find this div.SEOddLnk.ng-binding it is not been showing in html code –  Apr 10 '20 at 14:28
0

Try:

import bs4 as bs
import urllib.request
import lxml
source = urllib.request.urlopen('https://web.bet9ja.com/Sport/SubEventDetail?SubEventID=76512106')
soup = bs.BeautifulSoup(source,'lxml')
a = soup.find_all('div')
for i in a:
    try:
        print(i['class'])
    except:
        pass
        try:
            sp = i.find_all('div')
            for j in sp:
                print(j['class'])
        except:
            pass

This helps you find available classes in the <div> tag.
You get nothing when the class you give doesn't exist. This happens as many of the sites are dynamically generated and requests can't get them. In these cases, we need to use selenium.

Joshua Varghese
  • 5,082
  • 1
  • 13
  • 34