14

I imagine it's pretty easy to do, but I can't figure out what I'm doing wrong. I'm using Abraham's OAuth to gain access. I'm building a database with my follower's information: screen name, user name and twitter ID. Nothing too special.

I referenced Twitter's "cursoring" page, especially the pseudo code, to make my code. For those who don't want to click the link to see said pesudo code, it looks like the following:

cursor = -1

api_path = "https://api.twitter.com/1.1/endpoint.json?screen_name=targetUser"

do {

    url_with_cursor = api_path + "&cursor=" + cursor      

    response_dictionary = perform_http_get_request_for_url( url_with_cursor )

    cursor = response_dictionary[ 'next_cursor' ]

}

while ( cursor != 0 )

With every request, the end user gets a "cursor" which allows them to navigate through "pages" of results. Each page holds 20, and if you have 200 followers you have to go through 10 pages. I have over 900 followers. I modified it to look like the following:

 include('config.php');  //db connection
 include('twitter_oauth.php'); //oauth connection

 $followers = "";

$cursor = -1;
echo '<pre>';   
do {

    $consumerKey = 'xxx';
    $consumerSecret = 'xxx';
    $OAuthToken = 'xxx';
    $OAuthSecret = 'xxx';

    $tweet = new TwitterOAuth($consumerKey, $consumerSecret, $OAuthToken, $OAuthSecret);

    $followers = $tweet->get('followers/list', array('screen_name' => 'my_screen_name', 'cursor' => $cursor));

    print_r($followers);

    if (isset($followers->error)) {
        echo $followers->next_cursor_str;
        break;
    } 

    foreach($followers->users as $users) {

        $followersQ = mysql_query("SELECT * FROM followers WHERE tw_id = '".$users->id."'") or die(mysql_error());
        $num_rows = mysql_num_rows($followersQ);

        if ($num_rows == 0) {
            $followersQ2 = "INSERT INTO followers 
                                        (screen_name, name, tw_id)
                                        VALUES
                                        ('".$users->screen_name."', '".$users->name."', '".$users->id."')";
            $followersR = mysql_query($followersQ2) or die(mysql_error());
            echo 'done one set<br>';
        }

    }


    $cursor = $followers->next_cursor_str;

}

while ( $cursor != 0 );
echo '</pre>';

?>

The above code calls the twitter followers/list and gets the first 20 users. It then gets a cursor and goes to the next one, and repeats. Only, it seems after about 80 users it gives me the lovely:

[errors] => Array
    (
        [0] => stdClass Object
            (
                [message] => Rate limit exceeded
                [code] => 88
            )

    )

I could manually get the next cursor, wait 15 minutes for the rate limit to go down, call the function again with the cursor, get the next 80 items, then get that key and repeat, but I want to set up a script that can call it over and over.

I feel I'm doing something wrong, either with my function where I call oAuth, or outside of it somewhere. Can somebody point me in the right direction?

Thank you.

Kenton de Jong
  • 1,001
  • 4
  • 17
  • 37
  • You will need to limit your rate, with `sleep()` you should have 15 requests per 15 minute so you can sleep for a minute after each request, or blast though your requests and then sleep for 15 minutes. It also looks like there is header info sent for determining when you can go again https://dev.twitter.com/docs/rate-limiting/1.1 – cmorrissey Sep 03 '13 at 19:50
  • That's not a bad idea. It would take a few hours though. If nothing else, I'll do that. Thanks! – Kenton de Jong Sep 03 '13 at 23:17

2 Answers2

18

This a way faster, but there's a limitation concerns also :

1- make a request to get all followers ids ... paging with 5000 id in the page https://developer.twitter.com/en/docs/twitter-api/v1/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids

2- loop on the ids and send each 100 id in a comma separated string to get their info https://developer.twitter.com/en/docs/twitter-api/v1/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup

3- now u can get 1500 user object instead of 300 user object every 15 minutes

But you need also to set a timer every 15 request in case the followers list is more than 1500

taha
  • 722
  • 7
  • 15
Islam Ahmed
  • 539
  • 1
  • 5
  • 17
  • 1
    I ended up doing this, but I modified it to use https://dev.twitter.com/docs/api/1.1/get/users/show instead so I could get both the user name and screen name. Your concept worked great, but I just needed some more more data. Doing my approach, I had to set a loop and a timer, much as Relequestual had said, but in the end, it worked, so thank you :) – Kenton de Jong Sep 05 '13 at 20:32
  • 1
    U r welcome :) ... Unfortunately twitter limitation is too high and we have to work around :D – Islam Ahmed Sep 06 '13 at 17:53
  • Am I missing something? Doesn't this just change the bottleneck from step 1 to step 2? There is already a call to get followers 200 at a time, and step 2 is 100 at a time. – pete Jan 20 '17 at 22:01
  • 1
    @pete This answer was 3 years ago, I don't really know if twitter updated their APIs to solve this issue or not. But this was the only solution back in 2013. – Islam Ahmed Jan 21 '17 at 19:09
  • 4
    In 2017 - users/lookup has a limit of 900 per 15 minutes with 100 results per response for 90,000 every 15 minutes. The bottleneck is now followers/ids with 15 request per 15 minutes returning 5,000 ids at 75,000. – dotcomly Oct 23 '17 at 18:03
  • this is still true in 2018 - though if you need screen_names (and not twitter user ids) you'll need to do the `get/users/show` method instead of `get/users/lookup`. This yields 900 user screen_names every 15 minutes instead of 15 every 15 mins (when using `/get/friends/list`). – Marc Maxmeister Feb 07 '18 at 16:53
2

I don't think there is any way round the limitations imposed. Even tweetbot has this limitation, as it's a limitation twitter impose. You could create a note in the database of the current status and set a cron job to run every 15 minutes which would run a group of requests again. It will take time, but it could notify you via email when it's finished. That's what services like socialbro do. You'd cache those results in your database of course.

Relequestual
  • 11,631
  • 6
  • 47
  • 83