YouTube returns 429 Too many requests to perl script while any web browser on same computer downloads YouTube webpages properly.
A search for YouTube 'rate limiting' timeout was unsuccessful.
There are lot of references to youtube-dl issues of similar nature.
At this moment I am puzzled why web browser is working fine but perl script gets '429 Too many requests' for retrieving even single webpage.
I did mass 'verification' of bookmarked videos stored in DB file more than 24 hours ago. As of this moment 429 Too many requests
is still the answer perl scripts gets from YouTube server.
An attempt to spoof agent in perl script did not change the outcome
$ua->agent('Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4044.92 Safari/537.36');
Any clues are welcome to better understanding the problem.
Is a value of rate limiting timeout (as described Retry-After: 3600
) on YouTube server against DDOS attack is described somewhere, nice to have a reference to this information.
Snippet code used to capture webpage with url of YouTube playlist
use strict;
use warnings;
use feature 'say';
use HTTP::Tiny;
use Getopt::Long qw(GetOptions);
use Data::Dumper;
my %opt;
$opt{url} = 'https://www.youtube.com/watch?v=XdTdu1MxDpE&list=UUgDFVgTnw_W5DftgN2NQApQ';
GetOptions(
'url|u=s' => \$opt{url},
'debug|d' => \$opt{debug},
'help|?' => \$opt{help},
'man' => \$opt{man}
) or pod2usage(2);
pod2usage(1) if $opt{help};
pod2usage(-exitval => 0, -verbose => 2) if $opt{man};
my $response = HTTP::Tiny->new->get($opt{url});
say Dumper($response) if $opt{debug};
if( $response->{success} ) {
say $response->{content};
} else {
say "Failed: $response->{status} $response->{reason}";
}
__END__
=head1 NAME
.......
Output
Failed: 429 Too Many Requests