I have a website that hosts videos from a client. On the website the files load externally via m3u8 link.
The client would now like to have those videos on a Roku channel.
If I simply use the m3u8 link from the site it gives an error because the url generated is sent with a cookie and so a client must click and the link to generate a new code for them.
I would like if possible (and I have not seen this here) is to scrape the html page and just return the link via PHP script on the website from the Roku.
I know how to get titles and such using pure php but am having problems returning the m3u8 link..
I do have code to show I am not looking for handouts and actually am trying.
This is what I have used for getting the title name for example.
Note: I would like to know if it is possible to have one php that autofills the html page per url so I do not have to use a different php for each video with the url pretyped in.
<?php
$html = file_get_contents('http://example.com'); //get the html returned from the following url
$movie_doc = new DOMDocument();
libxml_use_internal_errors(TRUE); //disable libxml errors
if(!empty($html)){ //if any html is actually returned
$movie_doc->loadHTML($html);
libxml_clear_errors(); //remove errors for yucky html
$movie_xpath = new DOMXPath($movie_doc);
//get all the titles
$movie_row = $movie_xpath->query('//title');
if($movie_row->length > 0){
foreach($movie_row as $row){
echo $row->nodeValue . "<br/>";
}
}
}
?>