I'm writing a scraper using FSharp.Collections.ParallelSeq
and a retry computation. I would like to retrieve HTML from multiple pages in parallel, and I would like to retry requests when they fail.
For example:
open System
open FSharp.Collections.ParallelSeq
type RetryBuilder(max) =
member x.Return(a) = a // Enable 'return'
member x.Delay(f) = f // Gets wrapped body and returns it (as it is)
// so that the body is passed to 'Run'
member x.Zero() = failwith "Zero" // Support if .. then
member x.Run(f) = // Gets function created by 'Delay'
let rec loop(n) =
if n = 0 then failwith "Failed" // Number of retries exceeded
else try f() with _ -> loop(n-1)
loop max
let retry = RetryBuilder(4)
let getHtml (url : string) = retry {
Console.WriteLine("Get Url")
return 0;
}
//A property/field?
let GetHtmlForAllPages =
let pages = {1 .. 10}
let allHtml = pages |> PSeq.map(fun x -> getHtml("http://somesite.com/" + x.ToString())) |> Seq.toArray
allHtml
[<EntryPoint>]
let main argv =
let htmlForAllPages = GetHtmlForAllPages
0 // return an integer exit code
When I try to interact with GetHtmlForAllPages
from main
the code seems to hang. Stepping through the code shows me that PSeq.map
begins work on the first four values of pages
.
What's going on that causes the retry
computation expression to never start/complete? Is there some weird interplay between PSeq
and retry
?
The code works as expected if I make GetHtmlForAllPages
a function and invoke it. I'm curious what's going on when GetHtmlForAllPages
is a field?