3

I am trying to download data from a school course catalog site. I have 64 Urls in the variable UrlBook. I have successfully written code to download a collection of courses and turn them into a single subject object from a single url using completion handler method. I don't really know how should I implement to collect all the subjects from 64 url and eventually turn them into a catalog object (it contains a list of subject objects).

I have read many articles and posts on asynchronous and synchronous processing, it's just so confusing to me. I would really appreciate easy and straight forward code to help me solve this problem. Thank you guys!

let urlBook = getUrlFromBook()

func fetchClassInfo(url:URL,completion: @escaping ([clase])-> Void){
    let task = URLSession.shared.dataTask(with: url){(data, response, error) in
        let jsonDecoder = JSONDecoder()
        if let data = data,
            let collection:[clase] = try?  jsonDecoder.decode([clase].self, from: data){
            completion(collection)
        }else{
            print("Either no data was returned, or data was not properly decoded.")
            //completion(nil)
        }
    }
    task.resume()
}

fetchClassInfo(url:urlBook.book[0]){(clase) in
    let newSubject = makeNewSubject(subjectIndex: 0, collectionOfCourse: clase)
    var masterCatalog = catalog(subjectCollection: [])
     masterCatalog.addSubject(newSubject: newSubject)

    }
Jay Gatsby
  • 59
  • 1
  • 3
  • So let me get it right, you have a bunch of urls and you want to make multiple request to all those urls? – ERP Jun 18 '20 at 01:11
  • Also you want the network requests to be concurrent (that is all in different threads and in parallel) or serial way, one after the other? – ERP Jun 18 '20 at 01:13
  • yes I have a collection of them and I try to access all of them and then pack all the data into a giant object(called catalog) for my app.There is no preference to whether it should be concurrent or not. I just get frustrated about working with closure and completion handler. My first try was to wrap the fetchClassInfo closure into a for loop to make all these network requests, but turn out it failed. – Jay Gatsby Jun 18 '20 at 05:18
  • I am hoping to get some clear structure and insights on how to access multiple networks and add the returned objects(subjects) into a list and then make a new object called catalog that have one field "var subjectCollection:[subject]" – Jay Gatsby Jun 18 '20 at 05:20

3 Answers3

4

You can basically create a logic like below. This functions takes a list of url and return a list of Subject in completion. You can modify the models etc, as you need. In this function; DispatchGroup is to wait all requests to be completed before calling completion and DispatchQueue is to prevent "data race" when appending subjects into array.

func downloadUrls(urls: [URL], completion: @escaping ([Subject]) -> Void) {
    var subjectCollection: [Subject] = []    
    let urlDownloadQueue = DispatchQueue(label: "com.urlDownloader.urlqueue")
    let urlDownloadGroup = DispatchGroup()

    urls.forEach { (url) in
        urlDownloadGroup.enter()
    
        URLSession.shared.dataTask(with: url, completionHandler: { (data, response, error) in
            guard let data = data, 
                let subject = try? JSONDecoder().decode(Subject.self, from: data) else {                    
                // handle error                    
                urlDownloadQueue.async {
                    urlDownloadGroup.leave()
                }
                return
            }
        
            urlDownloadQueue.async {
                subjectCollection.append(subject)
                urlDownloadGroup.leave()
            }
        }).resume()
    }

    urlDownloadGroup.notify(queue: DispatchQueue.global()) {
        completion(subjectCollection)
    }
}
p10ben
  • 425
  • 1
  • 6
  • 17
Omer Faruk Ozturk
  • 1,722
  • 13
  • 25
2

This is a function that downloads all urls concurrently:

func downloadAllUrls(urls: [String]){
    let dispatchGroup = DispatchGroup()
    for url in urls {
        dispatchGroup.enter()
        // Here is where you have to do your get call to server 
        // and when finished call dispatchGroup.leave()
    }
    dispatchGroup.notify(queue: .main) {
        // Do what ever you want after all calls are finished
    }
}

it uses DispatchQueue to notify when all requests are finished and then you can do what ever you want with them. The Server could be your custom async call to the network. Mine is like that.

ERP
  • 325
  • 2
  • 11
  • Let me know if you want to know how to implement the Server.get call that I used @Jay Gatsby – ERP Jun 18 '20 at 05:56
1

I would suggest to look on Promises approach since it has solutions for asynchronous tasks from the box. There are many implementations of Promises on swift. Even more you can try to use SDK's Combine framework but it is a little bit complex and works from iOS 13 only.

For instance there is a sample with my PromiseQ swift package:

Promise.all( paths.map { fetch($0) } ) // Download all paths concurrently
.then { dataArray in
    // Parse all data to array of results
    let results:[YourClass] = dataArray.compactMap { try? JSONDecoder().decode(YourClass.self, from: $0) }
}

Where:

  • paths:[String] - array with http paths to download.
  • fetch(_ path: String) -> Promise<Data> - a special function to download data by a path that returns a promise.
  • YourClass - your class to parse.
iUrii
  • 11,742
  • 1
  • 33
  • 48