0

I have been trying to integrate pytorch model developed on python into IOS. The example I have looked at is from this github repo.

I used the same d2go model in my own application. One thing I noticed is that if the model inference code isn't wrapped in the DispatchQueue global as shown below

DispatchQueue.global().async {
    guard let outputs = self.inferencer.module.detect(image: &pixelBuffer) else {
        return
    }

I get a error like Thread 1: EXC_BAD_ACCESS (code=1, address=0x7ffeeb4e0000) or if my model takes too long to run the inference even though it is wrapped in the dispatchQueue code above, I get an error like Thread 4: EXC_BAD_ACCESS (code=1, address=0x7ff159bed010).

Im not sure how threading works in such scenarios. I am running the code when a button is pressed in the new SwiftUI framework.

Any Intuition on why such a case might happen ? I have tried the above on simulators

calveeen
  • 621
  • 2
  • 10
  • 28

1 Answers1

0

You should probably declare "pixelBuffer" in the same scope (inside the dispatch block)

Arik Segal
  • 2,963
  • 2
  • 17
  • 29
  • nope that crashed as well. I am wondering if it is because the model that I am using is 78mb whereas d2go model is only 2 mb large. – calveeen Jul 07 '21 at 07:03