2

AVAssetReader is fantastic, but I can only see how to use it with a local asset, a file, or I guess a composition,

So,

assetReader = try AVAssetReader(asset: self.asset)
...
assetReader.addOutput(readerOutput)

and so on,

Say you have an arriving stream

(perhaps Apple's examples of .M3U8 files,

enter image description here

https://developer.apple.com/streaming/examples/ )

In fact, can AVAssetReader be used for streams? Or only local files?

I just plain cannot find this explained anywhere. (Maybe it's obvious if you're more familiar with it. :/ )

Fattie
  • 27,874
  • 70
  • 431
  • 719

1 Answers1

3

It's not obvious. Patching together the header file comments for both AVAssetReader and AVComposition gives the strong impression of an API designed only for local assets, although the language does not explicitly rule out non-local assets.

From the AVAssetReader header file:

Instances of AVAssetReader read media data from an instance of AVAsset, whether the asset is file-based or represents an assembly of media data from multiple sources, as is the case with AVComposition.

and from AVComposition:

An AVComposition combines media data from multiple local file-based sources in a custom temporal arrangement, in order to present or process media data from multiple sources together. All local file-based audiovisual assets are eligible to be combined, regardless of container type.

If you're interested in video only, and don't mind processing as part of playback, you can capture frames from a remote asset by adding an AVPlayerItemVideoOutput to your AVPlayerItem. If you're interested in audio, you're up a creek.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • 1
    Got it. Fantastic, thanks. I think then really it's only for files. Yeah my goal is fairly "simple" (1) get a streamed in (like,. ts) stream of H264. (2) separate into the various (say, 2 or 3) streams. each stream gets hardware decoded (so, basically VTDecompression). Really that's it! I fear that AVAssetReader, while it would be much easier, is probably not the answer. – Fattie Dec 15 '18 at 23:12
  • 1
    I've come to the conclusion the only way to do it really is (A) get the stream (B) basically using this process .. stackoverflow.com/questions/29525000 .. package the stream ready for VTDecompression, (C) send to VTDecompression, and that's it, use the raw frames. Really "B" is the touchy part I guess. I must admit I'm a bit stumped on how to do merely "A" ! :) but there's a couple examples around such as gityhub/Avois, so, it's coming together :O I was hoping for a cheap easy solution with AVAssetReader :) – Fattie Dec 15 '18 at 23:14
  • 1
    p.s. what I said is not quite true: some remote assets _do_ work with `AVAssetReader`, but you have to wrap them in an `AVComposition` first. A remote mp4 did work for me, but an m3u8 did not. This fact could simplify your (B) a lot. (A) is not so bad, m3u8 files are just text files pointing (eventually) to mp4s (or something else). There could be some pointers in this https://github.com/shogo4405/HaishinKit.swift – Rhythmic Fistman Dec 17 '18 at 22:11
  • 1
    Fascinating, I will probe in to that. Fantastic. Interesting thing - in my case, I've gone down the path of compiling ffmpeg from scratch in to an iOS app, and then .. just writing my own in ffmpeg from scratch. – Fattie Dec 17 '18 at 22:52