The Video Toolbox framework (VideoToolbox.framework) includes direct access to hardware video encoding and decoding in iOS and OSX.
Questions tagged [video-toolbox]
117 questions
3
votes
1 answer
Decoding frames with VTDecompressionSessionDecodeFrame fails with 12909 error
I'm trying to decode CMSampleBuffer so I can analyze their pixel data.
I keep getting error 12909 when I call VTDecompressionSessionDecodeFrame. This is all very new to me - any ideas where might be the problem?
Here's my code:
func…

msmialko
- 1,439
- 2
- 20
- 35
3
votes
1 answer
How to save encoded CMSampleBuffer samples to mp4 file on iOS
I'm using the VideoToolbox framework to retrieve data from AVCaptureSession and encode it to h264 and acc.
I'm at a point where I:
obtain data using the delegate method func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer:…

Grzegorz Aperliński
- 848
- 1
- 10
- 23
3
votes
4 answers
How to get CVPixelBuffer handle from UnsafeMutablePointer in Swift?
I got a decoded AVFrame whose format shows 160/Videotoolbox_vld. After googled some articles(here) and viewed the FFmpeg source code(here, and here), the CVBuffer handle should be at AVFrame.data[3]. But the CVBuffer I got seems invalid, any…

Chen OT
- 3,486
- 2
- 24
- 46
3
votes
0 answers
iOS VTDecompressionSessionDecodeFrame error -12909 when decoding HEVC
I have some trouble with streaming raw H.265 over rtsp when using VTDecompressionSessionDecodeFrame. The 3 main steps I do are the following ones:
OSStatus status = CMVideoFormatDescriptionCreateFromHEVCParameterSets(kCFAllocatorDefault, 3,…

Mench
- 31
- 4
3
votes
0 answers
Decode h264 video stream to get image buffer
I followed this post to decode my h264 video stream frames.
My data frames as bellow:
My code:
NSString * const naluTypesStrings[] =
{
@"0: Unspecified (non-VCL)",
@"1: Coded slice of a non-IDR picture (VCL)", // P frame
@"2:…

Nhat Dinh
- 3,378
- 4
- 35
- 51
3
votes
0 answers
FFmpeg enable videotoolbox on iOS
I build FFmpeg for iOS and have enabled VideoToolbox, but when decoding video use function avcodec_decode_video2, it doesn't use VideoToolbox and very slow on iPhone.
I use the example demuxing_decoding here:…

bigbangvn
- 51
- 8
3
votes
1 answer
VTCompressionSessionCreate is always crash
This is crash when i want to create a VTCompressionSessionRef by use VTCompressionSessionCreate, who can tell me why?????
dispatch_sync(aQueue, ^{
// Create the compression session
OSStatus status = VTCompressionSessionCreate(NULL,…

user5725770
- 31
- 3
3
votes
1 answer
VideoToolbox examples on iOS?
I see varied articles that point to iOS8 being able to do hardware encoding of h264.
I'm having trouble finding any real code examples of this, however, and I have trouble looking for a place to start.
Ideally, I want to be able to create h264…

Stefan Kendall
- 66,414
- 68
- 253
- 406
3
votes
2 answers
CMVideoFormatDescriptionCreateFromH264ParameterSets in Swift
The CoreMedia/Video Toolbox API uses a lot of pointers which in Swift is confusing me!
The SPS, PPS data has come from my h264 stream and I'm simply trying to create a VFD for it.
I have tried the following and expected it to work, but I get a…

ZENUAV
- 31
- 4
3
votes
1 answer
Using VTCompressionSession as in WWDC2014
The documentation on this library is essentially non-existent, so I really need your help here.
Goal: I need H264 encoding (preferably with both audio and video, but just video is fine and I'll just play around a few days to get audio to work too)…

dcheng
- 1,827
- 1
- 11
- 20
3
votes
1 answer
How to set MaxH264SliceBytes property of VTCompressionSession
iOS VTCompressionSession has a property which is kVTCompressionPropertyKey_MaxH264SliceBytes. However, I cannot set the kVTCompressionPropertyKey_MaxH264SliceBytes property of VTCompressionSession. It returns a -12900 error code…

Alpha
- 61
- 4
2
votes
1 answer
kVTVideoDecoderBadDataErr when using VTDecompressionSessionDecodeFrame and H264 NAL units
My code can successfully extract all the NAL units of a H264 stream, that is packed into an Avi-File. I can also parse the SPS, PPS and the NAL unit types 1 and 5. I then extract a whole GOP (group of pictures), starting with the SPS and PPS,…

Lupurus
- 3,618
- 2
- 29
- 59
2
votes
0 answers
Hardware problem? iphone12 consumes more time on VTCompressionSessionEncodeFrame and AVAssetWriter than iphoneXs
This question is similar with
[https://developer.apple.com/forums/thread/127613]
In my demo, the average execution time of VTCompressionSessionEncodeFrame in iphone12 is 10ms while iphoneXs only costs 6ms. If I decrease the frequency of calling that…

xu yang
- 21
- 1
2
votes
1 answer
Is there a way to stream a Metal Texture to Youtube on iOS?
I was looking into replaykit here https://developer.apple.com/videos/play/wwdc2018/601/ and was interested in knowing whether it would be possible to stream the result of a Metal renderpass to youtube. The use case involves giving the user…

synchronizer
- 1,955
- 1
- 14
- 37
2
votes
0 answers
iOS - fails to decode HEVC (H.265) stream if resolution is over 1080p
I am using VideoToolbox API from Apple to decode the HEVC stream. I am using AVSampleBufferDisplayLayer layer for rendering decoded frames.
I am successfully able to decode frames if the source resolution is 1080p (1920 X 1080) or less.
If the…

user3001078
- 21
- 1