0
  • I want to develop an IOS SDK for Unity/COCOS/SpritKit game app to record video.Main program is to synthetic video after screenshots.

  • Now I do the test with IOS project from unity game on IOS 7.0 device and try to use Opengl function glreadpiexls() to read pixel buffer, but always got the black image which troubled me for weeks. I try some solutions like to set the context and set the EAGLView drawing property before glreadpiexls() but not work. And some suggest to make sure to execute glreadpiexls() before presentRenderbuffer, but how . I can not modify the code from Unity.

  • Solutions like IOSuface depreciated on iOS 9.0 and seems only for UIView elements and IOSurface was depreciated on ios 9.0 and seems only for UIView elements. RenderTexture or Application.CaptureScreenshot are only for unity project, not for my IOS SDK.

So

So can anyone give me some suggestions for the screenshot on iOS for Unity/COCOS/SpritKit games, and this is my current code.

int w = 320;//size.width;//viewport[2];//
int h = 480;//size.height;//viewport[3];//
NSInteger myDataLength = w * h * 4;    

// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);

glReadPixels(0, 0, w, h, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
GLubyte temp = 0;
int index1 = 0;
int index2 = 0;
for(int y = 0; y < h - 1 - y; y++) {  // 交换
    index1 = y * 4 * w;
    index2 = (h -1 - y) * w * 4;
    for(int x = 0; x <w * 4; x++) {
        temp = buffer[index1 + x];
        buffer[index1 + x] = buffer[index2 + x];
        buffer[index2 + x] = temp;
    }
}

// make data provider with data.
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, myDataLength, NULL);


// prep the ingredients
int bitsPerComponent = 8;
int bitsPerPixel = 32;
int bytesPerRow = 4 * w;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
// make the cgimage
CGImageRef imageRef = CGImageCreate(w, h, bitsPerComponent, bitsPerPixel, bytesPerRow, CGColorSpaceCreateDeviceRGB(), kCGBitmapByteOrderDefault, provider, NULL, NO, kCGRenderingIntentDefault);

// myImage is alway nil when debug
UIImage *myImage = [ UIImage imageWithCGImage:imageRef scale:s orientation:UIImageOrientationUp ];    
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpaceRef);
Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
Jack Mei
  • 1
  • 1

1 Answers1

0

And some suggest to make sure to execute glreadpiexls() before presentRenderbuffer, but how . I can not modify the code from Unity.

What they are saying is that you should convert that piece of code into a simple function. For example void takeScreenShot(){//your code here} then convert it into a plugin. Here is how to do that.

After converting into a plugin, create a simple script in Unity and name it UnityCam. Attach this script to the main Camera in Unity. Inside the Unity cam script, call the takeScreenShot() that is in your iOS plugin. You should ONLY call it from the OnPostRender() callback function not from Start() or Update() function.

If that doesn't work, there is a new hidden and undocumented Unity API for recording Video on iOS. Download Unity 5.4 first, then use the code below to record video on iOS. Unity is working on recording for Android too but since this question is about iOS, the code below should do it. Note that it will only work on from iOS 9 and above.

using UnityEngine.Apple.ReplayKit;

void startRecordingOnIOS()
{
    if (ReplayKit.APIAvailable)
    {
        if (ReplayKit.isRecording)
        {
            ReplayKit.StopRecording();
        }
        ReplayKit.StartRecording(true);//True to use mic, false not not use mic
    }
    else
    {
        //Fallback. Use your code from your question()
    }
}

void stopRecordingOnIOS()
{
    if (ReplayKit.APIAvailable)
    {
        if (ReplayKit.isRecording)
        {
            ReplayKit.StopRecording();
        }
    }
    else
    {
        //Fallback. Use your code from your question()
    }
}

void viewRecordedVideoOnIOS()
{
    if (ReplayKit.APIAvailable)
    {
        if (ReplayKit.recordingAvailable)
        {
            ReplayKit.Preview();
        }
    }
    else
    {
        //Fallback. Use your code from your question()
    }
}

void discardRecordedVideoOnIOS()
{
    if (ReplayKit.APIAvailable)
    {
        if (ReplayKit.isRecording)
        {
            ReplayKit.Discard();
        }
    }
    else
    {
        //Fallback. Use your code from your question()
    }
}
Community
  • 1
  • 1
Programmer
  • 121,791
  • 22
  • 236
  • 328
  • Thanks for yoour patients.When unity project release to IOS, open it on XCode and add my recording sdk.And the fact is that, I can't modify the part of code from unity5 or cocos etc. I just want to do a general IOS SDK for games which may be developed by engines like unity3D or cocos etc. – Jack Mei Jun 03 '16 at 02:08
  • @JackMei It's really hard to understand your comment. If you are developing a recording API for Unity, my answer is enough to get you started. use the built in API from Unity. If it is not available on that device then use the plugin from your question. If you want anyone to be able to use your sdk in Unity, you have to to make it into a plugin so that they can all it from Unity side. You are getting a black screen because you are not calling that from the function `OnPostRender()` from unity. All these are mentioned in my answer. – Programmer Jun 03 '16 at 02:48
  • Really thanks for your answer. I know it works but I need to develop a general ios sdk for unity/cocos/sprikit games.I am developing a recording API on XCode not for unity. For game developers of my team, I will provide interface for the ios project from one of the three engines.So Can you understand what I mean? – Jack Mei Jun 06 '16 at 02:01