I am currently using the WebCamTexture class to display live camera feed until the user takes a snapshot, and then I use said snapshot as a Texture in my app.
Here's the code I'm using at the moment :
private WebCamTexture cameraTexture;
private Texture2D snapshot;
public UITexture webCamTexture;
snapshot = new Texture2D(cameraTexture.width, cameraTexture.height);
snapshot.SetPixels(m_CameraTexture.GetPixels());
snapshot.Apply();
webCamTexture.mainTexture = snapshot;
Note : The UITexture class comes from NGUI, it's only to display the texture on the scene.
On Android devices, there appears to be no problem. However, when I use this on iOS devices (tested on iPad2 and iPad3), the texture instantly becomes blurry when I set it. Is this a focus problem ?
I've tried a couple things, mainly waiting for the end of the frame before taking the shot and calling cameraTexture.Pause() before getting the pixels, to no avail.
Why does the iOS texture become blurry ?