2

I am currently using the WebCamTexture class to display live camera feed until the user takes a snapshot, and then I use said snapshot as a Texture in my app.

Here's the code I'm using at the moment :

private WebCamTexture cameraTexture;
private Texture2D snapshot;
public UITexture webCamTexture;

snapshot = new Texture2D(cameraTexture.width, cameraTexture.height);
snapshot.SetPixels(m_CameraTexture.GetPixels());
snapshot.Apply();
webCamTexture.mainTexture = snapshot;

Note : The UITexture class comes from NGUI, it's only to display the texture on the scene.

On Android devices, there appears to be no problem. However, when I use this on iOS devices (tested on iPad2 and iPad3), the texture instantly becomes blurry when I set it. Is this a focus problem ?

I've tried a couple things, mainly waiting for the end of the frame before taking the shot and calling cameraTexture.Pause() before getting the pixels, to no avail.

Why does the iOS texture become blurry ?

Bypp
  • 331
  • 3
  • 22
  • What is the requested size of `cameraTexture`? – Sergey Krusch Jun 10 '14 at 02:35
  • The size I use is not set by code, I just apply the texture to a prefab I made, which varies depending on the resolution of the platform I use. It's not full screen, if that's what you wanted to know. – Bypp Jun 10 '14 at 13:04
  • Done some further testing. Apparently the maximum supported resolution by Unity is 1280x720, don't know if this does anything because the device I've tested on has only 1024x768 resolution... Anyone ? – Bypp Jun 10 '14 at 19:50
  • Well, iPad has 2 cameras. 720p is the front camera resolution. – Sergey Krusch Jun 10 '14 at 20:41
  • Unsure if resolution truly is my problem here... The live camera feed looks clear when I display it, the problem occurs when I get/set the pixels, then and only then it becomes blurry. – Bypp Jun 11 '14 at 12:53
  • I haven't worked with cameras, unfortunately. But there's no one to answer. So, I'll continue guessing :D Try to do `cameraTexture.EncodeToPNG()`, save it to file and see if it's blurred. – Sergey Krusch Jun 11 '14 at 13:18

2 Answers2

1

I had the same problem, but setting mipmaps to false in the Texture2D constructor seemed to fix it for me

I.e.:

snapshot = new Texture2D(webcamTexture.width, webcamTexture.height, TextureFormat.ARGB32, false);

I don't have an explanation on why it works unfortunately.

  • mipmap is making lowrez versions of the texture and fading between them depending on how far away the camera sits from the quad being rendered, I guess since you are using NGUI, you might be rendering the UI as quads seen far away from the camera making the texture display a mipmap level with less detail in the texture. – Chris Apr 25 '16 at 23:08
1

I belive that since it's ok for Android its a problem related to capturing images. When you initialize a WebCamTexture using is capturing from the Camera as if it were recording a video, when you record a video you cannot use focus and anti-shake as well as ISO settings. When you take photo on iOS you actually just grap the frames from this feed which is not really intended for sharp photos. The plugin CameraCaptureKit (https://www.assetstore.unity3d.com/en/#!/content/56673) extends WebCamTexture and solves this by using two steams one for capturing the preview while waiting for the user to capture a still image and then another high rez. one for actually capturing the photo. Consider either implemetning the same by integrating some new objc code as a iOS plugin or using something like Camera Capture Kit could prove to help fix your issue.

Chris
  • 498
  • 5
  • 16