I am writing an app using Camera2 API, which should show preview from camera and take a picture. Currently my code works as follows:
- When Camera Fragment is instantiated, wait for
TextureView.SurfaceTextureListener.onSurfaceTextureAvailable
to be called - In ViewModel get available and suitable picture and preview sizes from CameraCharacteristics, and emit found preview size to Fragment with LiveData
- Fragment observes preview size LiveData and calls
setDefaultBufferSize
with new size for itsTextureView
'sSurfaceTexture
- When new size is set, capture session is created, and repeating preview request is set, so
TextureView
starts to show image from camera - To avoid disrupting other camera apps' work, all camera-related things are cleared after Fragment's
onPause
and steps 1-4 are followed again afteronResume
Surface
instance is shared between Fragment and camera logic classes: the shared variable is initialized with it inTextureView.SurfaceTextureListener.onSurfaceTextureAvailable
and is set to null whenTextureView.SurfaceTextureListener.onSurfaceTextureDestroyed
is called
This works fine for some devices of popular brands with modern Android versions, but the app should work on the particular generic Chinese tablet with Android 6 ("CameraManager: Using legacy camera HAL
"), and there I face a problem.
- When the camera is instantiated and preview is started, I see that preview size is 640x480 (so the image is stretched), however, the size passed to
setDefaultBufferSize
is 1280x720 - Logcat also is full of continuous
Surface::setBuffersUserDimensions(this=0x7f55fb5200,w=640,h=480)
messages - I've found on SO, that on some Samsung devices with Android 5 some resolutions may not really be available for Camera2, but here when I close the app and open it again, the preview resolution is 1280x720 as needed
- So my guess is that I may call
setDefaultBufferSize
too early on first Camera Fragment setup, and only when the view is recreated when after the app was minimized, the needed resolution is "picked up" - I also tried to call
setDefaultBufferSize
in lambda passed toTextureView.post
, and it solved the problem except for the case when I should ask for user's permissions on Camera Fragment (ie when user opens the camera for the first time), so the Fragment is paused a few times to show permissions pop-ups. However, withoutTextureView.post
setDefaultBufferSize
is also called in main thread, so I guess that delay caused byTextureView.post
was the game changer here - Also in
setDefaultBufferSize
docs I see: The new default buffer size will take effect the next time the image producer requests a buffer to fill. For Canvas this will be the next time Surface.lockCanvas is called. For OpenGL ES, the EGLSurface should be destroyed (via eglDestroySurface), made not-current (via eglMakeCurrent), and then recreated (via eglCreateWindowSurface) to ensure that the new default size has taken effect. It seems to me that it may be about the case