人类的悲欢并不相通—鲁迅
-
Android Camera系列(一):SurfaceView+Camera
-
Android Camera系列(二):TextureView+Camera
-
Android Camera系列(三):GLSurfaceView+Camera
本系列主要讲述Android开发中Camera的相关操作、预览方式、视频录制等,项目结构代码耦合性低,旨在帮助大家能从中有所收获(方便copy :) ),对于个人来说也是一个总结的好机会
本章我们来讲解GLSurfaceView
进行Camera预览,基于第一篇Android Camera系列(一):SurfaceView+Camera的成果,我们已经对Camera进行了封装,CameraManager拿来直接使用就好
一.GLSurfaceView使用
GLSurfaceView实际上就是继承了SurfaceView,并在其内部封装了EGL环境管理和渲染线程,使得我们可以直接使用opengl的API接口对图像进行变换等操作,如:黑白滤镜、美颜等各种复杂的滤镜效果
- 自定义
CameraGLSurfaceView
继承GLSurfaceView
- 实现
SurfaceTexture.OnFrameAvailableListener
接口,并在onFrameAvailable
回调中请求每一帧数据进行渲染,也就是说Camera的预览需要我们自己绘制完成 - GLSurfaceView提供了绘制接口
Renderer
,我们需要定义CameraSurfaceRenderer
实现该接口,并在GLSurfaceView初始化时设置自定义的渲染类。在onSurfaceCreated
回调中创建外部纹理SurfaceTexture
,并设置OnFrameAvailableListener
监听Camera数据回调 - 实现自定义CameraCallback接口,监听Camera状态
- 一定要实现
onResume
和onPause
接口,并在对应的Activity生命周期中调用。这是所有使用Camera的bug的源头
public class CameraGLSurfaceView extends GLSurfaceView implements SurfaceTexture.OnFrameAvailableListener, CameraCallback {
private static final String TAG = CameraGLSurfaceView.class.getSimpleName();
private Context mContext;
private SurfaceTexture mSurfaceTexture;
private CameraHandler mCameraHandler;
private boolean hasSurface; // 是否存在摄像头显示层
private CameraManager mCameraManager;
private int mRatioWidth = 0;
private int mRatioHeight = 0;
private int mGLSurfaceWidth;
private int mGLSurfaceHeight;
private CameraSurfaceRenderer mRenderer;
public CameraGLSurfaceView(Context context) {
super(context);
init(context);
}
public CameraGLSurfaceView(Context context, AttributeSet attrs) {
super(context, attrs);
init(context);
}
private void init(Context context) {
mContext = context;
mCameraHandler = new CameraHandler(this);
mCameraManager = new CameraManager(context);
mCameraManager.setCameraCallback(this);
setEGLContextClientVersion(2);
mRenderer = new CameraSurfaceRenderer(mCameraHandler);
setRenderer(mRenderer);
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
public SurfaceTexture getSurfaceTexture() {
return mSurfaceTexture;
}
private void setAspectRatio(int width, int height) {
if (width < 0 || height < 0) {
throw new IllegalArgumentException("Size cannot be negative.");
}
mRatioWidth = width;
mRatioHeight = height;
requestLayout();
}
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec);
int width = MeasureSpec.getSize(widthMeasureSpec);
int height = MeasureSpec.getSize(heightMeasureSpec);
if (0 == mRatioWidth || 0 == mRatioHeight) {
setMeasuredDimension(width, height);
} else {
if (width < height * mRatioWidth / mRatioHeight) {
setMeasuredDimension(width, width * mRatioHeight / mRatioWidth);
} else {
setMeasuredDimension(height * mRatioWidth / mRatioHeight, height);
}
}
}
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
requestRender();
}
/**
* Connects the SurfaceTexture to the Camera preview output, and starts the preview.
*/
private void handleSetSurfaceTexture(SurfaceTexture st) {
Logs.i(TAG, "handleSetSurfaceTexture.");
mSurfaceTexture = st;
hasSurface = true;
mSurfaceTexture.setOnFrameAvailableListener(this);
openCamera();
}
/**
*
* @param width
* @param height
*/
private void handleSurfaceChanged(int width, int height) {
Logs.i(TAG, "handleSurfaceChanged.");
mGLSurfaceWidth = width;
mGLSurfaceHeight = height;
setAspectRatio();
}
/**
* 打开摄像头并预览
*/
public void onResume() {
super.onResume();
if (hasSurface) {
// 当activity暂停,但是并未停止的时候,surface仍然存在,所以 surfaceCreated()
// 并不会调用,需要在此处初始化摄像头
openCamera();
}
}
/**
* 停止预览并关闭摄像头
*/
public void onPause() {
super.onPause();
closeCamera();
}
public void onDestroy() {
mCameraHandler.invalidateHandler();
}
/**
* 打开摄像头
*/
private void openCamera() {
if (mSurfaceTexture == null) {
Logs.e(TAG, "mSurfaceTexture is null.");
return;
}
if (mCameraManager.isOpen()) {
Logs.w(TAG, "Camera is opened!");
return;
}
mCameraManager.openCamera();
if (mCameraManager.isOpen()) {
mCameraManager.startPreview(mSurfaceTexture);
}
}
private void closeCamera() {
mCameraManager.releaseCamera();
queueEvent(() -> mRenderer.notifyPausing());
mSurfaceTexture = null;
}
@Override
public void onOpen() {
}
@Override
public void onOpenError(int error, String msg) {
}
@Override
public void onPreview(int previewWidth, int previewHeight) {
Logs.i(TAG, "onPreview " + previewWidth + " " + previewHeight);
queueEvent(() -> mRenderer.setCameraPreviewSize(previewWidth, previewHeight));
setAspectRatio();
}
@Override
public void onPreviewError(int error, String msg) {
}
@Override
public void onClose() {
}
private void setAspectRatio() {
int previewWidth = mCameraManager.getPreviewWidth();
int previewHeight = mCameraManager.getPreviewHeight();
if (mGLSurfaceWidth > mGLSurfaceHeight) {
setAspectRatio(previewWidth, previewHeight);
} else {
setAspectRatio(previewHeight, previewWidth);
}
}
/**
* Handles camera operation requests from other threads. Necessary because the Camera
* must only be accessed from one thread.
* <p>
* The object is created on the UI thread, and all handlers run there. Messages are
* sent from other threads, using sendMessage().
*/
static class CameraHandler extends Handler {
public static final int MSG_SET_SURFACE_TEXTURE = 0;
public static final int MSG_SURFACE_CHANGED = 1;
private WeakReference<CameraGLSurfaceView> mWeakGLSurfaceView;
public CameraHandler(CameraGLSurfaceView view) {
mWeakGLSurfaceView = new WeakReference<>(view);
}
/**
* Drop the reference to the activity. Useful as a paranoid measure to ensure that
* attempts to access a stale Activity through a handler are caught.
*/
public void invalidateHandler() {
mWeakGLSurfaceView.clear();
}
@Override
public void handleMessage(@NonNull Message msg) {
super.handleMessage(msg);
int what = msg.what;
CameraGLSurfaceView view = mWeakGLSurfaceView.get();
if (view == null) {
return;
}
switch (what) {
case MSG_SET_SURFACE_TEXTURE:
view.handleSetSurfaceTexture((SurfaceTexture) msg.obj);
break;
case MSG_SURFACE_CHANGED:
view.handleSurfaceChanged(msg.arg1, msg.arg2);
break;
default:
throw new RuntimeException("unknown msg " + what);
}
}
}
/**
* Renderer object for our GLSurfaceView.
* <p>
* Do not call any methods here directly from another thread -- use the
* GLSurfaceView#queueEvent() call.
*/
static class CameraSurfaceRenderer implements GLSurfaceView.Renderer {
private CameraGLSurfaceView.CameraHandler mCameraHandler;
private final float[] mSTMatrix = new float[16];
private FullFrameRect mFullScreen;
// width/height of the incoming camera preview frames
private boolean mIncomingSizeUpdated;
private int mIncomingWidth;
private int mIncomingHeight;
private int mTextureId = -1;
private SurfaceTexture mSurfaceTexture;
public CameraSurfaceRenderer(CameraGLSurfaceView.CameraHandler cameraHandler) {
mCameraHandler = cameraHandler;
mTextureId = -1;
mIncomingSizeUpdated = false;
mIncomingWidth = mIncomingHeight = -1;
}
/**
* Notifies the renderer thread that the activity is pausing.
* <p>
* For best results, call this *after* disabling Camera preview.
*/
public void notifyPausing() {
if (mSurfaceTexture != null) {
Logs.d(TAG, "renderer pausing -- releasing SurfaceTexture");
mSurfaceTexture.release();
mSurfaceTexture = null;
}
if (mFullScreen != null) {
mFullScreen.release(false); // assume the GLSurfaceView EGL context is about
mFullScreen = null; // to be destroyed
}
mIncomingWidth = mIncomingHeight = -1;
}
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
Logs.i(TAG, "onSurfaceCreated. " + Thread.currentThread().getName());
// Set up the texture blitter that will be used for on-screen display. This
// is *not* applied to the recording, because that uses a separate shader.
mFullScreen = new FullFrameRect(
new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT));
mTextureId = mFullScreen.createTextureObject();
// Create a SurfaceTexture, with an external texture, in this EGL context. We don't
// have a Looper in this thread -- GLSurfaceView doesn't create one -- so the frame
// available messages will arrive on the main thread.
mSurfaceTexture = new SurfaceTexture(mTextureId);
mCameraHandler.sendMessage(mCameraHandler.obtainMessage(CameraHandler.MSG_SET_SURFACE_TEXTURE, mSurfaceTexture));
}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
mCameraHandler.sendMessage(mCameraHandler.obtainMessage(CameraHandler.MSG_SURFACE_CHANGED, width, height));
}
@Override
public void onDrawFrame(GL10 gl) {
if (mSurfaceTexture == null) return;
mSurfaceTexture.updateTexImage();
if (mIncomingWidth <= 0 || mIncomingHeight <= 0) {
return;
}
if (mIncomingSizeUpdated) {
mFullScreen.getProgram().setTexSize(mIncomingWidth, mIncomingHeight);
mIncomingSizeUpdated = false;
}
mSurfaceTexture.getTransformMatrix(mSTMatrix);
mFullScreen.drawFrame(mTextureId, mSTMatrix);
}
public void setCameraPreviewSize(int width, int height) {
mIncomingWidth = width;
mIncomingHeight = height;
mIncomingSizeUpdated = true;
}
}
}
1.Camera操作时机
与SurfaceView和TextureView不同,GLSurfaceView中并没有地方获取SurfaceTexture的地方,虽然Renderer
接口有onSurfaceCreated
回调但是并没有SurfaceTexture,而是要求我们自己创建外部纹理ID用于Camera预览数据的回调。至于这个外部纹理如何又显示到GLSurfaceView上的,这个章节就不先介绍了,后续我们进行完整的opengl环境搭建再进一步讨论。
Renderer中的所有回调接口都是运行在独立的线程中的,这也就是为什么我们要单独定义个类,而不是让CameraGLSurfaceView直接实现该接口,让他和别的接口方法隔离
GLSurfaceView
中setRenderer
接口源码可以看到会启动一个GLThread线程,Renderer接口都是运行在该线程中
public void setRenderer(Renderer renderer) {
...
mRenderer = renderer;
mGLThread = new GLThread(mThisWeakRef);
mGLThread.start();
}
因此我们要把创建好的外部纹理通过Handler传递给UI线程,UI线程在获取到SurfaceTexture后打开摄像头,记得在onResume
中也同样打开一次摄像头
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
mFullScreen = new FullFrameRect(
new Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT));
// 创建外部纹理ID
mTextureId = mFullScreen.createTextureObject();
// 在此EGL上下文中创建具有外部纹理的SurfaceTexture
mSurfaceTexture = new SurfaceTexture(mTextureId);
// 将SurfaceTexture传递给UI线程
mCameraHandler.sendMessage(mCameraHandler.obtainMessage(CameraHandler.MSG_SET_SURFACE_TEXTURE, mSurfaceTexture));
}
我们重写surfaceDestroyed
,在该回调和onPause
中关闭摄像头
注意
surfaceDestroyed
是SurfaceHolder.Callback
的方法,该方法是运行在UI线程中的
2. GLSurfaceView计算大小
- 和SurfaceView和TextureView一样,我们在
onPreview
回调中设置TextureView的大小和比例 - 在
onSurfaceChanged
回调中设置GL画布大小,偶发预览变形大多是没有在此调用glViewport
方法导致
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
gl.glViewport(0, 0, width, height);
mCameraHandler.sendMessage(mCameraHandler.obtainMessage(CameraHandler.MSG_SURFACE_CHANGED, width, height));
}
二.最后
本文介绍了Camera+GLSurfaceView
的基本操作及关键代码。本章内容也不是很多,介绍了如何用GLSurfaceView预览Camera数据的流程。我们没有对opengl的API进行过多的讲解,以及Renderer
接口是如何将数据渲染到GLSurfaceView
中的。在后续的章节我会讲解opengl在Android中的使用,希望聪明的你在回看该章会有种醍醐灌顶的感觉吧。
lib-camera库包结构如下:
包 | 说明 |
---|---|
camera | camera相关操作功能包,包括Camera和Camera2。以及各种预览视图 |
encoder | MediaCdoec录制视频相关,包括对ByteBuffer和Surface的录制 |
gles | opengles操作相关 |
permission | 权限相关 |
util | 工具类 |
每个包都可独立使用做到最低的耦合,方便白嫖
github地址:https://github.com/xiaozhi003/AndroidCamera,https://gitee.com/xiaozhi003/android-camera
参考:
- https://github.com/afei-cn/CameraDemo
- https://github.com/saki4510t/UVCCamera
- https://github.com/google/grafika