我自己打电话回电话螺纹
(非主要)连续,因为这是1920 x 1088(是的,88),每秒30帧的视频:
@覆盖公共void onYuvDataReceived(MediaFormat MediaFormat,ByteBuffer字节缓冲区,最终int宽度,最终int高度){
从媒体格式
我可以猜colorFormat(颜色格式)
:COLOR_FormatYUV420半平面
或COLOR_FormatYUV420平面
(我希望两者都支持,但至少其中之一)
现在我想画这些框架,最好是在上面纹理视图
,但可能是表面视图
此外,作为RGB的转换/位图
效率不高,在我的情况下甚至可能需要60+毫秒(即每秒30帧的视频…),所以我应该坚持一些“本地方式”或“GPU方式”(对吗?)
WebRTC方式
我发现了网络RTC
lib非常有用,它包含了一些用于渲染案例的面包屑,但我无法实现“视频”,只绘制了第一帧(正确,没有问题)
int rowStrideY=宽度;int rowStrideU=宽度/2;int rowStrideV=宽度/2;//TODO与ColorFormat一样int basicOffset=字节缓冲区.remaining()/6;int偏移量Y=0;int offsetU=基本偏移量*4;int offsetV=基本偏移量*5;字节缓冲区i420ByteBuffer=字节缓冲区重复();i420ByteBuffer.position(偏移量Y);final ByteBuffer dataY=i420ByteBuver.slice();i420ByteBuffer.position(offsetU);最终ByteBuffer数据U=i420ByteBuver.slice();i420ByteBuffer.position(offsetV);final ByteBuffer dataV=i420ByteBuver.slice();JavaI420Buffer javaI420BuBuffer=JavaI420缓冲区.wrap(宽度、高度、,数据Y,行StrideY,数据U,行StrideU,数据V,行StrideV,() -> {JniCommon.nativeFreeByteBuffer(i420ByteBuff);});视频帧帧=新视频帧(javaI420Buffer,0,System.currentTimeMillis());surfaceViewRenderer.onFrame(帧);//turnOffYuv();//没有崩溃,但只绘制了第一帧}
(这方面的一些来源于酒店雇员和饭馆雇员)
曲面视图渲染器
当馈送第二帧/下一帧时,将抛出
致命异常:SurfaceViewRendererEglRenderer进程:thats.my.package,PID:12970java.lang.IllegalStateException:缓冲区不可访问在java.nio上。DirectByteBuffer.slice(DirectByteBuffer.java:159)在org.webrtc上。JavaI420Buffer.getDataY(JavaI420Buffer.java:118)在org.webrtc上。VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)在org.webrtc上。视频帧Drawer.drawFrame(VideoFrameDrawer.java:221)在org.webrtc上。EglRenderer.renderFrameOnRenderThread(EglRederer.java:664)在org.webrtc上。EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(未知来源:0)网址:org.webrtc-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(未知来源:2)在android.os上。Handler.handleCallback(Handler.java:883)在android.os上。Handler.dispatchMessage(Handler.java:100)在org.webrtc上。EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRederer.java:103)在android.os上。Looper.loop(Looper.java:214)在android.os上。HandlerThread.run(HandlerThread.java:67)
一些消息来源建议创建视频曲目
,添加接收器
等,但我没有准备好自己的,也有点困惑和害怕本机方法在代码中
OpenGL ES方式
我有点害怕WebRTC的本土关系,为了我的目的,我转向了其他更“纯粹”的方式-OpenGL ES和GLSurfaceView(GL表面视图)
。已找到这个 渲染器
,进行了一些调整,实现了彩色的拉伸视频,大部分是深色的,但非常平滑。。。
在CreateView上
在里面碎片
:
mGLSurfaceView=rootView.findViewById(R.id.GLSurfaceView);mGLSurfaceView.setEGLContextClientVersion(2);mRenderer=新的NV21Renderer();mGLSurfaceView.setRenderer(mRenderer);mGLSurfaceView.setPreserveEGLContextOnPause(true);mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
帧传递:
@覆盖public void onYuvData(MediaFormat MediaFormat,byte[]data,int dataSize,int width,int height){//这里的数据[]是NV21//YuvImage YuvImage=新的YuvImage(数据,ImageFormat.NV21,宽度,高度,空);//mediaFormat包含“原始”colorFormatmGLSurfaceView.queueEvent(新的Runnable()){@覆盖公共void run(){mRenderer.onPreviewFrame(数据);mGLSurfaceView.requestRender();}});}
渲染器(这些GLES20。
呼叫是我的第一个OpenGL行):
公共类NV21Renderer实现了GLSurfaceView。渲染器{public static final int recWidth=1920;公共静态最终int recHeight=1088;private static final int LENGTH=recWidth*recHeight;private static final int LENGTH_4=recWidth*recHeight/4;私有静态最终int U_INDEX=LENGTH;私有静态最终int V_INDEX=LENGTH+LENGTH_4;私有int[]yTextureNames;私有int[]uTextureNames;私有int[]vTextureNames;私有最终FloatBuffer mVertices;私有最终ShortBuffer mIndices;private int mProgramObject;私有int mPositionLoc;私有int mTexCoordLoc;私有int yTexture;private int uTexture;私有int vTexture;私有最终ByteBuffer yBuffer;私有最终ByteBuffer uBuffer;私有最终ByteBuffer vBuffer;byte[]ydata=新字节[LENGTH];byte[]uData=新字节[LENGTH_4];byte[]vData=新字节[LENGTH_4];private boolean surfaceCreated=false;private boolean dirty=false;//无数据时禁止绘制第一帧公共NV21渲染器(){mVertices=ByteBuffer.allocateDirect(mVerticesData.length*4).order(ByteOrder.nativeOrder()).asFloatBuffer();mVertices.put(mVerticesData).position(0);mIndices=ByteBuffer.allocateDirect(mIndicesData.length*2).order(ByteOrder.nativeOrder()).asShortBuffer();mIndices.put(mIndicesData).position(0);yBuffer=ByteBuffer.allocateDirect(长度);uBuffer=ByteBuffer.allocateDirect(LENGTH_4);vBuffer=ByteBuffer.allocateDirect(LENGTH_4);}@覆盖已创建公共void onSurfaceCreated(GL10 gl,EGLConfig配置){Timber.d(“创建的表面”);GLES20.GL启用(GLES20.GL_TEXTURE_2D);GLES20.glClearColor(0.0f、0.0f、.0f、1.0f);最终字符串vShaderStr=vertexShader;最终字符串fShaderStr=fragmentShader;IntBuffer frameBuffer=IntBuffer.allocate(1);IntBuffer renderBuffer=IntBuffer.allocate(1);GLES20.glGen帧缓冲区(1,帧缓冲区);GLES20.glGenRenderbuffers(1,renderBuffer);GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER,FRAMEBUFFER.get(0));GLES20.GL清除(GLES20.GL_COLOR_BUFFER_BIT);GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER,RENDERBUFFER.get(0));GLES20.GL渲染缓冲存储(GLES20.GL_RENDERBUFFER,GLES20.GR_DEPTH_COMPONENT16,recWidth、recHeight);IntBuffer参数BufferHeigth=IntBuffer.allocate(1);IntBuffer参数BufferWidth=IntBuffer.allocate(1);GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER,GLES20.GL_RENDERBUFFER_WIDTH,parameterBufferWidth);GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER,GLES20.GL_RENDERBUFFER_HEIGHT,parameterBufferHeigth);GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER,GLES20.GL_COLOR_ATTACHMENT0,GLES20GL_RENDERBUFFER,RENDERBUFFER.get(0));if(GLES20.glCheckFramebufferStatus(GLES50.GL_FRAMEBUFFER)!=GLES20.GL_FRAMEBUFFER_COMPLETE){Timber.w(“gl帧缓冲区状态!=帧缓冲区完成%s”,GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER));}GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER,0);GLES20.GL清除(GLES20.GL_COLOR_BUFFER_BIT);mProgramObject=加载程序(vShaderStr,fShaderStr);mPositionLoc=GLES20.glGetAttribLocation(mProgramObject,“a_position”);mTexCoordLoc=GLES20.glGetAttribLocation(mProgramObject,“a_texCoord”);GLES20.GL启用(GLES20.GL_TEXTURE_2D);yTexture=GLES20.glGetUniformLocation(mProgramObject,“y_texture”);yTextureNames=新整数[1];GLES20.glGenTextures(1,yTextureNames,0);int yTextureName=yTextureNames[0];GLES20.GL启用(GLES20.GL_TEXTURE_2D);uTexture=GLES20.glGetUniformLocation(mProgramObject,“u_texture”);uTextureNames=新整数[1];GLES20.glGenTextures(1,uTextureNames,0);int uTextureName=uTextureNames[0];GLES20.GL启用(GLES20.GL_TEXTURE_2D);vTexture=GLES20.glGetUniformLocation(mProgramObject,“v_texture”);vTextureNames=新整数[1];GLES20.glGenTextures(1,vTextureNames,0);int vTextureName=vTextureNames[0];surfaceCreated=true;}@覆盖public void onSurfaceChanged(GL10 gl,int width,int height){Timber.d(“onSurfaceChanged width:”+宽度+“高度:”+高度+“surfaceCreated:”+surface Created:“dirty:”+dirty);GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);GLES20.glViewport(0,0,宽度,高度);GLES20.GL清除(GLES20.GL_COLOR_BUFFER_BIT);}@覆盖DrawFrame上的公共最终无效(GL10 gl){Timber.d(“onDrawFrame surfaceCreated:”+surface Created+“dirty:”+dirty);if(!surfaceCreated||!dirty)return;//清除颜色缓冲区GLES20.GL清除(GLES20.GL_COLOR_BUFFER_BIT);//使用程序对象GLES20.glUseProgram(mProgramObject);//加载顶点位置m顶点位置(0);GLES20.glVertexAttribPointer(mPositionLoc,3,GLES20.GL_FLOAT,false,5*4,mVertices);//加载纹理坐标m顶点位置(3);GLES20.glVertexAttribPointer(mTexCoordLoc,2,GLES20.GL_FLOAT,false,5*4,mVertices);GLES20.glEnableVertexAttribArray(mPositionLoc);GLES20.glEnableVertexAttribArray(mTexCoordLoc);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,yTextureNames[0]);GLES20.glTex图像2D(GLES20.GL_TEXTURE_2D,0,GLES20.GL_LUMINANCE,recWidth,recHeight,0,GLES20.GL_LUMINANCE,GLES20.GL_UNSIGNED_BYTE,yBuffer);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_MIN_FILTER,GLES50.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_MAG_FILTER,GLES50.GL_LINEAR);GLES20.glTex参数(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_WRAP_S,GLES50.GL_CLAMP_TO_EDGE);GLES20.glTex参数(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_WRAP_T,GLES50.GL_CLAMP_TO_EDGE);GLES20.glActiveTexture(GLES20.GL_TEXTURE1);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,yTextureNames[0]);GLES20.glUniform1i(yTexture,0);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,uTextureNames[0]);GLES20.glTex图像2D(GLES20.GL_TEXTURE_2D,0,GLES20.GL_LUMINANCE,recWidth/2、recHeight/2、0、GLES20.GL_LUMINANCE、GLES20-GL_UNSIGNED_BYTE、uBuffer);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_MIN_FILTER,GLES50.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_MAG_FILTER,GLES50.GL_LINEAR);GLES20.glTex参数(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_WRAP_S,GLES50.GL_CLAMP_TO_EDGE);GLES20.glTex参数(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_WRAP_T,GLES50.GL_CLAMP_TO_EDGE);GLES20.glActiveTexture(GLES20.GL_TEXTURE1+2);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,uTextureNames[0]);GLES20.glUniform1i(uTexture,2);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,vTextureNames[0]);GLES20.glTex图像2D(GLES20.GL_TEXTURE_2D,0,GLES20.GL_LUMINANCE,recWidth/2、recHeight/2、0、GLES20.GL_LUMINANCE、GLES20-GL_UNSIGNED_BYTE、vBuffer);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_MIN_FILTER,GLES50.GL_LINEAR);GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_MAG_FILTER,GLES50.GL_LINEAR);GLES20.glTex参数(GLES20.GL_TEXTURE_2D,GLES20.GL_TEXMURE_WRAP_S,GLES50.GL_CLAMP_TO_EDGE);GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D、GLES20.GL_TEXTURE_WRAP_T、GLES20.GL_CLAMP_TO_EDGE);GLES20.glActiveTexture(GLES20.GL_TEXTURE1+1);GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,vTextureNames[0]);GLES20.glUniform1i(vTexture,1);GLES20.GL绘图元素(GLES20.GL_TRIANGLES,6,GLES20.GL_UNSIGNED_SHORT,mIndices);脏=假;}private int loadShader(int类型,String shaderSrc){int着色器;int[]编译=新int[1];shader=GLES20.glCreateShader(类型);if(着色器==0){返回0;}GLES20.glShaderSource(shader,shaderSrc);GLES20.glCompileShader(着色器);GLES20.glGetShaderiv(着色器,GLES20.GL_COMPILE_STATUS,已编译,0);if(已编译[0]==0){Timber.d(“loadShader%s”,GLES20.glGetShaderInfoLog(着色器));GLES20.glDeleteShader(着色器);返回0;}返回着色器;}private int loadProgram(String vertShaderSrc,String fragShaderSrch){int vertexShader;int fragmentShader;int程序对象;int[]linked=新int[1];vertexShader=loadShader(GLES20.GL_VERTEX_SHADER,vertShaderSrc);if(vertexShader==0){返回0;}fragmentShader=loadShader(GLES20.GL_FRAGMENT_SHADER,fragShaderSrc);if(fragmentShader==0){GLES20.glDeleteShader(vertexShader);返回0;}programObject=GLES20.glCreateProgram();if(程序对象==0){返回0;}GLES20.glAttachShader(programObject,vertexShader);GLES20.glAttachShader(programObject,fragmentShader);GLES20.glLinkProgram(programObject);GLES20.glGetProgramiv(程序对象,GLES20.GL_LINK_STATUS,链接,0);if(链接[0]==0){Timber.e(“链接程序时出错:%s”,GLES20.glGetProgramInfoLog(programObject));GLES20.glDeleteProgram(程序对象);返回0;}GLES20.glDeleteShader(vertexShader);GLES20.glDeleteShader(碎片着色器);返回程序对象;}公共void onPreviewFrame(字节[]数据){System.arraycopy(数据,0,ydata,0,长度);yBuffer.put(数据);y缓冲区位置(0);System.arraycopy(数据,U_INDEX,uData,0,LENGTH_4);uBuffer.put(uData);uBuffer.位置(0);System.arraycopy(数据,V_INDEX,vData,0,LENGTH_4);vBuffer.put(vData);vBuffer.位置(0);脏=真;}私有静态最终String vertexShader=“属性vec4 a_position;\n”+“属性vec2 a_texCoord;\n”+“改变vec2 v_texCoord;\n”+“void main(){\n”+“gl_Position=a_Position;\n”+“v_texCoord=a_texCoord;\n”+“}\n”;private static最终字符串碎片着色器=“#ifdef GL_ES\n”+“高精度浮点;\n”+“#endif\n”+“改变vec2 v_texCoord;\n”+“均匀采样器2D y_texture;\n”+“均匀采样器2D u_texture;\n”+“均匀采样器2D v_texture;\n”+“void main(void){\n”+“浮点数r,g,b,y,u,v;\n”+//我们通过GL_LUMINANCE将每个像素的Y值放入R、G、B分量,//这就是为什么我们从R分量中提取它,我们也可以使用G或B//参见https://stackoverflow.com/questions/12130790/yuv-to-rgb-conversion-by-fragment-shader/17615696#17615696//和https://stackoverflow.com/questions/22456884/how-to-render-androids-yuv-nv21-camera-image-on-the-background-in-libgdx-with-o“y=纹理2D(y_texture,v_texCoord)。r;\n”+//由于我们使用GL_LUMINANCE,每个组件都在自己的地图上“u=纹理2D(u_texture,v_texCoord).r-0.5;\n”+“v=纹理2D(v_texture,v_texCoord).r-0.5;\n”+//这些数字只是YUV到RGB的转换常数“r=y+1.13983*v;\n”+“g=y-0.39465*u-0.58060*v;\n”+“b=y+2.03211*u;\n”+//我们最终设置了像素的RGB颜色“gl_FragColor=vec4(r,g,b,1.0);\n”+“}\n”;私有静态最终浮点[]mVerticesData={-1.f,1.f,0.0f,//位置00.0f、0.0f、//特克斯坐标0-1.f,-1.f,0.0f,//位置10.0f、1.0f、//TexCord 11.f,-1.f,0.0f,//位置21.0f、1.0f、//特克斯坐标21.f,1.f,0.0f,//位置31.0f、0.0f//特克斯坐标3};私有静态最终短[]mIndicesData={0,1,2,0,2,3};}
在一些Android 10的作品中,视频非常深色且延伸,颜色主要是绿色和粉红色/红色)
但在Android 13(Pixel)上,我总是
信号11(SIGSEGV),代码1(SEGV_MAPERR),故障地址0x00000000
扔进去的在创建的曲面上
.它是不是被错误配置了。。。?
那么:如何在屏幕上绘制YUV/NV21,从简单的字节数组/缓冲区到图片/视频?
PS.YUV流/回调很好,我可以用例如h264对其进行编码,并将其放入mp4文件或流输出,没有问题,或者使用生成的jpeg检查单个帧YuvImage公司
。我只想实时绘制。“预览”。。