Preparation 0: Integrate the 7N RTN SDK to AR Demo
download Qiniu RTN SDK To current directory QNRTC-Android
git clone git@github.com :pili-engineering/QNRTC-Android.git
Download ARCore to the current directory arcore-android-sdk
git clone git@github.com :google-ar/arcore-android-sdk.git
Copy the corresponding 7N RTN SDK file to the hello_ar_java project
-
File QNRTC-Android/releases/qndroid-rtc-1.2.0.jar copy to arcore-android-sdk/samples/hello_ar_java/app/libs/ Medium (the libs directory needs to be created by yourself) -
take QNRTC-Android/releases/ Lower armeabi、armeabi-v7a、arm64-v8a、x86 Wait for 4 folders to be copied to arcore-android-sdk/samples/hello_ar_java/app/src/main/jniLibs Folder (the jniLibs directory needs to be created by yourself) -
Open using Android Studio arcore-android-sdk/samples/hello_ar_java Project, modify several configurations -
To make the project reference the library added in the above two steps, open app/build.gradle Files, in dependencies Add line in implementation fileTree(include: ['*.jar'], dir: 'libs') -
In order to make real-time calls, you need to set the permission of the program to use the network, open AndroidManifest.xml Files, in manifest Add the following permission statement to the label <uses-permission android:name="android.permission.INTERNET"/> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
-
Introduction to core classes
-
Background image rendering (camera preview original image) -
Object and its shadow rendering (Android model and its shadow) -
Plane rendering (plane detected by AR system) -
Point cloud rendering (point cloud detected by AR system)
Preparation 1: Establish a basic real-time audio and video call environment
public class HelloArActivity extends AppCompatActivity implements GLSurfaceView.Renderer, QNRoomEventListener { Private boolean mPublished=false;//Identify whether local publishing succeeds ... @Override public void onJoinedRoom() { mRTCManager.publish(); // After joining the room successfully, try to publish } @Override public void onLocalPublished() { MPublished=true;//After publishing successfully, the ID is true } ... }
protected void onCreate(Bundle savedInstanceState) { ... QNRTCSetting setting = new QNRTCSetting(); setting.setExternalVideoInputEnabled(true); // Enable external video import function mRTCManager.setRoomEventListener(this); // Set room event listener MRTCManager. initialize (this, setting);//7-N RTN SDK initialization mRTCManager.joinRoom(###Your Room Token###); // Join the specified room through Room Token }
Preparation 2: Establish a basic AR environment
Use GLSurfaceView&Renderer to prepare for drawing AR images
public class HelloArActivity extends AppCompatActivity implements GLSurfaceView.Renderer, QNRoomEventListener { /** *Display callback when surface creation is completed **/ public void onSurfaceCreated(GL10 gl, EGLConfig config) { } ... /** *Display callback when surface size changes **/ public void onSurfaceChanged(GL10 gl, int width, int height) { } ... /** *Display callback when surface creation is completed **/ public void onDrawFrame(GL10 gl) { } }
surfaceView = findViewById(R.id.surfaceview); // Parsing GLSurfaceView from layout xml ... surfaceView.setRenderer(this); // Set Renderer
Create Session
protected void onResume() { session = new Session(/* context= */ this); // AR system initialization ... session.resume(); // Start the AR session and try to start the camera. If the camera is occupied, a CameraNotAvailableException exception will be thrown }
Use OpenGL Shader to draw AR enhancement pictures on the display surface
-
Original camera preview data -
Detected Planar Array -
Array of detected point clouds -
Plane touch event
//Draw Background private final BackgroundRenderer backgroundRenderer = new BackgroundRenderer(); //Draw Objects private final ObjectRenderer virtualObject = new ObjectRenderer(); //Paint object shadows private final ObjectRenderer virtualObjectShadow = new ObjectRenderer(); //Draw Plane private final PlaneRenderer planeRenderer = new PlaneRenderer(); //Draw Cloud Points private final PointCloudRenderer pointCloudRenderer = new PointCloudRenderer(); public void onDrawFrame(GL10 gl) { frame = session.update(); // Acquire the original data frame of the camera (blocking method) // Handle one tap per frame. HandleTap (frame, camera);//Check whether there are plane click events. If there are, place the Android model in the corresponding location ... // Draw background. backgroundRenderer.draw(frame); // Draw camera preview data as background image ... // Visualize tracked points. PointCloud pointCloud = frame.acquirePointCloud(); pointCloudRenderer.update(pointCloud); PointCloudRenderer.draw (viewmtx, projmtx);//Draw point clouds ... // Visualize planes. PlaneRenderer.drawPlanes (session. getAllTrackables (Plane. class), camera. getDisplayOrientedPose(), projmtx);//Draw a plane ... // Update and draw the model and its shadow. virtualObject.updateModelMatrix(anchorMatrix, scaleFactor); virtualObjectShadow.updateModelMatrix(anchorMatrix, scaleFactor); VirtualObject.draw (viewmtx, projmtx, colorCorrectionRgba, coloredAnchor. color);//Draw the Android model VirtualObjectShadow. draw (viewmtx, projmtx, colorCorrectionRgba, coloredAnchor. color);//Draw the shadow of Android model }
public void onDrawFrame(GL10 gl) { ... If (mPublished) {//AR data will be imported only after the 7N RTN publishing stream succeeds //Read the data of AR enhancement screen from GPU GLES20.glReadPixels(0, 0, mSurfaceWidth, mSurfaceHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, mBufferRGBA); //RGBA is converted to NV21 (the algorithm is not expanded here for space reasons) mBufferNV21 = RGBAToNV21(mBufferRGBA, mSurfaceWidth, mSurfaceHeight); //Release AR enhanced images in the form of NV21 data through the "external video data import" function mRTCManager.inputVideoFrame(mBufferNV21, mSurfaceWidth, mSurfaceHeight, 0, frame.getTimestamp()); } }
summary
Free time limit gift activity