AR+real-time audio and video call, seamless integration of virtual and reality

original
2018/11/16 11:24
Reading number 420

In the middle of this year, Google launched ARCore , can seamlessly integrate reality and digital, and enrich our real world. Through it, developers can more quickly and conveniently develop AR applications on the Android platform. With a large number of AR technology products, they can find novel application scenarios, and even open up a new product line.

At present, there are many products based on AR technology in the market. For example, IKEA Place application provides a new way to purchase furniture online. Users only need to place the mobile phone camera in the corner where they want to place furniture, then select the furniture you want, and complete the layout by simple dragging and rotating to see whether the furniture meets your needs.

The following figure shows the schematic diagram of using IKEA Place. It seems that this chair is quite suitable:)

 IKEA Place Schematic Diagram

If AR is combined with other technologies, will there be more exciting application scenarios?

Qiniu real-time audio and video cloud (hereinafter referred to as Qiniu RTN) is based on the widely standardized WebRTC technology stack, has full platform compatibility, and supports major browsers such as Chrome, Safari, Firefox, and Android, iOS, Windows on all ends. The powerful Qiniu real-time audio and video streaming media network has more than 180 data centers in the world, and has powerful link acceleration function. The rich nodes ensure that customers can be accelerated no matter where they are distributed in the world. The ultra-low latency of 200ms on average provides the most fundamental support for many customer scenarios with strict requirements for real-time, such as one-on-one chat, chat room, video conference, online education and other scenarios with strong requirements for interactivity, which are very suitable for using 7N RTN.

In this article, we will combine the official example of Google hello_ar_java AR technology will be integrated into real-time audio and video calls, which will be applied to the new function "external audio and video data import" of the 1.1.0+version 7N RTN SDK.

The following is the dynamic picture of the effect

 article_20181115_02

Preparation 0: Integrate the 7N RTN SDK to AR Demo

Before we really start coding, we need to complete the corresponding project and environment

download Qiniu RTN SDK To current directory QNRTC-Android

 git clone  git@github.com :pili-engineering/QNRTC-Android.git

Download ARCore to the current directory arcore-android-sdk

 git clone  git@github.com :google-ar/arcore-android-sdk.git

Copy the corresponding 7N RTN SDK file to the hello_ar_java project

  1. File QNRTC-Android/releases/qndroid-rtc-1.2.0.jar copy to arcore-android-sdk/samples/hello_ar_java/app/libs/ Medium (the libs directory needs to be created by yourself)
  2. take QNRTC-Android/releases/ Lower armeabi、armeabi-v7a、arm64-v8a、x86 Wait for 4 folders to be copied to arcore-android-sdk/samples/hello_ar_java/app/src/main/jniLibs Folder (the jniLibs directory needs to be created by yourself)
  3. Open using Android Studio arcore-android-sdk/samples/hello_ar_java Project, modify several configurations
    • To make the project reference the library added in the above two steps, open app/build.gradle Files, in dependencies Add line in implementation fileTree(include: ['*.jar'], dir: 'libs')
    • In order to make real-time calls, you need to set the permission of the program to use the network, open AndroidManifest.xml Files, in manifest Add the following permission statement to the label
      • <uses-permission android:name="android.permission.INTERNET"/>
      • <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>

Introduction to core classes

Before actual coding and code analysis, let's briefly introduce the core classes involved

QNRTCManager : Qiniu RTN SDK core class, providing low latency real-time audio and video call capability

Session : ARCore core class, managing AR system status, including camera camera acquisition, dot network monitoring, plane detection and other capabilities

GLSurfaceView & Renderer : The view class and rendering class provided by the Android system are respectively responsible for screen display and rendering

BackgroundRenderer & ObjectRenderer & PlaneRenderer & PointCloudRenderer : The rendering classes provided in Demo are responsible for the rendering of the following parts

  • Background image rendering (camera preview original image)
  • Object and its shadow rendering (Android model and its shadow)
  • Plane rendering (plane detected by AR system)
  • Point cloud rendering (point cloud detected by AR system)

Preparation 1: Establish a basic real-time audio and video call environment

First, we need to implement a room event listener for real-time audio and video QNRoomEventListener There are many methods that need to be implemented. The following shows only the methods needed for this simple example. The complete interface description is shown in here

 public class HelloArActivity extends AppCompatActivity implements GLSurfaceView.Renderer, QNRoomEventListener { Private boolean mPublished=false;//Identify whether local publishing succeeds ... @Override public void onJoinedRoom() { mRTCManager.publish(); //  After joining the room successfully, try to publish } @Override public void onLocalPublished() { MPublished=true;//After publishing successfully, the ID is true } ... }

stay onCreate The tail of the method initializes the real-time audio and video call environment and adds it to the designated room. For the way to obtain Room Token, refer to here

 protected void onCreate(Bundle savedInstanceState) { ... QNRTCSetting setting = new QNRTCSetting(); setting.setExternalVideoInputEnabled(true); //  Enable external video import function mRTCManager.setRoomEventListener(this); //  Set room event listener MRTCManager. initialize (this, setting);//7-N RTN SDK initialization mRTCManager.joinRoom(###Your Room Token###); //  Join the specified room through Room Token }

Preparation 2: Establish a basic AR environment

Use GLSurfaceView&Renderer to prepare for drawing AR images

Implemented in Activity class declaration GLSurfaceView.Renderer Interface, as shown below in this Demo, we need to implement three corresponding methods immediately, and the meanings are described in the notes respectively

 public class HelloArActivity extends AppCompatActivity implements GLSurfaceView.Renderer, QNRoomEventListener { /** *Display callback when surface creation is completed **/ public void onSurfaceCreated(GL10 gl, EGLConfig config) { } ... /** *Display callback when surface size changes **/ public void onSurfaceChanged(GL10 gl, int width, int height) { } ... /** *Display callback when surface creation is completed **/ public void onDrawFrame(GL10 gl) { } }

After implementing the Renderer rendering class, we need to provide a surface for display so that Renderer can render and display on it. GLSurfaceView has this ability.

The following sample code parses the GLSurfaceView from the layout xml file and sets the Renderer

 surfaceView = findViewById(R.id.surfaceview); //  Parsing GLSurfaceView from layout xml ... surfaceView.setRenderer(this); //  Set Renderer

Create Session

Session is the main entry class of AR system, which must be initialized and started before any AR operation

 protected void onResume() { session = new Session(/* context= */ this); //  AR system initialization ... session.resume(); //  Start the AR session and try to start the camera. If the camera is occupied, a CameraNotAvailableException exception will be thrown }

Use OpenGL Shader to draw AR enhancement pictures on the display surface

After the AR session starts, each frame of camera data can provide the following information

  • Original camera preview data
  • Detected Planar Array
  • Array of detected point clouds
  • Plane touch event

We can onDrawFrame In the method, the above events are used for corresponding processing. For example, if a plane touch event is encountered, an Android model is placed at the corresponding location, and the detected plane and point cloud are drawn at the same time.

 //Draw Background private final BackgroundRenderer backgroundRenderer = new BackgroundRenderer(); //Draw Objects private final ObjectRenderer virtualObject = new ObjectRenderer(); //Paint object shadows private final ObjectRenderer virtualObjectShadow = new ObjectRenderer(); //Draw Plane private final PlaneRenderer planeRenderer = new PlaneRenderer(); //Draw Cloud Points private final PointCloudRenderer pointCloudRenderer = new PointCloudRenderer(); public void onDrawFrame(GL10 gl) { frame = session.update(); //  Acquire the original data frame of the camera (blocking method) // Handle one tap per frame. HandleTap (frame, camera);//Check whether there are plane click events. If there are, place the Android model in the corresponding location ... // Draw background. backgroundRenderer.draw(frame); //  Draw camera preview data as background image ... // Visualize tracked points. PointCloud pointCloud = frame.acquirePointCloud(); pointCloudRenderer.update(pointCloud); PointCloudRenderer.draw (viewmtx, projmtx);//Draw point clouds ... // Visualize planes. PlaneRenderer.drawPlanes (session. getAllTrackables (Plane. class), camera. getDisplayOrientedPose(), projmtx);//Draw a plane ... // Update and draw the model and its shadow. virtualObject.updateModelMatrix(anchorMatrix, scaleFactor); virtualObjectShadow.updateModelMatrix(anchorMatrix, scaleFactor); VirtualObject.draw (viewmtx, projmtx, colorCorrectionRgba, coloredAnchor. color);//Draw the Android model VirtualObjectShadow. draw (viewmtx, projmtx, colorCorrectionRgba, coloredAnchor. color);//Draw the shadow of Android model }

##Technology combination: release AR enhanced images to real-time audio and video cloud

The basic Real time audio and video call and AR enhancement screen Now we just need to combine them finally.

Since the session will occupy the device camera after starting, the 7N RTN SDK cannot collect. At this time, we need to use the function "external audio and video data import" provided by the latest version.

Before publishing the stream, we need to obtain AR enhancement screen Because the "External Video Data Import" function of the current 7N RTN Android SDK only supports data in NV21 format.

The following example code is in the onDrawFrame Add the tail in the method, read the surface content data of GLSurfaceView, perform necessary format conversion, and then publish it

 public void onDrawFrame(GL10 gl) { ... If (mPublished) {//AR data will be imported only after the 7N RTN publishing stream succeeds //Read the data of AR enhancement screen from GPU GLES20.glReadPixels(0, 0,  mSurfaceWidth, mSurfaceHeight, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, mBufferRGBA); //RGBA is converted to NV21 (the algorithm is not expanded here for space reasons) mBufferNV21 = RGBAToNV21(mBufferRGBA,  mSurfaceWidth, mSurfaceHeight); //Release AR enhanced images in the form of NV21 data through the "external video data import" function mRTCManager.inputVideoFrame(mBufferNV21, mSurfaceWidth, mSurfaceHeight, 0, frame.getTimestamp()); } }

summary

Using the "external audio and video data import" function provided by the 1.1.0+version 7N RTN SDK, you can easily combine AR with real-time audio and video communication. The above programs are based on the 7N RTN SDK and the corresponding RTN network operation, and can support up to 20 simultaneous low latency audio and video calls. It is believed that the combination of AR technology and real-time audio and video communication will bring more application scenarios in the near future.

Free time limit gift activity

From October 30, Qiniu Real time Audio and Video Cloud will implement a monthly free time limit gift activity. The pure audio, standard definition, high-definition and ultra clear four gears will be given 5000 minutes for free. If they are fully used, the total consumption amount will be 770 yuan.

Expand to read the full text
Loading
Click to lead the topic 📣 Post and join the discussion 🔥
Reward
zero comment
one Collection
zero fabulous
 Back to top
Top