Skip to main content

You are looking at Interactive Live Streaming v3.x Docs. The newest version is  Interactive Live Streaming 4.x

Raw Video Data

Introduction

During the video transmission process, you can pre- and post-process the captured video data to achieve the desired playback effect.

Agora provides the raw data function for you to process the video data per your application scenario. This function enables you to pre-process the captured video frames before sending it to the encoder, or to post-process the decoded video frames.

Sample project

Agora provides an open-source sample project on GitHub. You can view the source code on Github or download the project to try it out.

Implementation

Before using the raw data function, ensure that you have implemented the basic real-time communication functions in your project. See Start a Video Call or Start Interactive Live Video Streaming for details.

The Agora Java SDK does not provide a raw video data interface. The Agora C++ SDK provides the IVideoFrameObserver class to capture and modify raw video data. Therefore, you must use Java to call Agora's C++ interface via the JNI (Java Native Interface). Since the Video SDK for Java encapsulates the Video SDK for C++, you can include the .h file in the SDK to directly call the C++ methods.

Follow these steps to implement the raw video data function in your project:

  1. Use the JNI and C++ interface files to generate a shared library in the project, and use Java to call the raw video data interface of the Agora C++ SDK.
  2. Before joining a channel, call the registerVideoFrameObserver method to register a video observer, and implement an IVideoFrameObserver class in this method.
  3. After you successfully register the observer, the SDK captures every video frame and sends the captured raw video data via the onCaptureVideoFrame, onPreEncodeVideoFrame, or onRenderVideoFrame callbacks.
  4. Process the captured raw video data according to your needs. Then, you can send it to the SDK via the callbacks mentioned in step 3.

When you use the SDK v3.0.1 or later, note that the SDK no longer guarantees that callback functions in IVideoFrameObserver are reported in the same thread. The SDK only guarantees the sequence of these callbacks. If you are using OpenGL to perform image enhancement on the raw video data, you need to actively switch the OpenGL context in the callback function in IVideoFrameObserver to adapt to a multi-threaded scenario; otherwise, the image enhancement cannot work.

Call the Agora C++ API in a Java project

The following diagram shows the basic flow of calling the Agora C++ API in a Java project:

1607912602385

  • The Java project loads the .so library built from the C++ interface file (.cpp file) via the Java interface file.
  • The Java interface file generates a .h file with the javac -h -jni command. The C++ interface file should include this file.
  • The C++ interface file calls C++ methods of the .so library in the Agora Android SDK by including the Agora Android SDK header file.

API call sequence

The following diagram shows how to implement the raw video data function in your project:

1607915730887

The registerVideoFrameObserver, onCaptureVideoFrame, onPreEncodeVideoFrame, and onRenderVideoFrame are all C++ methods and callbacks.

Sample code

Create a JNI interface

Create a Java interface file and a C++ interface file separately via the JNI interface. Make sure to build the C++ interface file as a .so library.

  1. Create a Java interface file to call the C++ API. The interface file should declare the corresponding Java methods for calling C++. Refer to the MediaPreProcessing.java file in the sample project for the implementation.

_46
// The Java interface file declares the corresponding Java methods for calling C++.
_46
package io.agora.advancedvideo.rawdata;
_46
_46
import java.nio.ByteBuffer;
_46
_46
public class MediaPreProcessing {
_46
static {
_46
// Loads the C++ .so library. Build the C++ interface file to generate the .so library.
_46
// The name of the .so library depends on the library name generated by building the C++ interface file.
_46
System.loadLibrary("apm-plugin-raw-data");
_46
}
_46
// Define the Java method that corresponds to the C++ API
_46
public interface ProgressCallback {
_46
// Get the captured video frame
_46
void onCaptureVideoFrame(int videoFrameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs);
_46
// Get the pre-encoded video frame
_46
void onPreEncodeVideoFrame(int videoFrameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs);
_46
// Get the video frame rendered by the SDK
_46
void onRenderVideoFrame(int uid, int videoFrameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs);
_46
// Get the recorded audio frame
_46
void onRecordAudioFrame(int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength);
_46
// Get the playback audio frame
_46
void onPlaybackAudioFrame(int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength);
_46
// Get the playback audio frame before mixing
_46
void onPlaybackAudioFrameBeforeMixing(int uid, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength);
_46
// Get the mixed audio frame
_46
void onMixedAudioFrame(int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, long renderTimeMs, int bufferLength);
_46
}
_46
_46
public static native void setCallback(ProgressCallback callback);
_46
_46
public static native void setVideoCaptureByteBuffer(ByteBuffer byteBuffer);
_46
_46
public static native void setAudioRecordByteBuffer(ByteBuffer byteBuffer);
_46
_46
public static native void setAudioPlayByteBuffer(ByteBuffer byteBuffer);
_46
_46
public static native void setBeforeAudioMixByteBuffer(ByteBuffer byteBuffer);
_46
_46
public static native void setAudioMixByteBuffer(ByteBuffer byteBuffer);
_46
_46
public static native void setVideoDecodeByteBuffer(int uid, ByteBuffer byteBuffer);
_46
_46
public static native void releasePoint();
_46
_46
}

  1. Run the following command to generate a .h file from the Java interface file:

_6
# JDK 10 or later
_6
javac -h -jni MediaPreProcessing.java
_6
_6
# JDK 9 or earlier
_6
javac MediaPreProcessing.java
_6
javah -jni MediaPreProcessing.class

  1. Create a C++ interface file to include the methods to be called from Java. The C++ interface file exports the corresponding methods from the C++ SDK based on the generated .h file. Refer to the io_agora_advancedvideo_rawdata_MediaPreProcessing.cpp file in the sample project for the implementation.
The JNI defines the data structure in C++ that maps to Java. You can refer here for more information.

_231
// Global variables
_231
_231
jobject gCallBack = nullptr;
_231
jclass gCallbackClass = nullptr;
_231
// Method ID at the Java level
_231
jmethodID captureVideoMethodId = nullptr;
_231
jmethodID renderVideoMethodId = nullptr;
_231
void *_javaDirectPlayBufferCapture = nullptr;
_231
map<int, void *> decodeBufferMap;
_231
_231
static JavaVM *gJVM = nullptr;
_231
_231
// Implement the IVideoFrameObserver class and related callbacks
_231
class AgoraVideoFrameObserver : public agora::media::IVideoFrameObserver
_231
{
_231
_231
_231
public:
_231
AgoraVideoFrameObserver()
_231
{
_231
_231
}
_231
_231
~AgoraVideoFrameObserver()
_231
{
_231
_231
}
_231
// Get video frame data from the VideoFrame object, copy to the ByteBuffer, and call Java method via the method ID
_231
void getVideoFrame(VideoFrame &videoFrame, _jmethodID *jmethodID, void *_byteBufferObject,
_231
unsigned int uid)
_231
{
_231
if (_byteBufferObject == nullptr)
_231
{
_231
return;
_231
}
_231
_231
int width = videoFrame.width;
_231
int height = videoFrame.height;
_231
size_t widthAndHeight = (size_t) videoFrame.yStride * height;
_231
size_t length = widthAndHeight * 3 / 2;
_231
_231
AttachThreadScoped ats(gJVM);
_231
JNIEnv *env = ats.env();
_231
_231
memcpy(_byteBufferObject, videoFrame.yBuffer, widthAndHeight);
_231
memcpy((uint8_t *) _byteBufferObject + widthAndHeight, videoFrame.uBuffer,
_231
widthAndHeight / 4);
_231
memcpy((uint8_t *) _byteBufferObject + widthAndHeight * 5 / 4, videoFrame.vBuffer,
_231
widthAndHeight / 4);
_231
_231
if (uid == 0)
_231
{
_231
env->CallVoidMethod(gCallBack, jmethodID, videoFrame.type, width, height, length,
_231
videoFrame.yStride, videoFrame.uStride,
_231
videoFrame.vStride, videoFrame.rotation,
_231
videoFrame.renderTimeMs);
_231
} else
_231
{
_231
env->CallVoidMethod(gCallBack, jmethodID, uid, videoFrame.type, width, height,
_231
length,
_231
videoFrame.yStride, videoFrame.uStride,
_231
videoFrame.vStride, videoFrame.rotation,
_231
videoFrame.renderTimeMs);
_231
}
_231
}
_231
// Copy video frame data from ByteBuffer to the VideoFrame object
_231
void writebackVideoFrame(VideoFrame &videoFrame, void *byteBuffer)
_231
{
_231
if (byteBuffer == nullptr)
_231
{
_231
return;
_231
}
_231
_231
int width = videoFrame.width;
_231
int height = videoFrame.height;
_231
size_t widthAndHeight = (size_t) videoFrame.yStride * height;
_231
_231
memcpy(videoFrame.yBuffer, byteBuffer, widthAndHeight);
_231
memcpy(videoFrame.uBuffer, (uint8_t *) byteBuffer + widthAndHeight, widthAndHeight / 4);
_231
memcpy(videoFrame.vBuffer, (uint8_t *) byteBuffer + widthAndHeight * 5 / 4,
_231
widthAndHeight / 4);
_231
}
_231
_231
public:
_231
// Implement the onCaptureVideoFrame callback
_231
virtual bool onCaptureVideoFrame(VideoFrame &videoFrame) override
_231
{
_231
// Get captured video frames
_231
getVideoFrame(videoFrame, captureVideoMethodId, _javaDirectPlayBufferCapture, 0);
_231
__android_log_print(ANDROID_LOG_DEBUG, "AgoraVideoFrameObserver", "onCaptureVideoFrame");
_231
// Send the video frames back to the SDK
_231
writebackVideoFrame(videoFrame, _javaDirectPlayBufferCapture);
_231
return true;
_231
}
_231
_231
// Implement the onRenderVideoFrame callback
_231
virtual bool onRenderVideoFrame(unsigned int uid, VideoFrame &videoFrame) override
_231
{
_231
__android_log_print(ANDROID_LOG_DEBUG, "AgoraVideoFrameObserver", "onRenderVideoFrame");
_231
map<int, void *>::iterator it_find;
_231
it_find = decodeBufferMap.find(uid);
_231
_231
if (it_find != decodeBufferMap.end())
_231
{
_231
if (it_find->second != nullptr)
_231
{
_231
// Get the video frame rendered by the SDK
_231
getVideoFrame(videoFrame, renderVideoMethodId, it_find->second, uid);
_231
// Send the video frames back to the SDK
_231
writebackVideoFrame(videoFrame, it_find->second);
_231
}
_231
}
_231
return true;
_231
}
_231
_231
// Implement the onPreEncodeVideoFrame callback
_231
virtual bool onPreEncodeVideoFrame(VideoFrame& videoFrame) override {
_231
// Get the pre-encoded video frame
_231
getVideoFrame(videoFrame, preEncodeVideoMethodId, _javaDirectPlayBufferCapture, 0);
_231
__android_log_print(ANDROID_LOG_DEBUG, "AgoraVideoFrameObserver", "onPreEncodeVideoFrame");
_231
// Send the video frames back to the SDK
_231
writebackVideoFrame(videoFrame, _javaDirectPlayBufferCapture);
_231
return true;
_231
}
_231
_231
};
_231
_231
_231
...
_231
_231
// AgoraVideoFrameObserver object
_231
static AgoraVideoFrameObserver s_videoFrameObserver;
_231
// rtcEngine object
_231
static agora::rtc::IRtcEngine *rtcEngine = nullptr;
_231
// Set up the C++ interface
_231
#ifdef __cplusplus
_231
extern "C" {
_231
#endif
_231
_231
_231
int __attribute__((visibility("default")))
_231
loadAgoraRtcEnginePlugin(agora::rtc::IRtcEngine *engine)
_231
{
_231
__android_log_print(ANDROID_LOG_DEBUG, "agora-raw-data-plugin", "loadAgoraRtcEnginePlugin");
_231
rtcEngine = engine;
_231
return 0;
_231
}
_231
_231
void __attribute__((visibility("default")))
_231
unloadAgoraRtcEnginePlugin(agora::rtc::IRtcEngine *engine)
_231
{
_231
__android_log_print(ANDROID_LOG_DEBUG, "agora-raw-data-plugin", "unloadAgoraRtcEnginePlugin");
_231
_231
rtcEngine = nullptr;
_231
}
_231
_231
_231
...
_231
_231
// For the Java interface file, use the JNI to export corresponding C++.
_231
// The Java_io_agora_advancedvideo_rawdata_MediaPreProcessing_setCallback method corresponds to the setCallback method in the Java interface file.
_231
JNIEXPORT void JNICALL Java_io_agora_advancedvideo_rawdata_MediaPreProcessing_setCallback
_231
(JNIEnv *env, jclass, jobject callback)
_231
{
_231
if (!rtcEngine) return;
_231
_231
env->GetJavaVM(&gJVM);
_231
// Create an AutoPtr instance that uses the IMediaEngine class as the template
_231
agora::util::AutoPtr<agora::media::IMediaEngine> mediaEngine;
_231
// The AutoPtr instance calls the queryInterface method to get a pointer to the IMediaEngine instance from the IID.
_231
// The AutoPtr instance accesses the pointer to the IMediaEngine instance via the arrow operator and calls the registerVideoFrameObserver via the IMediaEngine instance.
_231
mediaEngine.queryInterface(rtcEngine, agora::INTERFACE_ID_TYPE::AGORA_IID_MEDIA_ENGINE);
_231
if (mediaEngine)
_231
{
_231
// Register the video frame observer
_231
int code = mediaEngine->registerVideoFrameObserver(&s_videoFrameObserver);
_231
_231
...
_231
_231
}
_231
_231
if (gCallBack == nullptr)
_231
{
_231
gCallBack = env->NewGlobalRef(callback);
_231
gCallbackClass = env->GetObjectClass(gCallBack);
_231
// Get the method ID of callback functions
_231
captureVideoMethodId = env->GetMethodID(gCallbackClass, "onCaptureVideoFrame",
_231
"(IIIIIIIIJ)V");
_231
renderVideoMethodId = env->GetMethodID(gCallbackClass, "onRenderVideoFrame",
_231
"(IIIIIIIIIJ)V");
_231
_231
__android_log_print(ANDROID_LOG_DEBUG, "setCallback", "setCallback done successfully");
_231
}
_231
_231
...
_231
_231
// C++ implementation of the setVideoCaptureByteBuffer method in the Java interface file
_231
JNIEXPORT void JNICALL
_231
Java_io_agora_advancedvideo_rawdata_MediaPreProcessing_setVideoCaptureByteBuffer
_231
(JNIEnv *env, jclass, jobject bytebuffer)
_231
{
_231
_javaDirectPlayBufferCapture = env->GetDirectBufferAddress(bytebuffer);
_231
}
_231
_231
// C++ implementation of the setVideoDecodeByteBuffer method in the Java interface file
_231
JNIEXPORT void JNICALL
_231
Java_io_agora_advancedvideo_rawdata_MediaPreProcessing_setVideoDecodeByteBuffer
_231
(JNIEnv *env, jclass, jint uid, jobject byteBuffer)
_231
{
_231
if (byteBuffer == nullptr)
_231
{
_231
decodeBufferMap.erase(uid);
_231
} else
_231
{
_231
void *_javaDirectDecodeBuffer = env->GetDirectBufferAddress(byteBuffer);
_231
decodeBufferMap.insert(make_pair(uid, _javaDirectDecodeBuffer));
_231
__android_log_print(ANDROID_LOG_DEBUG, "agora-raw-data-plugin",
_231
"setVideoDecodeByteBuffer uid: %u, _javaDirectDecodeBuffer: %p",
_231
uid, _javaDirectDecodeBuffer);
_231
}
_231
}
_231
}
_231
}
_231
_231
_231
...
_231
_231
_231
#ifdef __cplusplus
_231
}
_231
#endif

  1. Build the C++ interface file via the NDK to generate a .so library. Use the System.loadLibrary() method to load the generated .so library in the Java interface file. See the following CMake example:

_25
cmake_minimum_required(VERSION 3.4.1)
_25
_25
add_library( # Sets the name of the library.
_25
apm-plugin-raw-data
_25
_25
# Sets the library as a shared library.
_25
SHARED
_25
_25
# Provides a relative path to your source file(s).
_25
src/main/cpp/io_agora_advancedvideo_rawdata_MediaPreProcessing.cpp)
_25
_25
_25
find_library( # Sets the name of the path variable.
_25
log-lib
_25
_25
# Specifies the name of the NDK library that
_25
# you want CMake to locate.
_25
log)
_25
_25
target_link_libraries( # Specifies the target library.
_25
apm-plugin-raw-data
_25
_25
# Links the target library to the log library
_25
# included in the NDK.
_25
${log-lib})

Implement the raw video data function in a Java project

  1. Implement an interface that maps to the C++ methods in a Java interface file.

_75
// Implement the ProgressCallback interface in Java
_75
public class MediaDataObserverPlugin implements MediaPreProcessing.ProgressCallback {
_75
_75
_75
...
_75
_75
_75
// Get the captured video frame
_75
@Override
_75
public void onCaptureVideoFrame(int videoFrameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs) {
_75
byte[] buf = new byte[bufferLength];
_75
byteBufferCapture.limit(bufferLength);
_75
byteBufferCapture.get(buf);
_75
byteBufferCapture.flip();
_75
_75
for (MediaDataVideoObserver observer : videoObserverList) {
_75
observer.onCaptureVideoFrame(buf, videoFrameType, width, height, bufferLength, yStride, uStride, vStride, rotation, renderTimeMs);
_75
}
_75
_75
byteBufferCapture.put(buf);
_75
byteBufferCapture.flip();
_75
_75
if (beCaptureVideoShot) {
_75
beCaptureVideoShot = false;
_75
_75
getVideoSnapshot(width, height, rotation, bufferLength, buf, captureFilePath, yStride, uStride, vStride);
_75
}
_75
}
_75
// Get the pre-encoded video frame
_75
@Override
_75
public void onPreEncodeVideoFrame(int videoFrameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs) {
_75
byte[] buf = new byte[bufferLength];
_75
byteBufferCapture.limit(bufferLength);
_75
byteBufferCapture.get(buf);
_75
byteBufferCapture.flip();
_75
_75
for (MediaDataVideoObserver observer : videoObserverList) {
_75
observer.onPreEncodeVideoFrame(buf, videoFrameType, width, height, bufferLength, yStride, uStride, vStride, rotation, renderTimeMs);
_75
}
_75
_75
byteBufferCapture.put(buf);
_75
byteBufferCapture.flip();
_75
_75
if (beCaptureVideoShot) {
_75
beCaptureVideoShot = false;
_75
_75
getVideoSnapshot(width, height, rotation, bufferLength, buf, captureFilePath, yStride, uStride, vStride);
_75
}
_75
}
_75
// Get the video frame rendered by the SDK
_75
@Override
_75
public void onRenderVideoFrame(int uid, int videoFrameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs) {
_75
for (MediaDataVideoObserver observer : videoObserverList) {
_75
ByteBuffer tmp = decodeBufferList.get(uid);
_75
if (tmp != null) {
_75
byte[] buf = new byte[bufferLength];
_75
tmp.limit(bufferLength);
_75
tmp.get(buf);
_75
tmp.flip();
_75
_75
observer.onRenderVideoFrame(uid, buf, videoFrameType, width, height, bufferLength, yStride, uStride, vStride, rotation, renderTimeMs);
_75
_75
tmp.put(buf);
_75
tmp.flip();
_75
_75
if (beRenderVideoShot) {
_75
if (uid == renderVideoShotUid) {
_75
beRenderVideoShot = false;
_75
_75
getVideoSnapshot(width, height, rotation, bufferLength, buf, renderFilePath, yStride, uStride, vStride);
_75
}
_75
}
_75
}
_75
}
_75
}

  1. Call the setCallback method. The setCallback method calls the registerVideoFrameObserver C++ method via JNI to register a video frame observer.

_10
@Override
_10
public void onActivityCreated(@Nullable Bundle savedInstanceState) {
_10
super.onActivityCreated(savedInstanceState);
_10
// Implement the MediaDataObserverPlugin instance
_10
mediaDataObserverPlugin = MediaDataObserverPlugin.the();
_10
// Register the video frame observer
_10
MediaPreProcessing.setCallback(mediaDataObserverPlugin);
_10
MediaPreProcessing.setVideoCaptureByteBuffer(mediaDataObserverPlugin.byteBufferCapture);
_10
mediaDataObserverPlugin.addVideoObserver(this);
_10
}

  1. Implement the onCaptureVideoFrame, onRenderVideoFrame, and onPreEncodeVideoFrame callbacks. Get the video frame from the callbacks, and process the video frame.

_26
// Get the captured video frame
_26
@Override
_26
public void onCaptureVideoFrame(byte[] data, int frameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs) {
_26
_26
Log.e(TAG, "onCaptureVideoFrame0");
_26
if (blur) {
_26
return;
_26
}
_26
Bitmap bitmap = YUVUtils.i420ToBitmap(width, height, rotation, bufferLength, data, yStride, uStride, vStride);
_26
Bitmap bmp = YUVUtils.blur(getContext(), bitmap, 4);
_26
System.arraycopy(YUVUtils.bitmapToI420(width, height, bmp), 0, data, 0, bufferLength);
_26
}
_26
// Get the pre-rendered video frame
_26
@Override
_26
public void onRenderVideoFrame(int uid, byte[] data, int frameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs) {
_26
if (blur) {
_26
return;
_26
}
_26
Bitmap bmp = YUVUtils.blur(getContext(), YUVUtils.i420ToBitmap(width, height, rotation, bufferLength, data, yStride, uStride, vStride), 4);
_26
System.arraycopy(YUVUtils.bitmapToI420(width, height, bmp), 0, data, 0, bufferLength);
_26
}
_26
// Get the pre-encoded video frame
_26
@Override
_26
public void onPreEncodeVideoFrame(byte[] data, int frameType, int width, int height, int bufferLength, int yStride, int uStride, int vStride, int rotation, long renderTimeMs) {
_26
Log.e(TAG, "onPreEncodeVideoFrame0");
_26
}

API reference

See also

Refer to Raw Audio Data if you want to implement the raw audio data function in your project.

Interactive Live Streaming