Skip to main content

Channel quality

Customer satisfaction for your IoT SDK integrated app depends on the quality of video and audio it provides. Quality of audiovisual communication through your app is affected by the following factors:

  • Bandwidth of network connection: Bandwidth is the volume of information that an Internet connection can handle per unit of time. When the available bandwidth is not sufficient to transmit the amount of data necessary to provide the desired video quality, your users see jerky or frozen video along with audio that cuts in and out.

  • Stability of network connection: Network connections are often unstable with the network quality going up and down. Users get temporarily disconnected and come back online after an interruption. These issues lead to a poor audiovisual experience for your users unless your app is configured to respond to these situations and take remedial actions.

  • Hardware quality: The camera and microphone used to capture video and audio must be of sufficiently good quality. If the user's hardware does not capture the audiovisual information in suitably high definition, it limits the quality of audio and video that is available to the remote user.

  • Video and audio settings: The sharpness, smoothness, and overall quality of the video is directly linked to the frame rate, bitrate and other video settings. Similarly, the audio quality depends on the sample rate, bitrate, number of channels and other audio parameters. If you do not choose proper settings, the audio and video transmitted are of poor quality. On the other hand, if the settings are too demanding, the available bandwidth quickly gets choked, leading to suboptimal experience for your users.

  • Echo: Echo is produced when your audio signal is played by a remote user through a speakerphone or an external device. This audio is captured by the remote user's microphone and sent back to you. Echo negatively affects audio quality, making speech difficult to understand.

  • Multiple users in a channel: When multiple users engage in real-time audio and video communication in a channel, the available bandwidth is quickly used up due to several incoming audio and video streams. The device performance also deteriorates due to the excessive workload required to decode and render multiple video streams.

This page shows you how to use IoT SDK features to account for these factors and ensure optimal audio and video quality in your app.

Understand the tech

To provide the best audio and video quality in your app:

  • Choose an audio codec to optimize bit rate and quality

    In audio and video streaming, the choice of an encoding algorithm affects both quality and bit rate. Your goal is to lower the required bit rate while maintaining quality at the desired level. IoT SDK offers the following built-in audio codecs:

    • Opus
    • G722
    • G711A (PCMU)
    • G711U (PCMU)

    You specify an audio codec when you join a channel. Depending on your audio quality requirements you also set the sampling rate and the number of channels. If your app needs a custom encoding algorithm, you disable use of a built-in codec and implement your own encoder.

  • Adjust sending bit rate in real time

    In order to optimize data transmission and avoid network congestion, best practice is to adjust the sending bit rate in real-time according to changes in network conditions.

    You configure BandWidth Estimation (BWE) before joining a channel to set the minimum, maximum, and starting bit rate values according to the actual bandwidth and bit rate needs. When the network bandwidth changes, IoT SDK triggers an event to prompt your app to adjust the sending bit rate in real time. The bit rate returned by the callback is the maximum recommended encoding bit rate of the video encoder.

  • Change audio and video streaming status

    Network traffic can be reduced by suspending data transmission when audio or video feed is not required by the receiver. After a user successfully connects to a channel and starts audio and video streaming, they can suspend sending streams to a specific connection or to all connections to flexibly manage the transmission status of audio and video streams. Similarly, the user may suspend receiving a specific or all streams as required.

    When a user changes the transmission status of the local audio or video stream, the IoT SDK triggers a corresponding callback to prompt the remote user to suspend or resume sending their audio or video stream.

  • Request key frames

    In video transmission, a frame containing complete image information is known as a key frame. Subsequent frames, known as delta frames, only include modifications to the previous frame. When a video streaming client experiences network congestion, the data loss in delta frames leads to visual inconsistencies in subsequent frames. IoT SDK enables you to request a key frame from the sender to resolve such issues. A fresh key frame resets the video stream to a known state. This feature allows the client to resynchronize with the video stream and resume playback without visual artifacts or distortion.

  • Log files

    IoT SDK provides configuration options that you use to customize the location, content and size of log files containing key data of IoT SDK operation. When you set up logging, IoT SDK writes information messages, warnings, and errors regarding activities such as initialization, configuration, connection and disconnection to log files. Log files are useful in detecting and resolving channel quality issues.

The following figure shows the workflow you need to implement to ensure channel quality in your app:

Ensure Channel Quality

Prerequisites

In order to follow this procedure you must have:

  • Android Studio 4.1 or higher.
  • Android SDK API Level 24 or higher.
  • A mobile device that runs Android 4.1 or higher.
  • An Agora account and project.

  • A computer with Internet access.

    Ensure that no firewall is blocking your network communication.

Project setup

To create the environment necessary to implement channel quality best practices into your app, open the SDK quickstart for IoT SDK project you created previously.

Implement best practice to optimize channel quality

This section shows you how to integrate channel quality optimization features of IoT SDK into your app, step-by-step.

Implement the user interface

To enable app users to mute and unmute audio and video, add checkboxes to the user interface. To do this, open /app/res/layout/activity_main.xml and add the following lines before </RelativeLayout>:


_15
<CheckBox
_15
android:id="@+id/MuteLocalAudio"
_15
android:text="Mute local audio"
_15
android:layout_width="wrap_content"
_15
android:layout_height="wrap_content"
_15
android:layout_below="@id/JoinButton"
_15
android:layout_alignStart="@id/JoinButton" />
_15
_15
<CheckBox
_15
android:id="@+id/MuteLocalVideo"
_15
android:text="Mute local video"
_15
android:layout_width="wrap_content"
_15
android:layout_height="wrap_content"
_15
android:layout_below="@id/MuteLocalAudio"
_15
android:layout_alignStart="@id/JoinButton" />

Handle the system logic

Import the necessary Android classes and access the UI elements.

  1. Import supporting libraries

    In MainActivity.java, add the following to the list of import statements:


    _3
    import java.lang.Math;
    _3
    import android.widget.CheckBox;
    _3
    import android.widget.CompoundButton;

  2. Define variables to access the checkboxes

    In /app/java/com.example.<projectname>/MainActivity, add the following declarations to the MainActivity class:


    _2
    private CheckBox checkMuteLocalAudio;
    _2
    private CheckBox checkMuteLocalVideo;

Implement channel quality features

To implement channel quality features, take the following steps:

  1. Configure the IoT SDK log file

    To customize the location, and content of the log file, add the following code to setupAgoraEngine() before agoraEngine.init(appId, agoraRtcEvents, options);


    _3
    // Configure log file location and logging level
    _3
    options.logCfg.logPath = getExternalFilesDir(null).getPath() + "/iotlog";
    _3
    options.logCfg.logLevel = AgoraRtcService.LogLevel.RTC_LOG_WARNING;

  2. Specify the audio codec, sampling rate and the number of channels

    You set the audio codec, sampling rate, and the number of channels in ChannelOptions that you pass to the agoraEngine.joinChannel method. To set these parameters, modify the following lines in the joinChannel(View view) method according to your requirements:


    _4
    channelOptions.audioCodecOpt.audioCodecType
    _4
    = AgoraRtcService.AudioCodecType.AUDIO_CODEC_TYPE_OPUS;
    _4
    channelOptions.audioCodecOpt.pcmSampleRate = 16000;
    _4
    channelOptions.audioCodecOpt.pcmChannelNum = 1;

    Choose from OPUS, G722, G711A, and G711U built-in audio codecs in the IoT SDK.

    To use your own audio codec, set channelOptions.audioCodecOpt.audioCodecType to AUDIO_CODEC_DISABLED. When you call agoraEngine.sendAudioData, set the dataType parameter of AudioFrameInfo to your encoding format. This setting transmits the audio encoding format as is to the receiving end. When you disable use of a built-in audio codec, IoT SDK does not process the audio. The receiving end obtains the encoded audio data and the encoding format through the onAudioData callback and decodes it using a custom decoder.

  3. Configure bandwidth estimation parameters

    You configure bandWidth estimation parameters before joining a channel. For example, based on the definition levels of a webcam, the minimum value can be set to 400 kbps, and the maximum value can be set to 4200 kbps. The starting value is between the minimum and the maximum value. If the initial encoding is SD, the initial bit rate can be set to 500 kbps, based on the table.

    To specify these parameters, add the following code to the joinChannel(View view) method before agoraEngine.joinChannel.


    _2
    // Configure Bandwidth Estimation. Min, max and starting bit rates in bps
    _2
    agoraEngine.setBweParam(connectionId, 400000, 4200000, 500000);

  4. Respond to target bit rate changes

    When network bandwidth changes, IoT SDK triggers the onTargetBitrateChanged callback to prompt the app to adjust the sending bit rate. The targetBps value returned by the callback is the maximum recommended encoding bit rate of the video encoder.

    In this example, you use definition levels of a webcam to switch resolution depending on the reported target bit rate. To do this, replace the onTargetBitrateChanged method under agoraRtcEvents with the following:


    _36
    int curTargetBitrate, diffTargetBitrate, lastTargetBitrate = 500;
    _36
    int lowBitrateL1 = 400, highBitrateL1 = 800;
    _36
    int lowBitrateL2 = 1130 , highBitrateL2 = 2260, highBitrateL3 = 4160;
    _36
    int curResolutionLevel = 1 , lastResolutionLevel = 1;
    _36
    _36
    @Override
    _36
    public void onTargetBitrateChanged(int connId, int targetBps) {
    _36
    // Adjust the sending code rate in real time
    _36
    // The current code rate is graded according to 100 K, rounded down
    _36
    curTargetBitrate = targetBps/100000 * 100;
    _36
    diffTargetBitrate = Math.abs(curTargetBitrate - lastTargetBitrate);
    _36
    _36
    // If the difference between the detected bandwidth and the
    _36
    // current encoding rate is more than 100K, adjust the encoding parameters
    _36
    if ((diffTargetBitrate >= 100) && ((lowBitrateL1 < curTargetBitrate)
    _36
    && (curTargetBitrate < highBitrateL3))) {
    _36
    if ((lowBitrateL1 < curTargetBitrate) && (curTargetBitrate < highBitrateL1)) {
    _36
    curResolutionLevel = 1; // Set resolution to L1
    _36
    } else if ((lowBitrateL2 < curTargetBitrate) && (curTargetBitrate < highBitrateL2)) {
    _36
    curResolutionLevel = 2; // Set resolution to L2
    _36
    } else {
    _36
    curResolutionLevel = 3; // Set resolution to L3
    _36
    }
    _36
    _36
    // Adjust encoding resolution
    _36
    if (curResolutionLevel != lastResolutionLevel) {
    _36
    //setEncoderResolution(curResolutionLevel);
    _36
    }
    _36
    _36
    // Set the encoding bit rate
    _36
    // setEncoderBitrate(curTargetBitrate);
    _36
    _36
    lastResolutionLevel = curResolutionLevel;
    _36
    lastTargetBitrate = curTargetBitrate;
    _36
    }
    _36
    }

  5. Manage audio and video streaming status

    When a user taps a Checkbox, you pause or resume sending local audio and video streams. To do this, add the following code to the onCreate method in the MainActivity class:


    _20
    checkMuteLocalAudio = findViewById(R.id.MuteLocalAudio);
    _20
    checkMuteLocalVideo = findViewById(R.id.MuteLocalVideo);
    _20
    _20
    checkMuteLocalAudio.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
    _20
    @Override
    _20
    public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
    _20
    if (isJoined) {
    _20
    agoraEngine.muteLocalAudio(connectionId, checkMuteLocalAudio.isChecked());
    _20
    }
    _20
    }
    _20
    });
    _20
    _20
    checkMuteLocalVideo.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
    _20
    @Override
    _20
    public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
    _20
    if (isJoined) {
    _20
    agoraEngine.muteLocalVideo(connectionId, checkMuteLocalVideo.isChecked());
    _20
    }
    _20
    }
    _20
    });

    To pause and resume playing remote audio or video streams call agoraEngine.muteRemoteAudio(connectionId, remoteUid, isMuted) or agoraEngine.muteRemoteVideo(connectionId, remoteUid, isMuted).

  6. Handle muting and unmuting notifications of remote streams

    When a remote user mutes or unmutes their audio or video stream, you receive notification of these changes. In this example, you inform the user of these events by displaying a message. To do this, replace the onUserMuteAudio and onUserMuteVideo methods under agoraRtcEvents with the following:


    _11
    @Override
    _11
    public void onUserMuteAudio(int connId, int uid, boolean muted) {
    _11
    // A remote user paused or resumed sending the audio stream
    _11
    showMessage("Remote user " + uid + (muted ? " muted" : " unmuted") + " audio");
    _11
    }
    _11
    _11
    @Override
    _11
    public void onUserMuteVideo(int connId, int uid, boolean muted) {
    _11
    // A remote user paused or resumed sending the video stream
    _11
    showMessage("Remote user " + uid + (muted ? " muted" : " unmuted") + " video");
    _11
    }

  7. Request and send key frames

    When you call agoraEngine.sendVideoData to send a video frame, you specify if this frame is a key frame by setting the frameType parameter of VideoFrameInfo.

    When the sender does not send a key frame for a long time or the key frame is lost or damaged during transmission, IoT SDK triggers the onKeyFrameGenReq callback to advise the sender to generate a key frame. In this example, you show a message when you receive the onKeyFrameGenReq callback. Add the following code to the MainActivity class:


    _5
    @Override
    _5
    public void onKeyFrameGenReq(String channel, int remote_uid, byte stream_id) {
    _5
    // Set a flag for the encoder to generate a key frame",
    _5
    showMessage("Frame loss detected.");
    _5
    }

    If the receiver encounters an error when decoding frames, it calls the agoraEngine.requestVideoKeyFrame method to request a remote user to generate a fresh key frame.

Test your implementation

To ensure that you have implemented channel quality features into your app:

  1. Generate a temporary token in Agora Console .

  2. In your browser, navigate to the Agora Muting web demo and update App ID, Channel, and Token with the values for your temporary token, then click Join.

  1. In Android Studio, open app/java/com.example.<projectname>/MainActivity, and update appId, channelName and token with the values for your temporary token.

  2. Connect a physical Android device to your development device.

  3. In Android Studio, click Run app. A moment later you see the project installed on your device. If this is the first time you run the project, grant microphone and camera access to your app.

    When the app starts, it sets the log file location and logging level according to your preference.

  4. Press Join. Your app does the following:

    • Sets the audio codec, sampling rate and the number of channels
    • Sets bandwidth estimation parameters
    • Starts streaming audio and video
    • Listens for notification of changes in network bandwidth to adjust video resolution and bit rate accordingly
    • Listens for key frame request to notify the encoder to send a key frame
  5. Check and uncheck Mute local audio and Mute local video boxes.

    Audio and video are muted/unmuted in the web demo app.

  6. Press Mute Audio and Mute Video buttons in the web demo app. You see messages in your Android app informing you of these events.

Reference

This section contains additional information that completes the content on this page, or points you to documentation that explains other aspects to this product.

The table below shows the recommended sampling rates and corresponding data sizes for Opus, G722, and G711 audio data.

Audio formatSampling rate (Hz)Audio data size (Byte)
G7118000320
G72216000640
OPUS16000640
OPUS480001920

Definition levels of a webcam

GearResolutionFrame rateBit rate range (kbps)
SDL1: 640*36015400 - 800
HDL2: 1280*720151130 - 2260
Ultra HDL3: 1920*1080151130 - 4160
vundefined