Skip to main content

You are looking at Interactive Live Streaming v3.x Docs. The newest version is  Interactive Live Streaming 4.x

Start Interactive Live Audio Streaming

Use this guide to quickly start the interactive live audio streaming with the Agora Voice SDK for Android.

Understand the tech

The following figure shows the workflow to integrate into your app in order to add Interactive Live Streaming Premium functionality.

1625465916613

As shown in the figure, the workflow for adding Interactive Live Streaming Premium in your project is as follows:

  1. Set the client role Each user in an Interactive Live Streaming Premium channel is either a host or an audience member. Hosts publish streams to the channel, and the audience subscribe to the streams.

  2. Retrieve a token A token is the credential that authenticates a user when your app client joins a channel. In a test or production environment, your app client retrieves tokens from a server in your security infrastructure.

  3. Join a channel Call joinChannel to create and join a channel. App clients that pass the same channel name join the same channel.

  4. Publish and subscribe to audio and video in the channel After joining a channel, app clients with the role of the host can publish audio and video. For an auidence memeber to send audio and video, you can call setClientRole to switch the client role.

For an app client to join a channel, you need the following information:

  • The App ID: A randomly generated string provided by Agora for identifying your app. You can get the App ID from Agora Console.
  • The user ID: The unique identifier of a user. You need to specify the user ID yourself, and ensure that it is unique in the channel.
  • A token: In a test or production environment, your app client retrieves tokens from a server in your security infrastructure. For this page, you use a temporary token with a validity period of 24 hours that you retrieve from Agora Console.
  • The channel name: A string that identifies the channel for the live stream.

Prerequisites

Before proceeding, ensure that your development environment meets the following requirements:

  • Java Development Kit.
  • Android Studio 3.0 or later.
  • Android SDK API Level 16 or higher.
  • A valid Agora account.
  • An active Agora project with an App ID and a temporary token. For details, see Get Started with Agora.
  • A computer with access to the internet. If your network has a firewall, follow the instructions in Firewall Requirements.
  • A mobile device that runs Android 4.1 or later.

Set up the development environment

  1. For new projects, in Android Studio, create a Phone and Tablet Android project with an Empty Activity.

    After creating the project, Android Studio automatically starts gradle sync. Ensure that the sync succeeds before you continue.

  2. Integrate the Voice SDK into your project with Maven Central. For more integration methods, see Other approaches to intergrate the SDK.

    a. In /Gradle Scripts/build.gradle(Project: <projectname>), add the following lines to add the Maven Central dependency:


    _14
    buildscript {
    _14
    repositories {
    _14
    ...
    _14
    mavenCentral()
    _14
    }
    _14
    ...
    _14
    }
    _14
    _14
    allprojects {
    _14
    repositories {
    _14
    ...
    _14
    mavenCentral()
    _14
    }
    _14
    }

    The way to add the Maven Central dependency can be different if you set dependencyResolutionManagement in your Android project.

    b. In /Gradle Scripts/build.gradle(Module: <projectname>.app), add the following lines to integrate the Agora Voice SDK into your Android project:


    _7
    ...
    _7
    dependencies {
    _7
    ...
    _7
    // For x.y.z, fill in a specific SDK version number. For example, 3.5.0 or 3.7.0.2.
    _7
    // Get the latest version number through the release notes.
    _7
    implementation 'io.agora.rtc:voice-sdk:x.y.z'
    _7
    }

  3. Add permissions for network and device access.

    In /app/Manifests/AndroidManifest.xml, add the following permissions after </application>:


    _8
    <uses-permission android:name="android.permission.INTERNET" />
    _8
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    _8
    <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
    _8
    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
    _8
    <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
    _8
    <uses-permission android:name="android.permission.BLUETOOTH" />
    _8
    <!-- Add the following permission on devices running Android 12.0 or later -->
    _8
    <uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />

  4. To prevent obfuscating the code in the Agora SDK, add the following line to /Gradle Scripts/proguard-rules.pro:


    _1
    -keep class io.agora.**{*;}

Implement the basic interactive live streaming

This section introduces how to use the Agora Voice SDK to start the interactive live audio streaming. The following figure shows the API call sequence of the interactive live audio streaming.

1583134611348

1. Create the UI

Create the user interface (UI) for the audio streaming in the layout file of your project. Skip to Import Classes if you already have a UI in your project.

You can also refer to the xml files under the layout path in the OpenLive-Voice-Only-Android demo project.

Example for creating the UI

_100
<?xml version="1.0" encoding="UTF-8"?>
_100
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
_100
xmlns:tools="http://schemas.android.com/tools"
_100
android:layout_width="match_parent"
_100
android:layout_height="match_parent"
_100
android:keepScreenOn="true"
_100
tools:context=".ui.LiveRoomActivity">
_100
_100
<RelativeLayout
_100
android:layout_width="match_parent"
_100
android:layout_height="match_parent">
_100
_100
<TextView
_100
android:id="@+id/room_name"
_100
android:layout_width="wrap_content"
_100
android:layout_height="wrap_content"
_100
android:layout_alignParentTop="true"
_100
android:layout_centerHorizontal="true"
_100
android:layout_marginTop="6dp"
_100
android:textColor="@color/dark_black"
_100
android:textSize="16sp"
_100
android:textStyle="bold" />
_100
_100
<io.agora.propeller.ui.AGLinearLayout
_100
android:id="@+id/bottom_container"
_100
android:layout_width="match_parent"
_100
android:layout_height="wrap_content"
_100
android:layout_alignParentBottom="true"
_100
android:layout_alignParentLeft="true"
_100
android:layout_alignParentStart="true"
_100
android:orientation="vertical">
_100
_100
<ImageView
_100
android:id="@+id/bottom_action_end_call"
_100
android:layout_width="54dp"
_100
android:layout_height="54dp"
_100
android:layout_gravity="center_horizontal"
_100
android:onClick="onEndCallClicked"
_100
android:scaleType="center"
_100
android:src="@drawable/btn_endcall" />
_100
_100
<RelativeLayout
_100
android:id="@+id/bottom_action_container"
_100
android:layout_width="match_parent"
_100
android:layout_height="54dp"
_100
android:gravity="center_vertical"
_100
android:orientation="horizontal">
_100
_100
<ImageView
_100
android:id="@id/switch_broadcasting_id"
_100
android:layout_width="54dp"
_100
android:layout_height="match_parent"
_100
android:layout_alignParentLeft="true"
_100
android:layout_alignParentStart="true"
_100
android:scaleType="center"
_100
android:src="@drawable/btn_request_broadcast" />
_100
_100
<LinearLayout
_100
android:layout_width="wrap_content"
_100
android:layout_height="match_parent"
_100
android:layout_centerInParent="true"
_100
android:orientation="horizontal">
_100
_100
<ImageView
_100
android:id="@id/switch_speaker_id"
_100
android:layout_width="54dp"
_100
android:layout_height="match_parent"
_100
android:onClick="onSwitchSpeakerClicked"
_100
android:scaleType="center"
_100
android:src="@drawable/btn_speaker" />
_100
_100
<ImageView
_100
android:id="@id/mute_local_speaker_id"
_100
android:layout_width="54dp"
_100
android:layout_height="match_parent"
_100
android:onClick="onVoiceMuteClicked"
_100
android:scaleType="center"
_100
android:src="@drawable/btn_mute" />
_100
_100
</LinearLayout>
_100
_100
</RelativeLayout>
_100
</io.agora.propeller.ui.AGLinearLayout>
_100
_100
<EditText
_100
android:id="@+id/msg_list"
_100
android:layout_width="match_parent"
_100
android:layout_height="match_parent"
_100
android:layout_above="@id/bottom_container"
_100
android:layout_below="@id/room_name"
_100
android:layout_marginBottom="8dp"
_100
android:layout_marginTop="6dp"
_100
android:enabled="true"
_100
android:focusable="false"
_100
android:gravity="start|top"
_100
android:inputType="none"
_100
android:scrollbars="vertical" />
_100
_100
</RelativeLayout>
_100
</FrameLayout>

2. Import Classes

Import the following classes in the activity file of your project:


_2
import io.agora.rtc.IRtcEngineEventHandler;
_2
import io.agora.rtc.RtcEngine;

3. Get the device permission

Call the checkSelfPermission method to access the microphone of the Android device when launching the activity.


_32
private static final int PERMISSION_REQ_ID_RECORD_AUDIO = 22;
_32
_32
// Ask for Android device permissions at runtime.
_32
@Override
_32
protected void onCreate(Bundle savedInstanceState) {
_32
super.onCreate(savedInstanceState);
_32
setContentView(R.layout.activity_voice_chat_view);
_32
_32
// Initialize RtcEngine and join the channel after getting the permission.
_32
if (checkSelfPermission(Manifest.permission.RECORD_AUDIO, PERMISSION_REQ_ID_RECORD_AUDIO)) {
_32
initAgoraEngineAndJoinChannel();
_32
}
_32
}
_32
_32
private void initAgoraEngineAndJoinChannel() {
_32
initializeAgoraEngine();
_32
joinChannel();
_32
}
_32
_32
public boolean checkSelfPermission(String permission, int requestCode) {
_32
Log.i(LOG_TAG, "checkSelfPermission " + permission + " " + requestCode);
_32
if (ContextCompat.checkSelfPermission(this,
_32
permission)
_32
!= PackageManager.PERMISSION_GRANTED) {
_32
_32
ActivityCompat.requestPermissions(this,
_32
new String[]{permission},
_32
requestCode);
_32
return false;
_32
}
_32
return true;
_32
}

4. Initialize RtcEngine

Create and initialize the RtcEngine object before calling any other Agora APIs.

In the string.xml file, replace agora_app_id with your App ID. Call the create method and pass in the App ID to initialize the RtcEngine object.

You can also listen for callback events, such as when the local user joins the channel. Do not implement UI operations in these callbacks.


_41
private RtcEngine mRtcEngine;
_41
private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
_41
@Override
_41
// Listen for the onJoinChannelSuccess callback.
_41
// This callback occurs when the local user successfully joins the channel.
_41
public void onJoinChannelSuccess(String channel, final int uid, int elapsed) {
_41
runOnUiThread(new Runnable() {
_41
@Override
_41
public void run() {
_41
Log.i("agora","Join channel success, uid: " + (uid & 0xFFFFFFFFL));
_41
}
_41
});
_41
}
_41
_41
_41
_41
@Override
_41
// Listen for the onUserOffline callback.
_41
// This callback occurs when the host leaves the channel or drops offline.
_41
public void onUserOffline(final int uid, int reason) {
_41
runOnUiThread(new Runnable() {
_41
@Override
_41
public void run() {
_41
Log.i("agora","User offline, uid: " + (uid & 0xFFFFFFFFL));
_41
onRemoteUserLeft();
_41
}
_41
});
_41
}
_41
};
_41
_41
...
_41
_41
// Initialize the RtcEngine object.
_41
private void initializeEngine() {
_41
try {
_41
mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
_41
} catch (Exception e) {
_41
Log.e(TAG, Log.getStackTraceString(e));
_41
throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));
_41
}
_41
}

5. Set the channel profile

After initializing the RtcEngine object, call the setChannelProfile method to set the channel profile as LIVE_BROADCASTING.

One RtcEngine object uses one profile only. If you want to switch to another profile, release the current RtcEngine object with the destroy method and create a new one before calling the setChannelProfile method.


_3
private void setChannelProfile() {
_3
mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_LIVE_BROADCASTING);
_3
}

6. Set the user role

A live-streaming channel has two user roles: BROADCASTER and AUDIENCE, and the default role is AUDIENCE. After setting the channel profile to LIVE_BROADCASTING, your app may use the following steps to set the client role:

  1. Allow the user to set the role as BROADCASTER or AUDIENCE.
  2. Call the setClientRole method and pass in the user role set by the user.

Note that in the live streaming, only the host can be heard. If you want to switch the user role after joining the channel, call the setClientRole method.


_41
public void onClickJoin(View view) {
_41
// Show a dialog box to choose a user role.
_41
AlertDialog.Builder builder = new AlertDialog.Builder(this);
_41
builder.setMessage(R.string.msg_choose_role);
_41
builder.setNegativeButton(R.string.label_audience, new DialogInterface.OnClickListener() {
_41
@Override
_41
public void onClick(DialogInterface dialog, int which) {
_41
MainActivity.this.forwardToLiveRoom(Constants.CLIENT_ROLE_AUDIENCE);
_41
}
_41
});
_41
builder.setPositiveButton(R.string.label_broadcaster, new DialogInterface.OnClickListener() {
_41
@Override
_41
public void onClick(DialogInterface dialog, int which) {
_41
MainActivity.this.forwardToLiveRoom(Constants.CLIENT_ROLE_BROADCASTER);
_41
}
_41
});
_41
AlertDialog dialog = builder.create();
_41
_41
dialog.show();
_41
}
_41
_41
// Get the user role and channel name specified by the user.
_41
// The channel name is used when joining the channel.
_41
public void forwardToLiveRoom(int cRole) {
_41
final EditText v_room = (EditText) findViewById(R.id.room_name);
_41
String room = v_room.getText().toString();
_41
_41
Intent i = new Intent(MainActivity.this, LiveRoomActivity.class);
_41
i.putExtra("CRole", cRole);
_41
i.putExtra("CName", room);
_41
_41
startActivity(i);
_41
}
_41
_41
// Pass in the role set by the user.
_41
private int mRole;
_41
mRole = getIntent().getIntExtra("CRole", 0);
_41
_41
private void setClientRole() {
_41
mRtcEngine.setClientRole(mRole);
_41
}

7. Join a channel

After setting the user role, you can call the joinChannel method to join a channel. In this method, set the following parameters:

  • token: Pass a token that identifies the role and privilege of the user. You can set it as one of the following values:
    • A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see Get a Temporary Token. When joining a channel, ensure that the channel name is the same with the one you use to generate the temporary token.
    • A token generated at the server. This applies to scenarios with high-security requirements. For details, see Generate a token from Your Server. When joining a channel, ensure that the channel name and uid is the same with those you use to generate the token.
  • If your project has enabled the app certificate, ensure that you provide a token.
  • Ensure that you do not set token as "".
  • channelName: Specify the channel name that you want to join.
  • uid: ID of the local user that is an integer and should be unique. If you set uid as 0, the SDK assigns a user ID for the local user and returns it in the onJoinChannelSuccess callback.
    Once the user joins the channel, the user subscribes to the audio streams of all the other users in the channel by default, giving rise to usage and billing calculation. If you do not want to subscribe to a specified stream or all remote streams, call the mute methods accordingly.

For more details on the parameter settings, see joinChannel.


_6
private void joinChannel() {
_6
// Join a channel with a token.
_6
private String mRoomName;
_6
mRoomName = getIntent().getStringExtra("CName");
_6
mRtcEngine.joinChannel(YOUR_TOKEN, mRoomName, "Extra Optional Data", 0);
_6
}

8. Leave the channel

Call the leaveChannel method to leave the current channel according to your scenario, for example, when the streaming ends, when you need to close the app, or when your app runs in the background.


_13
@Override
_13
protected void onDestroy() {
_13
super.onDestroy();
_13
if (!mCallEnd) {
_13
leaveChannel();
_13
}
_13
RtcEngine.destroy();
_13
}
_13
_13
private void leaveChannel() {
_13
// Leave the current channel.
_13
mRtcEngine.leaveChannel();
_13
}

Sample code

You can find the complete code logic in the OpenLive-Voice-Only-Android demo project.

Run the project

Run the project on your Android device. When the audio streaming starts, all the audience can hear the host in the app.

See also

Other approaches to integrate the SDK

In addition to integrating the Agora Voice SDK for Android through MavenCentral, you can also import the SDK into your project by manually copying the SDK files.

  1. Go to SDK Downloads, download the latest version of the Agora Voice SDK, and extract the files from the downloaded SDK package.
  2. Copy the following files or subfolders from the libs folder of the downloaded SDK package to the path of your project.
File or subfolderPath of your project
agora-rtc-sdk.jar file/app/libs/
arm64-v8a folder/app/src/main/jniLibs/
armeabi-v7a folder/app/src/main/jniLibs/
include folder/app/src/main/jniLibs/
x86 folder/app/src/main/jniLibs/
x86_64 folder/app/src/main/jniLibs/
If you use the armeabi architecture, copy files from the armeabi-v7a folder to the armeabi file of your project. Contact support@agora.io if you encounter any incompability issue.

Interactive Live Streaming