Live video broadcasting has seen a range of uses from live shopping to live concerts. There are a lot of aspects to building a scalable, high-quality, live video streaming app. For example, maintaining low latency, load balancing, and managing thousands of users in the audience can be stressful while also maintaining cross-platform compatibility.
There’s a really easy way to make this happen using the Agora React Native SDK. In this article, we’ll build a live broadcasting app that can have multiple broadcasters and host thousands of users by using the magic of the Agora Video SDK. We’ll go over the structure, setup, and execution of the app before diving into how it works. You can get a live broadcast going in a few simple steps within a matter of minutes.
We’ll be using the Agora RTC SDK for React Native for the example below. I’m using v3.4.6 at the time of writing.
Go to https://console.agora.io/ to create an account and log in to the dashboard. You can follow this guide for reference: https://www.agora.io/en/blog/how-to-get-started-with-agora
Navigate to the Project List tab under the Project Management tab, and create a new project by clicking the blue Create button.
Create a new project and retrieve the App ID. If you select App ID with a token, obtain a temporary token as well for your project. You can find a link to generate temporary tokens on the edit page. The temporary token will be used to authorize your requests while you’re developing the application.
Note: Token authentication is recommended for all RTE apps running in production environments. For more information about token-based authentication in the Agora platform, see this guide: https://docs.agora.io/en/Video/token?platform=All%20Platforms
This is the structure of our application:
.
├── android
├── components
│ └── Permission.ts
│ └── Style.ts
├── ios
├── App.tsx
├── index.js
.
You’ll need to have the LTS version of Node.js and NPM installed.
npm install
to install the app dependencies in the unzipped directory../App.tsx
and enter the App ID that we obtained from the Agora Console (appId: ‘<YourAppIDHere>’
). If you’re using tokens, enter your token and channel name as well.cd ios && pod install
. You can then open ios/<projectName>.xcworkspace
file to open your project in XCode and build the app. (The iOS simulator does not support the camera. Use a physical device instead.)npm run android
to start the app. Wait for a few minutes for the app to build.That’s it. You should have a video call going between the two devices. The app uses test
as the channel name.
We’re exporting a function to request camera and microphone permissions from the OS on Android.
The App.tsx
file contains the core logic of our video call.
We start by writing the import statements. Next, we have some constants for our App ID, token, and channel name.
We define an interface for our application state containing isHost
(a Boolean value to switch between audience and broadcaster; a host can both send and receive streams, whereas an audience can only receive streams), joinSucceed
(a Boolean value to store if we’ve connected successfully), and peerIds
(an array to store the UIDs of other users in the channel).
We define a class-based component, the _engine
variable, which will store the instance of the RtcEngine
class, which provides methods that can be invoked by our application to manage the live stream.
In the constructor, we set our state variables and request permission for the camera and the mic on Android. When the component is mounted, we call the init
function, which initializes the RTC engine using the App ID. It also enables the video by calling the enableVideo
method on our engine instance.
We set channelProfile
as Live Broadcasting and clientRole
based on our isHost
state variable value.
The init
function also adds event listeners for various events in the live broadcast. For example, the UserJoined
event gives us the UID of a user when they join the channel. We store this UID in our state.
(If there are users connected to the channel before we joined, a UserJoined
event is fired for each user after they successfully join the channel.)
Next, we have the function toggleRole
, which changes roles between audience and broadcaster. We have startCall
and endCall
to start and end the call. The toggleRole
function updates the state and calls the setClientRole
function with a role argument based on the state. The joinChannel
method takes in a token, channel name, optional info, and an optional UID. (If you set UID to 0, the SDK automatically assigns a UID.)
We define the render function for displaying buttons to start and end the call and to display our local video feed as well as the remote users’ video feeds. We define the _renderVideos
function, which renders our video feeds.
To display the local user’s video feed, we use the <RtcLocalView.SurfaceView>
component, which takes in channelId
and renderMode
(which can be used to fit the video inside a view or zoom to fill the view) as props. To display the remote user’s video feed, we use the <RtcLocalView.SurfaceView>
component from the SDK, which takes in the UID of the remote user along with channelId
and renderMode
. We map over the Remote users’ UIDs to display a video for each, using the peerIDs
array.
The Style.ts
file contains the styling for the components.
That’s how easy it is to build a live video broadcasting app. You can refer to the Agora React Native API Reference to see methods that can help you quickly add features like muting the camera and mic, setting video profiles, audio mixing, and much more.
If you’re deploying your app to production, you can read more about how to use tokens in this blog.
And I invite you to join the Agora Developer Slack Community. Feel free to ask any React Native questions in the #react-native-help-me
channel.