In today’s multimedia-driven web, integrating live audio and video streaming into web applications is increasingly commonplace. However, developers often encounter restrictions with standard video integrations that lack flexibility or fail to meet specific user interface requirements. The Agora Web SDK provides a powerful set of tools for real-time engagement, allowing not only for easy embedding of live voice and video functionalities but also for complete customization of video elements using JavaScript/TypeScript.
Whether you are a beginner eager to explore real-time video functionalities, or an advanced developer looking to create highly customized video interfaces, this guide is tailored for you. The Agora Web SDK offers convenient APIs that can automatically embed a video element within a specified <div/>
, but there are scenarios where developers need a bit more control over the video tracks.
In this guide we’ll dive into how to manipulate audio and video tracks created by the Agora Web SDK, customize video player controls, and integrate interactive elements to enhance user engagement.
Before starting, ensure you have the following:
Before we dive into the code, this guide assumes you have a project setup with an HTML file linked to a JavaScript file. This guide will focus solely on integrating the Agora Web SDK and setting up the custom video element.
If you haven’t already, you need to add the Agora Web SDK to your project. Open your terminal, navigate to your project directory, and run the following command:
npm install agora-rtc-sdk-ng
The Agora Web SDK makes it straightforward to manage audio and video tracks separately, giving you the flexibility to create a custom video element tailored to your application’s needs. Below, I’ll walk you through the steps to set up and control a video stream.
To initialize the camera and mic using Agora’s Web SDK, create an asynchronous function that uses .createMicrophoneAndCameraTracks()
. Wrap this in a try-catch to handle any errors that might occur during initialization.
Call this initialization function when your application is ready to start processing the video. In this example we’ll wait for the DOM to fully load.
span class="hljs-keyword">import AgoraRTC from 'agora-rtc-sdk-ng';
document.addEventListener('DOMContentLoaded', async () => {
try {
// Create audio and video tracks with specific configurations
const [audioTrack, videoTrack] = await AgoraRTC.createMicrophoneAndCameraTracks({
audioConfig: 'music_standard',
videoConfig: '360p_7'
});
// Proceed to creating and configuring the video element
setupVideoElement(audioTrack, videoTrack, true);
} catch (error) {
console.error('Failed to initialize media tracks:', error);
}
});
Once we have the tracks we can set up our <video/>
element. We can’t pass individual tracks to a <video/>
element, so we’ll need to use a MediaStream
object; used to represent streams of audio/video data. In our case, it acts as a wrapper for the camera track so we can pass it off to the <video/>
element.
After initializing the camera track using the Agora SDK, we need to get the raw MediaSTreamTrack
and use it to create a new MediaStream
. Then connect the pipes by setting the MediaStream
as the srcObject
for the <video/>
element.
function setupVideoElement(audioTrack, videoTrack, isRemote) {
const videoFromStream = document.createElement('video')
videoFromStream.id = 'local-video-stream'
videoFromStream.setAttribute('playsinline', 'true') // avoid fullscreen on mobile browsers
videoFromStream.setAttribute('webkit-playsinline', 'true') // for playback in webkit browsers
// Set the source object of the video element to include our tracks
const tracks = [videoTrack.getMediaStreamTrack()]
videoFromStream.srcObject = new MediaStream(tracks)
videoFromStream.controls = false // we choose not to display controls
videoFromStream.height = 300 // set the height
videoFromStream.width = 500 // set the width
// Automatically play the video when it is ready
videoFromStream.addEventListener('loadedmetadata', () => {
videoFromStream.play()
})
// Append the video element to the document's body
document.body.appendChild(videoFromStream)
}
Even though we set the MediaStream
as the srcObject
, we can’t immediately call play
on the <video/>
element. We have to wait for the browser to let us know when it’s ready. Add a listener for the loadedmetadata
event to start playback. Once we set up the <video />
element, we’ll append it to the DOM.
This example shows how to create a custom video element for the local user, so we only play the video track. You wouldn’t want to play the local audio as it would cause an echo. To use this for remote users you need to add the audio track to the tracks
array, before creating the MediaStream
. Otherwise if you try to play them separately the playback might not synchronize correctly.
const tracks = [audioTrack.getMediaStreamTrack(), videoTrack.getMediaStreamTrack()]
videoFromStream.srcObject = new MediaStream(tracks)
That’s it! Not so difficult right? Now with this knowledge, you’re ready to connect your Agora tracks to <video/>
elements, enabling you to offer enhanced interactive experiences to your users.
Here are a few next steps to keep you going on your journey with Agora: