Tutorials · 4 min read

AI face recognition (Javascript)

How to integrate AI facial expression detection to control your video – Javascript

Learn how to add AI facial expression detection to your videos using Javascript.

Yohann Martzolff

June 25, 2024

Today, with the rise of AI, new possibilities are emerging that allow us to take video control to the next level. One such innovative feature is AI facial expression detection. By integrating this technology into your videos, you can create a more engaging and responsive user experience.


In this blog, we will learn how to integrate AI facial expression detection into your videos using Javascript.


We use JavaScript here because it can be easily integrated into virtually any web application. Additionally, its adaptability to different platforms and its ability to interact with HTML and CSS empower you to create rich, interactive web experiences that cater to diverse user needs.


To learn how to do the integration with React, head to this blog.

1. Build the application’s structure

Start by creating a simple index.html file


And a index.js file


Start your application by running the npx http-server command and open the dedicated http://localhost port to see your application running. A “Hello World!” alert should appear.


Note: If you don’t have the npx package already installed on your computer, you can install it by running npm i -g npx in a terminal.

2. Add the external JS scripts

Import the MediaPipe script by adding these lines at the top of index.js


Import the api.video-player-sdk script by adding the following


Note: We’re using the api.video’s player in this article, but feel free to use your favorite one and adapt the callbacks to it.

3. Add a video player

Add a <div /> tag to index.html. It will be targeted by the api.video player SDK to display the video you're going to control with your facial expressions.


Target the element in index.js with the following code.


Run your application with npx http-server and you should see a video that you can play, pause, etc...

4. Record your facial expressions

To control the player with your facial expressions, you need to stream your face through your webcam and pass this streamed data to the trained machine-learning model.


Add a <video /> tag to index.html Apply a display: "none" inline style to hide the element, and an autoplay attribute so the webcam stream time is updated.


Note: You can remove the display: "none" style and customize your CSS, it doesn’t matter. Apply a display: "none" inline style to hide the element, and an autoplay attribute so the webcam stream time is updated.


5. Load the machine-learning model

Before we can use the machine-learning model, we must wait for it to finish loading. These models can be large and take a moment to get everything needed to run.


Add and run a new function in index.js


6. Start detecting the webcam stream

Once the machine-learning model has been loaded and the faceLandmarker class has been instantiated, you can start detecting the stream.


Add the following code to index.js


Check your application, and open your browser console. You should see a lot of console logs appearing each millisecond.


Each one of them logs an array of objects of type:


7. Use the facial expression detection results

You now have access to a list of facial expression detection results. Use them to do whatever you want and in this case, to control your video player.


Let’s assume that you want to play the video when you smile, and pause it when you do a kiss.


Check the facial expression detection results on each frame and play or pause your video depending on the score of the wanted result by replacing your console.log with the following.


Check your application, and try to smile at your webcam. It should trigger the player’s play callback and play the video. Then, try to do a kiss (😗) in front of your webcam, and the video should pause.

8. Display the results

It can be helpful to have a visual representation of the detection results.


Display these results as a list and highlight the detected facial expressions by adding a <ul /> tag to index.html


And the code below to index.js


Now, you should see a list of results displayed in your application, with a green highlight on the ones with more than 60% confidence.

9. Code samples

Check the full index.html and index.js codes below





We hope this blog helped you understand the coding behind integrating AI facial expression detection to your videos.


For any questions or doubts you may have, chat with us on our website. If you would like to try out api.video for your videos, sign-up for a free sandbox account here.

Try out more than 80 features for free

Access all the features for as long as you need.
No commitment or credit card required

Video API, simplified

Fully customizable API to manage everything video. From encoding to delivery, in minutes.

Built for Speed

The fastest video encoding platform. Serve your users globally with 140+ points of presence. 

Let end-users upload videos

Finally, an API that allows your end-users to upload videos and start live streams in a few clicks.


Volume discounts and usage-based pricing to ensure you don’t exceed your budget.