Rate this page:

Video Insights

Video Insights is in Public Beta. Learn more about Twilio's beta product support here.

Video Insights brings self-service tooling to the Twilio Console to provide analytics and aggregations for observing your application, discovering trends, and troubleshooting rooms and participants.

Some of the key capabilities are as follows:

  • Dashboard - Usage and quality metrics across all your rooms and participants.
  • Detected issues - Rooms and participants are tagged based on quality metrics observed during the call.
  • Quality metric graphs - Per-interval quality metric graphs for all participants.

Video Insights is free for Group, Peer-to-Peer, and WebRTC Go Room developers and is available for any applications built with Twilio Video’s Javascript, iOS, and/or Android SDKs.

Visit the Video Insights Dashboard in the Twilio Console to get started.


The Video Insights Dashboard provides a bird’s-eye view of usage and quality metrics across all the rooms and participants associated with a given account SID. The dashboard provides aggregations that help teams move from being completely responsive to end-user complaints to observing quality and troubleshooting proactively. It also allows you to segment your participants by device and software characteristics such as browser or operating system, track week-over-week growth, or better understand day-of-week or time-of-day usage.

Video Insights Dashboard Overview

All insights are tied to the account_sid that is associated with your Video Rooms. If you have subaccounts, each subaccount will have a unique Insights Dashboard.

Rooms Graph

The first graph provides the number of rooms that your account has created over time, with a distribution of rooms tagged as potentially having degraded quality. Please refer to the Detected Issues section for more information on the current list of issues that Twilio could detect and the associated thresholds used.

You can use this graph to monitor the distribution of rooms being tagged and then quickly dive into the rooms with issues. See below:

Video Insights Room Graph

Participants Graph

The participant graph allows you to segment your participants by the characteristics of their device and software setup, and filter for those tagged with issues to discover trends that you can act on.

Video Insights Participant Graph

You can segment the area graph by:

Device Manufacturer

The device manufacturers used by the participants in the time filter

Operating System

The operating systems used by the participants in the time filter


The browsers used by the participants in the time filter

Twilio SDK

The Twilio SDKs used by the participants in the time filter

Twilio parses the user agent to determine the Device Manufacturer, OS, and Browser. “Unknown” for these values means the user agent available did not have the information.

You can filter the graph by:

Participants with issues

Participants tagged with one or more issues. See Detected Issues for the current list and the associated thresholds.

Participants without issues

Participants not tagged with any Detected Issues.

Minutes Graph

The minutes graph provides the number of participant and recording minutes over time, along with historical data from the previous period, to observe week-over-week growth and better understand day-of-week or time-of-day usage.

The minutes provided are aggregated by the participant durations in seconds, and thus are not 1:1 with the minutes you will be billed for. Learn more about how billing works.



The Rooms view allows you to find your Room(s) of interest from the past seven days for Group and P2P Rooms, and from the past two days for WebRTC Go Rooms. You can search by SID or unique name, or filter by date and time, by room type, and/or by rooms that have Detected Issues.

Insights Rooms view

Room Summary

The Room Summary provides metadata about the Room, surfaces any Detected Issues, and lists the room’s participants.

Insights Room Summary view

Participant Summary

The Participant Summary provides details to help you troubleshoot issues and assess the media quality for individual users. For each participant, you are provided relevant participant characteristics (ex. OS, Browser, SDK versions) as well as per-interval quality metrics (bitrate, packet loss, round trip time) for the duration of their time in the Room.

Video Insights Participant Summary

Quality Metrics Graphs

Quality metrics are provided on a per room, per participant basis and are displayed in 10 second intervals. On the send (publishing) side, metrics for bitrate, packet loss, and round trip time (rtt) are provided on a per track basis to assess the quality of the participants outgoing audio or video. On the receive (subscribing) side, the total inbound bitrate for the connection is provided to help you diagnose when a participant has low downlink bandwidth.

A few things to note:

  • Quality metrics are sourced client-side from the WebRTC getStats() function and then processed
  • You may see multiple video tracks for a given participant. This will happen if
    • you implement the behavior of unpublishing / publishing tracks when an end-user turns their camera off, or
    • an end-user shares their screen
  • If you are using simulcast, the layers are merged to provide one metric for a given track. Please refer to the below sections for how these are calculated.
Send Tab (outgoing metrics)


Bitrate refers to the rate at which data (measured in bits) can be transferred between two endpoints in a given period of time. Low uplink bandwidth can lead to video freezing, video downscaling, frame-rate drops, and choppy audio. In addition, exceeding the available bandwidth of a receiver can overwhelm their network, cause degradation of the media quality, and potentially lead to dropped connections.

As mentioned above, bitrate is provided on a per track basis on the send side. Please refer to the Minimum Bandwidth Requirements documentation to better understand the bitrates required for the associated track characteristics.

If you have simulcast enabled, the bitrate you will see with video tracks is based on the sum of the bandwidth for all the layers.

Packet Loss

Packet loss is the measurement of packets that are expected but never arrive. High packet loss can result in frequent video freezing, video frame rate drops, and choppy audio. Packet loss is usually caused by overloaded routers or high CPU load on the machine.

If you have simulcast enabled, packet loss for video tracks is calculated by summing the packets lost and packets sent across all layers then applying the formula: (packets lost /( packets lost + packets sent)) * 100. This means that it’s possible that the individual streams forwarded by the SFU to subscribers may have varying levels of packet loss.

Round trip time

Round trip time is the time a packet of data takes to travel from sender to receiver and back. High round trip time is the cause of lag in media playback and occasional video freezing and drops in video frame rate. It can lead to end-users talking over one another. Round trip time is usually attributed to slow or overloaded networks.

If you have simulcast enabled, the round trip time (rtt) displayed is the maximum round trip time of all the individual layers. This means that it’s possible that certain streams forwarded by the SFU, based on the receiving participants downlink bandwidth, will experience less round trip time than what is displayed.

Receive Tab (incoming metrics)


The bitrate provided on the receive side is the total inbound bitrate received from the peer connection. Consistently low incoming bitrate can lead to video freezing, video downscaling, frame-rate drops, and choppy audio, regardless of the quality of the publisher’s media.

If you are using Group rooms, the Selective Forwarding Unit (SFU) acts as the peer connection, so this metric will represent the total bitrate received from all participants for all subscribed tracks. If you are using P2P rooms, you will see the inbound bitrate received for each peer connection.

The configured maxSubscriptionBitrate can impact this metric.


The Participant Summary provides characteristics about each participant that may be helpful for diagnosing issues. Some useful characteristics to look for are as follows:


The operating system and associated version used by the Participant.


The browser and associated version used by the Participant.


The device manufacturer and model used by the Participant.


The SDK and the associated version used by the Participant.

Tracks Published

Number of published tracks of the participant.

Detected Issues

Video Insights processes metrics and events and surfaces any potential issues detected by Twilio. The issues detected by Twilio are not exhaustive, but rather, are meant to provide a mechanism for easily identifying common issues and to assist in addressing issues in a more proactive manner. To start, Twilio is focusing on quality-related tagging on participants based on their published tracks (outgoing media). If any participants in the room are tagged with issues, then we will classify the room as having issues as well. The current list of detected issues is as follows:


High Threshold

Issue Metadata

Participant Packet Loss

Cumulative packet loss >= 5%

Whether it was detected on the outgoing audio, video, or both.

Participant Round Trip Time

Average round trip time > 300 ms

Whether it was detected on the outgoing audio, video, or both.

A few things to note:

  • If packet loss or round trip time is detected on any of the outgoing tracks, then the participant is tagged as having an issue.
  • It is possible that participants’ metrics will exceed the thresholds for portions of their time in video call but will not be tagged with an issue. Tagging is based on the entire duration of the track to avoid over-tagging in cases where there are blips of poor quality but ultimately, the overall experience was acceptable. You can visit the Participant Summaries to further analyze the per-interval metrics over the course of their time in the room.
  • Packet loss and round trip time are objective metrics that can be used to predict when an end-user could have had a degraded quality of experience. Quality of experience is subjective and thus, it’s possible we detect an issue but the end-user was happy with the experience (or vise versa).

Detected Issues are used throughout the Video Insights product:

  • Providing the distribution of rooms with issues in the Dashboard
  • Providing the ability to filter by participants with issues in the Dashboard
  • Providing the ability to filter by rooms with issues in the Rooms view
  • Aggregating and surfacing any detected issues in the Room Summary

Data Retention Policy

The data retention policy for Video Insights is as follows:

Insights Dashboard

Rooms and Participants

Group and P2P Rooms

14 days*

7 days

WebRTC Go Rooms

2 days

2 days

* Filtering is available up to 7 days in the past. Historical data from the previous week is also available, but not filterable.

Rate this page:

Need some help?

We all do sometimes; code is hard. Get help now from our support team, or lean on the wisdom of the crowd by visiting Twilio's Community Forums or browsing the Twilio tag on Stack Overflow.


        Thank you for your feedback!

        We are always striving to improve our documentation quality, and your feedback is valuable to us. Please select the reason(s) for your feedback or provide additional information about how we can improve:

        Sending your feedback...
        🎉 Thank you for your feedback!
        Something went wrong. Please try again.

        Thanks for your feedback!

        Refer us and get $10 in 3 simple steps!

        Step 1

        Get link

        Get a free personal referral link here

        Step 2

        Give $10

        Your user signs up and upgrade using link

        Step 3

        Get $10

        1,250 free SMSes
        OR 1,000 free voice mins
        OR 12,000 chats
        OR more