Rate this page:

Video Source APIs - iOS 4.x

This page is for reference only. We are no longer onboarding new customers to Programmable Video. Existing customers can continue to use the product until December 5, 2024.

We recommend migrating your application to the API provided by our preferred video partner, Zoom. We've prepared this migration guide to assist you in minimizing any service disruption.

In this guide, we will show you how to use the VideoSource APIs to share video in a Room. These APIs allow you to choose the built in camera(s), or any other source of content that is available to your application (or extension).


The VideoSource APIs describe producers and consumers of video content. A TVIVideoSource produces content for a TVILocalVideoTrack. Sources have the following properties.

  • VideoSources produce VideoFrames, and deliver them to VideoSinks.
  • VideoSources receive format requests, and deliver requests to VideoSinks.
  • The recommended maximum frame size is 1920x1080.
  • The recommended maximum frame rate is 30 frames per second.
  • The recommended pixel format is NV12.

A TVIVideoSink consumes content from a TVIVideoSource. Sinks have the following properties.

  • VideoSinks handle format requests from VideoSources.
  • VideoSinks consume frames from VideoSources.

In the next section we will show you how to use the CameraSource API.

Using the CameraSource API

A TVICameraSource is a TVIVideoSource that produces content from the built-in cameras. This is probably the first kind of video that you want to share, so it is a good place to begin.

Create a CameraSource and a LocalVideoTrack

First we want to create a TVICameraSource, and use that source to create a TVILocalVideoTrack.

guard let cameraSource = CameraSource() else {
    // Unable to initialize a camera source
var videoTrack = LocalVideoTrack(source: cameraSource)

Capture from a Device

Now that we've setup our Track and Source, its time to start producing frames from one of the built-in cameras. Lets use a TVICameraSource utility method to help us discover a front facing AVCaptureDevice.

guard let frontCamera = CameraSource.captureDevice(position: .front) else {
    // The device does not have a front camera.

// Start capturing with the device that we discovered.
cameraSource.startCapture(device: frontCamera)

In this example, TVICameraSource is automatically determining the best format to capture in. Typically, 640x480 at 30 frames / second is used as the default value.

Connect to a Room with a LocalVideoTrack

Next, we want to connect to a Room with the TVILocalVideoTrack we created earlier.

let connectOptions = ConnectOptions(token: accessToken){ (builder) in
    builder.roomName = "my-room"
    if let localVideoTrack = self.localVideoTrack {
        builder.videoTracks = [localVideoTrack]
} = TwilioVideoSDK.connect(options: connectOptions, delegate: self)

Select a new Device

While you can select a single device at start time, TVICameraSource also supports switching devices while it is already running. For example, you could switch from a front facing device to a rear facing device.

guard let rearCamera = CameraSource.captureDevice(position: .back) else {
    // The device does not have a rear camera.


Unpublishing Video and Stopping Capture

At some point after connecting to a Room, you might decide that you want to stop sharing video from the camera. Start with unpublishing the Track.

// Unpublish the Track. We will no longer be sharing video in the Room.
if let participant =,
    let videoTrack = self.localVideoTrack {

Finally, we will stop the source and destroy the objects.

// Stop capturing from the device. { (error) in
    if let theError = error {
        print("Error stopping capture:", theError as Any)
    } = nil
    self.localVideoTrack = nil

Selecting a Device Format

An AVCaptureDevice can produce video in many possible formats. TVICameraSource offers utility methods to discover formats that are suitable for video streaming. Consider executing the following code on your iOS device:

// Assume that we discovered "frontDevice" earlier.

let formats = CameraSource.supportedFormats(captureDevice: frontDevice)

When this code is run on an iPhone X with iOS 12.4, the following formats are returned.

Dimensions Frame Rate Pixel Format
192 x 144 30 420f
352 x 288 30 420f
480 x 360 30 420f
640 x 480 30 420f
960 x 540 30 420f
1280 x 720 30 420f
1920 x 1080 30 420f
1920 x 1440 30 420f
3088 x 2320 30 420f

Once you've determined which format you would like to use, you can provide it when starting capture.

// Formats are ordered by increasing dimensions. Start with the smallest size.
cameraSource.startCapture(device: frontDevice,
                          format: formats.firstObject as! VideoFormat,
                          completion: nil)

In some applications, it may be important to change formats at runtime with as little disruption to the camera feed as possible.

// Select another format for the front facing camera.
                                 format: formats.lastObject as! VideoFormat,
                                 completion: nil)

Making a Format Request

Device formats afford quite a lot of flexibility, but there are some cases that AVCaptureDevice does not support out of the box. For example, what if you wanted to:

  1. Produce square video.
  2. Produce video that fills a portrait iPhone X / XR / XS screen.

These are both cases where you want to publish video in a different aspect ratio or size than AVCaptureDevice can produce. That is okay, because format requests are here to help with this problem.

let frontDevice = CameraSource.captureDevice(position: .front)!
let formats = CameraSource.supportedFormats(captureDevice: frontDevice)

// We match 640x480 directly, since it is known to be supported by all devices.
var preferredFormat: VideoFormat?
for format in formats {
    let theFormat = format as! VideoFormat
    if theFormat.dimensions.width == 640,
        theFormat.dimensions.height == 480 {
        preferredFormat = theFormat

guard let captureFormat = preferredFormat else {
    // The preferred format could not be found.

// Request cropping to 480x480.
let croppingRequest = VideoFormat()
let dimension = captureFormat.dimensions.height
croppingRequest.dimensions = CMVideoDimensions(width: dimension,
                                               height: dimension) frontDevice,
                          format: captureFormat,
                          completion: nil)

The following diagram shows the effect of a format request on frames produced by TVICameraSource.

Take a look at the iOS QuickStart Example to learn more about using TVICameraSource.

Tracking Orientation Changes

The TVICameraSource provides flexibility in how it tracks video orientation for capture and preview. By default, the TVICameraSource monitors -[UIApplication statusBarOrientation] for orientation changes. With the addition of the UIWindowScene APIs in iOS 13, TVICameraSource now has a property, TVICameraSourceOptions.orientationTracker, which allows you to specify how the TVICameraSource should track orientation changes.

The orientationTracker property accepts an object that implements the TVICameraSourceOrientationTracker protocol. A default implementation, TVIUserInterfaceTracker is provided with the SDK. TVIUserInterfaceTracker monitors for changes in UIInterfaceOrientation at the application or scene level. For example, if you wish to track orientation changes based on a scene, you would provide the scene to track when creating the TVICameraSourceOptions.

// Track the orientation of the key window's scene.
let options = CameraSourceOptions { (builder) in
    if let keyScene = UIApplication.shared.keyWindow?.windowScene {
        builder.orientationTracker = UserInterfaceTracker(scene: keyScene)
let camera = CameraSource(options: options, delegate: self)

You will also need to forward UIWindowScene events from your UIWindowSceneDelegate to keep TVIUserInterfaceTracker up to date as the scene changes.

// Forward UIWindowScene events
func windowScene(_ windowScene: UIWindowScene,
                 didUpdate previousCoordinateSpace: UICoordinateSpace,
                 interfaceOrientation previousInterfaceOrientation: UIInterfaceOrientation,
                 traitCollection previousTraitCollection: UITraitCollection) {

You can also manually control how orientation is tracked. For example, you might decide to use UIDevice instead of UIScene to determine the orientation of the camera. To do this, you would create your own implementation of TVICameraSourceOrientationTracker which invokes the - (void)trackerOrientationDidChange:(AVCaptureVideoOrientation)orientation callback method when the device's orientation changes.

Writing a VideoSource

VideoSources are real-time producers of content. Importantly, to optimize for low latency delivery of individual frames is not guaranteed. The video pipeline continuously monitors network and device conditions and may respond by:

  • Reducing the number of bits allocated to the encoder.
  • Downscaling the video to a smaller size.
  • Dropping video frames at input.
  • Cropping (minimal, to ensure pixel alignment).

Sample Code

If you would like to implement your own VideoSource, or learn about advanced usage of TVICameraSource then an excellent place to begin is with our sample code.

Rate this page:

Need some help?

We all do sometimes; code is hard. Get help now from our support team, or lean on the wisdom of the crowd by visiting Twilio's Stack Overflow Collective or browsing the Twilio tag on Stack Overflow.

Loading Code Sample...

        Thank you for your feedback!

        Please select the reason(s) for your feedback. The additional information you provide helps us improve our documentation:

        Sending your feedback...
        🎉 Thank you for your feedback!
        Something went wrong. Please try again.

        Thanks for your feedback!