Menu

Expand
Rate this page:

Thanks for rating this page!

We are always striving to improve our documentation quality, and your feedback is valuable to us. How could this documentation serve you better?

Screen Capture

In this guide we'll show you how to share your iOS screen with other Participants connected to a Room using ReplayKit.framework. By using ReplayKit.framework, you have the ability to consume CMSampleBuffer objects that are produced by ReplayKit and convert them into VideoFrame objects that can be shared via a LocalVideoTrack.

There are two main ways that this can be accomplished depending on what screen content you wish to share. If you only want to share your application's screens, you can use the In-App Capture method. All of the code required is contained inside your application. If you want to share screens outside of your application, such as the Home Screen or other applications, you can use the Broadcast Extension method. Both methodologies are implemented in our ReplayKit Example application and should be used for reference.

VideoSource

No matter which method you choose, you will need to implement an object that conforms to the VideoSource protocol. This object is then provided when you create a LocalVideoTrack object. Your VideoSource object is responsible for consuming the CMSampleBuffer objects provided by ReplayKit and providing VideoFrame objects to its VideoSink.

The ReplayKitVideoSource class in our example application shows an implementation of this. Each timeReplayKit.framework produces a CMSampleBuffer object, it gets passed in to the processFrame() method. The ReplayKitVideoSource object handles the work of converting the frame into a VideoFrame object and passes it off to the VideoSink.

For example, creating a LocalVideoTrack to transmit video frames produced by ReplayKit using ReplayKitVideoSouce would be as follows.

var screenTrack: LocalVideoTrack?
var videoSource: ReplayKitVideoSource?

...

videoSource = ReplayKitVideoSource(isScreencast: true,
                                   telecineOptions: ReplayKitVideoSource.TelecineOptions.disabled)

screenTrack = LocalVideoTrack(source: videoSource!,
                              enabled: true,
                              name: "Screen")

Once configured, you can share screenTrack as you would any other LocalVideoTrack in your Room. You can provide it when you join the room via the videoTracks ConnectOption or adding it later via your LocalParticipant object.

In-App Capture Method

When you only want to share the screens of your application, you can use the In-App Capturing method. All of the ReplayKit.framework code required resides in your application. To capture your application's screens, you use the RPScreenRecorder class. To begin recording, you call the startCapture() method. The startCapture method takes a block that handles the incoming sampleBuffer objects. In this block you can pass the ReplayKit produced video frame off to your VideoSource consumer.

// Start recording the screen.
let recorder = RPScreenRecorder.shared()

// We are only using ReplayKit to capture the screen. 

// Use a LocalAudioTrack to capture the microphone for sharing audio in the room.
recorder.isMicrophoneEnabled = false
// Use a LocalVideoTrack with a CameraSource to capture the camera for sharing camera video in the room.
recorder.isCameraEnabled = false

recorder.startCapture(handler: { (sampleBuffer, type, error) in
    if error != nil {
        print("Capture error: ", error as Any)
        return
    }

    switch type {
    case RPSampleBufferType.video:
        self.videoSource?.processFrame(sampleBuffer: sampleBuffer)
        break
    case RPSampleBufferType.audioApp:
        // This exmaple does not capture audio generated by the application.
        break
    case RPSampleBufferType.audioMic:
        // We use `TVIDefaultAudioDevice` to capture and playback audio for conferencing.
        break
    }

}) { (error) in
    if error != nil {
        print("Screen capture error: ", error as Any)
    } else {
        print("Screen capture started.")
    }
}

When you are done sharing your screen, you can simply call the stopCapture() method of RPScreenRecorder and remove your LocalVideoTrack using the methods on your LocalParticipant object.

For further guidance on this approach, check out the ViewController class in the ReplayKitExample example application.

Broadcast Extension Method

With ReplayKit, it is possible to not only share your application's screens but also other applications on the device. This configuration requires that you create a ReplayKit broadcast extension that is included with your application. In your ReplayKit broadcast extension, you create a subclass of RPBroadcastSampleHandler. This class will be responsible for handling the samples produced by ReplayKit. The SampleHandler class in the ReplayKitExample shows an example implementation.

To start the ReplayKit broadcast, your application must present the RPBroadcastActivityViewController which allows the user to select your broadcast extension as the broadcasting service. Once the user starts the broadcast, the broadcastStarted() method will be invoked. The broadcastStarted() method in the example SampleHandler class then creates a ReplayKitVideoSource and an associated LocalVideoTrack, and connects to a Room. Each time ReplayKit generates a video frame, the processSampleBuffer() method is invoked. As in the In-App Capture Method above, the sampleBuffer is provided to the ReplayKitVideoSource for handling.

It is important to note that a ReplayKit broadcast extension must operate with limited memory. It is highly recommended that you use Group Rooms and set the isAutomaticSubscriptionEnabled ConnectOptions property to false when connecting to a Room within in the extension.

For further guidance on this approach, check out the ReplayKitExample example application.

Rate this page:

Need some help?

We all do sometimes; code is hard. Get help now from our support team, or lean on the wisdom of the crowd browsing the Twilio tag on Stack Overflow.