Configuring Audio, Video Input and Output devices - iOS
This page is for reference only. We are no longer onboarding new customers to Programmable Video. Existing customers can continue to use the product until December 5, 2024.
We recommend migrating your application to the API provided by our preferred video partner, Zoom. We've prepared this migration guide to assist you in minimizing any service disruption.
In this guide we’ll show you how to configure Audio and Video input and output devices from your Twilio Video application. Taking advantage of the ability to control input and output devices lets you build a better end user experience.
Selecting a specific Video Input
The CameraSource
class captures from an AVCaptureDevice
, and provides frames to a LocalVideoTrack
.
Add selected video source to local track
// The default initializer will succeed if the front camera is available.
if let camera = CameraSource(delegate: self),
let videoTrack = LocalVideoTrack(source: camera) {
// VideoView is a VideoRenderer and can be added to any VideoTrack.
let renderer = VideoView(frame: view.bounds)
// Add renderer to the video track
videoTrack.addRenderer(renderer)
self.localVideoTrack = videoTrack
self.camera = camera
self.view.addSubview(renderer)
} else {
print("Couldn't create CameraSource or LocalVideoTrack")
}
// Select between the front and back camera.
func flipCamera() {
var newDevice: AVCaptureDevice?
if let camera = self.camera, let captureDevice = camera.device {
if captureDevice.position == .front {
newDevice = CameraSource.captureDevice(position: .back)
} else {
newDevice = CameraSource.captureDevice(position: .front)
}
if let newDevice = newDevice {
camera.selectCaptureDevice(newDevice) { (captureDevice, videoFormat, error) in
if let error = error {
print("Error selecting capture device.\ncode = \((error as NSError).code) error = \(error.localizedDescription)")
}
}
}
}
}
Optionally set VideoEncodingMode in ConnectOptions
VideoEncodingMode represents the modes of behavior of the videoEncodingMode
property in ConnectOptions
. The videoEncodingMode
API is mutually exclusive with existing codec management APIs EncodingParameters.maxVideoBitrate
and preferredVideoCodecs
. The default value is unset.
auto
- In this mode, the SDK selects the video codecs and manages encodings automatically.
let connectOptions = ConnectOptions(token: accessToken) { (builder) in
builder.roomName = "my-room"
builder.videoEncodingMode = .auto
}
var room = TwilioVideoSDK.connect(options: connectOptions, delegate: self)
Selecting specific Audio routes
Typically, the audio input and output route is chosen by the end user in Control Center. By default TwilioVideo will manage the application's AVAudioSession
and configure it for video conferencing use cases. If you wish to modify audio behavior, including session configuration, you can create your own DefaultAudioDevice
and provide it as an option before connecting to a Room.
// Override the device before creating any Rooms or Tracks.
self.audioDevice = DefaultAudioDevice()
TwilioVideoSDK.audioDevice = self.audioDevice
let localAudioTrack = LocalAudioTrack()
let connectOptions = ConnectOptions(token: accessToken) { (builder) in
builder.roomName = "my-room"
if let audioTrack = localAudioTrack {
builder.audioTracks = [ audioTrack ]
}
}
var room = TwilioVideoSDK.connect(options: connectOptions, delegate: self)
To configure the AVAudioSession
, DefaultAudioDevice
executes the kDefaultAVAudioSessionConfigurationBlock
by default. You can alter the AVAudioSession
configuration for your application by providing a block
to the DefaultAudioDevice
. For example, the input and output audio routes can be overwritten by altering the AVAudioSession
configuration in the block that you provide to DefaultAudioDevice
.
The following example demonstrates how to configure AVAudioSession
for voice chat scenarios. This will prefer using the device's receiver and bottom microphone like a voice call in the Phone app.
// Change the audio route after connecting to a Room.
self.audioDevice.block = {
do {
DefaultAudioDevice.DefaultAVAudioSessionConfigurationBlock()
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setMode(.voiceChat)
} catch let error as NSError {
print("Fail: \(error.localizedDescription)")
}
}
self.audioDevice.block();
You must use your own AVAudioEngineDevice
audio device that uses the kAudioUnitSubType_RemoteIO
componentSubType
of the AudioComponentDescription
in order to change microphone orientations.
The next Objective-C example demonstrates how to select the back microphone. This might be useful if your subject is behind the phone, and being captured with the back camera.
// Change this to `AVAudioSessionOrientationFront` if you wish to use the front microphone.
NSString *microphone = AVAudioSessionOrientationBack;
typeof(self) __weak weakSelf = self;
self.audioDevice.block = ^ {
kTVIDefaultAVAudioSessionConfigurationBlock();
[weakSelf setMicrophoneInUse:microphone];
};
self.audioDevice.block();
- (void)setMicrophoneInUse:(NSString *)nextDataSource {
NSError *theError = nil;
BOOL result = YES;
AVAudioSession *session = [AVAudioSession sharedInstance];
result = [session setActive:YES error:&theError];
// Get the set of available inputs. If there are no audio accessories attached, there will be
// only one available input -- the built in microphone.
NSArray *inputs = [session currentRoute].inputs;
// Locate the Port corresponding to the built-in microphone.
AVAudioSessionPortDescription *builtInMicPort = nil;
for (AVAudioSessionPortDescription *port in inputs) {
if ([port.portType isEqualToString:AVAudioSessionPortBuiltInMic]) {
builtInMicPort = port;
break;
}
}
if ([builtInMicPort.preferredDataSource.orientation isEqualToString:nextDataSource]) {
return;
}
// loop over the built-in mic's data sources and attempt to locate the specified microphone
AVAudioSessionDataSourceDescription *theDataSource = nil;
for (AVAudioSessionDataSourceDescription *source in builtInMicPort.dataSources) {
if ([source.orientation isEqual:nextDataSource]) {
theDataSource = source;
break;
}
} // end data source iteration
if (theDataSource) {
theError = nil;
if ([theDataSource.orientation isEqualToString:AVAudioSessionOrientationBack]) {
result = [theDataSource setPreferredPolarPattern:AVAudioSessionPolarPatternSubcardioid error:&theError];
if (!result) {
NSLog (@"Failed to set AVAudioSessionPolarPatternSubcardioid failed");
}
} else if ([theDataSource.orientation isEqualToString:AVAudioSessionOrientationFront]) {
result = [theDataSource setPreferredPolarPattern:AVAudioSessionPolarPatternCardioid error:&theError];
if (!result) {
NSLog (@"Failed to set AVAudioSessionPolarPatternCardioid failed");
}
}
// Set a preference for the front data source.
theError = nil;
result = [builtInMicPort setPreferredDataSource:theDataSource error:&theError];
if (!result) {
// an error occurred. Handle it!
NSLog(@"setPreferredDataSource failed");
}
}
// Make sure the built-in mic is selected for input. This will be a no-op if the built-in mic is
// already the current input Port.
theError = nil;
result = [session setPreferredInput:builtInMicPort error:&theError];
if (!result) {
// an error occurred. Handle it!
NSLog(@"setPreferredInput failed");
}
}
Need some help?
We all do sometimes; code is hard. Get help now from our support team, or lean on the wisdom of the crowd by visiting Twilio's Stack Overflow Collective or browsing the Twilio tag on Stack Overflow.