Skip to contentSkip to navigationSkip to topbar
Rate this Page:

Configuring Audio, Video Input and Output devices - iOS 4.x


(warning)

Warning

This page is for reference only. We are no longer onboarding new customers to Programmable Video. Existing customers can continue to use the product until December 5, 2024(link takes you to an external page).
We recommend migrating your application to the API provided by our preferred video partner, Zoom. We've prepared this migration guide(link takes you to an external page) to assist you in minimizing any service disruption.

In this guide we'll show you how to configure Audio and Video input and output devices from your Twilio Video application. Taking advantage of the ability to control input and output devices lets you build a better end user experience.


Selecting a specific Video Input

selecting-a-specific-video-input page anchor

The TVICameraSource class captures from an AVCaptureDevice, and provides frames to a TVILocalVideoTrack.

Add selected video source to local track


_39
// The default initializer will succeed if the front camera is available.
_39
_39
if let camera = CameraSource(delegate: self),
_39
let videoTrack = LocalVideoTrack(source: camera) {
_39
_39
// TVIVideoView is a TVIVideoRenderer and can be added to any TVIVideoTrack.
_39
let renderer = VideoView(frame: view.bounds)
_39
_39
// Add renderer to the video track
_39
videoTrack.addRenderer(renderer)
_39
_39
self.localVideoTrack = videoTrack
_39
self.camera = camera
_39
self.view.addSubview(renderer)
_39
} else {
_39
print("Couldn't create CameraSource or LocalVideoTrack")
_39
}
_39
_39
_39
// Select between the front and back camera.
_39
func flipCamera() {
_39
var newDevice: AVCaptureDevice?
_39
_39
if let camera = self.camera, let captureDevice = camera.device {
_39
if captureDevice.position == .front {
_39
newDevice = CameraSource.captureDevice(position: .back)
_39
} else {
_39
newDevice = CameraSource.captureDevice(position: .front)
_39
}
_39
_39
if let newDevice = newDevice {
_39
camera.selectCaptureDevice(newDevice) { (captureDevice, videoFormat, error) in
_39
if let error = error {
_39
print("Error selecting capture device.\ncode = \((error as NSError).code) error = \(error.localizedDescription)")
_39
}
_39
}
_39
}
_39
}
_39
}


Selecting specific Audio routes

selecting-specific-audio-routes page anchor

Typically, the audio input and output route is chosen by the end user in Control Center. By default TwilioVideo will manage the application's AVAudioSession and configure it for video conferencing use cases. If you wish to modify audio behavior, including session configuration, you can create your own TVIDefaultAudioDevice(link takes you to an external page) and provide it as an option before connecting to a Room.


_13
// Override the device before creating any Rooms or Tracks.
_13
self.audioDevice = DefaultAudioDevice()
_13
TwilioVideoSDK.audioDevice = self.audioDevice
_13
_13
let localAudioTrack = LocalAudioTrack()
_13
let connectOptions = ConnectOptions(token: accessToken) { (builder) in
_13
builder.roomName = "my-room"
_13
_13
if let audioTrack = localAudioTrack {
_13
builder.audioTracks = [ audioTrack ]
_13
}
_13
}
_13
var room = TwilioVideoSDK.connect(options: connectOptions, delegate: self)

To configure the AVAudioSession, TVIDefaultAudioDevice executes the kDefaultAVAudioSessionConfigurationBlock by default. You can alter the AVAudioSession configuration for your application by providing a block to the TVIDefaultAudioDevice. For example, the input and output audio routes can be overwritten by altering the AVAudioSession configuration in the block that you provide to TVIDefaultAudioDevice.

The following example demonstrates how to configure AVAudioSession for voice chat scenarios. This will prefer using the device's receiver and bottom microphone like a voice call in the Phone app.


_13
// Change the audio route after connecting to a Room.
_13
self.audioDevice.block = {
_13
do {
_13
DefaultAudioDevice.DefaultAVAudioSessionConfigurationBlock()
_13
_13
let audioSession = AVAudioSession.sharedInstance()
_13
try audioSession.setMode(.voiceChat)
_13
} catch let error as NSError {
_13
print("Fail: \(error.localizedDescription)")
_13
}
_13
}
_13
_13
self.audioDevice.block();

You must use your own AVAudioEngineDevice audio device that uses the kAudioUnitSubType_RemoteIO componentSubType of the AudioComponentDescription in order to change microphone orientations.

The next Objective-C example demonstrates how to select the back microphone. This might be useful if your subject is behind the phone, and being captured with the back camera.


_77
_77
// Change this to `AVAudioSessionOrientationFront` if you wish to use the front microphone.
_77
NSString *microphone = AVAudioSessionOrientationBack;
_77
_77
typeof(self) __weak weakSelf = self;
_77
self.audioDevice.block = ^ {
_77
kTVIDefaultAVAudioSessionConfigurationBlock();
_77
[weakSelf setMicrophoneInUse:microphone];
_77
};
_77
self.audioDevice.block();
_77
_77
- (void)setMicrophoneInUse:(NSString *)nextDataSource {
_77
NSError *theError = nil;
_77
BOOL result = YES;
_77
_77
AVAudioSession *session = [AVAudioSession sharedInstance];
_77
_77
result = [session setActive:YES error:&theError];
_77
_77
// Get the set of available inputs. If there are no audio accessories attached, there will be
_77
// only one available input -- the built in microphone.
_77
NSArray *inputs = [session currentRoute].inputs;
_77
_77
// Locate the Port corresponding to the built-in microphone.
_77
AVAudioSessionPortDescription *builtInMicPort = nil;
_77
for (AVAudioSessionPortDescription *port in inputs) {
_77
if ([port.portType isEqualToString:AVAudioSessionPortBuiltInMic]) {
_77
builtInMicPort = port;
_77
break;
_77
}
_77
}
_77
_77
if ([builtInMicPort.preferredDataSource.orientation isEqualToString:nextDataSource]) {
_77
return;
_77
}
_77
_77
// loop over the built-in mic's data sources and attempt to locate the specified microphone
_77
AVAudioSessionDataSourceDescription *theDataSource = nil;
_77
for (AVAudioSessionDataSourceDescription *source in builtInMicPort.dataSources) {
_77
if ([source.orientation isEqual:nextDataSource]) {
_77
theDataSource = source;
_77
break;
_77
}
_77
} // end data source iteration
_77
_77
if (theDataSource) {
_77
theError = nil;
_77
if ([theDataSource.orientation isEqualToString:AVAudioSessionOrientationBack]) {
_77
result = [theDataSource setPreferredPolarPattern:AVAudioSessionPolarPatternSubcardioid error:&theError];
_77
if (!result) {
_77
NSLog (@"Failed to set AVAudioSessionPolarPatternSubcardioid failed");
_77
}
_77
} else if ([theDataSource.orientation isEqualToString:AVAudioSessionOrientationFront]) {
_77
result = [theDataSource setPreferredPolarPattern:AVAudioSessionPolarPatternCardioid error:&theError];
_77
if (!result) {
_77
NSLog (@"Failed to set AVAudioSessionPolarPatternCardioid failed");
_77
}
_77
}
_77
_77
// Set a preference for the front data source.
_77
theError = nil;
_77
result = [builtInMicPort setPreferredDataSource:theDataSource error:&theError];
_77
if (!result) {
_77
// an error occurred. Handle it!
_77
NSLog(@"setPreferredDataSource failed");
_77
}
_77
}
_77
_77
// Make sure the built-in mic is selected for input. This will be a no-op if the built-in mic is
_77
// already the current input Port.
_77
theError = nil;
_77
result = [session setPreferredInput:builtInMicPort error:&theError];
_77
if (!result) {
_77
// an error occurred. Handle it!
_77
NSLog(@"setPreferredInput failed");
_77
}
_77
}


Rate this Page: