Flex SDK for iOS: Quickstart
The Aircore Flex SDK for iOS allows you to quickly add high-quality, real-time voice or video chat to your iOS app. This lets you avoid dealing with the complexities of building a real-time social infrastructure on your own.
You can choose to add audio, video, or both.
Before starting
Minimum versions
Xcode: 13.2.1
iOS: 13.0
Limitations
For M1 and M2 Macs, you must run Xcode using Rosetta.
As of Xcode 14, bitcode is deprecated. With older versions of Xcode, turn off bitcode in your project's build settings to use this SDK.
Install the SDK
Open the Xcode workspace for your app.
Click File, then click Add Packages.
Copy and paste this URL into the search bar:
https://github.com/aircoreio/aircore-media-ios
Select the Up to Next Major Version dependency rule.
Click Add Package.
If you see the Choose Package Products dialog, click Add Package.
Xcode automatically links the framework to your project.
Set up your app
Enable background audio
We recommend adding the "Audio" and "Voice Over IP" background modes to your application's project settings. For more details, see Enabling Background Audio in Apple's documentation.
Request permissions
If you app uses the microphone or camera, you must tell the user why the app needs access:
If your app uses the microphone, set
NSMicrophoneUsageDescription
.If your app uses the camera, set
NSCameraUsageDescription
.
For more info, see the Apple Developer page Requesting authorization to capture and save media.
Configure the engine
Engine.sharedInstance
is the central location for configuring this SDK.
Set up the engine once per launch, before you use the framework:
To set the user agent, call the
setUserAgent()
method. This allows for better server-side logging and client information tracking.To set the log directory, call the
setLogDirPath()
method. You can send logs to us for troubleshooting.
Authenticate
In this guide, we use a publishable API key for a quick setup. To get your API key:
Create an app in the Developer Console.
Copy your publishable API key and use it to create channels in the next section.
For an overview of API keys, see Authentication. To use a secret API key with this SDK, see More options.
Create and use a channel
A channel is a connection to a real-time session with other users. See Channels for a general overview.
Joining a channel allows your app to:
Publish a local stream to other users
Receive remote streams from other users
Local and remote streams can contain audio, video, or both.
Create a channel
Engine.sharedInstance
is a singleton that produces a channel
object.
To start using a channel:
Call the
createChannel()
method and keep the resultingchannel
object:import AircoreMedia let key = "PUBLISHABLE_API_KEY" // Keep this object around! let channel = Engine.sharedInstance.createChannel( publishableAPIKey: key, userID: "user_id", channelID: "channel_id" ) // Check if the channel was successfully created guard let channel = channel else { fatalError() } // Handle error case // Store the channel somewhere safe channelStorage.channel = channel
To connect to the channel and start receiving audio, call its
join()
method:channelStorage.channel.join()
To set the volume for the channel, change the
outputVolume
property. You can do this at any time.
Leave a channel
When you no longer need a channel, call its leave()
method. You can use this method at any time, except when the channel is already terminated.
Leaving destroys all objects for local and remote streams, releasing system resources used by the channel.
The local channel instance also terminates in these cases:
All references to the channel go out of scope
An unrecoverable error occurs
To see why a channel terminated, check its terminationCause property.
You cannot rejoin a terminated channel. To return to the session with the same users, create a new channel
.
Track the status of a channel
The joinState
property shows a channel's connection status.
To track changes to this property, use the joinStateDidChange
notification:
let channel: Channel
NotificationCenter.default.addObserver(self,
selector: #selector(onChannelJoinStateChange(_:)),
name: ChannelNotification.joinStateDidChange, object: channel)
@objc func onChannelJoinStateChange(_ notif: NSNotification) {
// Get the join state from the notification and respond accordingly
guard let state = notif.userInfo?[ChannelNotification.Key.newJoinState] as?
Channel.JoinState else { return }
switch state {
case .notJoined:
// The initial state. The channel has not yet connected.
// No interaction with other users is possible.
// NOTE: A notification for .notJoined is not sent.
case .joining:
// The channel is connecting for the first time.
case .joined:
// The channel is connected and automatically plays remote streams.
case .rejoining:
// The channel connected and then disconnected.
// It is now reconnecting.
case .terminated:
// The channel has permanently disconnected. You can check the
// termination cause here.
// NOTE: You can't reuse the channel, but you can detect if the channel
// terminated unexpectedly and either create a new channel or show an
// error to the user.
let terminationCause = channel.terminationCause
}
}
You can use the join state to build your user experience and UI. See More options for examples.
Publish audio
Users share audio within a channel as streams. To publish audio to other users, you create a local stream.
You can also use the same local stream object to publish video.
Use a channel's
createLocalStream()
method to create a local stream.The channel's join state must be
.joined
or.rejoining
.import AircoreMedia // ... Create and join a channel // Make sure the channel is connected let channelJoinState = channel.joinState guard channelJoinState == .joined || channelJoinState == .rejoining else { return } // Default parameters let params = LocalStreamParams() let localStream = channel.createLocalStream(params: params) guard let localStream = localStream else { fatalError() } // Handle error case
Use
start()
to publish audio from the microphone:// Start the local stream localStream.start() // You can choose to keep the local stream or access it from the channel
Use
muteAudio()
to stop capturing and publishing audio:// Mute the audio localStream.muteAudio(true) // Unmute the audio localStream.muteAudio(false)
To create a muted stream, see More options.
Stop the stream:
// To stop sending local audio, stop the local stream localStream.stop()
To add metadata to a local stream, see More options.
Get notifications for local streams
The LocalStreamNotification
interface has notifications for muting, voice activity, and connection state for local streams.
Receive audio
When your app connects to a channel, it receives audio streams from other users. These are called remote streams.
The SDK automatically connects to remote streams and plays the audio through the active output device. You don't have to do any setup for your app to play the audio from remote streams.
For tasks such as getting metadata and muting a stream, you can use remote stream objects. You can also use remote stream objects to receive video.
Check remote streams in a channel
To see a list of active remote streams within a channel, check its remoteStreams
property. You can use this property to show active streams in your UI.
Each item in the list is a RemoteStream
object, which represents a stream from one user.
To get notifications for added and removed streams, see More options.
Check the connection to a remote stream
To see if a remote stream is connected, check the connectionState
property.
To get notifications for remote stream connection status, see More options.
Mute audio from a remote stream
To mute the incoming audio from a remote stream, use its muteAudio()
method. Your app still receives the audio but doesn't play it.
Track local audio muting
To track local muting of an incoming remote stream with muteAudio()
, you can use:
The
localAudioMuteStateDidChange
notificationThe
localAudioMuted
property
Track remote audio muting
The publisher can also mute a remote stream. To track when the publisher mutes audio that your app is receiving, you can use:
The
remoteAudioMuteStateDidChange
notificationThe
remoteAudioMuted
property
Check for voice activity
To check if a remote stream has audible speech, use the voiceActivity
property.
The voiceActivityStateDidChange
notification shows changes to this property.
Check termination cause
To see why a remote stream ended, check the terminationCause
property.
Terminated streams are removed from the channel's remoteStreams
set and destroyed when your app no longer holds a reference to them.
Preview the camera
Preview the camera with default settings
To start, create a
Camera
object.import AircoreMedia let camera = Camera()
Next, create a
Preview
object.let camera = Camera() let preview = Camera.Preview(camera: camera)
Use the
setRenderTarget()
method to show the preview in aUIView
:// Initialize the UIView on main thread... let cameraView = UIView() preview.setRenderTarget(cameraView) // Start previewing the Camera preview.start { error in if let error = error { // Handle error print("Error starting preview: \(error.localizedDescription)") return } } // Stop previewing the camera preview.stop()
When you start the preview, the app automatically requests camera permission.
Preview the camera with custom settings
You can also preview the camera with custom settings. You can set the camera position, resolution, and frame rate at any time.
Create the
Camera
andPreview
objects, then set the render target:let camera = Camera() let preview = Camera.Preview(camera: camera) // Initialize the UIView on main thread... let cameraView = UIView() preview.setRenderTarget(cameraView)
To choose the front or rear camera, use
setCameraPosition()
.// Custom configuration for camera camera.setCameraPosition(.front)
To set the resolution, use
setPreferredResolution()
to select a preset quality and aspect ratio.Use
setPreferredFrameRate()
to select a preset frame rate.// Set resolution and frame rate with presets let resolutionPreset = Camera.ResolutionPreset(quality: .high, aspectRatio: .aspect3x4) camera.setPreferredResolution(preset: resolutionPreset) camera.setPreferredFrameRate(preset: .highest)
Start the preview:
// Start previewing the camera preview.start { error in if let error = error { // Handle error print("Error starting preview: \(error.localizedDescription)") return }
You can also use custom values with
setPreferredResolution()
andsetPreferredFrameRate()
.// You can update the configuration for the camera after starting camera.setCameraPosition(.back) // Set custom resolution and frame rate camera.setPreferredResolution(CGSize(width: 360, height: 360)) camera.setPreferredFrameRate(15) }
The resolution and frame rate are preferred values, which are reference points for the actual values.
The SDK may change these values at any point to optimize the stream. For example, the SDK adapts the video quality based on network bandwidth. For more info, see Network and device adaptation.
Move the camera preview to a new view
You can only render the preview for a camera to a single view. To move the preview, call the Preview
object's setRenderTarget()
method on a new view. You can do this at any time.
You can change the preview object for a camera by starting a new preview. In this case, the old preview stops without notification.
// Assume initialization on main thread
let view = UIVIew()
preview.setRenderTarget(view)
let newView = UIView()
preview.setRenderTarget(newView)
Camera errors
Starting the camera can fail under various conditions. When you call the start()
method on a camera preview, the completion handler optionally contains a CameraError
. Check the error and respond accordingly.
The authorizationDenied
error means that the user denied access to the camera.
Not all errors mean that the operation failed. For example, the startCancelled
error means that the stop()
method cancelled the start()
method before it finished.
Publish video
To publish video, you can use the same LocalStream
objects that you do for audio.
To review creating local streams, see Publish audio.
Add video to a local stream
To set the video source, attach a
Camera
object to a local stream.let camera = Camera() localStream.videoSource = camera
Unlike audio, the video for local streams is muted by default.
To start publishing, use the local stream's
muteVideo()
method to unmute the video.If the user has not provided camera permission, the app automatically requests it when unmuting video.
localStream.start() // ... // Check that LocalStream has connected before trying to control video mute state let connectionState = localStream.connectionState guard connectionState == .connecting || connectionState == . connected else { return } // Unmute video (start publishing video) localStream.muteVideo(false) // Mute video (Stop publishing video) localStream.muteVideo(true)
Channels have the allowPublishVideo
setting, which is true
by default. If this value is false
, unmuting fails silently.
If you remove the video source from the local stream by setting it to nil
, the video is muted and publishing video stops.
If you change the video source to a different object, the local stream immediately publishes the new video source if the video mute state is unmuted.
Create a local stream with video
You can also set the video source and video mute state before creating a local stream. To do this, modify the LocalStreamParams
object.
let params = LocalStreamParams()
let camera = Camera()
params.videoSource = camera
// Initially unmute video when LocalStream starts
params.initialVideoMute = false
let localStream = channel.createLocalStream(params: params)
You can change the initial video source on a local stream at any time. However, you can only change the initial video mute state after the stream connects.
Start a local stream with audio only
Local streams share audio only by default:
let localStream = channel.createLocalStream()
assert(localStream.audioMuted == false)
assert(localStream.videoMuted == true)
assert(localStream.videoSource == nil)
Track video performance for local streams
To track video performance for a local stream, use the videoHealthDidChange
notification.
Receive video
To receive video, you can use the same RemoteStream
objects that you do for audio.
Remote stream objects automatically manage decoding and rendering video to a UIView
.
Display video from a remote stream
To display the video from a remote stream, use its setRenderTarget()
method.
import AircoreMedia
// Initialize on main thread
let view = UIView()
// Set remote stream video render target
remoteStream.setRenderTarget(to: view)
let newView = UIView()
// Change render target
remoteStream.setRenderTarget(to: newView)
// Stop rendering video
remoteStream.setRenderTarget(to: nil)
Mute video from a remote stream
As with audio, you can mute the incoming video from a remote stream using muteVideo()
. The publisher can also mute the video remotely.
Track local video muting
To track local muting of an incoming remote stream with muteVideo()
, you can use:
The
localVideoMuteStateDidChange
notificationThe
localVideoMuted
property
Track remote video muting
To track when the publisher mutes video that your app is receiving, you can use:
The
remoteVideoMuteStateDidChange
notificationThe
remoteVideoMuted
property
Track video performance for remote streams
To track video performance for a remote stream, use these notifications:
More info
To continue building your app, see More options.
See the full API reference.
Download our sample app.