Larix Broadcaster SDK for iOS
Build live streaming iOS apps with our SDK
Softvelum Larix Broadcaster gives extended capabilities for creating content on mobile devices.
It's a freeware app however there are cases when you might want to build your own applications.
Softvelum provides SDKs for that case. Larix Broadcaster SDK for iOS is one of them.
Why Larix SDK?
- RTMP, SRT, RTSP, RIST: Larix streaming library provides ability to stream your media content via all of them. It's the best set of streaming protocols available for your mobile streaming scenarios.
- SDK package includes source code for your reference, with full set of comments within the code. You can find out more about SDK package below.
- WebRTC with WHIP signaling based on Pion.
- Testing is crucial part of our product life cycle. Take a look at Larix Broadcaster connectivity tests for iOS as example. Besides that, our SDK customers get their updates only after we make sure our free apps haven't shown any issues on larger device fleet. So it's a double quality check for our subscribed customers.
- SDK is available on subscription basis. You pay as long as you need our software updates and technical support.
What is included into Larix Broadcaster iOS SDK?
Larix Broadcaster iOS SDK package has the following elements.
- Larix Broadcaster application source code with comments for every important component and code fragment.
- Binary libraries: libmbl (Softvelum closed-source streaming library), libsrt (official SRT Alliance library) and librist (official RIST Forum library).
- Larix Screencaster application source code.
- ios-larix-demo from our github shows a minimal example of app which can be built with streaming library.
Step-by-step guide explains the building process.
Having all that you will be able to create your own apps based on Larix apps, as well as include streaming capabilities into your existing apps.
Notice on NDI support: we do not provide NDI as part of our SDKs by default. Contact us if you want to add NDI streaming capabilities into your application.
High-level overview of Larix Broadcaster
The diagram below shows the key components of our Larix Broadcaster free app and the system components it uses for its work. You can download full-size picture here.
Click on elements, their respective details will be shown below the diagram.
Yellow elements represent Larix application source code which is available for editing.
Red elements represent LarixLib Swift components.
Blue elements represent iOS components which are used by Larix code or libmbl library.
Green elements represent libmbl binary library components which perform content streaming.
Purple elements represent third-party libraries for SRT and RIST streaming.
Main class for application. Creates main view and [AudioSession](#AudioSession) instance.
Main view with camera preview, UI controls and handling of streaming events.
Handle operations related to AVFoundation and bind it with Larix streamer library.
Base ``Streamer`` class mostly contains simple functions to interact with StreamingEngineProxy and abstract public function. Have ``StreamInternal`` child that have some common capture session functionality, which, in order, has 2 subclasses: `StreamerSingleCam` to handle regular `AVCaptureSession` for one camera, and `StreamerMultiCam` for `AVCaptureMultiCamSession` with simultaneous back and front camera handling. Also performs image manipulaiton (rotation, combine images) using CoreImage.
Use `StreamerBuilder` to create Streamer instance.
Facade for Larix streaming library, all operations are performed through this class. Library is written on Objective-C, you should use bridging header to use it from Swift application.
Internal class which responsible for compression of audio/video frames. After compression, data placed to circilar buffer and then wrapped into packets for each protocol
Writes compressed data to file using AVAssetWriter.
Sends data to server over RTMP protocol. It uses Network library (`nw_connection`) on transport level.
Sends data to server over RTSP protocol. It uses Network library (`nw_connection`) on transport level.
Wraps data in 188-bytes MPEG2TS packets.
Sends MPEG2TS data over SRT protocol using SRT library.
Sends MPEG2TS data over RIST protocol using RIST library.
An object that communicates to the system how you intend to use audio in the application.
An object that manages capture activity and coordinates the flow of data from input devices (camera, microphone) to capture outputs.
A capture session that supports simultaneous capture from multiple cameras.
A device that provides audio or video input for capture sessions and offers controls for hardware-specific capture features.
Subclass of 'CALayer' that you use to display video as it’s captured by an input device. You don't need to create it explicitly, call `Streamer.createPreviewLayer(parentView:)` to create preview for either single or multi-camera capture.
SRT library by SRT Alliance for streaming over SRT protocol.
RIST library by RIST Forum for streaming over RIST protocol.
Provides Swift bridging for Objective-C libmbl library. Import LarixCore in your Swift files to get access to libmbl types, no need to use bridiging header.
Utility class to check camera/microphone permissions.
Represents AVAudioSession wrapper. One should call AudioSession.start prior to Streamer initialization with audioConfig, otherwise Streamer.startCapture would fail. Also provides `AudioSessionStateObserver` protocol to handle audio session change.
Contanter for AVCaptureVideoPreviewLayer, for ``StreamerSingleCam``. Call ``Streamer.createPreviewLayer`` for single-camera capture to obtain **PreviewLayerSingleCam**.
Contanter with two instances of AVCaptureVideoPreviewLayer for ``StreamerMultiCam`` with layout depending on **MultiCamConfig** . Call ``Streamer.createPreviewLayer`` in streamer in mult-camera capture mode to obtain **PreviewLayerMultiCam**.
Represents VU meter to measure audio level. You can use ``processsAudioSampleBuffer`` from ``StreamerAppDelegate`` to obtain audio buffer and pass it to ``AudioLevelMeter.putBuffer``.
Perform compression of audio/video frames and writes it to file using AVAssetWriter. Use ``StreamingEngimeProxy.setsetFileWritingMode`` to switch between ``StreamRecorder`` and ``StreamRecorderCompressor``.
Audio converter objects convert between various linear PCM audio formats, as well as between linear PCM and compressed formats.
A compression session supports the compression of a sequence of video frames.
Larix Talkback and Larix Player SDK
Notice that Talkback for Larix Broadcaster functionality requires two SDKs for its functional availability: Larix Broadcaster SDK and Larix Player SDK. If you don't have Player SDK then you will not be able to use Talkback functionality after building your Larix-based app.
The playback library included by default in Broadcaster SDK package is a stub which does not provide any playback output.
Once you subscribe for Player SDK, you can replace it with the library from the Player package and build Larix app with Talkback feature set.
- Larix SDK FAQ has most popular questions about our mobile streaming technologies and some of the popular questions about SDK purchase and usage.
- SDK releases history page shows currently available SDKs and their revision history. It helps customers who previously purchased SDK to decide whether they should subscribe to get updates.
- Documentation reference page has a list of articles and videos about apps' setup and usage.
- Contact our helpdesk regarding our mobile technologies.
Get Larix SDK
To obtain Larix Broadcaster SDK, contact us for more details.
Download free apps
You can always test all features in action via our apps.
This product uses SRT library distributed under MPL-2.0 license.
This product uses librist library distributed under BSD 2-clause "Simplified" license.
iOS Larix Broadcaster uses Scrollable Segmented Control distributed under MIT license.
This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit.