IOStream

open class IOStream : NSObject

The IOStream class is the foundation of a RTMPStream.

  • The enumeration defines the state an IOStream client is in.

    See more

    Declaration

    Swift

    public enum ReadyState : Equatable
  • The lockQueue.

    Declaration

    Swift

    public let lockQueue: DispatchQueue
  • The offscreen rendering object.

    Declaration

    Swift

    public var screen: Screen { get }
  • Specifies the adaptibe bitrate strategy.

    Declaration

    Swift

    public var bitrateStrategy: any IOStreamBitRateStrategyConvertible { get set }
  • Specifies the audio monitoring enabled or not.

    Declaration

    Swift

    public var isMonitoringEnabled: Bool { get set }
  • Specifiet the device torch indicating wheter the turn on(TRUE) or not(FALSE).

    Declaration

    Swift

    public var torch: Bool { get set }
  • Specifies the frame rate of a device capture.

    Declaration

    Swift

    public var frameRate: Float64 { get set }
  • Specifies the feature to mix multiple audio tracks. For example, it is possible to mix .appAudio and .micAudio from ReplayKit. Warning: If there is a possibility of this feature, please set it to true initially.

    Declaration

    Swift

    public var isMultiTrackAudioMixingEnabled: Bool { get set }
  • Specifies the sessionPreset for the AVCaptureSession.

    Declaration

    Swift

    @available(tvOS 17.0, *)
    public var sessionPreset: AVCaptureSession.Preset { get set }
  • Specifies the video orientation for stream.

    Declaration

    Swift

    public var videoOrientation: AVCaptureVideoOrientation { get set }
  • Specifies the audio mixer settings.

    Declaration

    Swift

    public var audioMixerSettings: IOAudioMixerSettings { get set }
  • Specifies the video mixer settings.

    Declaration

    Swift

    public var videoMixerSettings: IOVideoMixerSettings { get set }
  • Specifies the audio compression properties.

    Declaration

    Swift

    public var audioSettings: AudioCodecSettings { get set }
  • Specifies the video compression properties.

    Declaration

    Swift

    public var videoSettings: VideoCodecSettings { get set }
  • The audio input formats.

    Declaration

    Swift

    public var audioInputFormats: [UInt8 : AVAudioFormat] { get }
  • The video input formats.

    Declaration

    Swift

    public var videoInputFormats: [UInt8 : CMFormatDescription] { get }
  • Specifies the controls sound.

    Declaration

    Swift

    public var soundTransform: SoundTransform { get set }
  • The number of frames per second being displayed.

    Declaration

    Swift

    @objc
    public internal(set) dynamic var currentFPS: UInt16 { get }
  • Specifies the delegate.

    Declaration

    Swift

    public weak var delegate: (any IOStreamDelegate)?
  • Specifies the view.

    Declaration

    Swift

    public var view: (any IOStreamView)? { get set }
  • The current state of the stream.

    Declaration

    Swift

    public var readyState: ReadyState { get set }
  • Creates an object.

    Declaration

    Swift

    override public init()
  • Attaches the camera device.

    Declaration

    Swift

    @available(tvOS 17.0, *)
    public func attachCamera(_ device: AVCaptureDevice?, track: UInt8 = 0, configuration: IOVideoCaptureConfigurationBlock? = nil)
  • Returns the IOVideoCaptureUnit by track.

    Declaration

    Swift

    @available(tvOS 17.0, *)
    public func videoCapture(for track: UInt8) -> IOVideoCaptureUnit?
  • Attaches the audio device.

    You can perform multi-microphone capture by specifying as follows on macOS. Unfortunately, it seems that only one microphone is available on iOS.

    stream.isMultiTrackAudioMixingEnabled = true
    
    var audios = AVCaptureDevice.devices(for: .audio)
    if let device = audios.removeFirst() {
       stream.attachAudio(device, track: 0)
    }
    if let device = audios.removeFirst() {
       stream.attachAudio(device, track: 1)
    }
    

    Declaration

    Swift

    @available(tvOS 17.0, *)
    public func attachAudio(_ device: AVCaptureDevice?, track: UInt8 = 0, configuration: IOAudioCaptureConfigurationBlock? = nil)
  • Returns the IOAudioCaptureUnit by track.

    Declaration

    Swift

    @available(tvOS 17.0, *)
    public func audioCapture(for track: UInt8) -> IOAudioCaptureUnit?
  • Appends a CMSampleBuffer.

    Declaration

    Swift

    public func append(_ sampleBuffer: CMSampleBuffer, track: UInt8 = 0)

    Parameters

    sampleBuffer

    The sample buffer to append.

    track

    Track number used for mixing

  • Appends an AVAudioBuffer.

    Declaration

    Swift

    public func append(_ audioBuffer: AVAudioBuffer, when: AVAudioTime, track: UInt8 = 0)

    Parameters

    audioBuffer

    The audio buffer to append.

    when

    The audio time to append.

    track

    Track number used for mixing.

  • Registers a video effect.

    Declaration

    Swift

    public func registerVideoEffect(_ effect: VideoEffect) -> Bool
  • Unregisters a video effect.

    Declaration

    Swift

    public func unregisterVideoEffect(_ effect: VideoEffect) -> Bool
  • Adds an observer.

    Declaration

    Swift

    public func addObserver(_ observer: any IOStreamObserver)
  • Removes an observer.

    Declaration

    Swift

    public func removeObserver(_ observer: any IOStreamObserver)
  • Configurations for the AVCaptureSession.

    Declaration

    Swift

    @available(tvOS 17.0, *)
    public func configuration(_ lambda: (_ session: AVCaptureSession) throws -> Void) rethrows
  • A handler that receives stream readyState will update.

    Warning

    Please do not call this method yourself.

    Declaration

    Swift

    open func readyStateWillChange(to readyState: ReadyState)
  • A handler that receives stream readyState updated.

    Warning

    Please do not call this method yourself.

    Declaration

    Swift

    open func readyStateDidChange(to readyState: ReadyState)