-
Notifications
You must be signed in to change notification settings - Fork 14
Migrate Streaming SDK to Live SDK.
DoubleLight edited this page Feb 18, 2021
·
3 revisions
- Live SDK is the successor of the Streaming SDK, based on Apple's Metal GPU API instead of OpenGL compared to the previous version.
- Streaming SDK will stop major updates. Because Apple has deprecated OpenGL already, developers should consider migrate quickly if still using Streaming SDK.
- Live SDK is developed by Swift. Full support for Objective-C, but Swift is more recommended.
In Live SDK, method prepare
is used for giving captureDevice and preview to manager.
You can call prepare
or attach
methods to change captureDevice any time, even broadcasting.
To change configuration of broadcast, use setConfig
and modifyConfig
.
STSLiveBroadcastConfig is the successor of STSStreamingPrepareConfig. These properties are renamed or redefined:
-
targetOutputSize
is redefined tovideoSize
. -
outputImageOrientation
is renamed tovideoOrientation
. -
fitAllCamera
andmaxVideoHeight
removed.
If videoSize is set to zero, a recommended value depend on videoOrientation and profile in current liveInfo will set when startLive.
For example, start a 720p and using portrait orientation will get Size(w:720, h:1280). -
captureDevicePosition
removed.
Use attachCamera to set captureDevice directly, you can use listFrontDevices and listBackDevices to get them. -
flipFrontCameraOutputHorizontally
is renamed tomirrored
. -
audioEnabled
is redefined tomuted
.
Filters in Streaming SDK is implemented by OpenGLES, it's not longer support in Live SDK.
Live SDK still provide STSBroadcastBeautyFaceEffect
to make beauty effect.
If you have customized filter based on OpenGLES, you have to reimplement it with CoreImage.
func execute(_ image: CIImage, info: CMSampleBuffer?) -> CIImage {
return image
}