Android WebRTC+SRS/ZLM视频通话(4):Android使用WebRTC推流SRS/ZLMediaKit

news2024/11/17 15:48:22

Android WebRTC+SRS/ZLM视频通话(4):Android使用WebRTC推流SRS/ZLMediaKit

来自奔三人员的焦虑日志

接着上一章内容,继续来记录Android是如何使用WebRTC往SRS或ZLMediaKit进行推流。想要在Android设备上实现高质量的实时流媒体推送?那么不要错过 WebRTC、SRS 和 ZLMediaKit 这三个工具!

WebRTC 是一种使用标准的 Web 技术实现 P2P 实时通信的开源技术,对于实时流媒体的推送、拉取都有着很好的支持。同时,SRS(Simple-RTMP-Server)和 ZLMediaKit 都是非常优秀的流媒体服务器,并且都内置了丰富的 WebRTC 支持。

结合使用这三者,您可以在 Android 设备上快速实现音视频流的推送,并且保证高质量、低延迟的体验。不仅如此,WebRTC 的 P2P 功能还可以使您的流媒体服务更加安全可靠,避免对于延迟、带宽和服务器压力造成过大的依赖。

盘点 WebRTC、SRS 和 ZLMediaKit,还有更多的有关实时流媒体的技术,可以持续关注我的专栏,我会用最菜的技术视角来进行记录,方便大家一起交流和指教。

ChatGPT提供的代码思路


在这里插入图片描述
在这里插入图片描述

上实践

从AI的回答中可以看出,要实现比较复杂的逻辑,就目前而言还是有点吃力,但相比于以前自己去Google和百度确实方便了好多,既然不能直接复制粘贴,那我们就照着AI给的思路来自己去封装一下代码即可。
在这里插入图片描述
由于找不到AI给出的cn.zhoulk:ZLMediaKit-Android:x.y.z依赖,AI提供的GitHub仓库也是返回404,甚至提供的webrtc-android-codelab项目地址以及WebRTC官方文档地址都没法打开,所以这里我们直接弃用,全凭自己的经验来写就好。
在这里插入图片描述

Android代码部分


1、创建项目

这里我们创建Compose Activity,顺便体验一下Kotlin中Compose写UI的顺滑。
在这里插入图片描述

2、添加依赖

首先需要将以下依赖项添加到您的 build.gradle 文件中:

dependencies {
	//...
    //webrtc
    implementation 'org.webrtc:google-webrtc:1.0.32006'
    //网络请求 https://github.com/liangjingkanji/Net
    implementation "org.jetbrains.kotlinx:kotlinx-coroutines-core:1.6.0" // 协程(版本自定)
    implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-android:1.6.0'
    implementation 'com.squareup.okhttp3:okhttp:4.10.0' // 要求OkHttp4以上
    implementation 'com.github.liangjingkanji:Net:3.5.3'
    implementation 'com.google.code.gson:gson:2.9.1'
    implementation 'com.github.getActivity:GsonFactory:5.2'
    implementation "org.jetbrains.kotlin:kotlin-stdlib:1.7.0"
    implementation 'com.github.liangjingkanji:Serialize:1.2.3'
    //申请权限框架
    implementation 'com.github.getActivity:XXPermissions:18.0'
}

由于WebRTC推拉流和sdp交换都必须依赖于网络,所以我们添加网络请求相关依赖和权限申请框架依赖。这里要注意一下settings.gradle的配置,添加aliyun的maven,避免下载不到依赖。如果实在加载不了,可以拉取我的Demo(WebRTC_Compose_Demo),我后面会把WebRTC的aar包一起传上去。

pluginManagement {
    repositories {
        google()
        mavenCentral()
        gradlePluginPortal()
    }
}
dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
    repositories {
        google()
        mavenCentral()
        maven { url 'https://jitpack.io' }
        maven { url 'https://maven.aliyun.com/repository/public/' }
        maven { url 'https://maven.aliyun.com/repository/google/' }
        maven { url 'https://maven.aliyun.com/repository/jcenter' }
    }
}
rootProject.name = "WebRTC-Compose-Demo"
include ':app'

记住要添加相应的权限:

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools">

    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.INTERNET" />

    <uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.CAMERA" />

    <uses-feature android:name="android.hardware.camera" />
    <uses-feature android:name="android.hardware.camera2.full" />
    <uses-feature android:name="android.hardware.camera2.autofocus" />

    <application
        android:allowBackup="true"
        android:dataExtractionRules="@xml/data_extraction_rules"
        android:fullBackupContent="@xml/backup_rules"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:networkSecurityConfig="@xml/network_security_config"
        android:requestLegacyExternalStorage="true"
        android:supportsRtl="true"
        android:theme="@style/Theme.WebRTCComposeDemo"
        tools:targetApi="31">

        <meta-data
            android:name="ScopedStorage"
            android:value="true" />

        <activity
            android:name=".MainActivity"
            android:exported="true"
            android:label="@string/app_name"
            android:theme="@style/Theme.WebRTCComposeDemo">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>

</manifest>

3、构建WebRTC工具类

将WebRTC推拉流相关逻辑封装成一个工具类,方便后面使用。代码里的备注比较少,具体可以参照大佬的解释(WebRTC简介)。

/**
 * Created by 玉念聿辉.
 * Use: WebRTC推拉流工具拉
 * Date: 2023/5/9
 * Time: 11:23
 */
class WebRTCUtil(context: Context) : PeerConnection.Observer,
    SdpObserver {
    private val context: Context
    private var eglBase: EglBase? = null
    private var playUrl: String? = null
    private var peerConnection: PeerConnection? = null
    private var surfaceViewRenderer: SurfaceViewRenderer? = null
    private var peerConnectionFactory: PeerConnectionFactory? = null
    private var audioSource: AudioSource? = null
    private var videoSource: VideoSource? = null
    private var localAudioTrack: AudioTrack? = null
    private var localVideoTrack: VideoTrack? = null
    private var captureAndroid: VideoCapturer? = null
    private var surfaceTextureHelper: SurfaceTextureHelper? = null
    private var isShowCamera = false
    private var isPublish = false //isPublish true为推流 false为拉流
    private var reConnCount = 0
    fun create(
        eglBase: EglBase?,
        surfaceViewRenderer: SurfaceViewRenderer?,
        playUrl: String?,
        callBack: WebRtcCallBack?
    ) {
        create(eglBase, surfaceViewRenderer, false, playUrl, callBack)
    }

    fun create(
        eglBase: EglBase?,
        surfaceViewRenderer: SurfaceViewRenderer?,
        isPublish: Boolean,
        playUrl: String?,
        callBack: WebRtcCallBack?
    ) {
        this.eglBase = eglBase
        this.surfaceViewRenderer = surfaceViewRenderer
        this.callBack = callBack
        this.playUrl = playUrl
        this.isPublish = isPublish
        init()
    }

    fun create(
        eglBase: EglBase?,
        surfaceViewRenderer: SurfaceViewRenderer?,
        isPublish: Boolean,
        isShowCamera: Boolean,
        playUrl: String?,
        callBack: WebRtcCallBack?
    ) {
        this.eglBase = eglBase
        this.surfaceViewRenderer = surfaceViewRenderer
        this.callBack = callBack
        this.playUrl = playUrl
        this.isPublish = isPublish
        this.isShowCamera = isShowCamera
        init()
    }

    private fun init() {
        peerConnectionFactory = getPeerConnectionFactory(context)
        // 这可以通过使用 PeerConnectionFactory 类并调用 createPeerConnection() 方法来创建WebRTC PeerConnection
        Logging.enableLogToDebugOutput(Logging.Severity.LS_NONE)
        peerConnection = peerConnectionFactory!!.createPeerConnection(config, this)

        //拉流
        if (!isPublish) {
            peerConnection!!.addTransceiver(
                MediaStreamTrack.MediaType.MEDIA_TYPE_AUDIO,
                RtpTransceiver.RtpTransceiverInit(RtpTransceiver.RtpTransceiverDirection.RECV_ONLY)
            )
            peerConnection!!.addTransceiver(
                MediaStreamTrack.MediaType.MEDIA_TYPE_VIDEO,
                RtpTransceiver.RtpTransceiverInit(RtpTransceiver.RtpTransceiverDirection.RECV_ONLY)
            )
        }
        //推流
        else {
            peerConnection!!.addTransceiver(
                MediaStreamTrack.MediaType.MEDIA_TYPE_AUDIO,
                RtpTransceiver.RtpTransceiverInit(RtpTransceiver.RtpTransceiverDirection.SEND_ONLY)
            )
            peerConnection!!.addTransceiver(
                MediaStreamTrack.MediaType.MEDIA_TYPE_VIDEO,
                RtpTransceiver.RtpTransceiverInit(RtpTransceiver.RtpTransceiverDirection.SEND_ONLY)
            )

            //设置回声去噪
            WebRtcAudioUtils.setWebRtcBasedAcousticEchoCanceler(true)
            WebRtcAudioUtils.setWebRtcBasedNoiseSuppressor(true)

            // 添加音频轨道
            audioSource = peerConnectionFactory!!.createAudioSource(createAudioConstraints())
            localAudioTrack = peerConnectionFactory!!.createAudioTrack(AUDIO_TRACK_ID, audioSource)
            localAudioTrack!!.setEnabled(true)
            peerConnection!!.addTrack(localAudioTrack)

            //添加视频轨道
            if (isShowCamera) {
                captureAndroid = CameraUtil.createVideoCapture(context)
                surfaceTextureHelper =
                    SurfaceTextureHelper.create("CameraThread", eglBase!!.eglBaseContext)
                videoSource = peerConnectionFactory!!.createVideoSource(false)
                captureAndroid!!.initialize(
                    surfaceTextureHelper,
                    context,
                    videoSource!!.capturerObserver
                )
                captureAndroid!!.startCapture(VIDEO_RESOLUTION_WIDTH, VIDEO_RESOLUTION_HEIGHT, FPS)
                localVideoTrack = peerConnectionFactory!!.createVideoTrack(VIDEO_TRACK_ID, videoSource)
                localVideoTrack!!.setEnabled(true)
                if (surfaceViewRenderer != null) {
                    val videoSink = ProxyVideoSink()
                    videoSink.setTarget(surfaceViewRenderer)
                    localVideoTrack!!.addSink(videoSink)
                }
                peerConnection!!.addTrack(localVideoTrack)
            }
        }
        peerConnection!!.createOffer(this, MediaConstraints())
    }

    fun destroy() {
        if (callBack != null) {
            callBack = null
        }
        if (peerConnection != null) {
            peerConnection!!.dispose()
            peerConnection = null
        }
        if (surfaceTextureHelper != null) {
            surfaceTextureHelper!!.dispose()
            surfaceTextureHelper = null
        }
        if (captureAndroid != null) {
            captureAndroid!!.dispose()
            captureAndroid = null
        }
        if (surfaceViewRenderer != null) {
            surfaceViewRenderer!!.clearImage()
        }
        if (peerConnectionFactory != null) {
            peerConnectionFactory!!.dispose()
            peerConnectionFactory = null
        }
    }

    /**
     * 配置音频参数
     * @return
     */
    private fun createAudioConstraints(): MediaConstraints {
        val audioConstraints = MediaConstraints()
        audioConstraints.mandatory.add(
            MediaConstraints.KeyValuePair(AUDIO_ECHO_CANCELLATION_CONSTRAINT, "true")
        )
        audioConstraints.mandatory.add(
            MediaConstraints.KeyValuePair(AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT, "false")
        )
        audioConstraints.mandatory.add(
            MediaConstraints.KeyValuePair(AUDIO_HIGH_PASS_FILTER_CONSTRAINT, "false")
        )
        audioConstraints.mandatory.add(
            MediaConstraints.KeyValuePair(AUDIO_NOISE_SUPPRESSION_CONSTRAINT, "true")
        )
        return audioConstraints
    }

    /**
     * 交换sdp
     */
    private fun openWebRtc(sdp: String?) {
//        scopeNet {
//            val mediaType: MediaType = "application/json".toMediaTypeOrNull()!!
//            var mBody: RequestBody? = RequestBody.create(mediaType, sdp!!)
//            var result = Post<SdpBean>(playUrl!!) {
//                body = mBody
//                addHeader("Content-Type", "application/json")
//            }.await()
//            reConnCount = 0
//            setRemoteSdp(result.sdp)
//            FLogUtil.e(TAG, "交换sdp: ${Gson().toJson(result)}")
//        }.catch {
//            FLogUtil.e(TAG, "交换sdp 异常: $it")
//            reConnCount++
//            if (reConnCount < 50) {
//                Timer().schedule(300){ //执行的任务
//                    openWebRtc(sdp)
//                }
//            }
//        }

        reConnCount++
        val client: OkHttpClient = OkHttpClient.Builder()
            .connectTimeout(60, TimeUnit.SECONDS)
            .hostnameVerifier { _, _ -> true }.build()
        val mediaType: MediaType = "application/json".toMediaTypeOrNull()!!
        val body = RequestBody.create(mediaType, sdp!!)
        val request: Request = Request.Builder()
            .url(playUrl!!)
            .method("POST", body)
            .addHeader("Content-Type", "application/json")
            .build()
        val call = client.newCall(request)
        call.enqueue(object : Callback {
            override fun onFailure(call: Call, e: IOException) {
                FLogUtil.e(TAG, "交换sdp reConnCount:$reConnCount  异常: ${e.message}")
                Timer().schedule(300) { //执行的任务
                    openWebRtc(sdp)
                }
            }

            @Throws(IOException::class)
            override fun onResponse(call: Call, response: Response) {
                val result = response.body!!.string()
                FLogUtil.e(TAG, "交换sdp: $result")
                var sdpBean = Gson().fromJson(result, SdpBean::class.java)
                if (sdpBean != null && !TextUtils.isEmpty(sdpBean.sdp)) {
                    if (sdpBean.code === 400) {
                        Timer().schedule(300) { //执行的任务
                            openWebRtc(sdp)
                        }
                    } else {
                        reConnCount = 0
                        setRemoteSdp(sdpBean.sdp)
                        if (callBack != null) callBack!!.onSuccess()
                    }
                }
            }
        })
    }

    fun setRemoteSdp(sdp: String?) {
        if (peerConnection != null) {
            val remoteSpd = SessionDescription(SessionDescription.Type.ANSWER, sdp)
            peerConnection!!.setRemoteDescription(this, remoteSpd)
        }
    }

    interface WebRtcCallBack {
        fun onSuccess()
        fun onFail()
    }

    private var callBack: WebRtcCallBack? = null

    init {
        this.context = context.applicationContext
    }

    /**
     * 获取 PeerConnectionFactory
     */
    private fun getPeerConnectionFactory(context: Context): PeerConnectionFactory {
        val initializationOptions: PeerConnectionFactory.InitializationOptions =
            PeerConnectionFactory.InitializationOptions.builder(context)
                .setEnableInternalTracer(true)
                .setFieldTrials("WebRTC-H264HighProfile/Enabled/")
                .createInitializationOptions()
        PeerConnectionFactory.initialize(initializationOptions)

        // 2. 设置编解码方式:默认方法
        val encoderFactory: VideoEncoderFactory = DefaultVideoEncoderFactory(
            eglBase!!.eglBaseContext,
            false,
            true
        )
        val decoderFactory: VideoDecoderFactory =
            DefaultVideoDecoderFactory(eglBase!!.eglBaseContext)

        // 构造Factory
        PeerConnectionFactory.initialize(
            PeerConnectionFactory.InitializationOptions
                .builder(context)
                .createInitializationOptions()
        )
        return PeerConnectionFactory.builder()
            .setOptions(PeerConnectionFactory.Options())
            .setAudioDeviceModule(JavaAudioDeviceModule.builder(context).createAudioDeviceModule())
            .setVideoEncoderFactory(encoderFactory)
            .setVideoDecoderFactory(decoderFactory)
            .createPeerConnectionFactory()
    }

    //修改模式 PlanB无法使用仅接收音视频的配置
    private val config: PeerConnection.RTCConfiguration
        private get() {
            val rtcConfig: PeerConnection.RTCConfiguration =
                PeerConnection.RTCConfiguration(ArrayList())
            //关闭分辨率变换
            rtcConfig.enableCpuOveruseDetection = false
            //修改模式 PlanB无法使用仅接收音视频的配置
            rtcConfig.sdpSemantics = PeerConnection.SdpSemantics.UNIFIED_PLAN
            return rtcConfig
        }

    override fun onCreateSuccess(sdp: SessionDescription) {
        if (sdp.type === SessionDescription.Type.OFFER) {
            //设置setLocalDescription offer返回sdp
            peerConnection!!.setLocalDescription(this, sdp)
            if (!TextUtils.isEmpty(sdp.description)) {
                reConnCount = 0
                openWebRtc(sdp.description)
            }
        }
    }

    override fun onSetSuccess() {}
    override fun onCreateFailure(error: String?) {}
    override fun onSetFailure(error: String?) {}
    override fun onSignalingChange(newState: PeerConnection.SignalingState?) {}
    override fun onIceConnectionChange(newState: PeerConnection.IceConnectionState?) {}
    override fun onIceConnectionReceivingChange(receiving: Boolean) {}
    override fun onIceGatheringChange(newState: PeerConnection.IceGatheringState?) {}
    override fun onIceCandidate(candidate: IceCandidate?) {
        peerConnection!!.addIceCandidate(candidate)
    }
    override fun onIceCandidatesRemoved(candidates: Array<IceCandidate?>?) {
        peerConnection!!.removeIceCandidates(candidates)
    }
    override fun onAddStream(stream: MediaStream?) {}
    override fun onRemoveStream(stream: MediaStream?) {}
    override fun onDataChannel(dataChannel: DataChannel?) {}
    override fun onRenegotiationNeeded() {}
    override fun onAddTrack(receiver: RtpReceiver, mediaStreams: Array<MediaStream?>?) {
        val track: MediaStreamTrack = receiver.track()!!
        if (track is VideoTrack) {
            val remoteVideoTrack: VideoTrack = track
            remoteVideoTrack.setEnabled(true)
            if (surfaceViewRenderer != null && isShowCamera) {
                val videoSink = ProxyVideoSink()
                videoSink.setTarget(surfaceViewRenderer)
                remoteVideoTrack.addSink(videoSink)
            }
        }
    }

    companion object {
        private const val TAG = "WebRTCUtil"
        const val VIDEO_TRACK_ID = "ARDAMSv0"
        const val AUDIO_TRACK_ID = "ARDAMSa0"
        private const val VIDEO_RESOLUTION_WIDTH = 1280
        private const val VIDEO_RESOLUTION_HEIGHT = 720
        private const val FPS = 30
        private const val AUDIO_ECHO_CANCELLATION_CONSTRAINT = "googEchoCancellation"
        private const val AUDIO_AUTO_GAIN_CONTROL_CONSTRAINT = "googAutoGainControl"
        private const val AUDIO_HIGH_PASS_FILTER_CONSTRAINT = "googHighpassFilter"
        private const val AUDIO_NOISE_SUPPRESSION_CONSTRAINT = "googNoiseSuppression"
    }
}

4、MainActivity绘制推流预览页面

在这里插入图片描述
页面很简单,一个输入框加开始推流的按钮,下面就是一个摄像头实时预览画面,由于Compose还没出SurfaceViewRenderer的替代品,这里我们还是用xlm来实现,再用AndroidView来加载即可,具体实现代码如下:

<?xml version="1.0" encoding="utf-8"?>
<org.webrtc.SurfaceViewRenderer xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/surface_view"
    android:layout_width="wrap_content"
    android:layout_height="wrap_content">
</org.webrtc.SurfaceViewRenderer>
/**
 * Created by 玉念聿辉.
 * Use: WebRTC Demo
 * Date: 2023/5/9
 * Time: 11:23
 */
class MainActivity : ComponentActivity() {
    private val permissionArray = arrayOf(
        Manifest.permission.RECORD_AUDIO,
        Manifest.permission.CAMERA,
        Manifest.permission.WRITE_EXTERNAL_STORAGE,
        Manifest.permission.READ_EXTERNAL_STORAGE
    )
    private var mEglBase: EglBase = EglBase.create()
    private var webRtcUtil1: WebRTCUtil? = null
    private var pushUrl =
        mutableStateOf("https://192.168.1.172/index/api/webrtc?app=live&stream=test&type=push")
    private var surfaceViewRenderer1: SurfaceViewRenderer? = null

    /**
     * 开始推流
     */
    private fun doPush() {
        if (TextUtils.isEmpty(pushUrl.value)) {
            Toast.makeText(this@MainActivity, "推流地址为空!", Toast.LENGTH_SHORT).show()
            return
        }
        if (webRtcUtil1 != null) {
            webRtcUtil1!!.destroy()
        }
        webRtcUtil1 = WebRTCUtil(this@MainActivity)
        webRtcUtil1!!.create(
            mEglBase,
            surfaceViewRenderer1,
            isPublish = true,
            isShowCamera = true,
            playUrl = pushUrl.value,
            callBack = object : WebRTCUtil.WebRtcCallBack {
                override fun onSuccess() {}
                override fun onFail() {}
            })
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        FLogUtil.init(this, true)//初始化日志工具
        app = this //初始化net
        getPermissions()//检测权限
        webRtcUtil1 = WebRTCUtil(this@MainActivity)

        setContent {
            WebRTCComposeDemoTheme {
                Column(modifier = Modifier.fillMaxWidth()) {
                    //推流地址输入框以及开始推流按钮
                    Row(
                        modifier = Modifier.fillMaxWidth(),
                        verticalAlignment = Alignment.CenterVertically
                    ) {
                        Box(
                            modifier = Modifier.weight(1f)
                        ) {
                            TextField(
                                value = pushUrl.value,
                                onValueChange = {
                                    pushUrl.value = it
                                },
                                textStyle = TextStyle(
                                    color = Color(0xFF000000),
                                    fontSize = 14.sp
                                ), colors = TextFieldDefaults.textFieldColors(
                                    backgroundColor = Color(0x00FFFFFF),
                                    disabledIndicatorColor = Color.Transparent,
                                    errorIndicatorColor = Color.Transparent,
                                    focusedIndicatorColor = Color.Transparent,
                                    unfocusedIndicatorColor = Color.Transparent
                                ),
                                keyboardOptions = KeyboardOptions(keyboardType = KeyboardType.Text),
                                placeholder = {
                                    Text(
                                        text = "请输入推流地址",
                                        style = TextStyle(
                                            fontSize = 14.sp,
                                            color = Color(0xffc0c4cc)
                                        )
                                    )
                                }
                            )
                        }

                        Box(
                            modifier = Modifier
                                .clickable {
                                    doPush()//开始推流
                                }
                                .width(80.dp)
                                .background(
                                    Color.White,
                                    RoundedCornerShape(5.dp)
                                )
                                .border(
                                    1.dp,
                                    Color(0xFF000000),
                                    shape = RoundedCornerShape(5.dp)
                                )
                                .padding(5.dp),
                            contentAlignment = Alignment.Center
                        ) {
                            Text(text = "推流")
                        }
                    }

                    //推流预览部分
                    Box(
                        modifier = Modifier
                            .fillMaxWidth()
                            .height(300.dp)
                    ) {
                        surfaceViewRenderer1 = mSurfaceViewRenderer(mEglBase, webRtcUtil1!!)
                        AndroidView({ surfaceViewRenderer1!! }) { videoView ->
                            CoroutineScope(Dispatchers.Main).launch {
                                //根据视频大小缩放surfaceViewRenderer控件
                                var screenSize = "480-640"
                                var screenSizeD = 720 / 1280.0
                                val screenSizeS: Array<String> =
                                    screenSize.split("-").toTypedArray()
                                screenSizeD =
                                    screenSizeS[0].toInt() / (screenSizeS[1].toInt() * 1.0)
                                var finalScreenSizeD = screenSizeD
                                var vto = videoView.viewTreeObserver
                                vto.addOnPreDrawListener {
                                    var width: Int = videoView.measuredWidth
                                    var height: Int = (finalScreenSizeD * width).toInt()
                                    //获取到宽度和高度后,可用于计算
                                    var layoutParams = videoView.layoutParams
                                    layoutParams.height = height
                                    videoView.layoutParams = layoutParams
                                    true
                                }
                            }
                        }
                    }
                }
            }
        }
    }

    /**
     * 申请权限
     */
    private fun getPermissions() {
        XXPermissions.with(this@MainActivity)
            .permission(permissionArray)
            .request(object : OnPermissionCallback {
                override fun onGranted(@NonNull permissions: List<String>, allGranted: Boolean) {
                    if (!allGranted) {
                        Toast.makeText(this@MainActivity, "请打开必要权限,以免影响正常使用!", Toast.LENGTH_LONG)
                            .show()
                        return
                    }
                }

                override fun onDenied(@NonNull permissions: List<String>, doNotAskAgain: Boolean) {
                    if (doNotAskAgain) {
                        Toast.makeText(this@MainActivity, "被永久拒绝授权,请手动授予权限", Toast.LENGTH_LONG)
                            .show()
                        // 如果是被永久拒绝就跳转到应用权限系统设置页面
                        XXPermissions.startPermissionActivity(this@MainActivity, permissions)
                    } else {
                        Toast.makeText(this@MainActivity, "获取权限失败", Toast.LENGTH_LONG).show()
                    }
                }
            })
    }
}

/**
 * 由于Compose还没出SurfaceViewRenderer的替代品,这里我们还是用xlm来实现
 */
@Composable
fun mSurfaceViewRenderer(mEglBase: EglBase, webRtcUtil1: WebRTCUtil): SurfaceViewRenderer {
    val context = LocalContext.current
    val surfaceViewRenderer = remember {
        SurfaceViewRenderer(context).apply {
            id = R.id.surface_view
        }
    }
    //Makes MapView follow the lifecycle of this composable
    val lifecycleObserver = rememberMapLifecycleObserver(surfaceViewRenderer, mEglBase, webRtcUtil1)
    val lifecycle = LocalLifecycleOwner.current.lifecycle
    DisposableEffect(lifecycle) {
        lifecycle.addObserver(lifecycleObserver)
        onDispose {
            lifecycle.removeObserver(lifecycleObserver)
        }
    }
    return surfaceViewRenderer
}

@Composable
fun rememberMapLifecycleObserver(
    surfaceViewRenderer: SurfaceViewRenderer,
    mEglBase: EglBase, webRtcUtil1: WebRTCUtil
): LifecycleEventObserver =
    remember(surfaceViewRenderer) {
        LifecycleEventObserver { _, event ->
            when (event) {
                Lifecycle.Event.ON_CREATE -> {
                    surfaceViewRenderer.init(mEglBase.eglBaseContext, null)
                    surfaceViewRenderer.setScalingType(RendererCommon.ScalingType.SCALE_ASPECT_FILL)
                    surfaceViewRenderer.setEnableHardwareScaler(true)
                    surfaceViewRenderer.setZOrderMediaOverlay(true)
                }
                Lifecycle.Event.ON_START -> {
                }
                Lifecycle.Event.ON_RESUME -> {
                }
                Lifecycle.Event.ON_PAUSE -> {
                }
                Lifecycle.Event.ON_STOP -> {
                }
                Lifecycle.Event.ON_DESTROY -> {
                    webRtcUtil1.destroy()
                }
                else -> throw IllegalStateException()
            }
        }
    }

到这里一个简单的WebRTC推流Demo就已经完成,我们把代码跑起来看一下运行效果。
在这里插入图片描述

第四章到这里就结束了,下节继续记录Android如何拉取SRS和ZLMediaKit流,占用您的垃圾时间了,实在对不住

THE END


感谢查阅
玉念聿辉:编辑

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/509403.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

VirtualBox+Vagrant

1.下载VirtualBox 1.下载 VirtualBox官网下载 下载过程无特别注意事项&#xff0c;注意调整到非 C 盘位置即可&#xff0c;其他部分皆为默认设置 2.修改虚拟电脑存储位置 1.将C:\Users\用户名下的VirtualBox VMs文件夹复制到自定义目标文件夹中 2.打开VirtualBox,在全局设置…

嵌入式TCP/IP协议栈-LwIP

文章目录 LWIP是什么基础知识OSI七层模型TCP/IP五层模型LwIP的模型常见协议ARP协议ICMP协议DHCP协议DNS协议 LwIP APISocket大端模式与小端模式地址字节序IP地址转换 网络并发 LWIP是什么 LWIP是一个轻量级的TCP/IP协议栈&#xff0c;其全称为Lightweight IP&#xff0c;它专门…

UNIX网络编程卷一 学习笔记 第十二章 IPv4与IPv6的互操作性

未来数年内&#xff0c;因特网也许会逐渐从IPv4过渡到IPv6&#xff0c;在过渡阶段&#xff0c;基于IPv4的现有应用能与基于IPv6的全新应用协同工作非常重要。例如&#xff0c;厂商不应只提供仅能与IPv6 telnet服务器程序协同工作的telnet客户程序&#xff0c;而应该提供既能与I…

环境搭建:Visual Studio的安装和创建C++项目

Visual Studio的安装和创建C项目 引言前言下载Visual Studio安装包安装Visual Studio创建项目小结 引言 &#x1f4a1; 作者简介&#xff1a;专注于C/C高性能程序设计和开发&#xff0c;理论与代码实践结合&#xff0c;让世界没有难学的技术。 &#x1f449; &#x1f396;️ C…

第五十章 Unity Input Manager 输入系统(上)

Unity的输入系统支持多种输入设备&#xff0c;比如键盘和鼠标&#xff0c;游戏手柄&#xff0c;触摸屏&#xff0c;VR和AR控制器等等。Unity 通过两个独立的系统提供输入支持&#xff1a;第一&#xff0c;输入管理器 (Input Manager) 是 Unity 核心平台的一部分&#xff0c;默认…

数据可视化一、ECharts

零、文章目录 数据可视化一、ECharts 1、数据可视化 &#xff08;1&#xff09;数据可视化 数据可视化主要目的&#xff1a;借助于图形化手段&#xff0c;清晰有效地传达与沟通信息。数据可视化可以把数据从冰冷的数字转换成图形&#xff0c;揭示蕴含在数据中的规律和道理。…

TGANet:用于改进息肉分割的文本引导注意力

文章目录 TGANet: Text-Guided Attention for Improved Polyp Segmentation摘要本文方法编码器模块Feature Enhancement ModuleLabel AttentiondecoderMulti-scale Feature Aggregation损失函数 实验结果 TGANet: Text-Guided Attention for Improved Polyp Segmentation 摘要…

2023年湖北建设厅特种工报名需要什么条件?报名流程是什么?

2023年湖北建设厅特种工报名需要什么条件&#xff1f;报名流程是什么&#xff1f; 建设厅特种工现在已经开始报名考试安排计划了&#xff0c;想要报考建设厅特种工的现在可以了解一下具体的流程以及相关信息哟。为自己报名考试做准备&#xff0c;下面启程别就来给您介绍一下建设…

从入门到精通:接口自动化测试中加密、解密和Sign签名的完整指南

【专业解读】接口自动化测试安全机制剖析&#xff1a;加密、解密和sign签名技术原理与应用&#xff01; 目录 摘要&#xff1a; 1. 加密 2. 解密 3. Sign签名 4. 示例 摘要&#xff1a; 在进行接口自动化测试时&#xff0c;加密、解密和sign签名是非常重要的步骤。这些过…

基于单片机的数字频率计设计

数字频率计概述 数字频率计是计算机、通讯设备、音频视频等科研生产领域不可缺少的测量仪器。它是一种用十进制数字显示被测信号频率的数字测量仪器。它的基本功能是测量正弦信号&#xff0c;方波信号及其他各种单位时间内变化的物理量。在进行模拟、数字电路的设计、安装、调试…

基于AT89C52单片机的温度检测设计与仿真

点击链接获取Keil源码与Project Backups仿真图: https://download.csdn.net/download/qq_64505944/87770153 源码获取 主要内容: 本设计是基于52系列的单片机进行的设计,可以完成温度的测控,可以实现实际温度与设定温度区域的比较,并在LED上相应的显示结果。设计过程在硬…

算法 DAY57 动态规划13 647. 回文子串 516.最长回文子序列

647. 回文子串 暴力解法&#xff1a;两层for循环&#xff0c;再加一个判断是否是回文子串&#xff0c;时间复杂度O(n3) 五部曲 1、 判断一个子字符串&#xff08;字符串的下表范围[i,j]&#xff09;是否回文&#xff0c;依赖于&#xff0c;子字符串&#xff08;下表范围[i 1,…

CLion安装(详细步骤+截图)

目录 一、CLion-2021.1.3.exe 下载 二、运行环境mingw-w64压缩包下载 三、 安装插件 ---- ide-eval-resetter-2.1.13压缩包下载 一、CLion-2021.1.3.exe 下载 Other Versions - CLion (jetbrains.com) 1、下载 2、更改路径 &#xff08;不要放在含有中文的路径下&a…

【Java面试】Java基础-集合相关知识点(1)

文章目录 1. 集合有哪些类&#xff1f;2. ArrayList3. Map主要有哪些类&#xff1f; 容器主要包括 Collection 和 Map 两种&#xff0c;Collection 存储着对象的集合&#xff0c;而 Map 存储着键值对(两个对象)的映射表。 1. 集合有哪些类&#xff1f; 集合是一组相关对象的容…

ThreadLocal底层源码解析

线程隔离&#xff0c;保证多线性访问安全 每个线程拿到的值私有&#xff0c;相互不干扰 ThreadLocal是JDK包提供的&#xff0c;它提供线程本地变量&#xff0c;如果创建一乐ThreadLocal变量&#xff0c;那么访问这个变量的每个线程都会有这个变量的一个副本&#xff0c;在实际…

跳槽? 我只想多赚点罢了

前言 五一过后也就也就意味着今年的金三银四跳槽季正式结束了&#xff0c;不知道大家是否拿到了offer&#xff0c;面对金三银四的招聘旺季&#xff0c;如果没有精心准备那笔者认为那是对自己不负责任&#xff1b;就我们测试员来说&#xff0c;多数的公司总体上面试都是以自我介…

表的存储原理(数据库)

目录 一、内部存储概述 二、 数据行&#xff08;记录&#xff09;结构 1、定长记录 2、变长记录 一、内部存储概述 表是有关某个特定实例的数据集合&#xff0c;在关系数据库中处于核心地位。 创建一个表&#xff0c;就会有一行或多行插入到用来管理这个表的多个系统表里。…

如何压缩照片大小不大于200k

如何压缩照片大小不大于200k&#xff1f;有时候需要压缩照片大小不大于200k是因为在一些场景下&#xff0c;上传或传输大文件会受到限制&#xff0c;例如通过电子邮件发送、上传到云存储空间等等。在这种情况下&#xff0c;压缩照片可以让图片更容易地传输和分享&#xff0c;并…

基于simulink进行音频波束成形系统的多核仿真

一、前言 此示例展示了 Simulink中的音频波束成形系统仿真模型如何使用数据流域提高性能。它使用 Simulink 中的数据流域自动将通信系统的数据驱动部分划分为多个线程&#xff0c;从而通过在桌面的多个内核上执行模拟来提高仿真的性能。 二、介绍 数据流执行域允许您在计算密集…

Docker Overlay2占用大量磁盘空间解决

问题 最近项目的jenkins编译时报错 FATAL: Unable to produce a script filejava.io.IOException: No space left on deviceat java.io.UnixFileSystem.createFileExclusively(Native Method)at java.io.File.createTempFile(File.java:2024)at hudson.FilePath$CreateTextTem…