You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* The user ID.In the local user's callback, uid is 0.In the remote users' callback, uid is the user ID of a remote user whose instantaneous volume is the highest.
2245
+
* The user ID.In the local user's callback, uid = 0.In the remote users' callback, uid is the user ID of a remote user whose instantaneous volume is one of the three highest.
2245
2246
*/
2246
2247
uid?: number;
2247
2248
/**
2248
-
* The volume of the user. The value ranges between 0 (the lowest volume) and 255 (the highest volume). If the local user enables audio capturing and calls muteLocalAudioStream and set it as true to mute, the value of volume indicates the volume of locally captured audio signal.
2249
+
* The volume of the user. The value ranges between 0 (lowest volume) and 255 (highest volume).
* 1: The streaming server and CDN server are being connected.
2393
+
* 1: The SDK is connecting to Agora's streaming server and the CDN server.
2393
2394
*/
2394
2395
RtmpStreamPublishStateConnecting=1,
2395
2396
/**
2396
2397
* 2: The RTMP or RTMPS streaming publishes. The SDK successfully publishes the RTMP or RTMPS streaming and returns this state.
2397
2398
*/
2398
2399
RtmpStreamPublishStateRunning=2,
2399
2400
/**
2400
-
* 3: The RTMP or RTMPS streaming is recovering. When exceptions occur to the CDN, or the streaming is interrupted, the SDK tries to resume RTMP or RTMPS streaming and returns this state.If the SDK successfully resumes the streaming, RtmpStreamPublishStateRunning(2) returns.If the streaming does not resume within 60 seconds or server errors occur, RtmpStreamPublishStateFailure(4) returns. If you feel that 60 seconds is too long, you can also actively try to reconnect.
2401
+
* 3: The RTMP or RTMPS streaming is recovering. When exceptions occur to the CDN, or the streaming is interrupted, the SDK tries to resume RTMP or RTMPS streaming and returns this state.If the SDK successfully resumes the streaming, RtmpStreamPublishStateRunning(2) returns.
2402
+
* If the streaming does not resume within 60 seconds or server errors occur, RtmpStreamPublishStateFailure(4) returns. You can also reconnect to the server by calling the stopRtmpStream method.
2401
2403
*/
2402
2404
RtmpStreamPublishStateRecovering=3,
2403
2405
/**
2404
-
* 4: The RTMP or RTMPS streaming fails. After a failure, you can troubleshoot the cause of the error through the returned error code.
2406
+
* 4: The RTMP or RTMPS streaming fails. See the errCode parameter for the detailed error information.
2405
2407
*/
2406
2408
RtmpStreamPublishStateFailure=4,
2407
2409
/**
2408
-
* 5: The SDK is disconnecting from the Agora streaming server and CDN. When you call stopRtmpStream to stop the Media Push normally, the SDK reports the Media Push state as RtmpStreamPublishStateDisconnecting and RtmpStreamPublishStateIdle in sequence.
2410
+
* 5: The SDK is disconnecting from the Agora streaming server and CDN. When you call stopRtmpStream to stop the streaming normally, the SDK reports the streaming state as RtmpStreamPublishStateDisconnecting and RtmpStreamPublishStateIdle in sequence.
* 11: The user role is not host, so the user cannot use the CDN live streaming function. Check your application code logic.
2464
+
* 11: The user role is not host, so the user cannot use the CDN live streaming function. Check your app code logic.
2463
2465
*/
2464
2466
RtmpStreamPublishErrorNotBroadcaster=11,
2465
2467
/**
2466
-
* 13: The updateRtmpTranscoding method is called to update the transcoding configuration in a scenario where there is streaming without transcoding. Check your application code logic.
2468
+
* 13: The updateRtmpTranscoding or setLiveTranscoding method is called to update the transcoding configuration in a scenario where there is streaming without transcoding. Check your application code logic.
Copy file name to clipboardExpand all lines: ts/Private/AgoraMediaBase.ts
+3-2
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,6 @@
1
1
import'./extension/AgoraMediaBaseExtension';
2
2
import{EncodedVideoFrameInfo}from'./AgoraBase';
3
+
3
4
/**
4
5
* The capture type of the custom video source.
5
6
*/
@@ -910,7 +911,7 @@ export class UserAudioSpectrumInfo {
910
911
exportinterfaceIAudioSpectrumObserver{
911
912
/**
912
913
* Gets the statistics of a local audio spectrum.
913
-
* After successfully calling registerAudioSpectrumObserver to implement the onLocalAudioSpectrum callback in IAudioSpectrumObserver and calling enableAudioSpectrumMonitor to enable audio spectrum monitoring, the SDK will trigger the callback as the time interval you set to report the received remote audio data spectrum.
914
+
* After successfully calling registerAudioSpectrumObserver to implement the onLocalAudioSpectrumcallback in IAudioSpectrumObserver and calling enableAudioSpectrumMonitor to enable audio spectrum monitoring, the SDK will trigger the callback as the time interval you set to report the received remote audio data spectrum.
914
915
*
915
916
* @param data The audio spectrum data of the local user. See AudioSpectrumData .
* Occurs each time the SDK receives a video frame captured by the local camera.
982
-
* After you successfully register the video frame observer, the SDK triggers this callback each time it receives a video frame. In this callback, you can get the video data captured by the local camera. You can then pre-process the data according to your scenarios.Once the pre-processing is complete, you can directly modify videoFrame in this callback, and set the return value to true to send the modified video data to the SDK.The video data that this callback gets has not been pre-processed, and is not watermarked, cropped, rotated or beautified.If the video data type you get is RGBA, the SDK does not support processing the data of the alpha channel.
983
+
* After you successfully register the video frame observer, the SDK triggers this callback each time it receives a video frame. In this callback, you can get the video data captured by the local camera. You can then pre-process the data according to your scenarios.After pre-processing, you can send the processed video data back to the SDK through this callback.The video data that this callback gets has not been pre-processed, and is not watermarked, cropped, rotated or beautified.If the video data type you get is RGBA, the SDK does not support processing the data of the alpha channel.
983
984
*
984
985
* @param videoFrame The video frame. See VideoFrame .The default value of the video frame data format obtained through this callback is as follows:macOS: YUV 420Windows: YUV 420
Copy file name to clipboardExpand all lines: ts/Private/IAgoraMediaEngine.ts
+2-1
Original file line number
Diff line number
Diff line change
@@ -13,6 +13,7 @@ import {
13
13
IVideoEncodedFrameObserver,
14
14
IVideoFrameObserver,
15
15
}from'./AgoraMediaBase';
16
+
16
17
/**
17
18
* The channel mode.
18
19
*/
@@ -126,7 +127,7 @@ export abstract class IMediaEngine {
126
127
* Call this method before joining a channel.
127
128
*
128
129
* @param enabled Whether to enable the external audio source:true: Enable the external audio source.false: (Default) Disable the external audio source.
129
-
* @param sampleRate The sample rate (Hz) of the external audio source which can be set as 8000, 16000, 32000, 44100, or 48000.
130
+
* @param sampleRate The sample rate (Hz) of the external audio which can be set as 8000, 16000, 32000, 44100, or 48000.
130
131
* @param channels The number of channels of the external audio source, which can be set as 1 (Mono) or 2 (Stereo).
131
132
* @param sourceNumber The number of external audio sources. The value of this parameter should be larger than 0. The SDK creates a corresponding number of custom audio tracks based on this parameter value and names the audio tracks starting from 0. In ChannelMediaOptions , you can set publishCustomAudioSourceId to the audio track ID you want to publish.
132
133
* @param localPlayback Whether to play the external audio source:true: Play the external audio source.false: (Default) Do not play the external source.
@@ -1038,15 +1039,15 @@ export class ChannelMediaOptions {
1038
1039
*/
1039
1040
publishCameraTrack?: boolean;
1040
1041
/**
1041
-
* Whether to publish the video captured by the second camera:true: Publish the video captured by the second camera.false: (Default) Do not publish the video captured by the second camera.
1042
+
* @ignore
1042
1043
*/
1043
1044
publishSecondaryCameraTrack?: boolean;
1044
1045
/**
1045
1046
* Whether to publish the audio captured by the microphone:true: (Default) Publish the audio captured by the microphone.false: Do not publish the audio captured by the microphone.
1046
1047
*/
1047
1048
publishMicrophoneTrack?: boolean;
1048
1049
/**
1049
-
* @ignore
1050
+
* Whether to publish the video captured from the screen:true: Publish the video captured from the screen.false: (Default) Do not publish the video captured from the screen.This parameter applies to Android and iOS only.
* By default, this callback is disabled. You can enable it by calling enableAudioVolumeIndication . Once this callback is enabled and users send streams in the channel, the SDK triggers the onAudioVolumeIndication callback according to the time interval set in enableAudioVolumeIndication. The SDK triggers two independent onAudioVolumeIndication callbacks simultaneously, which separately report the volume information of the local user who sends a stream and the remote users (up to three) whose instantaneous volume is the highest.Once this callback is enabled, if the local user calls the muteLocalAudioStream method to mute, the SDK continues to report the volume indication of the local user.If a remote user whose volume is one of the three highest in the channel stops publishing the audio stream for 20 seconds, the callback excludes this user's information; if all remote users stop publishing audio streams for 20 seconds, the SDK stops triggering the callback for remote users.
1370
+
* By default, this callback is disabled. You can enable it by calling enableAudioVolumeIndication . Once this callback is enabled and users send streams in the channel, the SDK triggers the onAudioVolumeIndication callback according to the time interval set in enableAudioVolumeIndication. The SDK triggers two independent onAudioVolumeIndication callbacks simultaneously, which separately report the volume information of the local user who sends a stream and the remote users (up to three) whose instantaneous volume is the highest.Once this callback is enabled, if the local user calls the muteLocalAudioStream method for muting, the SDK continues to report the volume indication of the local user. In the callbacks triggered, the volume information about the local user is 0 If a remote user whose volume is one of the three highest in the channel stops publishing the audio stream for 20 seconds, the callback excludes this user's information; if all remote users stop publishing audio streams for 20 seconds, the SDK stops triggering the callback for remote users.
1370
1371
*
1371
1372
* @param connection The connection information. See RtcConnection .
1372
-
* @param speakers The volume information of the users. See AudioVolumeInfo . An empty speakers array in the callback indicates that no remote user is in the channel or is sending a stream.
1373
+
* @param speakers The volume information of the users, see AudioVolumeInfo . An empty speakers array in the callback indicates that no remote user is in the channel or is sending a stream.
1373
1374
* @param speakerNumber The total number of users.In the callback for the local user, if the local user is sending streams, the value of speakerNumber is 1.In the callback for remote users, the value range of speakerNumber is [0,3]. If the number of remote users who send streams is greater than or equal to three, the value of speakerNumber is 3.
1374
1375
* @param totalVolume The volume of the speaker. The value range is [0,255].In the callback for the local user, totalVolume is the volume of the local user who sends a stream.In the callback for remote users, totalVolume is the sum of the volume of all remote users (up to three) whose instantaneous volume is the highest.
* When the state of Media Push changes, the SDK triggers this callback and reports the URL address and the current state of the Media Push. This callback indicates the state of the Media Push. When exceptions occur, you can troubleshoot issues by referring to the detailed error descriptions in the error code parameter.
2052
+
* Occurs when the media push state changes.
2053
+
* When the media push state changes, the SDK triggers this callback and reports the URL address and the current state of the media push. This callback indicates the state of the media push. When exceptions occur, you can troubleshoot issues by referring to the detailed error descriptions in the error code parameter.
2053
2054
*
2054
-
* @param url The URL address where the state of the Media Push changes.
2055
-
* @param state The current state of the Media Push. See RtmpStreamPublishState .
2056
-
* @param errCode The detailed error information for the Media Push. See RtmpStreamPublishErrorType .
2055
+
* @param url The URL address where the state of the media push changes.
2056
+
* @param state The current state of the media push. See RtmpStreamPublishState .
2057
+
* @param errCode The detailed error information for the media push. See RtmpStreamPublishErrorType .
Copy file name to clipboardExpand all lines: ts/Private/IAudioDeviceManager.ts
+2-1
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,6 @@
1
1
import'./extension/IAudioDeviceManagerExtension';
2
2
import{AudioDeviceInfo}from'./IAgoraRtcEngine';
3
+
3
4
/**
4
5
* The maximum length of the device ID.
5
6
*/
@@ -77,7 +78,7 @@ export abstract class IAudioDeviceManager {
77
78
* Sets the audio capture device.
78
79
* You can call this method to change the audio route currently being used, but this does not change the default audio route. For example, if the default audio route is microphone, you call this method to set the audio route as bluetooth earphones before joinging a channel and then start a device test, the SDK conducts device test on the bluetooth earphones. After the device test is completed and you join a channel, the SDK still uses the microphone for audio capturing.
79
80
*
80
-
* @param deviceId The ID of the audio capture device. You can get the media player ID by calling enumerateRecordingDevices . Connecting or disconnecting the audio device does not change the value of deviceId.The maximum length is MaxDeviceIdLengthType .
81
+
* @param deviceId The ID of the audio capture device. You can get the Device ID by calling enumerateRecordingDevices . Connecting or disconnecting the audio device does not change the value of deviceId.The maximum length is MaxDeviceIdLengthType .
0 commit comments