Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AUTO] Generate code by terra #980

Merged
merged 1 commit into from
Apr 13, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 22 additions & 20 deletions ts/Private/AgoraBase.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import './extension/AgoraBaseExtension';
import { RenderModeType, VideoSourceType } from './AgoraMediaBase';

/**
* The channel profile.
*/
Expand Down Expand Up @@ -1241,11 +1242,11 @@ export class CodecCapInfo {
/**
* @ignore
*/
codec_type?: VideoCodecType;
codecType?: VideoCodecType;
/**
* @ignore
*/
codec_cap_mask?: number;
codecCapMask?: number;
}

/**
Expand Down Expand Up @@ -2241,11 +2242,11 @@ export enum RemoteVideoDownscaleLevel {
*/
export class AudioVolumeInfo {
/**
* The user ID.In the local user's callback, uid is 0.In the remote users' callback, uid is the user ID of a remote user whose instantaneous volume is the highest.
* The user ID.In the local user's callback, uid = 0.In the remote users' callback, uid is the user ID of a remote user whose instantaneous volume is one of the three highest.
*/
uid?: number;
/**
* The volume of the user. The value ranges between 0 (the lowest volume) and 255 (the highest volume). If the local user enables audio capturing and calls muteLocalAudioStream and set it as true to mute, the value of volume indicates the volume of locally captured audio signal.
* The volume of the user. The value ranges between 0 (lowest volume) and 255 (highest volume).
*/
volume?: number;
/**
Expand Down Expand Up @@ -2389,23 +2390,24 @@ export enum RtmpStreamPublishState {
*/
RtmpStreamPublishStateIdle = 0,
/**
* 1: The streaming server and CDN server are being connected.
* 1: The SDK is connecting to Agora's streaming server and the CDN server.
*/
RtmpStreamPublishStateConnecting = 1,
/**
* 2: The RTMP or RTMPS streaming publishes. The SDK successfully publishes the RTMP or RTMPS streaming and returns this state.
*/
RtmpStreamPublishStateRunning = 2,
/**
* 3: The RTMP or RTMPS streaming is recovering. When exceptions occur to the CDN, or the streaming is interrupted, the SDK tries to resume RTMP or RTMPS streaming and returns this state.If the SDK successfully resumes the streaming, RtmpStreamPublishStateRunning(2) returns.If the streaming does not resume within 60 seconds or server errors occur, RtmpStreamPublishStateFailure(4) returns. If you feel that 60 seconds is too long, you can also actively try to reconnect.
* 3: The RTMP or RTMPS streaming is recovering. When exceptions occur to the CDN, or the streaming is interrupted, the SDK tries to resume RTMP or RTMPS streaming and returns this state.If the SDK successfully resumes the streaming, RtmpStreamPublishStateRunning(2) returns.
* If the streaming does not resume within 60 seconds or server errors occur, RtmpStreamPublishStateFailure(4) returns. You can also reconnect to the server by calling the stopRtmpStream method.
*/
RtmpStreamPublishStateRecovering = 3,
/**
* 4: The RTMP or RTMPS streaming fails. After a failure, you can troubleshoot the cause of the error through the returned error code.
* 4: The RTMP or RTMPS streaming fails. See the errCode parameter for the detailed error information.
*/
RtmpStreamPublishStateFailure = 4,
/**
* 5: The SDK is disconnecting from the Agora streaming server and CDN. When you call stopRtmpStream to stop the Media Push normally, the SDK reports the Media Push state as RtmpStreamPublishStateDisconnecting and RtmpStreamPublishStateIdle in sequence.
* 5: The SDK is disconnecting from the Agora streaming server and CDN. When you call stopRtmpStream to stop the streaming normally, the SDK reports the streaming state as RtmpStreamPublishStateDisconnecting and RtmpStreamPublishStateIdle in sequence.
*/
RtmpStreamPublishStateDisconnecting = 5,
}
Expand All @@ -2415,7 +2417,7 @@ export enum RtmpStreamPublishState {
*/
export enum RtmpStreamPublishErrorType {
/**
* 0: The RTMP or RTMPS streaming has not started or has ended.
* 0: The RTMP or RTMPS streaming publishes successfully.
*/
RtmpStreamPublishErrorOk = 0,
/**
Expand All @@ -2427,19 +2429,19 @@ export enum RtmpStreamPublishErrorType {
*/
RtmpStreamPublishErrorEncryptedStreamNotAllowed = 2,
/**
* 3: Timeout for the RTMP or RTMPS streaming.
* 3: Timeout for the RTMP or RTMPS streaming. Try to publish the streaming again.
*/
RtmpStreamPublishErrorConnectionTimeout = 3,
/**
* 4: An error occurs in Agora's streaming server.
* 4: An error occurs in Agora's streaming server. Try to publish the streaming again.
*/
RtmpStreamPublishErrorInternalServerError = 4,
/**
* 5: An error occurs in the CDN server.
*/
RtmpStreamPublishErrorRtmpServerError = 5,
/**
* 6: The RTMP or RTMPS streaming publishes too frequently.
* 6: The RTMP or RTMPS streaming publishing requests are too frequent.
*/
RtmpStreamPublishErrorTooOften = 6,
/**
Expand All @@ -2459,11 +2461,11 @@ export enum RtmpStreamPublishErrorType {
*/
RtmpStreamPublishErrorFormatNotSupported = 10,
/**
* 11: The user role is not host, so the user cannot use the CDN live streaming function. Check your application code logic.
* 11: The user role is not host, so the user cannot use the CDN live streaming function. Check your app code logic.
*/
RtmpStreamPublishErrorNotBroadcaster = 11,
/**
* 13: The updateRtmpTranscoding method is called to update the transcoding configuration in a scenario where there is streaming without transcoding. Check your application code logic.
* 13: The updateRtmpTranscoding or setLiveTranscoding method is called to update the transcoding configuration in a scenario where there is streaming without transcoding. Check your application code logic.
*/
RtmpStreamPublishErrorTranscodingNoMixStream = 13,
/**
Expand All @@ -2475,25 +2477,25 @@ export enum RtmpStreamPublishErrorType {
*/
RtmpStreamPublishErrorInvalidAppid = 15,
/**
* 16: Your project does not have permission to use streaming services. Refer to Media Push to enable the Media Push permission.
* @ignore
*/
RtmpStreamPublishErrorInvalidPrivilege = 16,
/**
* 100: The streaming has been stopped normally. After you stop the media push, the SDK returns this value.
* 100: The streaming has been stopped normally. After you call stopRtmpStream to stop streaming, the SDK returns this value.
*/
RtmpStreamUnpublishErrorOk = 100,
}

/**
* Events during the Media Push.
* Events during the media push.
*/
export enum RtmpStreamingEvent {
/**
* 1: An error occurs when you add a background image or a watermark image in the Media Push.
* 1: An error occurs when you add a background image or a watermark image in the media push.
*/
RtmpStreamingEventFailedLoadImage = 1,
/**
* 2: The streaming URL is already being used for Media Push. If you want to start new streaming, use a new streaming URL.
* 2: The streaming URL is already being used for CDN live streaming. If you want to start new streaming, use a new streaming URL.
*/
RtmpStreamingEventUrlAlreadyInUse = 2,
/**
Expand Down Expand Up @@ -3155,7 +3157,7 @@ export class VideoCanvas {
*/
sourceType?: VideoSourceType;
/**
* The ID of the media player. You can get the media player ID by calling getMediaPlayerId .
* The ID of the media player. You can get the Device ID by calling getMediaPlayerId .
*/
mediaPlayerId?: number;
/**
Expand Down
5 changes: 3 additions & 2 deletions ts/Private/AgoraMediaBase.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import './extension/AgoraMediaBaseExtension';
import { EncodedVideoFrameInfo } from './AgoraBase';

/**
* The capture type of the custom video source.
*/
Expand Down Expand Up @@ -910,7 +911,7 @@ export class UserAudioSpectrumInfo {
export interface IAudioSpectrumObserver {
/**
* Gets the statistics of a local audio spectrum.
* After successfully calling registerAudioSpectrumObserver to implement the onLocalAudioSpectrum callback in IAudioSpectrumObserver and calling enableAudioSpectrumMonitor to enable audio spectrum monitoring, the SDK will trigger the callback as the time interval you set to report the received remote audio data spectrum.
* After successfully calling registerAudioSpectrumObserver to implement the onLocalAudioSpectrumcallback in IAudioSpectrumObserver and calling enableAudioSpectrumMonitor to enable audio spectrum monitoring, the SDK will trigger the callback as the time interval you set to report the received remote audio data spectrum.
*
* @param data The audio spectrum data of the local user. See AudioSpectrumData .
*
Expand Down Expand Up @@ -979,7 +980,7 @@ export enum VideoFrameProcessMode {
export interface IVideoFrameObserver {
/**
* Occurs each time the SDK receives a video frame captured by the local camera.
* After you successfully register the video frame observer, the SDK triggers this callback each time it receives a video frame. In this callback, you can get the video data captured by the local camera. You can then pre-process the data according to your scenarios.Once the pre-processing is complete, you can directly modify videoFrame in this callback, and set the return value to true to send the modified video data to the SDK.The video data that this callback gets has not been pre-processed, and is not watermarked, cropped, rotated or beautified.If the video data type you get is RGBA, the SDK does not support processing the data of the alpha channel.
* After you successfully register the video frame observer, the SDK triggers this callback each time it receives a video frame. In this callback, you can get the video data captured by the local camera. You can then pre-process the data according to your scenarios.After pre-processing, you can send the processed video data back to the SDK through this callback.The video data that this callback gets has not been pre-processed, and is not watermarked, cropped, rotated or beautified.If the video data type you get is RGBA, the SDK does not support processing the data of the alpha channel.
*
* @param videoFrame The video frame. See VideoFrame .The default value of the video frame data format obtained through this callback is as follows:macOS: YUV 420Windows: YUV 420
*
Expand Down
3 changes: 2 additions & 1 deletion ts/Private/IAgoraMediaEngine.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import {
IVideoEncodedFrameObserver,
IVideoFrameObserver,
} from './AgoraMediaBase';

/**
* The channel mode.
*/
Expand Down Expand Up @@ -126,7 +127,7 @@ export abstract class IMediaEngine {
* Call this method before joining a channel.
*
* @param enabled Whether to enable the external audio source:true: Enable the external audio source.false: (Default) Disable the external audio source.
* @param sampleRate The sample rate (Hz) of the external audio source which can be set as 8000, 16000, 32000, 44100, or 48000.
* @param sampleRate The sample rate (Hz) of the external audio which can be set as 8000, 16000, 32000, 44100, or 48000.
* @param channels The number of channels of the external audio source, which can be set as 1 (Mono) or 2 (Stereo).
* @param sourceNumber The number of external audio sources. The value of this parameter should be larger than 0. The SDK creates a corresponding number of custom audio tracks based on this parameter value and names the audio tracks starting from 0. In ChannelMediaOptions , you can set publishCustomAudioSourceId to the audio track ID you want to publish.
* @param localPlayback Whether to play the external audio source:true: Play the external audio source.false: (Default) Do not play the external source.
Expand Down
1 change: 1 addition & 0 deletions ts/Private/IAgoraMediaPlayer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ import {
PlayerStreamInfo,
} from './AgoraMediaPlayerTypes';
import { IMediaPlayerSourceObserver } from './IAgoraMediaPlayerSource';

/**
* This class provides media player functions and supports multiple instances.
*/
Expand Down
1 change: 1 addition & 0 deletions ts/Private/IAgoraMediaPlayerSource.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ import {
PlayerUpdatedInfo,
SrcInfo,
} from './AgoraMediaPlayerTypes';

/**
* Provides callbacks for media players.
*/
Expand Down
1 change: 1 addition & 0 deletions ts/Private/IAgoraMediaRecorder.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ import {
IMediaRecorderObserver,
MediaRecorderConfiguration,
} from './AgoraMediaBase';

/**
* Used for recording audio and video on the client.
* IMediaRecorder can record the following:
Expand Down
1 change: 1 addition & 0 deletions ts/Private/IAgoraMusicContentCenter.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import './extension/IAgoraMusicContentCenterExtension';
import { IMediaPlayer } from './IAgoraMediaPlayer';

/**
* @ignore
*/
Expand Down
25 changes: 13 additions & 12 deletions ts/Private/IAgoraRtcEngine.ts
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,7 @@ import {
import { RtcConnection } from './IAgoraRtcEngineEx';
import { ILocalSpatialAudioEngine } from './IAgoraSpatialAudio';
import { IAudioDeviceManager } from './IAudioDeviceManager';

/**
* Media device types.
*/
Expand Down Expand Up @@ -1038,15 +1039,15 @@ export class ChannelMediaOptions {
*/
publishCameraTrack?: boolean;
/**
* Whether to publish the video captured by the second camera:true: Publish the video captured by the second camera.false: (Default) Do not publish the video captured by the second camera.
* @ignore
*/
publishSecondaryCameraTrack?: boolean;
/**
* Whether to publish the audio captured by the microphone:true: (Default) Publish the audio captured by the microphone.false: Do not publish the audio captured by the microphone.
*/
publishMicrophoneTrack?: boolean;
/**
* @ignore
* Whether to publish the video captured from the screen:true: Publish the video captured from the screen.false: (Default) Do not publish the video captured from the screen.This parameter applies to Android and iOS only.
*/
publishScreenCaptureVideo?: boolean;
/**
Expand Down Expand Up @@ -1366,10 +1367,10 @@ export interface IRtcEngineEventHandler {

/**
* Reports the volume information of users.
* By default, this callback is disabled. You can enable it by calling enableAudioVolumeIndication . Once this callback is enabled and users send streams in the channel, the SDK triggers the onAudioVolumeIndication callback according to the time interval set in enableAudioVolumeIndication. The SDK triggers two independent onAudioVolumeIndication callbacks simultaneously, which separately report the volume information of the local user who sends a stream and the remote users (up to three) whose instantaneous volume is the highest.Once this callback is enabled, if the local user calls the muteLocalAudioStream method to mute, the SDK continues to report the volume indication of the local user.If a remote user whose volume is one of the three highest in the channel stops publishing the audio stream for 20 seconds, the callback excludes this user's information; if all remote users stop publishing audio streams for 20 seconds, the SDK stops triggering the callback for remote users.
* By default, this callback is disabled. You can enable it by calling enableAudioVolumeIndication . Once this callback is enabled and users send streams in the channel, the SDK triggers the onAudioVolumeIndication callback according to the time interval set in enableAudioVolumeIndication. The SDK triggers two independent onAudioVolumeIndication callbacks simultaneously, which separately report the volume information of the local user who sends a stream and the remote users (up to three) whose instantaneous volume is the highest.Once this callback is enabled, if the local user calls the muteLocalAudioStream method for muting, the SDK continues to report the volume indication of the local user. In the callbacks triggered, the volume information about the local user is 0 If a remote user whose volume is one of the three highest in the channel stops publishing the audio stream for 20 seconds, the callback excludes this user's information; if all remote users stop publishing audio streams for 20 seconds, the SDK stops triggering the callback for remote users.
*
* @param connection The connection information. See RtcConnection .
* @param speakers The volume information of the users. See AudioVolumeInfo . An empty speakers array in the callback indicates that no remote user is in the channel or is sending a stream.
* @param speakers The volume information of the users, see AudioVolumeInfo . An empty speakers array in the callback indicates that no remote user is in the channel or is sending a stream.
* @param speakerNumber The total number of users.In the callback for the local user, if the local user is sending streams, the value of speakerNumber is 1.In the callback for remote users, the value range of speakerNumber is [0,3]. If the number of remote users who send streams is greater than or equal to three, the value of speakerNumber is 3.
* @param totalVolume The volume of the speaker. The value range is [0,255].In the callback for the local user, totalVolume is the volume of the local user who sends a stream.In the callback for remote users, totalVolume is the sum of the volume of all remote users (up to three) whose instantaneous volume is the highest.
*/
Expand Down Expand Up @@ -2048,12 +2049,12 @@ export interface IRtcEngineEventHandler {
): void;

/**
* Occurs when the state of Media Push changes.
* When the state of Media Push changes, the SDK triggers this callback and reports the URL address and the current state of the Media Push. This callback indicates the state of the Media Push. When exceptions occur, you can troubleshoot issues by referring to the detailed error descriptions in the error code parameter.
* Occurs when the media push state changes.
* When the media push state changes, the SDK triggers this callback and reports the URL address and the current state of the media push. This callback indicates the state of the media push. When exceptions occur, you can troubleshoot issues by referring to the detailed error descriptions in the error code parameter.
*
* @param url The URL address where the state of the Media Push changes.
* @param state The current state of the Media Push. See RtmpStreamPublishState .
* @param errCode The detailed error information for the Media Push. See RtmpStreamPublishErrorType .
* @param url The URL address where the state of the media push changes.
* @param state The current state of the media push. See RtmpStreamPublishState .
* @param errCode The detailed error information for the media push. See RtmpStreamPublishErrorType .
*/
onRtmpStreamingStateChanged?(
url: string,
Expand All @@ -2062,10 +2063,10 @@ export interface IRtcEngineEventHandler {
): void;

/**
* Reports events during the Media Push.
* Reports events during the media push.
*
* @param url The URL for Media Push.
* @param eventCode The event code of Media Push. RtmpStreamingEvent
* @param url The URL of media push.
* @param eventCode The event code of media push. See RtmpStreamingEvent .
*/
onRtmpStreamingEvent?(url: string, eventCode: RtmpStreamingEvent): void;

Expand Down
1 change: 1 addition & 0 deletions ts/Private/IAgoraRtcEngineEx.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ import {
LeaveChannelOptions,
StreamFallbackOptions,
} from './IAgoraRtcEngine';

/**
* Contains connection information.
*/
Expand Down
1 change: 1 addition & 0 deletions ts/Private/IAgoraSpatialAudio.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import './extension/IAgoraSpatialAudioExtension';
import { RtcConnection } from './IAgoraRtcEngineEx';

/**
* The spatial position of the remote user or the media player.
*/
Expand Down
3 changes: 2 additions & 1 deletion ts/Private/IAudioDeviceManager.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import './extension/IAudioDeviceManagerExtension';
import { AudioDeviceInfo } from './IAgoraRtcEngine';

/**
* The maximum length of the device ID.
*/
Expand Down Expand Up @@ -77,7 +78,7 @@ export abstract class IAudioDeviceManager {
* Sets the audio capture device.
* You can call this method to change the audio route currently being used, but this does not change the default audio route. For example, if the default audio route is microphone, you call this method to set the audio route as bluetooth earphones before joinging a channel and then start a device test, the SDK conducts device test on the bluetooth earphones. After the device test is completed and you join a channel, the SDK still uses the microphone for audio capturing.
*
* @param deviceId The ID of the audio capture device. You can get the media player ID by calling enumerateRecordingDevices . Connecting or disconnecting the audio device does not change the value of deviceId.The maximum length is MaxDeviceIdLengthType .
* @param deviceId The ID of the audio capture device. You can get the Device ID by calling enumerateRecordingDevices . Connecting or disconnecting the audio device does not change the value of deviceId.The maximum length is MaxDeviceIdLengthType .
*
* @returns
* 0: Success.< 0: Failure.
Expand Down
Loading