功能描述
音视频 SDK 提供了获取 原始视频数据 功能来实现对视频的 编码前处理 和 解码后处理,即在音视频处理过程中,对采集和接收到的视频、频帧进行修改,并发布到房间中供远端用户订阅,实现特殊的播放效果。
原始视频数据主要分为:本地摄像头采集的原始数据、本地预览数据(即经过裁剪或加水印处理后未编码的数据)、远端用户的原始视频数据。
- 编码前处理:在编码前对SDK提供的原始视频数据(例如,本地摄像头采集的视频数据、本地预览数据)自行逐帧处理。
- 解码后处理:在解码后对SDK提供的原始视频数据(例如,接收到的远端用户视频数据)自行逐帧处理。
Native SDK 通过提供 IVideoFrameObserver
类,实现获取或者修改原始视频数据功能。
实现方法
操作步骤
获取原始视频数据前,请确保已实现基本的音视频功能。
- 加入频道前调用
registerVideoFrameObserver
方法注册视频观测器,并在该方法中实现一个IVideoFrameObserver
类。 - 成功注册后,SDK 会在捕捉到每个视频帧时通过
onCaptureVideoFrame
、onPreEncodeVideoFrame
或onRenderVideoFrame
回调发送获取到的原始视频数据。 - 用户拿到视频数据后,根据场景需要自行进行处理。然后将处理过的视频数据再通过上述回调发送给 SDK。
API 调用时序
下图展示使用原始视频数据的 API 调用时序:
通过回调获得VideoFrame
对象,对该对象做修改并返回给 SDK。
示例代码
class ARMediaDataPluginVideoFrameObserver : public ar::media::IVideoFrameObserver
{
public:
ARMediaDataPlugin *mediaDataPlugin;
BOOL getOneDidCaptureVideoFrame = false;
BOOL getOneWillRenderVideoFrame = false;
NSString *videoFrameUid = @"-1";
ARVideoRawData* getVideoRawDataWithVideoFrame(VideoFrame& videoFrame)
{
ARVideoRawData *data = [[ARVideoRawData alloc] init];
data.type = videoFrame.type;
data.width = videoFrame.width;
data.height = videoFrame.height;
data.yStride = videoFrame.yStride;
data.uStride = videoFrame.uStride;
data.vStride = videoFrame.vStride;
data.rotation = videoFrame.rotation;
data.renderTimeMs = videoFrame.renderTimeMs;
data.yBuffer = (char *)videoFrame.yBuffer;
data.uBuffer = (char *)videoFrame.uBuffer;
data.vBuffer = (char *)videoFrame.vBuffer;
return data;
}
void modifiedVideoFrameWithNewVideoRawData(VideoFrame& videoFrame, ARVideoRawData *videoRawData)
{
videoFrame.width = videoRawData.width;
videoFrame.height = videoRawData.height;
videoFrame.yStride = videoRawData.yStride;
videoFrame.uStride = videoRawData.uStride;
videoFrame.vStride = videoRawData.vStride;
videoFrame.rotation = videoRawData.rotation;
videoFrame.renderTimeMs = videoRawData.renderTimeMs;
}
virtual bool onCaptureVideoFrame(VideoFrame& videoFrame) override
{
if (!mediaDataPlugin && ((mediaDataPlugin.observerVideoType >> 0) == 0)) return true;
@autoreleasepool {
ARVideoRawData *newData = nil;
if ([mediaDataPlugin.videoDelegate respondsToSelector:@selector(mediaDataPlugin:didCapturedVideoRawData:)]) {
ARVideoRawData *data = getVideoRawDataWithVideoFrame(videoFrame);
newData = [mediaDataPlugin.videoDelegate mediaDataPlugin:mediaDataPlugin didCapturedVideoRawData:data];
modifiedVideoFrameWithNewVideoRawData(videoFrame, newData);
// ScreenShot
if (getOneDidCaptureVideoFrame) {
getOneDidCaptureVideoFrame = false;
[mediaDataPlugin yuvToUIImageWithVideoRawData:newData];
}
}
}
return true;
}
virtual bool onRenderVideoFrame(const char* uid, VideoFrame& videoFrame) override
{
if (!mediaDataPlugin && ((mediaDataPlugin.observerVideoType >> 1) == 0)) return true;
@autoreleasepool {
ARVideoRawData *newData = nil;
if ([mediaDataPlugin.videoDelegate respondsToSelector:@selector(mediaDataPlugin:willRenderVideoRawData:ofUid:)]) {
ARVideoRawData *data = getVideoRawDataWithVideoFrame(videoFrame);
newData = [mediaDataPlugin.videoDelegate mediaDataPlugin:mediaDataPlugin willRenderVideoRawData:data ofUid:[NSString stringWithUTF8String:uid]];
modifiedVideoFrameWithNewVideoRawData(videoFrame, newData);
// ScreenShot
if (getOneWillRenderVideoFrame && [videoFrameUid isEqualToString:[NSString stringWithUTF8String:uid]]) {
getOneWillRenderVideoFrame = false;
videoFrameUid = @"-1";
[mediaDataPlugin yuvToUIImageWithVideoRawData:newData];
}
}
}
return true;
}
virtual bool onPreEncodeVideoFrame(VideoFrame& videoFrame) override
{
if (!mediaDataPlugin && ((mediaDataPlugin.observerVideoType >> 2) == 0)) return true;
@autoreleasepool {
ARVideoRawData *newData = nil;
if ([mediaDataPlugin.videoDelegate respondsToSelector:@selector(mediaDataPlugin:willPreEncodeVideoRawData:)]) {
ARVideoRawData *data = getVideoRawDataWithVideoFrame(videoFrame);
newData = [mediaDataPlugin.videoDelegate mediaDataPlugin:mediaDataPlugin willPreEncodeVideoRawData:data];
modifiedVideoFrameWithNewVideoRawData(videoFrame, newData);
}
}
return true;
}
virtual VIDEO_FRAME_TYPE getVideoFormatPreference() override
{
return VIDEO_FRAME_TYPE(mediaDataPlugin.videoFormatter.type);
}
virtual bool getRotationApplied() override
{
return mediaDataPlugin.videoFormatter.rotationApplied;
}
virtual bool getMirrorApplied() override
{
return mediaDataPlugin.videoFormatter.mirrorApplied;
}
};
API 参考
registerVideoFrameObserver
onCaptureVideoFrame
onRenderVideoFrame
onPreEncodeVideoFrame
开发注意事项
本文中使用的原始数据接口为 C++ 接口。如果你在 iOS 或 macOS 平台开发,请参考如下步骤视频数据观测器。
- (void)registerVideoRawDataObserver:(ObserverVideoType)observerType {
ar::rtc::IRtcEngine* rtc_engine = (ar::rtc::IRtcEngine*)self.rtcKit.getNativeHandle;
ar::util::AutoPtr<ar::media::IMediaEngine> mediaEngine;
mediaEngine.queryInterface(rtc_engine, ar::AR_IID_MEDIA_ENGINE);
NSInteger oldValue = self.observerVideoType;
self.observerVideoType |= observerType;
if (mediaEngine && oldValue == 0)
{
mediaEngine->registerVideoFrameObserver(&s_videoFrameObserver);
s_videoFrameObserver.mediaDataPlugin = self;
}
}
- (void)deregisterVideoRawDataObserver:(ObserverVideoType)observerType {
ar::rtc::IRtcEngine* rtc_engine = (ar::rtc::IRtcEngine*)self.rtcKit.getNativeHandle;
ar::util::AutoPtr<ar::media::IMediaEngine> mediaEngine;
mediaEngine.queryInterface(rtc_engine, ar::AR_IID_MEDIA_ENGINE);
self.observerVideoType ^= observerType;
if (mediaEngine && self.observerVideoType == 0)
{
mediaEngine->registerVideoFrameObserver(NULL);
s_videoFrameObserver.mediaDataPlugin = nil;
}
}
相关文档
如果你还想在项目中实现原始音频数据功能,请参考原始音频数据。