WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机

在之前已经实现了WebRTC调用ossrs服务,实现直播视频通话功能。但是在使用过程中,RTCCameraVideoCapturer类提供的方法不能修改及调节相机的灯光等设置,那就需要自定义RTCVideoCapturer自行采集画面了。

在这里插入图片描述

iOS端WebRTC调用ossrs相关,实现直播视频通话功能请查看:
https://blog.csdn.net/gloryFlow/article/details/132262724

这里自定义RTCVideoCapturer

一、自定义相机需要的几个类

需要了解的几个类

  • AVCaptureSession
    AVCaptureSession是iOS提供的一个管理和协调输入设备到输出设备之间数据流的对象。

  • AVCaptureDevice
    AVCaptureDevice是指硬件设备。

  • AVCaptureDeviceInput
    AVCaptureDeviceInput是用来从AVCaptureDevice对象捕获Input数据。

  • AVCaptureMetadataOutput
    AVCaptureMetadataOutput是用来处理AVCaptureSession产生的定时元数据的捕获输出的。

  • AVCaptureVideoDataOutput
    AVCaptureVideoDataOutput是用来处理视频数据输出的。

  • AVCaptureVideoPreviewLayer
    AVCaptureVideoPreviewLayer是相机捕获的视频预览层,是用来展示视频的。

二、自定义RTCVideoCapturer相机采集

自定义相机,我们需要为AVCaptureSession添加AVCaptureVideoDataOutput的output

 self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];[self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];self.dataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};self.dataOutput.alwaysDiscardsLateVideoFrames = YES;[self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];if ([self.session canAddOutput:self.dataOutput]) {[self.session addOutput:self.dataOutput];}else{NSLog( @"Could not add video data output to the session" );return nil;}

需要实现AVCaptureVideoDataOutputSampleBufferDelegate代理方法

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

将CMSampleBufferRef转换成CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。

完整代码如下

SDCustomRTCCameraCapturer.h

#import <AVFoundation/AVFoundation.h>
#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>@protocol SDCustomRTCCameraCapturerDelegate <NSObject>- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;@end@interface SDCustomRTCCameraCapturer : RTCVideoCapturer@property(nonatomic, weak) id<SDCustomRTCCameraCapturerDelegate>delegate;// Capture session that is used for capturing. Valid from initialization to dealloc.
@property(nonatomic, strong) AVCaptureSession *captureSession;// Returns list of available capture devices that support video capture.
+ (NSArray<AVCaptureDevice *> *)captureDevices;
// Returns list of formats that are supported by this class for this device.
+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device;// Returns the most efficient supported output pixel format for this capturer.
- (FourCharCode)preferredOutputPixelFormat;// Starts the capture session asynchronously and notifies callback on completion.
// The device will capture video in the format given in the `format` parameter. If the pixel format
// in `format` is supported by the WebRTC pipeline, the same pixel format will be used for the
// output. Otherwise, the format returned by `preferredOutputPixelFormat` will be used.
- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fpscompletionHandler:(nullable void (^)(NSError *))completionHandler;
// Stops the capture session asynchronously and notifies callback on completion.
- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler;// Starts the capture session asynchronously.
- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fps;
// Stops the capture session asynchronously.
- (void)stopCapture;#pragma mark - 自定义相机
@property (nonatomic, readonly) dispatch_queue_t bufferQueue;@property (nonatomic, assign) AVCaptureDevicePosition devicePosition; // default AVCaptureDevicePositionFront@property (nonatomic, assign) AVCaptureVideoOrientation videoOrientation;@property (nonatomic, assign) BOOL needVideoMirrored;@property (nonatomic, strong , readonly) AVCaptureConnection *videoConnection;@property (nonatomic, copy) NSString *sessionPreset;  // default 640x480@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;@property (nonatomic, assign) BOOL bSessionPause;@property (nonatomic, assign) int iExpectedFPS;@property (nonatomic, readwrite, strong) NSDictionary *videoCompressingSettings;- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePositionsessionPresset:(AVCaptureSessionPreset)sessionPresetfps:(int)iFPSneedYuvOutput:(BOOL)needYuvOutput;- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame;- (void)setISOValue:(float)value;- (void)startRunning;- (void)stopRunning;- (void)rotateCamera;- (void)rotateCamera:(BOOL)isUseFrontCamera;- (void)setWhiteBalance;- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit;@end

SDCustomRTCCameraCapturer.m

#import "SDCustomRTCCameraCapturer.h"
#import <UIKit/UIKit.h>
#import "EFMachineVersion.h"//#import "base/RTCLogging.h"
//#import "base/RTCVideoFrameBuffer.h"
//#import "components/video_frame_buffer/RTCCVPixelBuffer.h"
//
//#if TARGET_OS_IPHONE
//#import "helpers/UIDevice+RTCDevice.h"
//#endif
//
//#import "helpers/AVCaptureSession+DevicePosition.h"
//#import "helpers/RTCDispatcher+Private.h"typedef NS_ENUM(NSUInteger, STExposureModel) {STExposureModelPositive5,STExposureModelPositive4,STExposureModelPositive3,STExposureModelPositive2,STExposureModelPositive1,STExposureModel0,STExposureModelNegative1,STExposureModelNegative2,STExposureModelNegative3,STExposureModelNegative4,STExposureModelNegative5,STExposureModelNegative6,STExposureModelNegative7,STExposureModelNegative8,
};static char * kEffectsCamera = "EffectsCamera";static STExposureModel currentExposureMode;@interface SDCustomRTCCameraCapturer ()<AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate, AVCaptureMetadataOutputObjectsDelegate>
@property(nonatomic, readonly) dispatch_queue_t frameQueue;
@property(nonatomic, strong) AVCaptureDevice *currentDevice;
@property(nonatomic, assign) BOOL hasRetriedOnFatalError;
@property(nonatomic, assign) BOOL isRunning;
// Will the session be running once all asynchronous operations have been completed?
@property(nonatomic, assign) BOOL willBeRunning;@property (nonatomic, strong) AVCaptureDeviceInput *deviceInput;
@property (nonatomic, strong) AVCaptureVideoDataOutput *dataOutput;
@property (nonatomic, strong) AVCaptureMetadataOutput *metaOutput;
@property (nonatomic, strong) AVCaptureStillImageOutput *stillImageOutput;@property (nonatomic, readwrite) dispatch_queue_t bufferQueue;@property (nonatomic, strong, readwrite) AVCaptureConnection *videoConnection;@property (nonatomic, strong) AVCaptureDevice *videoDevice;
@property (nonatomic, strong) AVCaptureSession *session;@end@implementation SDCustomRTCCameraCapturer {AVCaptureVideoDataOutput *_videoDataOutput;AVCaptureSession *_captureSession;FourCharCode _preferredOutputPixelFormat;FourCharCode _outputPixelFormat;RTCVideoRotation _rotation;float _autoISOValue;
#if TARGET_OS_IPHONEUIDeviceOrientation _orientation;BOOL _generatingOrientationNotifications;
#endif
}@synthesize frameQueue = _frameQueue;
@synthesize captureSession = _captureSession;
@synthesize currentDevice = _currentDevice;
@synthesize hasRetriedOnFatalError = _hasRetriedOnFatalError;
@synthesize isRunning = _isRunning;
@synthesize willBeRunning = _willBeRunning;- (instancetype)init {return [self initWithDelegate:nil captureSession:[[AVCaptureSession alloc] init]];
}- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegate {return [self initWithDelegate:delegate captureSession:[[AVCaptureSession alloc] init]];
}// This initializer is used for testing.
- (instancetype)initWithDelegate:(__weak id<RTCVideoCapturerDelegate>)delegatecaptureSession:(AVCaptureSession *)captureSession {if (self = [super initWithDelegate:delegate]) {// Create the capture session and all relevant inputs and outputs. We need// to do this in init because the application may want the capture session// before we start the capturer for e.g. AVCapturePreviewLayer. All objects// created here are retained until dealloc and never recreated.if (![self setupCaptureSession:captureSession]) {return nil;}NSNotificationCenter *center = [NSNotificationCenter defaultCenter];
#if TARGET_OS_IPHONE_orientation = UIDeviceOrientationPortrait;_rotation = RTCVideoRotation_90;[center addObserver:selfselector:@selector(deviceOrientationDidChange:)name:UIDeviceOrientationDidChangeNotificationobject:nil];[center addObserver:selfselector:@selector(handleCaptureSessionInterruption:)name:AVCaptureSessionWasInterruptedNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleCaptureSessionInterruptionEnded:)name:AVCaptureSessionInterruptionEndedNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleApplicationDidBecomeActive:)name:UIApplicationDidBecomeActiveNotificationobject:[UIApplication sharedApplication]];
#endif[center addObserver:selfselector:@selector(handleCaptureSessionRuntimeError:)name:AVCaptureSessionRuntimeErrorNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleCaptureSessionDidStartRunning:)name:AVCaptureSessionDidStartRunningNotificationobject:_captureSession];[center addObserver:selfselector:@selector(handleCaptureSessionDidStopRunning:)name:AVCaptureSessionDidStopRunningNotificationobject:_captureSession];}return self;
}//- (void)dealloc {
//  NSAssert(
//      !_willBeRunning,
//      @"Session was still running in RTCCameraVideoCapturer dealloc. Forgot to call stopCapture?");
//  [[NSNotificationCenter defaultCenter] removeObserver:self];
//}+ (NSArray<AVCaptureDevice *> *)captureDevices {
#if defined(WEBRTC_IOS) && defined(__IPHONE_10_0) && \__IPHONE_OS_VERSION_MIN_REQUIRED >= __IPHONE_10_0AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySessiondiscoverySessionWithDeviceTypes:@[ AVCaptureDeviceTypeBuiltInWideAngleCamera ]mediaType:AVMediaTypeVideoposition:AVCaptureDevicePositionUnspecified];return session.devices;
#elsereturn [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#endif
}+ (NSArray<AVCaptureDeviceFormat *> *)supportedFormatsForDevice:(AVCaptureDevice *)device {// Support opening the device in any format. We make sure it's converted to a format we// can handle, if needed, in the method `-setupVideoDataOutput`.return device.formats;
}- (FourCharCode)preferredOutputPixelFormat {return _preferredOutputPixelFormat;
}- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fps {[self startCaptureWithDevice:device format:format fps:fps completionHandler:nil];
}- (void)stopCapture {[self stopCaptureWithCompletionHandler:nil];
}- (void)startCaptureWithDevice:(AVCaptureDevice *)deviceformat:(AVCaptureDeviceFormat *)formatfps:(NSInteger)fpscompletionHandler:(nullable void (^)(NSError *))completionHandler {_willBeRunning = YES;[RTCDispatcherdispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{RTCLogInfo("startCaptureWithDevice %@ @ %ld fps", format, (long)fps);#if TARGET_OS_IPHONEdispatch_async(dispatch_get_main_queue(), ^{if (!self->_generatingOrientationNotifications) {[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];self->_generatingOrientationNotifications = YES;}});
#endifself.currentDevice = device;NSError *error = nil;if (![self.currentDevice lockForConfiguration:&error]) {RTCLogError(@"Failed to lock device %@. Error: %@",self.currentDevice,error.userInfo);if (completionHandler) {completionHandler(error);}self.willBeRunning = NO;return;}[self reconfigureCaptureSessionInput];[self updateOrientation];[self updateDeviceCaptureFormat:format fps:fps];[self updateVideoDataOutputPixelFormat:format];[self.captureSession startRunning];[self.currentDevice unlockForConfiguration];self.isRunning = YES;if (completionHandler) {completionHandler(nil);}}];
}- (void)stopCaptureWithCompletionHandler:(nullable void (^)(void))completionHandler {_willBeRunning = NO;[RTCDispatcherdispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{RTCLogInfo("Stop");self.currentDevice = nil;for (AVCaptureDeviceInput *oldInput in [self.captureSession.inputs copy]) {[self.captureSession removeInput:oldInput];}[self.captureSession stopRunning];#if TARGET_OS_IPHONEdispatch_async(dispatch_get_main_queue(), ^{if (self->_generatingOrientationNotifications) {[[UIDevice currentDevice] endGeneratingDeviceOrientationNotifications];self->_generatingOrientationNotifications = NO;}});
#endifself.isRunning = NO;if (completionHandler) {completionHandler();}}];
}#pragma mark iOS notifications#if TARGET_OS_IPHONE
- (void)deviceOrientationDidChange:(NSNotification *)notification {[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{[self updateOrientation];}];
}
#endif#pragma mark AVCaptureVideoDataOutputSampleBufferDelegate//- (void)captureOutput:(AVCaptureOutput *)captureOutput
//    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
//           fromConnection:(AVCaptureConnection *)connection {
//  NSParameterAssert(captureOutput == _videoDataOutput);
//
//  if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
//      !CMSampleBufferDataIsReady(sampleBuffer)) {
//    return;
//  }
//
//  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//  if (pixelBuffer == nil) {
//    return;
//  }
//
//#if TARGET_OS_IPHONE
//  // Default to portrait orientation on iPhone.
//  BOOL usingFrontCamera = NO;
//  // Check the image's EXIF for the camera the image came from as the image could have been
//  // delayed as we set alwaysDiscardsLateVideoFrames to NO.
//
//    AVCaptureDeviceInput *deviceInput =
//        (AVCaptureDeviceInput *)((AVCaptureInputPort *)connection.inputPorts.firstObject).input;
//    usingFrontCamera = AVCaptureDevicePositionFront == deviceInput.device.position;
//
//    switch (_orientation) {
//    case UIDeviceOrientationPortrait:
//      _rotation = RTCVideoRotation_90;
//      break;
//    case UIDeviceOrientationPortraitUpsideDown:
//      _rotation = RTCVideoRotation_270;
//      break;
//    case UIDeviceOrientationLandscapeLeft:
//      _rotation = usingFrontCamera ? RTCVideoRotation_180 : RTCVideoRotation_0;
//      break;
//    case UIDeviceOrientationLandscapeRight:
//      _rotation = usingFrontCamera ? RTCVideoRotation_0 : RTCVideoRotation_180;
//      break;
//    case UIDeviceOrientationFaceUp:
//    case UIDeviceOrientationFaceDown:
//    case UIDeviceOrientationUnknown:
//      // Ignore.
//      break;
//  }
//#else
//  // No rotation on Mac.
//  _rotation = RTCVideoRotation_0;
//#endif
//
//    if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {
//        [self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];
//    }
//}#pragma mark - AVCaptureSession notifications- (void)handleCaptureSessionInterruption:(NSNotification *)notification {NSString *reasonString = nil;
#if TARGET_OS_IPHONENSNumber *reason = notification.userInfo[AVCaptureSessionInterruptionReasonKey];if (reason) {switch (reason.intValue) {case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableInBackground:reasonString = @"VideoDeviceNotAvailableInBackground";break;case AVCaptureSessionInterruptionReasonAudioDeviceInUseByAnotherClient:reasonString = @"AudioDeviceInUseByAnotherClient";break;case AVCaptureSessionInterruptionReasonVideoDeviceInUseByAnotherClient:reasonString = @"VideoDeviceInUseByAnotherClient";break;case AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps:reasonString = @"VideoDeviceNotAvailableWithMultipleForegroundApps";break;}}
#endifRTCLog(@"Capture session interrupted: %@", reasonString);
}- (void)handleCaptureSessionInterruptionEnded:(NSNotification *)notification {RTCLog(@"Capture session interruption ended.");
}- (void)handleCaptureSessionRuntimeError:(NSNotification *)notification {NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];RTCLogError(@"Capture session runtime error: %@", error);[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{
#if TARGET_OS_IPHONEif (error.code == AVErrorMediaServicesWereReset) {[self handleNonFatalError];} else {[self handleFatalError];}
#else[self handleFatalError];
#endif}];
}- (void)handleCaptureSessionDidStartRunning:(NSNotification *)notification {RTCLog(@"Capture session started.");[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{// If we successfully restarted after an unknown error,// allow future retries on fatal errors.self.hasRetriedOnFatalError = NO;}];
}- (void)handleCaptureSessionDidStopRunning:(NSNotification *)notification {RTCLog(@"Capture session stopped.");
}- (void)handleFatalError {[RTCDispatcherdispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{if (!self.hasRetriedOnFatalError) {RTCLogWarning(@"Attempting to recover from fatal capture error.");[self handleNonFatalError];self.hasRetriedOnFatalError = YES;} else {RTCLogError(@"Previous fatal error recovery failed.");}}];
}- (void)handleNonFatalError {[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{RTCLog(@"Restarting capture session after error.");if (self.isRunning) {[self.captureSession startRunning];}}];
}#if TARGET_OS_IPHONE#pragma mark - UIApplication notifications- (void)handleApplicationDidBecomeActive:(NSNotification *)notification {[RTCDispatcher dispatchAsyncOnType:RTCDispatcherTypeCaptureSessionblock:^{if (self.isRunning && !self.captureSession.isRunning) {RTCLog(@"Restarting capture session on active.");[self.captureSession startRunning];}}];
}#endif  // TARGET_OS_IPHONE#pragma mark - Private- (dispatch_queue_t)frameQueue {if (!_frameQueue) {_frameQueue =dispatch_queue_create("org.webrtc.cameravideocapturer.video", DISPATCH_QUEUE_SERIAL);dispatch_set_target_queue(_frameQueue,dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0));}return _frameQueue;
}- (BOOL)setupCaptureSession:(AVCaptureSession *)captureSession {NSAssert(_captureSession == nil, @"Setup capture session called twice.");_captureSession = captureSession;
#if defined(WEBRTC_IOS)_captureSession.sessionPreset = AVCaptureSessionPresetInputPriority;_captureSession.usesApplicationAudioSession = NO;
#endif[self setupVideoDataOutput];// Add the output.if (![_captureSession canAddOutput:_videoDataOutput]) {RTCLogError(@"Video data output unsupported.");return NO;}[_captureSession addOutput:_videoDataOutput];return YES;
}- (void)setupVideoDataOutput {NSAssert(_videoDataOutput == nil, @"Setup video data output called twice.");AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];// `videoDataOutput.availableVideoCVPixelFormatTypes` returns the pixel formats supported by the// device with the most efficient output format first. Find the first format that we support.NSSet<NSNumber *> *supportedPixelFormats = [RTCCVPixelBuffer supportedPixelFormats];NSMutableOrderedSet *availablePixelFormats =[NSMutableOrderedSet orderedSetWithArray:videoDataOutput.availableVideoCVPixelFormatTypes];[availablePixelFormats intersectSet:supportedPixelFormats];NSNumber *pixelFormat = availablePixelFormats.firstObject;NSAssert(pixelFormat, @"Output device has no supported formats.");_preferredOutputPixelFormat = [pixelFormat unsignedIntValue];_outputPixelFormat = _preferredOutputPixelFormat;
//  videoDataOutput.videoSettings = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : pixelFormat};//    [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];videoDataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};videoDataOutput.alwaysDiscardsLateVideoFrames = YES;[videoDataOutput setSampleBufferDelegate:self queue:self.frameQueue];_videoDataOutput = videoDataOutput;//    // 设置视频方向和旋转
//    AVCaptureConnection *connection = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
//    [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//    [connection setVideoMirrored:NO];
}- (void)updateVideoDataOutputPixelFormat:(AVCaptureDeviceFormat *)format {
//  FourCharCode mediaSubType = CMFormatDescriptionGetMediaSubType(format.formatDescription);
//  if (![[RTCCVPixelBuffer supportedPixelFormats] containsObject:@(mediaSubType)]) {
//    mediaSubType = _preferredOutputPixelFormat;
//  }
//
//  if (mediaSubType != _outputPixelFormat) {
//    _outputPixelFormat = mediaSubType;
//    _videoDataOutput.videoSettings =
//        @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(mediaSubType) };
//  }
}#pragma mark - Private, called inside capture queue- (void)updateDeviceCaptureFormat:(AVCaptureDeviceFormat *)format fps:(NSInteger)fps {NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],@"updateDeviceCaptureFormat must be called on the capture queue.");@try {_currentDevice.activeFormat = format;_currentDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);} @catch (NSException *exception) {RTCLogError(@"Failed to set active format!\n User info:%@", exception.userInfo);return;}
}- (void)reconfigureCaptureSessionInput {NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],@"reconfigureCaptureSessionInput must be called on the capture queue.");NSError *error = nil;AVCaptureDeviceInput *input =[AVCaptureDeviceInput deviceInputWithDevice:_currentDevice error:&error];if (!input) {RTCLogError(@"Failed to create front camera input: %@", error.localizedDescription);return;}[_captureSession beginConfiguration];for (AVCaptureDeviceInput *oldInput in [_captureSession.inputs copy]) {[_captureSession removeInput:oldInput];}if ([_captureSession canAddInput:input]) {[_captureSession addInput:input];} else {RTCLogError(@"Cannot add camera as an input to the session.");}[_captureSession commitConfiguration];
}- (void)updateOrientation {NSAssert([RTCDispatcher isOnQueueForType:RTCDispatcherTypeCaptureSession],@"updateOrientation must be called on the capture queue.");
#if TARGET_OS_IPHONE_orientation = [UIDevice currentDevice].orientation;
#endif
}- (instancetype)initWithDevicePosition:(AVCaptureDevicePosition)iDevicePositionsessionPresset:(AVCaptureSessionPreset)sessionPresetfps:(int)iFPSneedYuvOutput:(BOOL)needYuvOutput
{self = [super init];if (self) {self.bSessionPause = YES;self.bufferQueue = dispatch_queue_create("STCameraBufferQueue", NULL);self.session = [[AVCaptureSession alloc] init];self.videoDevice = [self cameraDeviceWithPosition:iDevicePosition];_devicePosition = iDevicePosition;NSError *error = nil;self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.videoDeviceerror:&error];if (!self.deviceInput || error) {NSLog(@"create input error");return nil;}self.dataOutput = [[AVCaptureVideoDataOutput alloc] init];[self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];self.dataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : @(needYuvOutput ? kCVPixelFormatType_420YpCbCr8BiPlanarFullRange : kCVPixelFormatType_32BGRA)};self.dataOutput.alwaysDiscardsLateVideoFrames = YES;[self.dataOutput setSampleBufferDelegate:self queue:self.bufferQueue];self.metaOutput = [[AVCaptureMetadataOutput alloc] init];[self.metaOutput setMetadataObjectsDelegate:self queue:self.bufferQueue];self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];self.stillImageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG};if ([self.stillImageOutput respondsToSelector:@selector(setHighResolutionStillImageOutputEnabled:)]) {self.stillImageOutput.highResolutionStillImageOutputEnabled = YES;}[self.session beginConfiguration];if ([self.session canAddInput:self.deviceInput]) {[self.session addInput:self.deviceInput];}else{NSLog( @"Could not add device input to the session" );return nil;}if ([self.session canSetSessionPreset:sessionPreset]) {[self.session setSessionPreset:sessionPreset];_sessionPreset = sessionPreset;}else if([self.session canSetSessionPreset:AVCaptureSessionPreset1920x1080]){[self.session setSessionPreset:AVCaptureSessionPreset1920x1080];_sessionPreset = AVCaptureSessionPreset1920x1080;}else if([self.session canSetSessionPreset:AVCaptureSessionPreset1280x720]){[self.session setSessionPreset:AVCaptureSessionPreset1280x720];_sessionPreset = AVCaptureSessionPreset1280x720;}else{[self.session setSessionPreset:AVCaptureSessionPreset640x480];_sessionPreset = AVCaptureSessionPreset640x480;}if ([self.session canAddOutput:self.dataOutput]) {[self.session addOutput:self.dataOutput];}else{NSLog( @"Could not add video data output to the session" );return nil;}if ([self.session canAddOutput:self.metaOutput]) {[self.session addOutput:self.metaOutput];self.metaOutput.metadataObjectTypes = @[AVMetadataObjectTypeFace].copy;}if ([self.session canAddOutput:self.stillImageOutput]) {[self.session addOutput:self.stillImageOutput];}else {NSLog(@"Could not add still image output to the session");}self.videoConnection =  [self.dataOutput connectionWithMediaType:AVMediaTypeVideo];if ([self.videoConnection isVideoOrientationSupported]) {[self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];self.videoOrientation = AVCaptureVideoOrientationPortrait;}if ([self.videoConnection isVideoMirroringSupported]) {[self.videoConnection setVideoMirrored:YES];self.needVideoMirrored = YES;}if ([_videoDevice lockForConfiguration:NULL] == YES) {
//            _videoDevice.activeFormat = bestFormat;_videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, iFPS);_videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, iFPS);[_videoDevice unlockForConfiguration];}[self.session commitConfiguration];NSMutableDictionary *tmpSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] mutableCopy];if (!EFMachineVersion.isiPhone5sOrLater) {NSNumber *tmpSettingValue = tmpSettings[AVVideoHeightKey];tmpSettings[AVVideoHeightKey] = tmpSettings[AVVideoWidthKey];tmpSettings[AVVideoWidthKey] = tmpSettingValue;}self.videoCompressingSettings = [tmpSettings copy];self.iExpectedFPS = iFPS;[self addObservers];}return self;
}- (void)rotateCamera {if (self.devicePosition == AVCaptureDevicePositionFront) {self.devicePosition = AVCaptureDevicePositionBack;}else{self.devicePosition = AVCaptureDevicePositionFront;}
}- (void)rotateCamera:(BOOL)isUseFrontCamera {if (isUseFrontCamera) {self.devicePosition = AVCaptureDevicePositionFront;}else{self.devicePosition = AVCaptureDevicePositionBack;}
}- (void)setExposurePoint:(CGPoint)point inPreviewFrame:(CGRect)frame {BOOL isFrontCamera = self.devicePosition == AVCaptureDevicePositionFront;float fX = point.y / frame.size.height;float fY = isFrontCamera ? point.x / frame.size.width : (1 - point.x / frame.size.width);[self focusWithMode:self.videoDevice.focusMode exposureMode:self.videoDevice.exposureMode atPoint:CGPointMake(fX, fY)];
}- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{NSError *error = nil;AVCaptureDevice * device = self.videoDevice;if ( [device lockForConfiguration:&error] ) {device.exposureMode = AVCaptureExposureModeContinuousAutoExposure;
//        - (void)setISOValue:(float)value{[self setISOValue:0];
//device.exposureTargetBias// Setting (focus/exposure)PointOfInterest alone does not initiate a (focus/exposure) operation// Call -set(Focus/Exposure)Mode: to apply the new point of interestif ( focusMode != AVCaptureFocusModeLocked && device.isFocusPointOfInterestSupported && [device isFocusModeSupported:focusMode] ) {device.focusPointOfInterest = point;device.focusMode = focusMode;}if ( exposureMode != AVCaptureExposureModeCustom && device.isExposurePointOfInterestSupported && [device isExposureModeSupported:exposureMode] ) {device.exposurePointOfInterest = point;device.exposureMode = exposureMode;}device.subjectAreaChangeMonitoringEnabled = YES;[device unlockForConfiguration];}
}- (void)setWhiteBalance{[self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {if ([captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) {[captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];}}];
}- (void)changeDeviceProperty:(void(^)(AVCaptureDevice *))propertyChange{AVCaptureDevice *captureDevice= self.videoDevice;NSError *error;if ([captureDevice lockForConfiguration:&error]) {propertyChange(captureDevice);[captureDevice unlockForConfiguration];}else{NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);}
}- (void)setISOValue:(float)value exposeDuration:(int)duration{float currentISO = (value < self.videoDevice.activeFormat.minISO) ? self.videoDevice.activeFormat.minISO: value;currentISO = value > self.videoDevice.activeFormat.maxISO ? self.videoDevice.activeFormat.maxISO : value;NSError *error;if ([self.videoDevice lockForConfiguration:&error]){[self.videoDevice setExposureModeCustomWithDuration:AVCaptureExposureDurationCurrent ISO:currentISO completionHandler:nil];[self.videoDevice unlockForConfiguration];}
}- (void)setISOValue:(float)value{
//    float newVlaue = (value - 0.5) * (5.0 / 0.5); // mirror [0,1] to [-8,8]NSLog(@"%f", value);NSError *error = nil;if ( [self.videoDevice lockForConfiguration:&error] ) {[self.videoDevice setExposureTargetBias:value completionHandler:nil];[self.videoDevice unlockForConfiguration];}else {NSLog( @"Could not lock device for configuration: %@", error );}
}- (void)setDevicePosition:(AVCaptureDevicePosition)devicePosition
{if (_devicePosition != devicePosition && devicePosition != AVCaptureDevicePositionUnspecified) {if (_session) {AVCaptureDevice *targetDevice = [self cameraDeviceWithPosition:devicePosition];if (targetDevice && [self judgeCameraAuthorization]) {NSError *error = nil;AVCaptureDeviceInput *deviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:targetDevice error:&error];if(!deviceInput || error) {NSLog(@"Error creating capture device input: %@", error.localizedDescription);return;}_bSessionPause = YES;[_session beginConfiguration];[_session removeInput:_deviceInput];if ([_session canAddInput:deviceInput]) {[_session addInput:deviceInput];_deviceInput = deviceInput;_videoDevice = targetDevice;_devicePosition = devicePosition;}_videoConnection =  [_dataOutput connectionWithMediaType:AVMediaTypeVideo];if ([_videoConnection isVideoOrientationSupported]) {[_videoConnection setVideoOrientation:_videoOrientation];}if ([_videoConnection isVideoMirroringSupported]) {[_videoConnection setVideoMirrored:devicePosition == AVCaptureDevicePositionFront];}[_session commitConfiguration];[self setSessionPreset:_sessionPreset];_bSessionPause = NO;}}}
}- (void)setSessionPreset:(NSString *)sessionPreset
{if (_session && _sessionPreset) {//        if (![sessionPreset isEqualToString:_sessionPreset]) {_bSessionPause = YES;[_session beginConfiguration];if ([_session canSetSessionPreset:sessionPreset]) {[_session setSessionPreset:sessionPreset];_sessionPreset = sessionPreset;}[_session commitConfiguration];self.videoCompressingSettings = [[self.dataOutput recommendedVideoSettingsForAssetWriterWithOutputFileType:AVFileTypeQuickTimeMovie] copy];//        [self setIExpectedFPS:_iExpectedFPS];_bSessionPause = NO;
//        }}
}- (void)setIExpectedFPS:(int)iExpectedFPS
{_iExpectedFPS = iExpectedFPS;if (iExpectedFPS <= 0 || !_dataOutput.videoSettings || !_videoDevice) {return;}CGFloat fWidth = [[_dataOutput.videoSettings objectForKey:@"Width"] floatValue];CGFloat fHeight = [[_dataOutput.videoSettings objectForKey:@"Height"] floatValue];AVCaptureDeviceFormat *bestFormat = nil;AVFrameRateRange *bestFrameRateRange = nil;for (AVCaptureDeviceFormat *format in [_videoDevice formats]) {CMFormatDescriptionRef description = format.formatDescription;if (CMFormatDescriptionGetMediaSubType(description) != kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) {continue;}CMVideoDimensions videoDimension = CMVideoFormatDescriptionGetDimensions(description);if ((videoDimension.width == fWidth && videoDimension.height == fHeight)||(videoDimension.height == fWidth && videoDimension.width == fHeight)) {for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {if (range.maxFrameRate >= bestFrameRateRange.maxFrameRate) {bestFormat = format;bestFrameRateRange = range;}}}}if (bestFormat) {CMTime minFrameDuration;if (bestFrameRateRange.minFrameDuration.timescale / bestFrameRateRange.minFrameDuration.value < iExpectedFPS) {minFrameDuration = bestFrameRateRange.minFrameDuration;}else{minFrameDuration = CMTimeMake(1, iExpectedFPS);}if ([_videoDevice lockForConfiguration:NULL] == YES) {_videoDevice.activeFormat = bestFormat;_videoDevice.activeVideoMinFrameDuration = minFrameDuration;_videoDevice.activeVideoMaxFrameDuration = minFrameDuration;[_videoDevice unlockForConfiguration];}}
}- (void)startRunning
{if (![self judgeCameraAuthorization]) {return;}if (!self.dataOutput) {return;}if (self.session && ![self.session isRunning]) {if (self.bufferQueue) {dispatch_async(self.bufferQueue, ^{[self.session startRunning];});}self.bSessionPause = NO;}
}- (void)stopRunning
{if (self.session && [self.session isRunning]) {if (self.bufferQueue) {dispatch_async(self.bufferQueue, ^{[self.session stopRunning];});}self.bSessionPause = YES;}
}- (CGRect)getZoomedRectWithRect:(CGRect)rect scaleToFit:(BOOL)bScaleToFit
{CGRect rectRet = rect;if (self.dataOutput.videoSettings) {CGFloat fWidth = [[self.dataOutput.videoSettings objectForKey:@"Width"] floatValue];CGFloat fHeight = [[self.dataOutput.videoSettings objectForKey:@"Height"] floatValue];float fScaleX = fWidth / CGRectGetWidth(rect);float fScaleY = fHeight / CGRectGetHeight(rect);float fScale = bScaleToFit ? fmaxf(fScaleX, fScaleY) : fminf(fScaleX, fScaleY);fWidth /= fScale;fHeight /= fScale;CGFloat fX = rect.origin.x - (fWidth - rect.size.width) / 2.0f;CGFloat fY = rect.origin.y - (fHeight - rect.size.height) / 2.0f;rectRet = CGRectMake(fX, fY, fWidth, fHeight);}return rectRet;
}- (BOOL)judgeCameraAuthorization
{AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];if (authStatus == AVAuthorizationStatusRestricted || authStatus == AVAuthorizationStatusDenied) {return NO;}return YES;
}- (AVCaptureDevice *)cameraDeviceWithPosition:(AVCaptureDevicePosition)position
{AVCaptureDevice *deviceRet = nil;if (position != AVCaptureDevicePositionUnspecified) {NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];for (AVCaptureDevice *device in devices) {if ([device position] == position) {deviceRet = device;}}}return deviceRet;
}- (AVCaptureVideoPreviewLayer *)previewLayer
{if (!_previewLayer) {_previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];}return _previewLayer;
}- (void)snapStillImageCompletionHandler:(void (^)(CMSampleBufferRef imageDataSampleBuffer, NSError *error))handler
{if ([self judgeCameraAuthorization]) {self.bSessionPause = YES;NSString *strSessionPreset = [self.sessionPreset mutableCopy];self.sessionPreset = AVCaptureSessionPresetPhoto;// 改变preset会黑一下[NSThread sleepForTimeInterval:0.3];dispatch_async(self.bufferQueue, ^{[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {self.bSessionPause = NO;self.sessionPreset = strSessionPreset;handler(imageDataSampleBuffer , error);}];} );}
}
BOOL updateExporureModel(STExposureModel model){if (currentExposureMode == model) return NO;currentExposureMode = model;return YES;
}- (void)test:(AVCaptureExposureMode)model{if ([self.videoDevice lockForConfiguration:nil]) {if ([self.videoDevice  isExposureModeSupported:model]) {[self.videoDevice setExposureMode:model];}[self.videoDevice unlockForConfiguration];}
}- (void)setExposureTime:(CMTime)time{if ([self.videoDevice lockForConfiguration:nil]) {if (@available(iOS 12.0, *)) {self.videoDevice.activeMaxExposureDuration = time;} else {// Fallback on earlier versions}[self.videoDevice unlockForConfiguration];}
}
- (void)setFPS:(float)fps{if ([_videoDevice lockForConfiguration:NULL] == YES) {
//            _videoDevice.activeFormat = bestFormat;_videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, fps);_videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, fps);[_videoDevice unlockForConfiguration];}
}- (void)updateExposure:(CMSampleBufferRef)sampleBuffer{CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);NSDictionary * metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary *)metadataDict];CFRelease(metadataDict);NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];if(brightnessValue > 2 && updateExporureModel(STExposureModelPositive2)){[self setISOValue:500 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];[self setFPS:30];}else if(brightnessValue > 1 && brightnessValue < 2 && updateExporureModel(STExposureModelPositive1)){[self setISOValue:500 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];[self setFPS:30];}else if(brightnessValue > 0 && brightnessValue < 1 && updateExporureModel(STExposureModel0)){[self setISOValue:500 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];[self setFPS:30];}else if (brightnessValue > -1 && brightnessValue < 0 && updateExporureModel(STExposureModelNegative1)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:40];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -2 && brightnessValue < -1 && updateExporureModel(STExposureModelNegative2)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:35];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -2.5 && brightnessValue < -2 && updateExporureModel(STExposureModelNegative3)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:30];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -3 && brightnessValue < -2.5 && updateExporureModel(STExposureModelNegative4)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:25];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -3.5 && brightnessValue < -3 && updateExporureModel(STExposureModelNegative5)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:20];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -4 && brightnessValue < -3.5 && updateExporureModel(STExposureModelNegative6)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 250 exposeDuration:15];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if (brightnessValue > -5 && brightnessValue < -4 && updateExporureModel(STExposureModelNegative7)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 200 exposeDuration:10];[self test:AVCaptureExposureModeContinuousAutoExposure];}else if(brightnessValue < -5 && updateExporureModel(STExposureModelNegative8)){[self setISOValue:self.videoDevice.activeFormat.maxISO - 150 exposeDuration:5];[self test:AVCaptureExposureModeContinuousAutoExposure];}//    NSLog(@"current brightness %f min iso %f max iso %f min exposure %f max exposure %f", brightnessValue, self.videoDevice.activeFormat.minISO, self.videoDevice.activeFormat.maxISO, CMTimeGetSeconds(self.videoDevice.activeFormat.minExposureDuration),   CMTimeGetSeconds(self.videoDevice.activeFormat.maxExposureDuration));
}- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{if (!self.bSessionPause) {if (self.delegate && [self.delegate respondsToSelector:@selector(rtcCameraVideoCapturer:didOutputSampleBuffer:)]) {[self.delegate rtcCameraVideoCapturer:self didOutputSampleBuffer:sampleBuffer];}
//        if (self.delegate && [self.delegate respondsToSelector:@selector(captureOutput:didOutputSampleBuffer:fromConnection:)]) {
//            //[connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//            [self.delegate captureOutput:captureOutput didOutputSampleBuffer:sampleBuffer fromConnection:connection];
//        }}
//    [self updateExposure:sampleBuffer];
}- (void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection{AVMetadataFaceObject *faceObject = nil;for(AVMetadataObject *object  in metadataObjects){if (AVMetadataObjectTypeFace == object.type) {faceObject = (AVMetadataFaceObject*)object;}}static BOOL hasFace = NO;if (!hasFace && faceObject.faceID) {hasFace = YES;}if (!faceObject.faceID) {hasFace = NO;}
}#pragma mark - Notifications
- (void)addObservers {[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appWillResignActive) name:UIApplicationWillResignActiveNotification object:nil];[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(appDidBecomeActive) name:UIApplicationDidBecomeActiveNotification object:nil];[[NSNotificationCenter defaultCenter] addObserver:selfselector:@selector(dealCaptureSessionRuntimeError:)name:AVCaptureSessionRuntimeErrorNotificationobject:self.session];
}- (void)removeObservers {[[NSNotificationCenter defaultCenter] removeObserver:self];
}#pragma mark - Notification
- (void)appWillResignActive {RTCLogInfo("SDCustomRTCCameraCapturer appWillResignActive");
}- (void)appDidBecomeActive {RTCLogInfo("SDCustomRTCCameraCapturer appDidBecomeActive");[self startRunning];
}- (void)dealCaptureSessionRuntimeError:(NSNotification *)notification {NSError *error = [notification.userInfo objectForKey:AVCaptureSessionErrorKey];RTCLogError(@"SDCustomRTCCameraCapturer dealCaptureSessionRuntimeError error: %@", error);if (error.code == AVErrorMediaServicesWereReset) {[self startRunning];}
}#pragma mark - Dealloc
- (void)dealloc {DebugLog(@"SDCustomRTCCameraCapturer dealloc");[self removeObservers];if (self.session) {self.bSessionPause = YES;[self.session beginConfiguration];[self.session removeOutput:self.dataOutput];[self.session removeInput:self.deviceInput];[self.session commitConfiguration];if ([self.session isRunning]) {[self.session stopRunning];}self.session = nil;}
}@end

三、实现WebRTC结合ossrs视频通话功能

iOS端WebRTC调用ossrs相关,实现直播视频通话功能请查看:
https://blog.csdn.net/gloryFlow/article/details/132262724

这里列出来需要改动的地方

在createVideoTrack中使用SDCustomRTCCameraCapturer类

- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];self.localVideoSource = videoSource;// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {if (@available(iOS 10, *)) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];} else {// Fallback on earlier versions}} else{self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}

当需要渲染到界面上的时候,需要设置startCaptureLocalVideo,为 localVideoTrack添加渲染的界面View,renderer是一个RTCMTLVideoView

self.localRenderer = [[RTCMTLVideoView alloc] initWithFrame:CGRectZero];self.localRenderer.delegate = self;
- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}if (!renderer) {return;}if (!self.videoCapturer) {return;}[self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;[cameraVideoCapturer setISOValue:0.0];[cameraVideoCapturer rotateCamera:self.usingFrontCamera];self.videoCapturer.delegate = self;AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;if (!formatNilable) {return;}DebugLog(@"formatNilable:%@", formatNilable);NSInteger fps = [self selectFpsForFormat:formatNilable];CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);float width = videoVideoDimensions.width;float height = videoVideoDimensions.height;DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);[cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];[self changeResolution:width height:height fps:(int)fps];}if (@available(iOS 10, *)) {if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}} else {// Fallback on earlier versions}[self.localVideoTrack addRenderer:renderer];
}

获得相机采集的画面- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer;

具体代码如下

- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);RTCCVPixelBuffer *rtcPixelBuffer =[[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *1000000000;RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}

WebRTCClient完整代码如下

WebRTCClient.h

#import <Foundation/Foundation.h>
#import <WebRTC/WebRTC.h>
#import <UIKit/UIKit.h>
#import "SDCustomRTCCameraCapturer.h"#define kSelectedResolution @"kSelectedResolutionIndex"
#define kFramerateLimit 30.0@protocol WebRTCClientDelegate;
@interface WebRTCClient : NSObject@property (nonatomic, weak) id<WebRTCClientDelegate> delegate;/**connect工厂*/
@property (nonatomic, strong) RTCPeerConnectionFactory *factory;/**是否push*/
@property (nonatomic, assign) BOOL isPublish;/**connect*/
@property (nonatomic, strong) RTCPeerConnection *peerConnection;/**RTCAudioSession*/
@property (nonatomic, strong) RTCAudioSession *rtcAudioSession;/**DispatchQueue*/
@property (nonatomic) dispatch_queue_t audioQueue;/**mediaConstrains*/
@property (nonatomic, strong) NSDictionary *mediaConstrains;/**publishMediaConstrains*/
@property (nonatomic, strong) NSDictionary *publishMediaConstrains;/**playMediaConstrains*/
@property (nonatomic, strong) NSDictionary *playMediaConstrains;/**optionalConstraints*/
@property (nonatomic, strong) NSDictionary *optionalConstraints;/**RTCVideoCapturer摄像头采集器*/
@property (nonatomic, strong) RTCVideoCapturer *videoCapturer;/**local语音localAudioTrack*/
@property (nonatomic, strong) RTCAudioTrack *localAudioTrack;/**localVideoTrack*/
@property (nonatomic, strong) RTCVideoTrack *localVideoTrack;/**remoteVideoTrack*/
@property (nonatomic, strong) RTCVideoTrack *remoteVideoTrack;/**RTCVideoRenderer*/
@property (nonatomic, weak) id<RTCVideoRenderer> remoteRenderView;/**localDataChannel*/
@property (nonatomic, strong) RTCDataChannel *localDataChannel;/**localDataChannel*/
@property (nonatomic, strong) RTCDataChannel *remoteDataChannel;/**RTCVideoSource*/
@property (nonatomic, strong) RTCVideoSource *localVideoSource;- (instancetype)initWithPublish:(BOOL)isPublish;- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer;- (void)addIceCandidate:(RTCIceCandidate *)candidate;- (void)answer:(void (^)(RTCSessionDescription *sdp))completionHandler;- (void)offer:(void (^)(RTCSessionDescription *sdp))completionHandler;- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion;- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate;- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps;- (NSArray<NSString *> *)availableVideoResolutions;#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer;#pragma mark - Hiden or show Video
- (void)hidenVideo;- (void)showVideo;#pragma mark - Hiden or show Audio
- (void)muteAudio;- (void)unmuteAudio;- (void)speakOff;- (void)speakOn;#pragma mark - 设置视频码率BitrateBps
- (void)setMaxBitrate:(int)maxBitrate;- (void)setMinBitrate:(int)minBitrate;#pragma mark - 设置视频帧率
- (void)setMaxFramerate:(int)maxFramerate;- (void)close;@end@protocol WebRTCClientDelegate <NSObject>// 处理美颜设置
- (RTCVideoFrame *)webRTCClient:(WebRTCClient *)client didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer;- (void)webRTCClient:(WebRTCClient *)client didDiscoverLocalCandidate:(RTCIceCandidate *)candidate;
- (void)webRTCClient:(WebRTCClient *)client didChangeConnectionState:(RTCIceConnectionState)state;
- (void)webRTCClient:(WebRTCClient *)client didReceiveData:(NSData *)data;@end

WebRTCClient.m

#import "WebRTCClient.h"@interface WebRTCClient ()<RTCPeerConnectionDelegate, RTCDataChannelDelegate, RTCVideoCapturerDelegate, SDCustomRTCCameraCapturerDelegate>@property (nonatomic, assign) BOOL usingFrontCamera;@end@implementation WebRTCClient- (instancetype)initWithPublish:(BOOL)isPublish {self = [super init];if (self) {self.isPublish = isPublish;self.usingFrontCamera = YES;RTCMediaConstraints *constraints = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.publishMediaConstrains optionalConstraints:self.optionalConstraints];RTCConfiguration *newConfig = [[RTCConfiguration alloc] init];newConfig.sdpSemantics = RTCSdpSemanticsUnifiedPlan;newConfig.continualGatheringPolicy = RTCContinualGatheringPolicyGatherContinually;self.peerConnection = [self.factory peerConnectionWithConfiguration:newConfig constraints:constraints delegate:nil];[self createMediaSenders];[self createMediaReceivers];// srs not support data channel.// self.createDataChannel()[self configureAudioSession];self.peerConnection.delegate = self;}return self;
}- (void)createMediaSenders {if (!self.isPublish) {return;}NSString *streamId = @"stream";// AudioRTCAudioTrack *audioTrack = [self createAudioTrack];self.localAudioTrack = audioTrack;RTCRtpTransceiverInit *audioTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];audioTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;audioTrackTransceiver.streamIds = @[streamId];[self.peerConnection addTransceiverWithTrack:audioTrack init:audioTrackTransceiver];// VideoRTCRtpEncodingParameters *encodingParameters = [[RTCRtpEncodingParameters alloc] init];encodingParameters.maxBitrateBps = @(6000000);
//    encodingParameters.bitratePriority = 1.0;RTCVideoTrack *videoTrack = [self createVideoTrack];self.localVideoTrack = videoTrack;RTCRtpTransceiverInit *videoTrackTransceiver = [[RTCRtpTransceiverInit alloc] init];videoTrackTransceiver.direction = RTCRtpTransceiverDirectionSendOnly;videoTrackTransceiver.streamIds = @[streamId];// 设置该属性后,SRS服务无法播放视频画面
//    videoTrackTransceiver.sendEncodings = @[encodingParameters];[self.peerConnection addTransceiverWithTrack:videoTrack init:videoTrackTransceiver];[self setDegradationPreference:RTCDegradationPreferenceBalanced];[self setMaxBitrate:6000000];
}- (void)createMediaReceivers {if (!self.isPublish) {return;}if (self.peerConnection.transceivers.count > 0) {RTCRtpTransceiver *transceiver = self.peerConnection.transceivers.firstObject;if (transceiver.mediaType == RTCRtpMediaTypeVideo) {RTCVideoTrack *track = (RTCVideoTrack *)transceiver.receiver.track;self.remoteVideoTrack = track;}}
}- (void)configureAudioSession {[self.rtcAudioSession lockForConfiguration];@try {RTCAudioSessionConfiguration *configuration =[[RTCAudioSessionConfiguration alloc] init];configuration.category = AVAudioSessionCategoryPlayAndRecord;configuration.categoryOptions = AVAudioSessionCategoryOptionDefaultToSpeaker;configuration.mode = AVAudioSessionModeDefault;BOOL hasSucceeded = NO;NSError *error = nil;if (self.rtcAudioSession.isActive) {hasSucceeded = [self.rtcAudioSession setConfiguration:configuration error:&error];} else {hasSucceeded = [self.rtcAudioSession setConfiguration:configurationactive:YESerror:&error];}if (!hasSucceeded) {DebugLog(@"Error setting configuration: %@", error.localizedDescription);}} @catch (NSException *exception) {DebugLog(@"configureAudioSession exception:%@", exception);}[self.rtcAudioSession unlockForConfiguration];
}- (RTCAudioTrack *)createAudioTrack {/// enable google 3A algorithm.NSDictionary *mandatory = @{@"googEchoCancellation": kRTCMediaConstraintsValueTrue,@"googAutoGainControl": kRTCMediaConstraintsValueTrue,@"googNoiseSuppression": kRTCMediaConstraintsValueTrue,};RTCMediaConstraints *audioConstrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:mandatory optionalConstraints:self.optionalConstraints];RTCAudioSource *audioSource = [self.factory audioSourceWithConstraints:audioConstrains];RTCAudioTrack *audioTrack = [self.factory audioTrackWithSource:audioSource trackId:@"audio0"];return audioTrack;
}- (RTCVideoTrack *)createVideoTrack {RTCVideoSource *videoSource = [self.factory videoSource];self.localVideoSource = videoSource;// 如果是模拟器if (TARGET_IPHONE_SIMULATOR) {if (@available(iOS 10, *)) {self.videoCapturer = [[RTCFileVideoCapturer alloc] initWithDelegate:self];} else {// Fallback on earlier versions}} else{self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDevicePosition:AVCaptureDevicePositionFront sessionPresset:AVCaptureSessionPreset1920x1080 fps:20 needYuvOutput:NO];//        self.videoCapturer = [[SDCustomRTCCameraCapturer alloc] initWithDelegate:self];}RTCVideoTrack *videoTrack = [self.factory videoTrackWithSource:videoSource trackId:@"video0"];return videoTrack;
}- (void)addIceCandidate:(RTCIceCandidate *)candidate {[self.peerConnection addIceCandidate:candidate];
}- (void)offer:(void (^)(RTCSessionDescription *sdp))completion {if (self.isPublish) {self.mediaConstrains = self.publishMediaConstrains;} else {self.mediaConstrains = self.playMediaConstrains;}RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];DebugLog(@"peerConnection:%@",self.peerConnection);__weak typeof(self) weakSelf = self;[weakSelf.peerConnection offerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {DebugLog(@"offer offerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {DebugLog(@"offer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}- (void)answer:(void (^)(RTCSessionDescription *sdp))completion {RTCMediaConstraints *constrains = [[RTCMediaConstraints alloc] initWithMandatoryConstraints:self.mediaConstrains optionalConstraints:self.optionalConstraints];__weak typeof(self) weakSelf = self;[weakSelf.peerConnection answerForConstraints:constrains completionHandler:^(RTCSessionDescription * _Nullable sdp, NSError * _Nullable error) {if (error) {DebugLog(@"answer answerForConstraints error:%@", error);}if (sdp) {[weakSelf.peerConnection setLocalDescription:sdp completionHandler:^(NSError * _Nullable error) {if (error) {DebugLog(@"answer setLocalDescription error:%@", error);}if (completion) {completion(sdp);}}];}}];
}- (void)setRemoteSdp:(RTCSessionDescription *)remoteSdp completion:(void (^)(NSError *error))completion {[self.peerConnection setRemoteDescription:remoteSdp completionHandler:completion];
}- (void)setRemoteCandidate:(RTCIceCandidate *)remoteCandidate {[self.peerConnection addIceCandidate:remoteCandidate];
}- (void)setMaxBitrate:(int)maxBitrate {if (self.peerConnection.senders.count > 0) {RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;RTCRtpParameters *parametersToModify = videoSender.parameters;for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {encoding.maxBitrateBps = @(maxBitrate);}[videoSender setParameters:parametersToModify];}
}- (void)setDegradationPreference:(RTCDegradationPreference)degradationPreference {// RTCDegradationPreferenceMaintainResolutionNSMutableArray *videoSenders = [NSMutableArray arrayWithCapacity:0];for (RTCRtpSender *sender in self.peerConnection.senders) {if (sender.track && [kRTCMediaStreamTrackKindVideo isEqualToString:sender.track.kind]) {// [videoSenders addObject:sender];RTCRtpParameters *parameters = sender.parameters;parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];[sender setParameters:parameters];[videoSenders addObject:sender];}}for (RTCRtpSender *sender in videoSenders) {RTCRtpParameters *parameters = sender.parameters;parameters.degradationPreference = [NSNumber numberWithInteger:degradationPreference];[sender setParameters:parameters];}
}- (void)setMinBitrate:(int)minBitrate {if (self.peerConnection.senders.count > 0) {RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;RTCRtpParameters *parametersToModify = videoSender.parameters;for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {encoding.minBitrateBps = @(minBitrate);}[videoSender setParameters:parametersToModify];}
}- (void)setMaxFramerate:(int)maxFramerate {if (self.peerConnection.senders.count > 0) {RTCRtpSender *videoSender = self.peerConnection.senders.firstObject;RTCRtpParameters *parametersToModify = videoSender.parameters;// 该版本暂时没有maxFramerate,需要更新到最新版本for (RTCRtpEncodingParameters *encoding in parametersToModify.encodings) {encoding.maxFramerate = @(maxFramerate);}[videoSender setParameters:parametersToModify];}
}#pragma mark - Private
- (AVCaptureDevice *)findDeviceForPosition:(AVCaptureDevicePosition)position {NSArray<AVCaptureDevice *> *captureDevices =[SDCustomRTCCameraCapturer captureDevices];for (AVCaptureDevice *device in captureDevices) {if (device.position == position) {return device;}}return captureDevices[0];
}- (NSArray<NSString *> *)availableVideoResolutions {NSMutableSet<NSArray<NSNumber *> *> *resolutions =[[NSMutableSet<NSArray<NSNumber *> *> alloc] init];for (AVCaptureDevice *device in [SDCustomRTCCameraCapturer captureDevices]) {for (AVCaptureDeviceFormat *format in[SDCustomRTCCameraCapturer supportedFormatsForDevice:device]) {CMVideoDimensions resolution =CMVideoFormatDescriptionGetDimensions(format.formatDescription);NSArray<NSNumber *> *resolutionObject = @[@(resolution.width), @(resolution.height) ];[resolutions addObject:resolutionObject];}}NSArray<NSArray<NSNumber *> *> *sortedResolutions =[[resolutions allObjects] sortedArrayUsingComparator:^NSComparisonResult(NSArray<NSNumber *> *obj1, NSArray<NSNumber *> *obj2) {NSComparisonResult cmp = [obj1.firstObject compare:obj2.firstObject];if (cmp != NSOrderedSame) {return cmp;}return [obj1.lastObject compare:obj2.lastObject];}];NSMutableArray<NSString *> *resolutionStrings = [[NSMutableArray<NSString *> alloc] init];for (NSArray<NSNumber *> *resolution in sortedResolutions) {NSString *resolutionString =[NSString stringWithFormat:@"%@x%@", resolution.firstObject, resolution.lastObject];[resolutionStrings addObject:resolutionString];}return [resolutionStrings copy];
}- (int)videoResolutionComponentAtIndex:(int)index inString:(NSString *)resolution {if (index != 0 && index != 1) {return 0;}NSArray<NSString *> *components = [resolution componentsSeparatedByString:@"x"];if (components.count != 2) {return 0;}return components[index].intValue;
}- (AVCaptureDeviceFormat *)selectFormatForDevice:(AVCaptureDevice *)device {RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;NSArray *availableVideoResolutions = [self availableVideoResolutions];NSString *selectedIndexStr = [[NSUserDefaults standardUserDefaults] objectForKey:kSelectedResolution];int selectedIndex = 0;if (selectedIndexStr && selectedIndexStr.length > 0) {selectedIndex = [selectedIndexStr intValue];}NSString *videoResolution = [availableVideoResolutions objectAtIndex:selectedIndex];DebugLog(@"availableVideoResolutions:%@, videoResolution:%@", availableVideoResolutions, videoResolution);NSArray<AVCaptureDeviceFormat *> *formats =[SDCustomRTCCameraCapturer supportedFormatsForDevice:device];int targetWidth = [self videoResolutionComponentAtIndex:0 inString:videoResolution];int targetHeight = [self videoResolutionComponentAtIndex:1 inString:videoResolution];AVCaptureDeviceFormat *selectedFormat = nil;int currentDiff = INT_MAX;for (AVCaptureDeviceFormat *format in formats) {CMVideoDimensions dimension = CMVideoFormatDescriptionGetDimensions(format.formatDescription);FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription);int diff = abs(targetWidth - dimension.width) + abs(targetHeight - dimension.height);if (diff < currentDiff) {selectedFormat = format;currentDiff = diff;} else if (diff == currentDiff && pixelFormat == [cameraVideoCapturer preferredOutputPixelFormat]) {selectedFormat = format;}}return selectedFormat;}return nil;
}- (NSInteger)selectFpsForFormat:(AVCaptureDeviceFormat *)format {Float64 maxSupportedFramerate = 0;for (AVFrameRateRange *fpsRange in format.videoSupportedFrameRateRanges) {maxSupportedFramerate = fmax(maxSupportedFramerate, fpsRange.maxFrameRate);}
//    return fmin(maxSupportedFramerate, kFramerateLimit);return 20;
}- (void)startCaptureLocalVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}if (!renderer) {return;}if (!self.videoCapturer) {return;}[self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {AVCaptureDevice *camera = [self findDeviceForPosition:self.usingFrontCamera?AVCaptureDevicePositionFront:AVCaptureDevicePositionBack];SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;[cameraVideoCapturer setISOValue:0.0];[cameraVideoCapturer rotateCamera:self.usingFrontCamera];self.videoCapturer.delegate = self;AVCaptureDeviceFormat *formatNilable = [self selectFormatForDevice:camera];;if (!formatNilable) {return;}DebugLog(@"formatNilable:%@", formatNilable);NSInteger fps = [self selectFpsForFormat:formatNilable];CMVideoDimensions videoVideoDimensions = CMVideoFormatDescriptionGetDimensions(formatNilable.formatDescription);float width = videoVideoDimensions.width;float height = videoVideoDimensions.height;DebugLog(@"videoVideoDimensions width:%f,height:%f", width, height);[cameraVideoCapturer startRunning];
//        [cameraVideoCapturer startCaptureWithDevice:camera format:formatNilable fps:fps completionHandler:^(NSError *error) {
//            DebugLog(@"startCaptureWithDevice error:%@", error);
//        }];[self changeResolution:width height:height fps:(int)fps];}if (@available(iOS 10, *)) {if ([capturer isKindOfClass:[RTCFileVideoCapturer class]]) {RTCFileVideoCapturer *fileVideoCapturer = (RTCFileVideoCapturer *)capturer;[fileVideoCapturer startCapturingFromFileNamed:@"beautyPicture.mp4" onError:^(NSError * _Nonnull error) {DebugLog(@"startCaptureLocalVideo startCapturingFromFileNamed error:%@", error);}];}} else {// Fallback on earlier versions}[self.localVideoTrack addRenderer:renderer];
}- (void)renderRemoteVideo:(id<RTCVideoRenderer>)renderer {if (!self.isPublish) {return;}self.remoteRenderView = renderer;
}- (RTCDataChannel *)createDataChannel {RTCDataChannelConfiguration *config = [[RTCDataChannelConfiguration alloc] init];RTCDataChannel *dataChannel = [self.peerConnection dataChannelForLabel:@"WebRTCData" configuration:config];if (!dataChannel) {return nil;}dataChannel.delegate = self;self.localDataChannel = dataChannel;return dataChannel;
}- (void)sendData:(NSData *)data {RTCDataBuffer *buffer = [[RTCDataBuffer alloc] initWithData:data isBinary:YES];[self.remoteDataChannel sendData:buffer];
}#pragma mark - switchCamera
- (void)switchCamera:(id<RTCVideoRenderer>)renderer {self.usingFrontCamera = !self.usingFrontCamera;[self startCaptureLocalVideo:renderer];
}#pragma mark - 更改分辨率
- (BOOL)changeResolution:(int)width height:(int)height fps:(int)fps {if (!self.localVideoSource) {RTCVideoSource *videoSource = [self.factory videoSource];[videoSource adaptOutputFormatToWidth:width height:height fps:fps];} else {[self.localVideoSource adaptOutputFormatToWidth:width height:height fps:fps];}return YES;
}#pragma mark - Hiden or show Video
- (void)hidenVideo {[self setVideoEnabled:NO];self.localVideoTrack.isEnabled = NO;
}- (void)showVideo {[self setVideoEnabled:YES];self.localVideoTrack.isEnabled = YES;
}- (void)setVideoEnabled:(BOOL)isEnabled {[self setTrackEnabled:[RTCVideoTrack class] isEnabled:isEnabled];
}- (void)setTrackEnabled:(Class)track isEnabled:(BOOL)isEnabled {for (RTCRtpTransceiver *transceiver in self.peerConnection.transceivers) {if (transceiver && [transceiver isKindOfClass:track]) {transceiver.sender.track.isEnabled = isEnabled;}}
}#pragma mark - Hiden or show Audio
- (void)muteAudio {[self setAudioEnabled:NO];self.localAudioTrack.isEnabled = NO;
}- (void)unmuteAudio {[self setAudioEnabled:YES];self.localAudioTrack.isEnabled = YES;
}- (void)speakOff {__weak typeof(self) weakSelf = self;dispatch_async(self.audioQueue, ^{[weakSelf.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *ooapError;[self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&ooapError];DebugLog(@"speakOff error:%@, ooapError:%@", error, ooapError);} @catch (NSException *exception) {DebugLog(@"speakOff exception:%@", exception);}[weakSelf.rtcAudioSession unlockForConfiguration];});
}- (void)speakOn {__weak typeof(self) weakSelf = self;dispatch_async(self.audioQueue, ^{[weakSelf.rtcAudioSession lockForConfiguration];@try {NSError *error;[self.rtcAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker error:&error];NSError *ooapError;[self.rtcAudioSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&ooapError];NSError *activeError;[self.rtcAudioSession setActive:YES error:&activeError];DebugLog(@"speakOn error:%@, ooapError:%@, activeError:%@", error, ooapError, activeError);} @catch (NSException *exception) {DebugLog(@"speakOn exception:%@", exception);}[weakSelf.rtcAudioSession unlockForConfiguration];});
}- (void)setAudioEnabled:(BOOL)isEnabled {[self setTrackEnabled:[RTCAudioTrack class] isEnabled:isEnabled];
}- (void)close {RTCVideoCapturer *capturer = self.videoCapturer;if ([capturer isKindOfClass:[SDCustomRTCCameraCapturer class]]) {SDCustomRTCCameraCapturer *cameraVideoCapturer = (SDCustomRTCCameraCapturer *)capturer;[cameraVideoCapturer stopRunning];}[self.peerConnection close];self.peerConnection = nil;self.videoCapturer = nil;
}#pragma mark - RTCPeerConnectionDelegate
/** Called when the SignalingState changed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection
didChangeSignalingState:(RTCSignalingState)stateChanged {DebugLog(@"peerConnection didChangeSignalingState:%ld", (long)stateChanged);
}/** Called when media is received on a new stream from remote peer. */
- (void)peerConnection:(RTCPeerConnection *)peerConnection didAddStream:(RTCMediaStream *)stream {DebugLog(@"peerConnection didAddStream");if (self.isPublish) {return;}NSArray *videoTracks = stream.videoTracks;if (videoTracks && videoTracks.count > 0) {RTCVideoTrack *track = videoTracks.firstObject;self.remoteVideoTrack = track;}if (self.remoteVideoTrack && self.remoteRenderView) {id<RTCVideoRenderer> remoteRenderView = self.remoteRenderView;RTCVideoTrack *remoteVideoTrack = self.remoteVideoTrack;[remoteVideoTrack addRenderer:remoteRenderView];}/**if let audioTrack = stream.audioTracks.first{print("audio track faund")audioTrack.source.volume = 8}*/
}/** Called when a remote peer closes a stream.*  This is not called when RTCSdpSemanticsUnifiedPlan is specified.*/
- (void)peerConnection:(RTCPeerConnection *)peerConnection didRemoveStream:(RTCMediaStream *)stream {DebugLog(@"peerConnection didRemoveStream");
}/** Called when negotiation is needed, for example ICE has restarted. */
- (void)peerConnectionShouldNegotiate:(RTCPeerConnection *)peerConnection {DebugLog(@"peerConnection peerConnectionShouldNegotiate");
}/** Called any time the IceConnectionState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidChangeIceConnectionState:(RTCIceConnectionState)newState {DebugLog(@"peerConnection didChangeIceConnectionState:%ld", newState);if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didChangeConnectionState:)]) {[self.delegate webRTCClient:self didChangeConnectionState:newState];}if (RTCIceConnectionStateConnected == newState || RTCIceConnectionStateChecking == newState) {[self setDegradationPreference:RTCDegradationPreferenceMaintainResolution];}
}/** Called any time the IceGatheringState changes. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidChangeIceGatheringState:(RTCIceGatheringState)newState {DebugLog(@"peerConnection didChangeIceGatheringState:%ld", newState);
}/** New ice candidate has been found. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidGenerateIceCandidate:(RTCIceCandidate *)candidate {DebugLog(@"peerConnection didGenerateIceCandidate:%@", candidate);if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didDiscoverLocalCandidate:)]) {[self.delegate webRTCClient:self didDiscoverLocalCandidate:candidate];}
}/** Called when a group of local Ice candidates have been removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidRemoveIceCandidates:(NSArray<RTCIceCandidate *> *)candidates {DebugLog(@"peerConnection didRemoveIceCandidates:%@", candidates);
}/** New data channel has been opened. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidOpenDataChannel:(RTCDataChannel *)dataChannel {DebugLog(@"peerConnection didOpenDataChannel:%@", dataChannel);self.remoteDataChannel = dataChannel;
}/** Called when signaling indicates a transceiver will be receiving media from*  the remote endpoint.*  This is only called with RTCSdpSemanticsUnifiedPlan specified.*/
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidStartReceivingOnTransceiver:(RTCRtpTransceiver *)transceiver {DebugLog(@"peerConnection didStartReceivingOnTransceiver:%@", transceiver);
}/** Called when a receiver and its track are created. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidAddReceiver:(RTCRtpReceiver *)rtpReceiverstreams:(NSArray<RTCMediaStream *> *)mediaStreams {DebugLog(@"peerConnection didAddReceiver");
}/** Called when the receiver and its track are removed. */
- (void)peerConnection:(RTCPeerConnection *)peerConnectiondidRemoveReceiver:(RTCRtpReceiver *)rtpReceiver {DebugLog(@"peerConnection didRemoveReceiver");
}#pragma mark - RTCDataChannelDelegate
/** The data channel state changed. */
- (void)dataChannelDidChangeState:(RTCDataChannel *)dataChannel {DebugLog(@"dataChannelDidChangeState:%@", dataChannel);
}/** The data channel successfully received a data buffer. */
- (void)dataChannel:(RTCDataChannel *)dataChannel
didReceiveMessageWithBuffer:(RTCDataBuffer *)buffer {if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didReceiveData:)]) {[self.delegate webRTCClient:self didReceiveData:buffer.data];}
}#pragma mark - RTCVideoCapturerDelegate处理代理
- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame {
//    DebugLog(@"capturer:%@ didCaptureVideoFrame:%@", capturer, frame);
//    RTCVideoFrame *aFilterVideoFrame;
//    if (self.delegate && [self.delegate respondsToSelector:@selector(webRTCClient:didCaptureVideoFrame:)]) {
//        aFilterVideoFrame = [self.delegate webRTCClient:self didCaptureVideoFrame:frame];
//    }//  操作C 需要手动释放  否则内存暴涨
//      CVPixelBufferRelease(_buffer)//    拿到pixelBuffersampleBuffer
//        ((RTCCVPixelBuffer*)frame.buffer).pixelBuffer//    if (!aFilterVideoFrame) {
//        aFilterVideoFrame = frame;
//    }
//
//    [self.localVideoSource capturer:capturer didCaptureVideoFrame:frame];
}- (void)rtcCameraVideoCapturer:(RTCVideoCapturer *)capturer didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer {CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);RTCCVPixelBuffer *rtcPixelBuffer =[[RTCCVPixelBuffer alloc] initWithPixelBuffer:videoPixelBufferRef];RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];int64_t timeStampNs = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) *1000000000;RTCVideoFrame *rtcVideoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer     [self.localVideoSource capturer:capturer didCaptureVideoFrame:rtcVideoFrame];
}#pragma mark - Lazy
- (RTCPeerConnectionFactory *)factory {if (!_factory) {RTCInitializeSSL();RTCDefaultVideoEncoderFactory *videoEncoderFactory = [[RTCDefaultVideoEncoderFactory alloc] init];RTCDefaultVideoDecoderFactory *videoDecoderFactory = [[RTCDefaultVideoDecoderFactory alloc] init];_factory = [[RTCPeerConnectionFactory alloc] initWithEncoderFactory:videoEncoderFactory decoderFactory:videoDecoderFactory];}return _factory;
}- (dispatch_queue_t)audioQueue {if (!_audioQueue) {_audioQueue = dispatch_queue_create("cn.dface.webrtc", NULL);}return _audioQueue;
}- (RTCAudioSession *)rtcAudioSession {if (!_rtcAudioSession) {_rtcAudioSession = [RTCAudioSession sharedInstance];}return _rtcAudioSession;
}- (NSDictionary *)mediaConstrains {if (!_mediaConstrains) {_mediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",nil];}return _mediaConstrains;
}- (NSDictionary *)publishMediaConstrains {if (!_publishMediaConstrains) {_publishMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueFalse, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                   @"2160", kRTCMediaConstraintsMinWidth,
//                                   @"3840", kRTCMediaConstraintsMinHeight,
//                                   @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                   @"1", kRTCMediaConstraintsMaxAspectRatio,nil];}return _publishMediaConstrains;
}- (NSDictionary *)playMediaConstrains {if (!_playMediaConstrains) {_playMediaConstrains = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveAudio,kRTCMediaConstraintsValueTrue, kRTCMediaConstraintsOfferToReceiveVideo,kRTCMediaConstraintsValueTrue, @"IceRestart",kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",kRTCMediaConstraintsValueFalse, @"googBandwidthLimitedResolution",
//                                @"2160", kRTCMediaConstraintsMinWidth,
//                                @"3840", kRTCMediaConstraintsMinHeight,
//                                @"0.25", kRTCMediaConstraintsMinAspectRatio,
//                                @"1", kRTCMediaConstraintsMaxAspectRatio,nil];}return _playMediaConstrains;
}- (NSDictionary *)optionalConstraints {if (!_optionalConstraints) {_optionalConstraints = [[NSDictionary alloc] initWithObjectsAndKeys:kRTCMediaConstraintsValueTrue, @"DtlsSrtpKeyAgreement",kRTCMediaConstraintsValueTrue, @"googCpuOveruseDetection",nil];}return _optionalConstraints;
}- (void)dealloc {[self.peerConnection close];self.peerConnection = nil;
}@end

至此WebRTC视频自定义RTCVideoCapturer相机完毕。

其他
之前搭建ossrs服务,可以查看:https://blog.csdn.net/gloryFlow/article/details/132257196
之前实现iOS端调用ossrs音视频通话,可以查看:https://blog.csdn.net/gloryFlow/article/details/132262724
之前WebRTC音视频通话高分辨率不显示画面问题,可以查看:https://blog.csdn.net/gloryFlow/article/details/132240952
修改SDP中的码率Bitrate,可以查看:https://blog.csdn.net/gloryFlow/article/details/132263021
GPUImage视频通话视频美颜滤镜,可以查看:https://blog.csdn.net/gloryFlow/article/details/132265842
RTC直播本地视频或相册视频,可以查看:https://blog.csdn.net/gloryFlow/article/details/132267068

四、小结

WebRTC音视频通话-WebRTC视频自定义RTCVideoCapturer相机。主要获得相机采集的画面CVPixelBufferRef,将处理后的CVPixelBufferRef生成RTCVideoFrame,通过调用WebRTC的localVideoSource中实现的didCaptureVideoFrame方法。内容较多,描述可能不准确,请见谅。

https://blog.csdn.net/gloryFlow/article/details/132308673

学习记录,每天不停进步。

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/96511.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

手写模拟SpringBoot核心流程(一):实现极简一个SpringBoot——模拟SpringBoot启动过程

前言 Spring Boot 是一个开源的框架&#xff0c;用于简化 Spring 应用程序的开发和部署。它建立在 Spring Framework 的基础上&#xff0c;内置了web服务器——tomcat和jetty&#xff0c;使得 Spring 应用的构建变得更加快速、简单和可维护。 本文通过实现一个SpringBoot&…

ElasticSearch索引库、文档、RestClient操作

文章目录 一、索引库1、mapping属性2、索引库的crud 二、文档的crud三、RestClient 一、索引库 es中的索引是指相同类型的文档集合&#xff0c;即mysql中表的概念 映射&#xff1a;索引中文档字段的约束&#xff0c;比如名称、类型 1、mapping属性 mapping映射是对索引库中文…

C++ 面向对象三大特性——多态

✅<1>主页&#xff1a;我的代码爱吃辣 &#x1f4c3;<2>知识讲解&#xff1a;C 继承 ☂️<3>开发环境&#xff1a;Visual Studio 2022 &#x1f4ac;<4>前言&#xff1a;面向对象三大特性的&#xff0c;封装&#xff0c;继承&#xff0c;多态&#xff…

layui下拉框select 弹出层在最外层

出现问题如图所示 想要的效果是如下 这样的效果只需一行代码就能解决 .layui-layer-page .layui-layer-content{overflow: visible!important;}

No accessible constructors were found for the type‘XXXXXX‘

abp框架新建了一个模版项目&#xff0c;启动报错。 //报错实例 Autofac.Core.Activators.Reflection.NoConstructorsFoundException:“No accessible constructors were found for the type weigu.Admin.Order.OrderHuizongAppService.”报错意思是没有为’ weight.admin.orde…

MySQL 索引为什么使用 B+ 树,而不使用红黑树 / B 树 ?

面试官问 &#xff1a;索引为什么使用 B 树&#xff0c;而不使用 B 树&#xff0c;不使用红黑树呢 首先 B 树和 B 树 都是多叉搜索树&#xff0c;然后我们先来观察一下 B 树和 B 树的数据结构&#xff1a; B 树的数据结构实现 >> B 树的数据结构实现 >> 【B 树相…

CSS简介

目录 CSS CSS概念 核心概念 为什么需要CSS 语法 CSS的引入方式 内联样式&#xff08;行内样式&#xff09; 内部样式 外部样式&#xff08;推荐&#xff09; CSS CSS概念 CSS&#xff08;Cascading Style Sheets&#xff09;层叠样式表&#xff0c;又叫级联样式表&am…

微信小程序真机调试异常cmdId 1006, errCode-50011-已解决

cmdId 1006, errCode-50011 起因 小程序在模拟器上预览没问题,真机调试和体验版首页打不开,点展开显示cmdId 1006, errCode-50011 解决 查了下1006, 说是广告, 我没接广告,这个也不是错误码 1006广告组件被驳回你的广告正在被审核,无法展现广告后来找到几个类似的帖子…

中间件的介绍

1.1 什么是中间件 中间件是介于应用系统和系统软件之间的一类软件&#xff0c;他使用系统软件所提供的基础服务&#xff0c;衔接网络上应用系统的各个部分或不同的应用&#xff0c;能够达到资源共享、功能共享的目的。 例如MySQL就可以看作是具备中间件特性的一种技术&#x…

Linux系列讲解 —— FTP协议的应用

简单介绍一下FTP文件传输协议在linux系统中的应用。 目录 0. 基本概念1. FTP Server1.1 安装FTP Server1.2 FTP Server开启和关闭1.3 查看FTP Server是否开启1.4 FTP服务器配置 2. FTP Client2.1 lftp2.2 ftp2.3 sftp2.4 文件资源管理器集成的ftp和sftp 3. ftp常用命令 0. 基本…

python使用dir()函数获取对象中可用的属性和方法(看不到python源码又想知道怎么调用,DLL调用分析,SDK二次开发技巧)

有时候调用一些SDK&#xff0c;但是人家又是封装成dll文件形式调用的&#xff0c;这时没法看源码&#xff0c;也不想看其对应的开发文档&#xff08;尤其有些开发文档写得还很难懂&#xff0c;或者你从某个开源社区拿过来&#xff0c;就根本没找到开发文档&#xff09;&#xf…

搜狗拼音暂用了VSCode及微信小程序开发者工具快捷键Ctrl + Shit + K 搜狗拼音截图快捷键

修改搜狗拼音的快捷键 右键--更多设置--属性设置--按键--系统功能快捷键--系统功能快捷键设置--取消Ctrl Shit K的勾选--勾选截屏并设置为Ctrl Shit A 微信开发者工具设置快捷键 右键--Command Palette--删除行 微信开发者工具快捷键 删除行&#xff1a;Ctrl Shit K 或…

虚拟机问题

虚拟机无法识别USB设备 经排查为VMware USB Arbitration Service 没有启动,但是VMware USB Arbitration Service依赖于VMware Workstation Server启动 VMware USB Arbitration Service(VMUSBArbService)是由 VMware 虚拟化软件提供的一个服务,用于协调和管理主机系统上的…

计算机网络----CRC冗余码的运算

目录 1. 冗余码的介绍及原理2. CRC检验编码的例子3. 小练习 1. 冗余码的介绍及原理 冗余码是用于在数据链路层的通信链路和传输数据过程中可能会出错的一种检错编码方法&#xff08;检错码&#xff09;。原理&#xff1a;发送发把数据划分为组&#xff0c;设每组K个比特&#…

UNIAPP中开发企业微信小程序

概述 需求为使用uni-app开发企业微信小程序。希望可以借助现成的uni-app框架&#xff0c;快速开发。遇到的问题是uni-app引入jweixin-1.2.0.js提示异常: Reason: TypeError: Cannot read properties of undefined (reading ‘title’)。本文中描述了如何解决该问题&#xff0c…

linux 系统中vi 编辑器和库的制作和使用

目录 1 vim 1.1 vim简单介绍 1.2 vim的三种模式 1.3 vim基本操作 1.3.1命令模式下的操作 1.3.2 切换到文本输入模式 1.3.3 末行模式下的操作 2 gcc编译器 2.1 gcc的工作流程 2.2 gcc常用参数 3 静态库和共享&#xff08;动态&#xff09;库 3.1库的介绍 3.2静态…

Spring事件监听机制

前言 事件监听机制其原理就是观察者模式&#xff0c;而观察者模式又被称为发布-订阅模式。 观察者模式将有依赖关系的对象抽象为了观察者和主题两个不同的角色&#xff0c;多个观察者同时观察一个主题&#xff0c;两者只通过抽象接口保持松耦合状态&#xff0c;这样双方可以相…

【数据结构】 单链表面试题讲解

文章目录 引言反转单链表题目描述示例&#xff1a;题解思路代码实现&#xff1a; 移除链表元素题目描述&#xff1a;示例思路解析&#xff1a; 链表的中间结点题目描述&#xff1a;示例&#xff1a;思路解析代码实现如下&#xff1a; 链表中倒数第k个结点题目描述示例思路解析&…

2023.8 - java - 对象和类

public class Dog {String breed;int size;String colour;int age;void eat() {}void run() {}void sleep(){}void name(){} } 一个类可以包含以下类型变量&#xff1a; 局部变量&#xff1a;在方法、构造方法或者语句块中定义的变量被称为局部变量。变量声明和初始化都是在方…

基于IDE Eval Resetter延长IntelliJ IDEA等软件试用期的方法(包含新版本软件的操作方法)

本文介绍基于IDE Eval Resetter插件&#xff0c;对集成开发环境IntelliJ IDEA等JetBrains公司下属的多个开发软件&#xff0c;加以试用期延长的方法。 我们这里就以IntelliJ IDEA为例&#xff0c;来介绍这一插件发挥作用的具体方式。不过&#xff0c;需要说明使用IDE Eval Rese…