iOScaptureOutputdidOutputSampleBuffer不调用fromConnection
问题说明
我想从AVCaptureSession的实时馈送中提取帧,我使用Apple的AVCam作为测试用例。以下是AVCam的链接:
I want to pull frames from the live-feed of AVCaptureSession and I am using Apple's AVCam as a test case. Here is the link to AVCam:
https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
I发现 captureOutput:didOutputSampleBuffer:fromConnection
未被调用,我想知道为什么或我做错了什么。
I found that that captureOutput:didOutputSampleBuffer:fromConnection
is NOT called and I would like to know why or what I am doing wrong.
以下是我所做的:
(1)我使 AVCamViewController
委托
@interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>
(2)我创建了一个 AVCaptureVideoDataOutput
对象并将其添加到会话
(2) I created an AVCaptureVideoDataOutput
object and add it to the session
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([session canAddOutput:videoDataOutput])
{
[session addOutput:videoDataOutput];
}
(3)我通过记录随机字符串添加了委托方法和测试测试
(3) I added the delegate method and test by logging a random string to test
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
NSLog(@"I am called");
}
测试应用程序有效,但captureOutput:didOutputSampleBuffer:fromConnection不是叫做。
The test application works but captureOutput:didOutputSampleBuffer:fromConnection is not called.
(4)我在SO上读到 AVCaptureSession * session = [[AVCaptureSession alloc] init]中的会话变量;
在viewDidLoad中是本地的是一个可能的原因,为什么没有调用委托,我把它作为AVCamViewController类的实例变量,但它没有被调用。
(4) I read on SO that the session variable in AVCaptureSession *session = [[AVCaptureSession alloc] init];
being local in viewDidLoad is a possible reason why the delegate is not called and I made it an instance variable of the AVCamViewController class, yet it is not called.
这是我正在测试的viewDidLoad方法(取自AVCam),我在方法结束时添加了AVCaptureDataOutput:
Here is the viewDidLoad method I am testing with (taken from AVCam), I added AVCaptureDataOutput towards the end of the method:
- (void)viewDidLoad
{
[super viewDidLoad];
// Create the AVCaptureSession
session = [[AVCaptureSession alloc] init];
[self setSession:session];
// Setup the preview view
[[self previewView] setSession:session];
// Check for device authorization
[self checkDeviceAuthorizationStatus];
// In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time.
// Why not do all of this on the main queue?
// -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).
dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL);
[self setSessionQueue:sessionQueue];
dispatch_async(sessionQueue, ^{
[self setBackgroundRecordingID:UIBackgroundTaskInvalid];
NSError *error = nil;
AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];
AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (error)
{
NSLog(@"%@", error);
}
if ([session canAddInput:videoDeviceInput])
{
[session addInput:videoDeviceInput];
[self setVideoDeviceInput:videoDeviceInput];
dispatch_async(dispatch_get_main_queue(), ^{
// Why are we dispatching this to the main queue?
// Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread.
// Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation.
[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)[self interfaceOrientation]];
});
}
AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];
if (error)
{
NSLog(@"%@", error);
}
if ([session canAddInput:audioDeviceInput])
{
[session addInput:audioDeviceInput];
}
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
if ([session canAddOutput:stillImageOutput])
{
[stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}];
[session addOutput:stillImageOutput];
[self setStillImageOutput:stillImageOutput];
}
AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoDataOutput setSampleBufferDelegate:self queue:sessionQueue];
if ([session canAddOutput:videoDataOutput])
{
NSLog(@"Yes I can add it");
[session addOutput:videoDataOutput];
}
});
}
- (void)viewWillAppear:(BOOL)animated
{
dispatch_async([self sessionQueue], ^{
[self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];
[self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext];
[self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]];
__weak AVCamViewController *weakSelf = self;
[self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {
AVCamViewController *strongSelf = weakSelf;
dispatch_async([strongSelf sessionQueue], ^{
// Manually restarting the session since it must have been stopped due to an error.
[[strongSelf session] startRunning];
[[strongSelf recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal];
});
}]];
[[self session] startRunning];
});
}
有人可以告诉我为什么以及如何解决它的建议?
Can someone please tell me why and suggestions on how to fix it?
正确答案
我已经做了很多实验,我想我的答案很可能。我有类似但不同的代码,这些代码是从头开始编写的,而不是从Apple的样本中复制(现在有点旧)。
I've done a lot of experimenting with this and I think I have the answer probably. I have similar but different code that's written from the ground up rather than being copied from Apple's samples (which are a bit old now).
我认为这是部分.. 。
I think it's the section...
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
if ([session canAddOutput:movieFileOutput])
{
[session addOutput:movieFileOutput];
AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
if ([connection isVideoStabilizationSupported])
[connection setEnablesVideoStabilizationWhenAvailable:YES];
[self setMovieFileOutput:movieFileOutput];
}
从我的实验中,这就是导致问题的原因。在我的代码中,当它存在 captureOutput:didOutputSampleBuffer:fromConnection
时不会被调用。我认为视频系统EITHER为您提供了一系列样本缓冲区或将压缩的优化电影文件记录到磁盘,而不是两者。 (至少在iOS上。)我想这很有意义/并不奇怪,但我没有看到它在任何地方都有记录!
From my experiments, this is the thing that causes your problem. In my code, when this is there captureOutput:didOutputSampleBuffer:fromConnection
is not called. I think the video system EITHER gives you a series of sample buffers OR records a compressed, optimised movie file to disk, not both. (At least on iOS.) I guess this makes sense/is not surprising but I have not seen it documented anywhere!
此外,有一点,我似乎是当我打开麦克风时,没有出现错误和/或缓冲区回调。再次没有记录,这些是错误-11800(未知错误)。但我不能总是重现那个。
Also, at one point, I seemed to be getting errors and/or the buffer callback not occurring when I had the microphone on. Again undocumented, these were error -11800 (unknown error). But I cannot always reproduce that.
这篇好文章是转载于:学新通技术网
- 版权申明: 本站部分内容来自互联网,仅供学习及演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,请提供相关证据及您的身份证明,我们将在收到邮件后48小时内删除。
- 本站站名: 学新通技术网
- 本文地址: /reply/detail/tanhcfhkef
-
YouTube API 不能在 iOS (iPhone/iPad) 工作,但在桌面浏览器工作正常?
it1352 07-30 -
iPhone,一张图像叠加到另一张图像上以创建要保存的新图像?(水印)
it1352 07-17 -
保持在后台运行的 iPhone 应用程序完全可操作
it1352 07-25 -
使用 iPhone 进行移动设备管理
it1352 07-23 -
在android同时打开手电筒和前置摄像头
it1352 09-28 -
扫描 NFC 标签时是否可以启动应用程序?
it1352 08-02 -
检查邮件是否发送成功
it1352 07-25 -
Android微调工具-删除当前选择
it1352 06-20 -
希伯来语的空格句子标记化错误
it1352 06-22 -
Android App 和三星 Galaxy S4 不兼容
it1352 07-20