从CMSampleBufferr制作UIImage会发生什么?

问题描述:

这是一个相当流行的问题,但我还没有找到解决我的问题。 我从iPhone的前置摄像头,这样从CMSampleBufferr制作UIImage会发生什么?

func captureOutput(captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection) { 
    let uiImage = imageFromSampleBufferDep(sampleBuffer) 
    ... 
    UIImageWriteToSavedPhotosAlbum(uiImage!, self, "image:didFinishSavingWithError:contextInfo:", nil) 
    ... 
} 

func imageFromSampleBufferDep(sampleBuffer: CMSampleBuffer) -> UIImage? { 
    let imageBuffer: CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer)! 
    CVPixelBufferLockBaseAddress(imageBuffer, 0) 
    let address = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0) 
    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer) //1924 
    let width = CVPixelBufferGetWidth(imageBuffer) //1280 
    let height = CVPixelBufferGetHeight(imageBuffer) //720 

    let colorSpace = CGColorSpaceCreateDeviceRGB() 

    let context = CGBitmapContextCreate(address, width, height, 8, bytesPerRow, colorSpace, CGImageAlphaInfo.NoneSkipFirst.rawValue) 
    let imageRef = CGBitmapContextCreateImage(context) 

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0) 
    var resultImage : UIImage? 
    if context != nil { 
     resultImage = UIImage(CGImage: imageRef!) 
    } 

    return resultImage 
} 

拍摄框架和我有一个错误:

<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 5120 for 8 integer bits/component, 3 components, kCGImageAlphaNoneSkipFirst. 

我试图直接分配bytesPerRow到5120,但在这种情况下,我有一组灰色和倒置的图片(附)

热点解决这个问题?

enter image description here

+0

你是假设你所得到的像素缓冲区是单平面,是在RGB色彩空间。相机的内容不会是这种格式。首先,您需要使用以下代码获取像素格式类型:CVPixelBufferGetPixelFormatType。你也可以看看使用CVPixelBufferGetPlaneCount来找出飞机的数量。为了得到更好的颜色匹配,你也应该考虑使用'CGColorSpaceRef colorSpace; colorSpace =(CGColorSpaceRef)CVBufferGetAttachment(pixBuffer, kCVImageBufferCGColorSpaceKey,NULL);' – SheffieldKevin

+0

可以给我另一个提示吗?我做了这些事情。 'var pixelBuffer:CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!; var planesNum = CVPixelBufferGetPlaneCount(pixelBuffer);让colorSpace = CVBufferGetAttachment(pixelBuffer,kCVImageBufferCGColorSpaceKey,nil)as! CGColorSpaceRef'在这种情况下,colorSpace返回nil。我是对的,那么我必须交换imageBuffer到pixelBuffer? – albertpod

+0

飞机数量的目的是什么?必须在哪里使用? – albertpod

其实我做这导致了上述错误的傻事。

继此convert CMSampleBufferRef to UIImage问题。

我这样做:

var videoCaptureOutput = AVCaptureVideoDataOutput() 

videoCaptureOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_32BGRA)]