iOS14.5中的CoreML内存泄漏

在我的应用程序中,我将 VNImageRequestHandler 与自定义 MLModel 一起用于对象检测。

该应用程序适用于 14.5 之前的 iOS 版本。

当 iOS 14.5 到来时,它打破了一切。

  1. 每当try handler.perform([visionRequest])抛出错误时(Error Domain=com.apple.vis Code=11 "encountered unknown exception" UserInfo={NSLocalizedDescription=encountered unknown exception}),pixelBuffer内存被保留并且永远不会释放,它使 AVCaptureOutput 的缓冲区满然后新帧没来。
  2. 我必须更改代码如下,通过将pixelBuffer复制到另一个var,我解决了新帧不来的问题,但仍然发生内存泄漏问题。

由于内存泄漏,应用程序在一段时间后崩溃。

请注意,在 iOS 14.5 版本之前,检测工作完美,try handler.perform([visionRequest])永远不会抛出任何错误。

这是我的代码:

private func predictWithPixelBuffer(sampleBuffer: CMSampleBuffer) {
  guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
    return
  }
  
  // Get additional info from the camera.
  var options: [VNImageOption : Any] = [:]
  if let cameraIntrinsicMatrix = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, nil) {
    options[.cameraIntrinsics] = cameraIntrinsicMatrix
  }
  
  autoreleasepool {
    // Because of iOS 14.5, there is a bug that when perform vision request failed, pixel buffer memory leaked so the AVCaptureOutput buffers is full, it will not output new frame any more, this is a temporary work around to copy pixel buffer to a new buffer, this currently make the memory increased a lot also. Need to find a better way
    var clonePixelBuffer: CVPixelBuffer? = pixelBuffer.copy()
    let handler = VNImageRequestHandler(cvPixelBuffer: clonePixelBuffer!, orientation: orientation, options: options)
    print("[DEBUG] detecting...")
    
    do {
      try handler.perform([visionRequest])
    } catch {
      delegate?.detector(didOutputBoundingBox: [])
      failedCount += 1
      print("[DEBUG] detect failed (failedCount)")
      print("Failed to perform Vision request: (error)")
    }
    clonePixelBuffer = nil
  }
}

有没有人遇到过同样的问题?如果是这样,你是如何解决的?

以上是iOS14.5中的CoreML内存泄漏的全部内容。
THE END
分享
二维码
< <上一篇
下一篇>>