似乎有很多与我遇到的问题类似的问题:
找到了AVCaptureMetadataOutput setMetadataObjectTypes不支持的类型
还有一个与AVFoundation有关的Apple错误:
https://forums.developer.apple.com/thread/86810#259270
但是这些似乎都不是我的答案。
我的代码可以在swift 3中很好地运行,但是只会在swift 4中出错。使用以上链接中的解决方案根本不会产生任何变化。
码:
import UIKit
import AVFoundation
class BarCodeScanViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate {
weak var delegate: FlowControllerDelegate?
var captureSession: AVCaptureSession = AVCaptureSession()
var previewLayer: AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer()
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = UIColor.black
captureSession = AVCaptureSession()
guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
let videoInput: AVCaptureDeviceInput
do {
videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
} catch {
return
}
if (captureSession.canAddInput(videoInput)) {
captureSession.canAddInput(videoInput)
} else {
failed()
return
}
// let captureMetadataOutput = AVCaptureMetadataOutput()
let metadataOutput = AVCaptureMetadataOutput()
if captureSession.canAddOutput(metadataOutput) {
captureSession.addOutput(metadataOutput)
// Check status of camera permissions
metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
// metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.upce]
metadataOutput.metadataObjectTypes = [.ean8, .ean13, .pdf417, .upce]
} else {
failed()
return
}
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.frame = view.layer.bounds
previewLayer.videoGravity = .resizeAspectFill
view.layer.addSublayer(previewLayer)
captureSession.startRunning()
}
func failed() {
let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert)
ac.addAction(UIAlertAction(title: "OK", style: .default))
present(ac, animated: true)
// captureSession = nil
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
if(captureSession.isRunning == false) {
captureSession.startRunning()
}
}
override func viewWillDisappear(_ animated: Bool) {
if captureSession.isRunning == true {
captureSession.stopRunning()
}
super.viewWillDisappear(animated)
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!) {
captureSession.stopRunning()
if let metatdataObject = metadataObjects.first {
guard let readableObject = metatdataObject as? AVMetadataMachineReadableCodeObject else { return }
guard let stringValue = readableObject.stringValue else { return }
AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
found(code: stringValue)
}
dismiss(animated: true)
}
func found(code: String) {
print(code)
}
override var prefersStatusBarHidden: Bool {
return true
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .portrait
}
}
当我在Xcode 8和Swift 3中构建此代码时,它工作正常。当我在Xcode 9中运行它时,Swift 4it在添加媒体类型时崩溃:
metadataOutput.metadataObjectTypes = [.ean8, .ean13, .pdf417, .upce]
在这两种情况下,我都将构建到以前没有Beta的iOS 11设备。
我尝试使用“ __”来查看是否是上面提到的Apple bug。如果我注释掉该行,则代码会运行,但不会捕获。
苹果可能还会引入其他错误吗?还有谁有相同的问题吗?
任何帮助,将不胜感激。
谢谢
有关清晰度的更多信息:
Leevi Graham是正确的,也确实是苹果在没有适当文档的情况下更改了堆栈。这导致似乎有错误。
澄清对我有帮助的评论:
代表回叫已从:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!)
至
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection)
但是,我真正遇到的问题是您不再为获取设置大量类型metadataObjectTypes
。现在,您只需设置所有可用的类型:
metadataOutput.metadataObjectTypes =
metadataOutput.availableMetadataObjectTypes
所以...
实际上,这是一个API问题。为此提出了若干雷达问题。但是苹果已经改变了他们的AVFoundation文档来解决这个问题。
本文收集自互联网,转载请注明来源。
如有侵权,请联系 [email protected] 删除。
我来说两句