coreml

Unable to load CoreML model using MLModel.compileModel

霸气de小男生 提交于 2019-12-11 06:35:46
问题 The CoreML couldn't be loaded. The first line is successful, but the second line gives an error called: The file couldn’t be saved. The model does exist and modelUrl is correct. The same issue is also found here Unable to load CoreML model using MLModel.compileModel(at:) Does someone know what the cause is? var modelUrl = NSBundle.MainBundle.GetUrlForResource("SentimentPolarity", "mlmodel"); var compiledModelUrl = MLModel.CompileModel(modelUrl, out var error); 回答1: I found the answer just now

How to access elements inside MLMultiArray in CoreML

心已入冬 提交于 2019-12-11 05:33:09
问题 I have initialized MLMultiArray using initWithDataPointer as shown in the code below: float count = 512 * 384; double *tempBuffer = malloc(count * sizeof(double)); NSError *error = NULL; NSArray *shape = [NSArray arrayWithObjects:[NSNumber numberWithInt:1],[NSNumber numberWithInt:512],[NSNumber numberWithInt:384], nil]; NSArray *stride = [NSArray arrayWithObjects:[NSNumber numberWithInt:1],[NSNumber numberWithInt:1],[NSNumber numberWithInt:1], nil]; MLMultiArray *mlMultiArray = [[MLMultiArray

Is there a way to make CoreML model available for iOS11+ at source level

人走茶凉 提交于 2019-12-07 13:32:17
问题 I have a CoreML model in my application. At run time, the prediction feature should be disabled on iOS8-10 and active on iOS11. To be able to compile, on all classes that use CoreML, I have added : @available(iOS 11.0, *) But the .mlmodel generates the Swift code at every rebuild discarding all annotations. And so creating compile errors like : 'MLModel' is only available on iOS 11.0 or newer Is there a way in Xcode9 to make the mlmodel iOS11 only? EDIT : This bug was fixed in XCode 9 beta 4.

Is there a way to make CoreML model available for iOS11+ at source level

核能气质少年 提交于 2019-12-06 03:12:14
I have a CoreML model in my application. At run time, the prediction feature should be disabled on iOS8-10 and active on iOS11. To be able to compile, on all classes that use CoreML, I have added : @available(iOS 11.0, *) But the .mlmodel generates the Swift code at every rebuild discarding all annotations. And so creating compile errors like : 'MLModel' is only available on iOS 11.0 or newer Is there a way in Xcode9 to make the mlmodel iOS11 only? EDIT : This bug was fixed in XCode 9 beta 4. The workaround is not needed anymore. Upd. 07/25/17: Apple just have introduced new API for compiling

Text detection in images

风格不统一 提交于 2019-12-03 20:35:18
I am using below sample code for text detection in images (not handwritten) using coreml and vision. https://github.com/DrNeuroSurg/OCRwithVisionAndCoreML-Part2 In this they have used machine learning model which supports only uppercase and numbers. Where as in my project I want upper case, lower case , numbers and few of special characters (like : ,- ). I do not have any experience in python to do required changes and generate the required .mlmodel file using train data (which again I don't have for my requirement) Below is the link where it is given how to create .mlmodel http://www

Error installing coremltools

情到浓时终转凉″ 提交于 2019-12-03 20:18:21
问题 I am looking at Core ML Apple iOS framework. I have read that to install coremltools to create own models. I have installed python sudo python /Users/administrator/Downloads/get-pip.py As per document coreml installation I have downloaded coremltool file. and then trying to install coremltools https://pypi.python.org/pypi/coremltools When I installed coremltools on my mac, i got the following error. Please suggest me to solve this error. so that i can work on coremltools MyMacbook:~

How to transform vision framework coordinate system into ARKit?

家住魔仙堡 提交于 2019-12-03 06:53:15
问题 I am using ARKit (with SceneKit) to add the virtual object (e.g. ball). I am tracking real world object (e.g. foot) by using Vision framework and receiving its updated position in vision request completion handler method. let request = VNTrackObjectRequest(detectedObjectObservation: lastObservation, completionHandler: self.handleVisionRequestUpdate) I wants to replace the tracked real world object with virtual (for example replace foot with cube) but I am not sure how to replace the

How to convert a UIImage to a CVPixelBuffer [duplicate]

帅比萌擦擦* 提交于 2019-12-03 05:51:00
问题 This question already has answers here : Convert Image to CVPixelBuffer for Machine Learning Swift (3 answers) Closed 2 years ago . Apple's new CoreML framework has a prediction function that takes a CVPixelBuffer . In order to classify a UIImage a conversion must be made between the two. Conversion code I got from an Apple Engineer: 1 // image has been defined earlier 2 3 var pixelbuffer: CVPixelBuffer? = nil 4 5 CVPixelBufferCreate(kCFAllocatorDefault, Int(image.size.width), Int(image.size

How to transform vision framework coordinate system into ARKit?

耗尽温柔 提交于 2019-12-02 19:29:47
I am using ARKit (with SceneKit) to add the virtual object (e.g. ball). I am tracking real world object (e.g. foot) by using Vision framework and receiving its updated position in vision request completion handler method. let request = VNTrackObjectRequest(detectedObjectObservation: lastObservation, completionHandler: self.handleVisionRequestUpdate) I wants to replace the tracked real world object with virtual (for example replace foot with cube) but I am not sure how to replace the boundingBox rect (which we receive in vision request completion) into scene kit node as coordinate system are

error: Cannot subscript a value of type '[String : Any]' with an index of type 'UIImagePickerController.InfoKey' [duplicate]

喜夏-厌秋 提交于 2019-12-01 21:29:40
This question already has an answer here: Cannot subscript a value of type '[String : Any]' with an index of type 'UIImagePickerController.InfoKey' 8 answers I am trying to rebuild the Apple test App for image detection via CoreML, but I have the error: Cannot subscript a value of type '[String : Any]' with an index of type 'UIImagePickerController.InfoKey extension ImageClassificationViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate { func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String: Any]) { picker.dismiss