问题
I want to make a capture image in ARKit and send byte array to TCP server.
Well This is my code.
@IBOutlet weak var sceneView: ARSCNView!
@IBAction func sendButtonAction(_ sender: Any) {
let captureImage:UIImage = self.sceneView.snapshot()
}
I can get image by snapshot but i don't know how convert it to Byte Array (include pixel R,G,B Data.)
I tried to change UIImage to binary data like this.
let imageData: NSData = UIImagePNGRepresentation(captureImage)! as NSData
but this is not correct because imageData's size is mutable whenever i snapshot :(
My purpose is to make a captureImage's Byte(UInt8) array which size is Width*Height*3(R,G,B) Bytes
If you have any ideas to solve this problem, please help me.
回答1:
I'm a little confused about what exactly isn't working for you by using UIImagePNGRepresentation
, sorry if I'm missing something or maybe you could clarify.
But for one, PNG would be expecting RGBA, so you might want to try UIImageJPEGRepresentation
, since it doesn't support an alpha channel.
And if you're trying to get an actual NSMutableArray
, see if this old answer helps: https://stackoverflow.com/a/29734175/8895191
来源:https://stackoverflow.com/questions/48364722/how-to-capture-image-in-arkit-and-send-binary-data