How can you track motion using the iPhone's camera?

落爺英雄遲暮 提交于 2019-12-17 17:28:58

问题


I saw that someone has made an app that tracks your feet using the camera, so that you can kick a virtual football on your iPhone screen.

How could you do something like this? Does anyone know of any code examples or other information about using the iPhone camera for detecting objects and tracking them?


回答1:


I just gave a talk at SecondConf where I demonstrated the use of the iPhone's camera to track a colored object using OpenGL ES 2.0 shaders. The post accompanying that talk, including my slides and sample code for all demos can be found here.

The sample application I wrote, whose code can be downloaded from here, is based on an example produced by Apple for demonstrating Core Image at WWDC 2007. That example is described in Chapter 27 of the GPU Gems 3 book.

The basic idea is that you can use custom GLSL shaders to process images from the iPhone camera in realtime, determining which pixels match a target color within a given threshold. Those pixels then have their normalized X,Y coordinates embedded in their red and green color components, while all other pixels are marked as black. The color of the whole frame is then averaged to obtain the centroid of the colored object, which you can track as it moves across the view of the camera.

While this doesn't address the case of tracking a more complex object like a foot, shaders like this should be able to be written that could pick out such a moving object.

As an update to the above, in the two years since I wrote this I've now developed an open source framework that encapsulates OpenGL ES 2.0 shader processing of images and video. One of the recent additions to that is a GPUImageMotionDetector class that processes a scene and detects any kind of motion within it. It will give you back the centroid and intensity of the overall motion it detects as part of a simple callback block. Using this framework to do this should be a lot easier than rolling your own solution.




回答2:


I have had some success tracking faces and eyes using OpenCV on the iPhone. Here's a good place to start: http://niw.at/articles/2009/03/14/using-opencv-on-iphone/en

I guess the trick is finding a cascade (description of what the camera should be looking for) that describes a foot, not really sure if that exists though.



来源:https://stackoverflow.com/questions/3933716/how-can-you-track-motion-using-the-iphones-camera

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!