kinect

kinect as a webcam

那年仲夏 提交于 2019-12-06 16:17:59
问题 I need to get my kinect device on to the list of cameras in skype. I tried this one, http://www.e2esoft.cn/kinect/, but it doesn't work. Maybe it's because I use OpenNI version 2.x. I installed it as a PrimeSense device, as I'm not allowed to use Microsoft SDK. May be I should write my own driver. But I can't find any source that covers webcam driver writing. 回答1: Here is another alternative that uses OpenNI drivers : http://www.softpedia.com/get/Internet/WebCam/Kinect-Virtual-Camera.shtml .

What does Joint.Position refer to?

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-06 16:16:38
I'm trying to detect certain positions or gestures of the user using Kinect and the Kinect SDK 1.8. For this purpose I intend to use the information provided by the Joint.Position attribute. However, I can't make sense of the values I'm reading. What do they mean? Are they the distance between the joint and the sensor? What unit are they being measured with? Joint.Position is a 3D vector which contains X , Y and Z properties. Each of these properties represents the distance in meters (along the X, Y and Z directions) of the joint from the origin of the coordinate system used. In Microsoft

Access Kinect RGB image data from ZigJS

◇◆丶佛笑我妖孽 提交于 2019-12-06 14:41:18
问题 I have ZigJS running in the browser and everything is working well, but I want to record the Kinect webcam images in order to play them back as a recorded video. I've looked through the documentation at http://zigfu.com/apidoc/ but cannot find anything related to the RGB information. However, this SO answer leads me to believe this is possible: We also support serialization of the depth and RGB image into canvas objects in the browser Is it possible to capture the RGB image data from ZigJS

Saving raw detph-data

Deadly 提交于 2019-12-06 13:47:48
I am trying to save my kinect raw depth-data and i dont want to use the Kinect Studio, because i need the raw-data for further calculations. I am using the kinectv2 and kinect sdk! My problem is that i just get low FPS for the saved data. Its about 15-17FPS. Here my Framereader ( in further steps i want to save colorstream also): frameReader = kinectSensor.OpenMultiSourceFrameReader(FrameSourceTypes.Depth); frameReader.MultiSourceFrameArrived += Reader_MultiSourceFrameArrived; Here the Event: void Reader_MultiSourceFrameArrived(object sender, MultiSourceFrameArrivedEventArgs e) { var reference

How to? Creating a webm video from kinect data for three.js example webgl_kinect

我是研究僧i 提交于 2019-12-06 10:28:08
问题 http://mrdoob.github.com/three.js/examples/webgl_kinect.html How one creates a kinect webm movie to play in the above example, specifically, how one does it from an kinect oni file or a kinect point cloud? Approach, language, os are not important. Thanks. 回答1: Alongside kinect.webm you'll find kinect.nfo where I wrote some notes with links on how it was recorded. 来源: https://stackoverflow.com/questions/15457765/how-to-creating-a-webm-video-from-kinect-data-for-three-js-example-webgl-kinect

How to do Joint tracking in Kinect with a scaled Image

做~自己de王妃 提交于 2019-12-06 09:56:59
问题 I trying to do some Joint Tracking with kinect (just put a ellipse inside my right hand) everything works fine for a default 640x480 Image, i based myself in this channel9 video. My code, updated to use the new CoordinateMapper classe is here ... CoordinateMapper cm = new CoordinateMapper(this.KinectSensorManager.KinectSensor); ColorImagePoint handColorPoint = cm.MapSkeletonPointToColorPoint(atualSkeleton.Joints[JointType.HandRight].Position, ColorImageFormat.RgbResolution640x480Fps30);

Emgu CV and the official Microsoft Kinect SDK?

拟墨画扇 提交于 2019-12-06 07:32:41
问题 Emgu CV currently allows the use of the Kinect with the OpenNI drivers. I've also seen that there exists an mssdk-openni bridge application to allow the Kinects running on the official Microsoft SDK to emulate OpenNI driven Kinects. Has anyone been successful in getting a Kinect running on the Microsoft SDK to work with Emgu CV, either with the mssdk-openni bridge or directly? Are there any tips on getting it running smoothly, or pitfalls to avoid? 回答1: Yeah. I've simply installed the SDK and

Kinect Grip Gesture for Click

三世轮回 提交于 2019-12-06 06:22:17
I'm using kinect V2.0. I need to perform click using grip gesture. Is there a way to handle the Grip gesture in V2.0 like AddHandPointerGripHandler in V1.8. In Microsoft Kinect SDK v2.0, the Body class includes two properties: Body.HandRightState Body.HandLeftState Both these properties are instances of the HandState enumeration, which specifies if the hand is: Closed (and you can detect this to trigger the Grip gesture); Lasso (which means that the hand is closed in a fist, except for a finger pointing upward) Not Tracked (the hand state is not tracked) Open (the hand is open) Unknown If you

body joints angle using kinect (checking time interval)

心不动则不痛 提交于 2019-12-06 06:18:38
问题 As you can see in image (the link is given below) when left hand is raised it shows an angle what i wanted is . A person should hold his arm for 5sec (in the position shown in a image) and if the person changes its arm position (that means if the angle is below 70 or above 80) within 5 sec some message should b displayed to put the arm back in same position and timer restarts! http://postimage.org/image/hpfl41nzp/ Mainwindow.xaml file <Window x:Class="shoulder_joint.MainWindow" xmlns="http:/

generate a point cloud from a given depth image-matlab Computer Vision System Toolbox

限于喜欢 提交于 2019-12-06 04:34:17
I am a beginner in matlab, I have purchased Computer Vision System Toolbox. I have being given 400 of depth images (.PNG images). I would like to create a point cloud for each image. I looked at the documentation of Computer Vision System Toolbox, and there is an example of converting depth image to point cloud ( http://uk.mathworks.com/help/vision/ref/depthtopointcloud.html ): [xyzPoints,flippedDepthImage] = depthToPointCloud(depthImage,depthDevice) depthDevice = imaq.VideoDevice('kinect',2) but the thing that I don't understand is that it requires Kinect camera and connection . I am not