augmented-reality

Augmented Faces API – How facial landmarks generated?

大憨熊 提交于 2019-12-05 04:28:12
I'm an IT student, and would like to know (understand) more about the Augmented Faces API in ARCore. I just saw the ARCore V1.7 release , and the new Augmented Faces API . I get the enormous potential of this API. But I didn't see any questions or articles on this subject. So I'm questioning myself, and here are some assumptions / questions which come to my mind about this release. Assumption ARCore team are using (Like Instagram and Snapchat) machine learning, to generate landmarks all over the face. Probably HOG Face Detection .. Questions How does ARCore generate 468 points all over the

Augmented Reality on Mobile phones

南笙酒味 提交于 2019-12-05 04:17:53
I am interested in implementing an Augmented Reality application for mobile phones using Adobe Flash Platform. Can you please let me know if any sources is available for me to find out how to start? I'm not sure what software I need to use to implement AR but as I can understand I need: ARToolKit Marker Generator to create the marker matching with the 3D image, then FLARToolKit to analyze the image from the marker and Papervision3D to create an object that shares that same space with the marker. Can you please let me know if what I wrote is correct and advise me how to start the implementation

Setting ARKit Orientation via Swift

萝らか妹 提交于 2019-12-05 02:54:18
问题 I am developing an ARKit app with OpenGL , so working directly with ARKit and not using SceneKit. By default, ARKit is set to landscape orientation, but I have been unable to track down any documentation or examples to rotate to portrait. SceneKit example works in portrait but the Metal example only works in landscape. Is it possible to change the ARKit tracking orientation? 回答1: I was able to solve this in the application logic by multiplying the camera matrix by a quaternion that is rotated

Get Started with Open CV image recognition

扶醉桌前 提交于 2019-12-04 22:10:16
I am trying to make an app for image recognition with Open CV, i want to implement something like this but i don't know how should i do it can any one give me any help where should i begin from i have downloaded Opencv for iOS from here , I have a hardcopy of image as an example which i want to scan through the camera and the images(markers) i have imported in project now when i scan the image through camera then it should overlay the markers on the image and when i tap/select the marker it should show the info of that marker. Here is my image : It's just an example i have taken (Square,Circle

Augmented Reality movement

偶尔善良 提交于 2019-12-04 19:34:52
Hey guys i am in a problem related to augmented reality please help PROBLEM I need to show the label on camera overlay for any object with movement for example i have 4 object say four house in four direction i.e. south north east west and when i see to north i should be able to see the only single label for that house on that label there will be some information say house name so how to do it when i move to south direction north label should start move aside and south label should start to come on the screen Solution that i have done still when i see to north i am able to show the all four

Visual Odometry (aka. Egomotion estimation) with OpenCV

為{幸葍}努か 提交于 2019-12-04 19:26:57
问题 I'm planning to implement an application with augmented reality features. For one of the features I need an egomotion estimation. Only the camera is moving, in a space with fixed objects (nothing or only small parts will be moving, so that they might be ignored). So I searched and read a lot and stumbled upon OpenCV. Wikipedia explicitly states that it could be used for egomotion. But I cannot find any documentation about it. Do I need to implement the egomotion algorithm by myself with

How to capture live camera frames in RGB with DirectShow

可紊 提交于 2019-12-04 14:51:36
问题 I'm implementing live video capture through DirectShow for live processing and display. (Augmented Reality app). I can access the pixels easily enough, but it seems I can't get the SampleGrabber to provide RGB data. The device (an iSight -- running VC++ Express in VMWare) only reports MEDIASUBTYPE_YUY2. After extensive Googling, I still can't figure out whether DirectShow is supposed to provide built-in color space conversion for this sort of thing. Some sites report that there is no YUV<-

OpenGL + OpenCV Augmented Reality on Android

北城余情 提交于 2019-12-04 14:35:46
问题 I am using Vuforia SDK to do AR stuff on Android. For a project I want to track a ball. But with Vuforia this is not possible. So I tried to use OpenCV color detection to track a ball. I adapt this solution for my project. For now I can track and calculate center point(in screen coordinates) of the ball with OpenCV. How can I use OpenCV screen coordinates to do OpenGL AR stuff on Android. Where to start? 回答1: I combined opencv tracking (using the Camshift algorithm) with OpenGL here: http:/

Saving an Image to Photos Folder In hololens App

流过昼夜 提交于 2019-12-04 12:01:43
I'm attempting to capture a photo inside my hololens app. It seems to be working but saving the image to an obscure place that I cant access or view. I want to save it to my pictures library Described here I think . Or where should i save the image so I can see it in my photos on the hololens. filePath = C:/Data/Users/JanikJoe/AppData/Local/Packages/HoloToolkit-Unity_pzq3xp76mxafg/LocalState\CapturedImage10.02831_n.jpg filePath2 = C:/Data/Users/DefaultAccount/AppData/Local/DevelopmentFiles/HoloToolkit-UnityVS.Debug_x86.janik/Data\CapturedImage10.02831_n.jpg using UnityEngine; using UnityEngine

Google Cardboard - are there an iPhone / iOS starter projects for the Cardboard VR kit?

十年热恋 提交于 2019-12-04 11:46:16
问题 I'm looking at the Google Cardboard kit, an inexpensive VR setup that uses Android Devices to play 3d VR games. I see that they have an Android demo project, but is there any iOS or Objective-C port of the Cardboard project? If not, are there any other VR projects for iOS that can be modified to work with the cardboard kit? 回答1: June 2015 update Google has added iOS support to their official Unity SDK. Even though the Unity plugin bundles a pre-compiled iOS Cardboard binary, there doesn't