augmented-reality

How to track image anchors after initial detection in ARKit 1.5?

妖精的绣舞 提交于 2019-12-02 18:55:52
I'm trying ARKit 1.5 with image recognition and, as we can read in the code of the sample project from Apple : Image anchors are not tracked after initial detection, so create an animation that limits the duration for which the plane visualization appears. An ARImageAnchor doesn't have a center: vector_float3 like ARPlaneAnchor has, and I cannot find how I can track the detected image anchors. I would like to achieve something like in this video , that is, to have a fix image, button, label, whatever, staying on top of the detected image, and I don't understand how I can achieve this. Here is

How to improve accuracy of accelerometer and compass sensors?

微笑、不失礼 提交于 2019-12-02 18:37:06
i am creating an augmented reality app that simply visualices a textview when the phone is facing a Point of Interest (wich gps position is stored on the phone). The textview is painted on the point of interest location in the screen. It works ok, the problem is that compass and accelerometer are very "variant", and the textview is constantly moving up and down left and right because the innacuracy of the sensors. there is a way to solve it? Our problem is same. I also had same problem when I create simple augmented reality project. The solution is to use exponential smoothing or moving

How to start writing an augmented reality application

99封情书 提交于 2019-12-02 18:23:22
I have been looking at creating an augmented reality application. Can anyone suggest a preferred technology platform to start writing an application of this kind. I would like this to be a desktop application and not a mobile application. Therefore I want to use a webcam with object recognition. Thanks! FLARToolKit is another good place to look. It's free and uses flash + Actionsctript 3. gotoandlearn DOT com has a couple good video tutorial on how to use the library, I'd give you links to them but stackoverflow says I'm not special enough for more than 1 url. Your best bet is probably a cell

create opencv camera matrix for iPhone 5 solvepnp

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-02 17:46:11
I am developing an application for the iPhone using opencv. I have to use the method solvePnPRansac: http://opencv.willowgarage.com/documentation/cpp/camera_calibration_and_3d_reconstruction.html For this method I need to provide a camera matrix: __ __ | fx 0 cx | | 0 fy cy | |_0 0 1 _| where cx and cy represent the center pixel positions of the image and fx and fy represent focal lengths, but that is all the documentation says. I am unsure what to provide for these focal lengths. The iPhone 5 has a focal length of 4.1 mm, but I do not think that this value is usable as is. I checked another

Fast, very lightweight algorithm for camera motion detection?

自闭症网瘾萝莉.ら 提交于 2019-12-02 16:50:52
I'm working on an augmented reality app for iPhone that involves a very processor-intensive object recognition algorithm (pushing the CPU at 100% it can get through maybe 5 frames per second), and in an effort to both save battery power and make the whole thing less "jittery" I'm trying to come up with a way to only run that object recognizer when the user is actually moving the camera around. My first thought was to simply use the iPhone's accelerometers / gyroscope, but in testing I found that very often people would move the iPhone at a consistent enough attitude and velocity that there

React Native Augmented Reality (AR)

不想你离开。 提交于 2019-12-02 15:15:26
I'm investigating a couple of iOS/Android mobile apps for clients at the moment, that involve Augmented Reality: The ability to preview what a piece of their furniture would look like in your home A digital dressing room - preview what the clothes will look like in yourself We primarily use React Native to build mobile apps (one of the projects would be building the feature into an existing React Native app). Can anyone share their experience with React Native and AR? Share any links to frameworks/components that may help get started? Or simply lead some discussion around where to start? It

Face filter implementation like MSQRD/SnapChat [closed]

﹥>﹥吖頭↗ 提交于 2019-12-02 13:52:40
I want to develop the live face filters as MSQRD/Snapchat live filters but did not able to find out how should I proceed should I use Augmented Reality framework and detect face OR use core image to detect the face and process accordingly. Please let me know if anyone has the idea how to implement the same? I would recommend going with Core Image and CIDetector . https://developer.apple.com/library/ios/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_detect_faces/ci_detect_faces.html It has been available since iOS 5 and it has great documentation. Creating a face detector example:

Android: Unable to detect vertical plane

旧街凉风 提交于 2019-12-02 12:43:48
问题 I am trying to detect a vertical plane like wall to add image view on a vertical plane. But did not find the vertical plane. As per the default config for a session that can find both planes as a horizontal and vertical plane. But unable to find a vertical plan. How to find a vertical plane in android application? Please help me. 回答1: Firstly, you need an appropriate vertical surface for tracking. Wall with a solid color (with no distinguishing features on it) is very bad instance. The most

Android: Unable to detect vertical plane

ε祈祈猫儿з 提交于 2019-12-02 07:19:39
I am trying to detect a vertical plane like wall to add image view on a vertical plane. But did not find the vertical plane. As per the default config for a session that can find both planes as a horizontal and vertical plane. But unable to find a vertical plan. How to find a vertical plane in android application? Please help me. Firstly, you need an appropriate vertical surface for tracking. Wall with a solid color (with no distinguishing features on it) is very bad instance. The most robust approach for tracking of a vertical surface is a well lit brick wall, or a wall with pictures on it,

Why does ARKit app stop working after a few days?

你说的曾经没有我的故事 提交于 2019-12-02 03:56:48
I developed a simple ARKit app in Unity for iOS. It works great, but there is a strange problem: after several days it stops working. So when I click on the app icon on the iPhone, it opens the app for a millisecond and instantly quits. If I reinstall the app again it works perfectly as before. Why is it happening? Is there any way to prevent it? I use "Personal team" in Xcode, can it be the reason? Thank you in advance! Unlike in Android, you can’t install any app on an iOS device. It has to be signed by Apple first. However, when you’re developing an app, you probably want to test it before