google-glass

Detecting whether Glassware was launched via voice command or the touch menu

妖精的绣舞 提交于 2019-12-01 09:20:00
问题 Is it possible to have different behaviors when a glassware is launched via "OK Glass" voice command vs touch menu selection? Specifically we are trying to prompt voice recognition if the glassware is launched with "OK Glass" voice command, otherwise go direct to the glassware if it is launched from the touch menu. Or, is there a way for an app to know in which way it was launched? We are trying to emulate what Google Play Music Glassware does. 回答1: The GDK does not yet provide a way to do

How to install Speech to text in Google Glass?

我只是一个虾纸丫 提交于 2019-12-01 07:28:29
问题 I developed an app using the Android 4.1.2 and speech to text is working well but when I tried it on the Google Glass it is not working(Activity not found exception). This is my Speech to text Code Intent intent = new Intent( RecognizerIntent.ACTION_RECOGNIZE_SPEECH); intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US"); try { startActivityForResult(intent, requestCode); } catch (ActivityNotFoundException a) { Toast t = Toast.makeText(getApplicationContext(), "Opps! Your device

How to run a Google Glass GDK sample on the device?

左心房为你撑大大i 提交于 2019-12-01 06:45:37
I start developing Google Glass app with GDK . I use Eclipse with Android SDK Manager to install the GDK. After that, I import the Google Glass project following the steps mentioned here : Click File > New Project > Android Sample Project Select Glass Development Kit as the build target and click Next. Select the Timer sample and click Finish . In the Eclipse Package Explorer, right click the sample and select Run as > Android Application with Glass connected to your development system. The Google Glass (version XE12) is connected, but nothing shows up. Then, I follow another tutorial online

How to run a Google Glass GDK sample on the device?

走远了吗. 提交于 2019-12-01 05:15:32
问题 I start developing Google Glass app with GDK . I use Eclipse with Android SDK Manager to install the GDK. After that, I import the Google Glass project following the steps mentioned here: Click File > New Project > Android Sample Project Select Glass Development Kit as the build target and click Next. Select the Timer sample and click Finish . In the Eclipse Package Explorer, right click the sample and select Run as > Android Application with Glass connected to your development system. The

Built-in ScrollView that scrolls with head motion

。_饼干妹妹 提交于 2019-12-01 04:32:42
Speaking "ok glass" brings up a command list that automatically scrolls based on the user's head motion. Is there a built-in UI element in the GDK that implements this? Or will I have to write my own code that uses sensors? I tried reimplementing parts of this. It's not as shiny as the google one, but those could serve as a starting point: https://github.com/pscholl/glass_snippets/blob/master/lib/src/main/java/de/tud/ess/HeadListView.java https://github.com/pscholl/glass_snippets/blob/master/lib/src/main/java/de/tud/ess/HeadScrollView.java I went through the GDK's Developer Guides at https:/

Built-in ScrollView that scrolls with head motion

核能气质少年 提交于 2019-12-01 01:28:07
问题 Speaking "ok glass" brings up a command list that automatically scrolls based on the user's head motion. Is there a built-in UI element in the GDK that implements this? Or will I have to write my own code that uses sensors? 回答1: I tried reimplementing parts of this. It's not as shiny as the google one, but those could serve as a starting point: https://github.com/pscholl/glass_snippets/blob/master/lib/src/main/java/de/tud/ess/HeadListView.java https://github.com/pscholl/glass_snippets/blob

Accessing rear facing camera on Glass

六眼飞鱼酱① 提交于 2019-12-01 00:28:16
I looked through the api, stackoverflow and google, and didn't find anything. Is there a way in the API to access the camera that faces the eye? I would like to be able to tell if the user's eye is open or closed...is this possible with this version of the GDK? Is that what the built in wink-to-take-a-picture app is doing? AFAIK, you cannot currently determine open/closed, however there are some eye gestures that are supported, assuming that you have the latest version of Glass hardware. From here , you can see the supported gestures: BLINK("BLINK", 3), DOFF("DOFF", 6), DON("DON", 5), DOUBLE

Menu Item for “OPEN_URI” not present in menuItems return

狂风中的少年 提交于 2019-12-01 00:24:12
I have a card that is being inserted in my timeline by the mirror api. The card has 3 options: SCAN, REPLY, DELETE. Expected-> Barcode Test[SCAN, REPLY, DELETE] Received-> Barcode Test[REPLY, DELETE] The Reply and Delete options only return on menu item. If i change 'OPEN_URI' to 'CUSTOM" it returns but does not do what I hope to do, which is open my android.scan.(this is present no my device) I followed similar steps to here and on the Mirror-API docs about creating the menuItems https://developers.google.com/glass/v1/reference/timeline#menuItems Opening GDK Glassware through Mirror API

Simulate Touch Controls Through Code

こ雲淡風輕ζ 提交于 2019-11-30 23:47:19
I'm trying to make it possible to navigate through my Google Glass application by using head gestures. I'm able to recognize head gestures like looking to the right left and up. They each have their own method for what to do when this gesture is recognized Now I need to simulate the corresponding touch gestures inside each method. So it will think I'm swiping to the left or right which will allow me to navigate through the cards with the head gestures. Does anyone have any idea on how to actually achieve this? Edit I created a quick hello world application to play with. I added my headgesture

Menu Item for “OPEN_URI” not present in menuItems return

家住魔仙堡 提交于 2019-11-30 19:38:44
问题 I have a card that is being inserted in my timeline by the mirror api. The card has 3 options: SCAN, REPLY, DELETE. Expected-> Barcode Test[SCAN, REPLY, DELETE] Received-> Barcode Test[REPLY, DELETE] The Reply and Delete options only return on menu item. If i change 'OPEN_URI' to 'CUSTOM" it returns but does not do what I hope to do, which is open my android.scan.(this is present no my device) I followed similar steps to here and on the Mirror-API docs about creating the menuItems https:/