Skip to main content

Get Google Glass on your Nexus (or in that case Android)


Guess we all are interested about Google Glass. Pretty interested I might say. And all the more since the Glass Explorer prototypes started making its way to the dev community. Last day while reading a idea gist about what might be some good things to implement in glass fro Wikimedia I was pointed to the Glass Viewer in Google Glass Developer portal(Playground).

And today I found a way to run Glass apps in my Nexus 4.
Thanks to Michael Evans(as he pointed that out) and Google we now have working factory images for the XE5 root.

The build process is also very detailed. 
And then we have another dump (which is very useful) from AndroidPolice

Now how would you like if these were repackaged as install-able APK's which you can try in your android?


Google Glass's build process is fairly conservative - they don't use hidden APIs often, and when they do, they use reflection. Thus, it is relatively easy to repackage the Glass APKs for other devices.

Modifications to the base APK

The use-library element in AndroidManifest is removed, as it refers to unused code.
com/google/glass/hidden/HiddenViewConfiguration.smali is patched to always return 0xffff instead of calling the nonexistent View.getDeviceTapTimeout
An instructional video (don_doff_background.mov, 8MB) is removed to save space.
All native libraries required are shipped with the APK, as are all the Glass fonts.
For the camera, instead of calling Camera.open() to get the rear facing camera, Camera.open(0) is called to get the first camera, as the Nexus 7 doesn't have a rear camera.

So without Further Ado: Install

Download the APK:
Setup: http://goo.gl/32FXv This one's been modified so that instead of scanning a barcode, it uses the existing Google Account to setup and then force closes.
Install just like any other boring APK. None of the Google Glass apps need system privileges. I do not recommend installing these APKs as system APKs, as the Glass apps will attempt to reboot the phone after a force close.

It works on both Nexus 4 and Note 2 (not tested by me).

Everything including the raw image and apk and build process for these are in github.
I'll update this post with the links very soon.

Till then Go-Glass!!





Comments

Popular posts from this blog

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post ( part 1 ), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit. What we will cover today: How ARCore and ARKit does it's SLAM/Visual Inertia Odometry Can we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computer When we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts pro

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications. I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with. Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real wo

VMware in Linux : The virtual machine's operating system has attempted to enable promiscuous mode on adapter Ethernet0

I was just happy when I transported my dev virtual box to my Linux system. But immediately after starting up the system I was greeted with this error The virtual machine's operating system has attempted to enable promiscuous mode on adapter Ethernet0. This is not allowed for security reasons. Apparently the solution is this . Which states that "VMware software does not allow the virtual Ethernet adapter to go into promiscuous mode unless the user running the VMware software has permission to make that setting. This follows the standard Linux practice that only root can put a network interface into promiscuous mode." And the solution it seems is adding a new user group and deligating permission to them. However for CentOS this didn't turn out to be the case. Apparently device nodes are created in boottime in this case and you need ownership permissions for udev to make it work. You can read a detailed description in that link. But in short the commands w