Skip to main content

VR For Everyone: When you get to play and teach WebVR

This blog is pretty late to the party. But recently I have been having quite some fun with tinkering with WebVR. What started for me initially with mozVR and then standardized into WebVR changed completely when aframe came into the picture!

So let me back off a little bit. WebVR started as an entry pint of browsers coming into play where we could get VR content from directly browser. Just imagine writing a game or environment in javascript,html as a webpage and use your phone/occulus rift and you get transported in virtual reality! How cool is that? No need to use other software, no platform restrictions. As long as you have a browser and an internet connection, you are good to go! However that didn't pan out so well due to how you create them. The only way to properly utilize it was to use WebGL and libraries like three.js. All of this changed when Mozilla released aframe on 16th December 2015. From the aframe team 

A-Frame makes it easy for web developers to create virtual reality experiences that work across desktop, iPhone, Android, and the Oculus Rift.

So it abstracts away all the technical difficulties from you! And makes it much easier to learn. how easy you ask? Well that's what this blog is all about.
I had the awesome opportunity to go and teach (and learn myself) to a group of kids in "The Renaissance Charter High School for Innovation" as part of NYC City TechImmersion program and their Enrichment Week Program. We started our Day 1 with kids who had a little experience in JavaScript to kids who had never programmed! And what they came up in 3 days was not only surprising but awesome!

So straight into some action

This was first day. Everyone just trying to get their hands into some concepts.

Matthew Boyle trying to let students understand how orientation works in real life 



And


Slowly by second day we started watching kids making things like


And


Which slowly started ramping up to this




Oh and also don't forget



My task after letting them live code with me, was to have them built something like this. Which I live coded and projected (it still is hard to follow unless you know the concepts)



And Voila!
This is what one of the students came out with!




And all these are in just two days!
The beginning of the third day was more awesome. Matthew and Sean, the other two instructors who were also learning showed something they made!




Which was supercool. And then we had them work on another task. Which produced something like this!




Finally the class


All these from students, some of them coding for the first time! And they were able to pick up aframe! The basic concepts of programming, how VR works and were able to build demos' environments which they could (and saw) right then in their own mobile phone with help fo Google Cardboard! (provided to them courtesy Mozilla!).

This was an awesome experience for me. And an eye opener how we can get people interested in being inclusive the way we use and treat technology. I really loved my experience and it was a refreshing outlook from my normal conference talks or demos. Here you always have to retrofit your approach to how the kids will find it interesting!

I had some materials made beforehand (and made some more impromptu). So if anyone wants to have a look at them here they are!

The Whole Syllabus + Handbook Slide: Here
Making Your First VR App! (Used as an opener for 2nd day): Here

All the code demos as well as materials are in the slides in jsfiddle and codepen. Some of the demos were acting weirdly in codepen so had to do a duplicate in jsfiddle. The slide has links to all of them.

A small collection can be found here too : http://codepen.io/rabimba/

The way we did it was, I made them fork it and made the required changes I asked in tehir own pen and show it to me. The demos are based on different concepts which goes on building functionality which you can use/build for building your ultimate application :D

Happy Virtual Coding people! And do drop a note if you have any comments or think this can be useful!

Update: It seems Mozilla VR team covered this! Read more about it in their blog and also in the weekly update.

Blog: https://blog.mozvr.com/fun-webvr-times-at-innovation-high/ 

Popular posts from this blog

Visualizing large scale Uber Movement Data

Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found one at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found one released by NYC - TLC which looked pretty official and I was hooked.
To explore the data I wanted to try out OmniSci. I recently saw a video of a talk at jupytercon by Randy Zwitch where he goes through a demo of exploring an NYC Cab dataset using OmniSci. A…

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn
Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications.
I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with.
Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real world. As a develope…

HackRice 7.5: How "uFilter" was born

I have a thing for Hackathon. I am a procrastinator. A lazy and procrastinator graduate student, not a nice combination to have. But still when I see hundreds of sharp minds in a room scrabbling over idea, hungry to build and prototype their idea. Bring it to life, it finally pushes me to activity, makes me productive.  That is why I love Hackathon, that is why I love HackRice, our resident Hackathon of Rice University.

TL;DR: if you just want to try the extension, chrome version is here and Firefox version is here.
I have been participating at HackRice since 2014, when I think for the first time it was open for non-rice students, and have been participating ever since. What a roller coaster ride it has been, but that is a story for another day. HackRice 7.5 being the last one I will be able to attend at Rice, it was somewhat special and emotional for me.
HackRice 7.5 was a tad different form the other iterations. For starters it was the first time it was being held in Spring semester…