Skip to main content

OpenIoT Summit: Two days with IoT, old phones and WebVR


OpenIoT Summit 2016 Report



I recently had the pleasure to present at the OpenIoT Summit 2016 by Linux Foundation where I was representing Mozilla as a Mozilla TechSpeaker.

The event was quite nice and had a lot of tracks. A little description from their website

Introducing the new OpenIoT Summit…
Billions of devices, trillions of dollars of opportunity. Building on successive waves of Web, mobile, and cloud, and powered by a revolution of cheap powerful hardware that is ever connected, the promise of the Internet of Things has finally arrived. It’s here, it’s real and it is creating untold opportunity.
..
..
Unlike existing IoT events, IoT Summit if for technologists and by technologists. Experts from the world’s leading companies and most important open source projects will present the information you need to lead and succeed successful IoT developments so that you can bring smart connected products and solutions to market.
OpenIoT Summit is the only IoT event focused on the development of IoT solutions. OpenIoT Summit is a technical event created to serve the unique needs of system architects, firmware developers, software developers and application developers in this emerging IoT ecosystem.
OpenIoT Summit delivers the knowledge you need to deliver smart connected products and solutions.
The agenda involved some really interesting talks by a lot of people from different technologies. You can have a look at the diverse motley of talks they had here. Some of them I really liked were
  • "Google ProjectARA Power Management Challenges" by Patrick Titiano, Baylibre
  • Keynote: Towards IoT Convergence - Bryan Che, General Manager, Cloud Product Strategy, Red Hat
  • AllJoyn 101: Make Smarter Devices - Ivan Judson, Microsoft
  • An IoT OS Security Architecture That is so Boring That You Can Sleep Soundly at Night - Ismo Puustinen, Intel Germany
They also had a keynote presentation by Linus Torvalds, at a time when we were frantically checking in codes to Github. 
A glimpse of the keynote
Mozilla Tech Evangelist and manager Dietrich Ayala was my co-speaker with me for the first talk. And being the awesome speaker he is eventually he completely owned the talk!

You can see him giving the talk in the following video. I was still checking in the code at that time -_-


We also got an article and mention later. Yay!
Dietrich completely rocks!

There were a lot of stalls with a lot of very interesting projects in showcase


The next day I had my another talk on WebVR. Which literally had people scratching their head. You can see the talk below.

If you want to see some more codes and some live coding demos. Please head over to my other post where I enjoyed an awesome time teaching teenagers about WebVR.

Conclusion: This was a unique experience. I met a lot of people involved with a lot of projects who all seemed to be very interested in what Mozilla doing in IoT and VR space. I got a lot of idea I could adopt from other projects, primarily Alljoyn. Brillo and Weave are something I would look out for. I unfortunately had to leave on 6th so missed all the sessions that day. But I thoroughly enjoyed my experience here.

Comments

Popular posts from this blog

Visualizing large scale Uber Movement Data

Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found one at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found one released by NYC - TLC which looked pretty official and I was hooked.
To explore the data I wanted to try out OmniSci. I recently saw a video of a talk at jupytercon by Randy Zwitch where he goes through a demo of exploring an NYC Cab dataset using OmniSci. A…

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post (part 1), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit.
What we will cover today:How ARCore and ARKit does it's SLAM/Visual Inertia OdometryCan we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computerWhen we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts processing data from cam…

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn
Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications.
I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with.
Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real world. As a develope…