Skip to main content

GeekXTalk: Connected Devices and some Techno Indians

The poster for the event
It was back in 2006 when I first set foot in Techno India. That was supposed to be the college where I learn how to be an Engineer in Electrical engineering. That also was the college where I spent a good number of days of my life, shaping up to be what I am today. 

First, just a little preamble. The idea of a tech-fest (technical festival) was very new in 2006-07. Even the word was synonymous with the event IIT Bombay organized yearly. Techno India was a really new college on the block, mostly afraid of taking any new steps, but with students who wanted to reach the sky, push the envelope beyond Horizon. Some of them formed the science club Geekonix and started Edge, the annual tech-fest, against all odds. But today we are not going to talk about that.

Fast forward 10 years and those students, now in various stages of their lives, decide that it's again time to do something to kindle and guide the interest in students in new technologies. And hence GeekXtalks, a talk series was born. Which we are going to talk about today.


After my talk at GraphicalWeb while chatting with a few others, I was invited to talk about IoT (Connected Devices) at the second chapter of the series. I did have a co-presenter with me. Ankan Roybardhan from RF system Integration Engineer in Wireless Design Org. of Apple. And I was representing Mozilla (yay!) as a Mozilla TechSpeaker.



The event took place at Techno India Salt lake campus. It was a holiday hence we had reservations about how many students would attend. We had an estimate of around 80 students who will be attending. So we were pleasantly surprised to see the room completely filled up and people standing!

Mostly everybody got a seat at the end

The Talk

I had planned on a more hands on and involved talk with "Web of Things". How it all connects. But decided against it after a little bit of chat with the students and getting to know their interest. So I did a fall back on one of the talks that I had delivered in OpenSource Bridge at Portland and with Dietrich(Platform Manager, Mozilla Corp.) in Linux Foundation OpenIoT Summit 2016. You can see the video of the talk below.





TL;DR: Connected Devices/ Internet Of Things are nothing new or fancy. they are our everyday devices with sensors which generate data. And we come up with a way to make sense of that data. But do we really need expensive proprietary devices to do that? Don't we have a device within our grasp which already does all these? Without vendor lock-in? Can we not use it? Can we not create a framework which allows you to utilize the device in an open way utilizing open web technologies


We need one device to rule it all
In this talk we take on the hype about IoT. The expensive devices, the ‘another’ new IoT device which talks to…only it’s own application. The awesome windows blind that can learn your moods or the overly complicated raspberry pi that you have to code to get your hack going. In this talk we will show a simple yet reasonable way to get started with IoT, using your old android/firefox os mobile/device that may be collecting dust in a shelf (which you don’t use) and turning them into something really interesting which can do tasks for you which otherwise your expensive “smart home” product would have done with more of expensive proprietary hardware. We will talk about what it means to us as open web citizens to have control over our IoT devices, how we can re-use old phones to have same functionality and show a sample code framework that you can already use to achieve this!
And of course this is the device

And how can a talk complete without some swags.


Once we had that out of the way (and the Pizza courtesy Mozilla) Ankan got to work explaining more about the works for an electrical, electronics or even mechanical engineer in the vast space that is IoT. We were overwhelmed by the diversity of our audience. Starting from computer science and Information Technology undergraduates, we had students from electrical, electronics asking us about the scope of work in the field and how they can contribute. That intrigued us to show a few of our hobby projects involving wearable devices.




And at the end, of course, Ankan was swamped by questions about career in Apple and higher education.



However, if any of you are interested in the Web of Things talk(what was initially planned), then here is the slide. Leave comments on any questions you might have and I will be glad to answer.



That's how we wended GeekXTalk Part 2

And if you have any other queries, you can find me on twitter at @rabimba

Comments

Popular posts from this blog

Visualizing large scale Uber Movement Data

Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found one at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found one released by NYC - TLC which looked pretty official and I was hooked.
To explore the data I wanted to try out OmniSci. I recently saw a video of a talk at jupytercon by Randy Zwitch where he goes through a demo of exploring an NYC Cab dataset using OmniSci. A…

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post (part 1), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit.
What we will cover today:How ARCore and ARKit does it's SLAM/Visual Inertia OdometryCan we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computerWhen we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts processing data from cam…

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn
Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications.
I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with.
Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real world. As a develope…