Skip to main content

Immersive Technology Conference at Houston

Image result for itcexpo

At the end of last year, I was invited to speak at the Immersive Technology Conference, which took place at the University Of Houston. It was a two-day event with my talk being the first one. That always is something I simultaneously hate and still crave for. Because even if it's stressful, that is the only timeslot which allows me to actually listen to and enjoy other talks after me. Otherwise, I keep fretting over my own and can't concentrate on anything else.


The conference line-up was pretty good. I enjoyed the talk by Ann McNamara. Ron Dagdag, Brian Dornbos and Angel Muniz. The only problem was the two rooms had conflicting sessions so I had to prioritize and choose. Which made me miss the talk by Chris Gerty from NASA since my own talk had a clashing slot.

The event also had a few expo stalls. And I managed to try my first Hololens experience here. Admittedly I look a bit foolish in these glasses.
This also was the first conference as a Mozilla TechSpeaker where I started trying to show live demos of WebXR applications running from my mobile. As you will see the experience isn't really flawless and often crashes. But it worked, I am still working out to iron out the kinks. But now the process goes much smoother as I experienced at OSCON.
If you want to see the talk video, this is how it went:


ITC (Immersive Technology Conference) is the only conference which I know of in Houston which was completely focused on VR/AR and Mixed Reality. Which in turn allowed me to meet with a lot of like-minded people who are focused on the immersive space, and I didn't know about before. Overall I liked the conference. I got to know and meet with a lot of very passionate folks of HoustonVR and look forward to keeping in touch and sharing VR awesomeness with them.

PS: Do check out the other talks in the conference, especially Ann McNamara and Ron Dagdag gave me a few ideas I want to experiment later on.

Comments

Popular posts from this blog

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post ( part 1 ), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit. What we will cover today: How ARCore and ARKit does it's SLAM/Visual Inertia Odometry Can we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computer When we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts pro

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications. I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with. Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real wo

Road to the Grant : Immersive Payment

  This is a news close to my heart. My project "Immersive Payment" has been awarded with Grant for the Web , and I will be focusing on using Web Monetization to enable Micro Payment and for Web Mixed Reality assets, as well as for 3d contents. It took me almost five months from learning about the initiative to actually getting thee Grant. It all started back when in Mozilla TechSpeaker call a fellow TechSpeaker and a friend Andrzej Mazur explained bout the program. He himself is also an early awardee . After the talk, I got really excited and interested in the potential and concept of Web Monetization, however, it wasn't until June that I really decided to apply. The application process was a bit of roller coaster ride for me. I had an idea of what I wanted to do and also explored what is possible right now. But a bit of termoil in my personal life kept me from going beyond toy implementations. Fast forward to June end and I finally had the project chalked out with init