Skip to main content

A day in SpaceCityJS and Houston Hackathon

Recently I had the opportunity to visit SpaceCity JS in Houston. Spacecity JS is a conference in Houston focused on JavaScript and the web platform,organized by the folks from other Houston based developer gorups like houston.js, Node JS Houston, and other groups.

It was a little different from the conferences I generally attend. A bit more open but still being very developer oriented. A small time line on how things happened.

Event Date: 14th May,2016
Place: Houston, TMCx


The day started with registration and breakfast at the TMC Innovation Center in houston. We all started with cups of coffee and breakfast. Then Evan Morikawa from Nylas N1 team took the stage and talked about how they used electron and react to build N1. Also he emphasized a lot on how chrome devtools helped in the process with performance profiling and getting the buttery smooth experience we get from N1(
Evan Morikawa on how they built N1 using electron and react
Then Kent Dodds took the stage to talk about how you can manage opensource projects effectively and efficiently. He talked about a lot of best practices and the tools we can use. Including travis, github, eslint to working with community. But what stood out for me is “your project is not opensource until it has a license” and you should always have a Thanks for clarifying that to everyone Kent.(

After that we had Kirsten Hunter talk about how you can work with fitbit api and play with your health data.Followed by Collin Estes from NASA. HE talked in depth on how NASA moved from legacy .net applications to using more opensource softwares like nodejs.

Lou Huang talked about how participating in civic hackathons can make an impact. And of course their awesome streetmix app.

It is always fun and interesting to hear from other developers how they utilize opensource. Some really interesting points I liked are
"So many ways to contribute to #opensource & only one is coding" - Chris Oakman

"Open source is going to take us to Mars" - Collin Estes
After that I headed out to Houston Hackathon 

Where I managed to cook up this


Overall it was a nice experience. And I got to build another app too in webRTC! Yay!

Conclusion: This was a refreshing experience. The conference was relatively cozy and many people knew each other, bringing the sense of a close knit "community". It seemed also like a good place for government officials to meet and talk as well. That sense was bolstered in Houston Hackathon as in the judging panel I did see quite a few officials. Overall I liked my experience there, and it was refreshing to hear what the startups like Nylas as well as agencies like NASA are utilizing Opensource for.


Popular posts from this blog

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post ( part 1 ), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit. What we will cover today: How ARCore and ARKit does it's SLAM/Visual Inertia Odometry Can we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computer When we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts pro

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications. I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with. Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real wo

Visualizing large scale Uber Movement Data

New York's cab data visualization from Uber's Engineering blog Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found  one  at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found  one released  by NYC - TLC which looked pretty official and I was hooked. To explore the data I wanted to try out OmniSci. I recently saw a video of a  talk  at jupytercon by Randy Zw