Skip to main content

The curious case of MTA Fare and birth of "MTABonus"

Finally yesterday my 11th Firefox OS App got published in the marketplace. With a salt of surprise as well. Instead of taking weeks to get approved it took only a day to get approved. But then again I am not complaining, I get lucky like this time to times (but why didn't I get this lucky for the DVLUP submission.....but that's a story for some other day) and quite a lot of time hit roadblocks too.

So without further ado this is finally how the "App" managed to look like

History: When I arrived New York almost three months back apart from the amazing city what amazed and delighted me was the transportation. Unlike most cities in United States New York City (there is a reason for that italics) has a very comprehensive and functional Public Transport system. And one that people use extensively. The famous NYC Metro and the Buses are run by the Metro Transport Authority. The idea is you buy a "MetroCard" and refill it (or have unlimited monthly/weekly ride) with certain amount. Every rife you avail will deduct the ride amount from the card. Not much different from what I had before in DART (Dallas Area Rapid Transport). But this happens with a twist, they award certain bonus ( After a couple of recharges this got me thinking. And it was an easy calculation to figure out how much and how many recharges I need to get a free ride (bonsu amount equivalent to one. But then again whats the fun of doing that? So one night being high on "Pure water from Genté Springs" quickly jotted down my calculations into this app (Yeah I am too laxy hence just wen ahead and used the FxOSstub).

Hence it was born.
So why suddenly this blog? I never really blogged about in of my other apps, not even about V.Translator which passed a 14K download mark this week. Well you say effect of spring water and to vent out how easy it is to build openwebapp. Really, it is so easy that I sometimes get worried that quite some amount of poorly written/optimized apps will make their way into the marketplace. Just like this one, if it had more functionality.
Also a reminder to myself that I should not cut corners next time. This was different than my most other app publications, most of which were fueled by either ego(why I can't do it)/api experimentation/app porting. This time though it was different.

So the confession where did I cut corners?
  • You see the upper left menu button? That actually doesn't do anything. I disabled (read commented out) the whole menu, but out of sheer laziness left that part. Mental Poke 1: Never do that again
  • See how there isn't any landscape screenshots? Because I disabled it. Not because I couldn't handle it. The app template itself is perfectly responsive and suitable for all sizes, but I didn't want to deal with scroll, hence...
  • See in the second screenshot how the Feedback is marginally overlapping the text? Well it's just a minor css overlap which is negligible if the user could scroll. But the user can't, hence and annoying ux fail which i have to fix later *sigh*
Finally if you want to test drive the app


Popular posts from this blog

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post ( part 1 ), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit. What we will cover today: How ARCore and ARKit does it's SLAM/Visual Inertia Odometry Can we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computer When we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts pro

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications. I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with. Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real wo

Visualizing large scale Uber Movement Data

New York's cab data visualization from Uber's Engineering blog Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found  one  at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found  one released  by NYC - TLC which looked pretty official and I was hooked. To explore the data I wanted to try out OmniSci. I recently saw a video of a  talk  at jupytercon by Randy Zw