Skip to main content

I am an Optimist..

 
I want to be an optimist

Being a software engineer involves a lot of pessimism: a lot of thought in the software engineering process revolves around things that might go wrong, that are going wrong, that have gone wrong, or that went wrong.

Over time, professional pessimism has been bleeding over into my personal life, and many times I tend to think more of things that could go wrong than about things that are going right.

There are times, though, when I want to be an optimist against the odds, and that even applies to the world of mobile gadgets.

I want phonebloks to succeed.



This is a great idea on paper. I hope that they can manage a level of integration that makes the size, weight, price and shape competitive with integrated solutions. I hope that going through a common backplane doesn't hurt battery life or system speed. I hope that hardware drivers can be made portable enough. I hope they can resolve the issues of antenna performance on a variable-geometry device, especially for big coils like NFC or Qi. I hope that carriers will be willing to provide end-user support for those devices.

I want Firefox OS to succeed.



This is another great idea on paper. I hope they can succeed in that direction where many other companies have failed. Looking at the ZTE Open, I hope that they can provide a user experience on that class of hardware that rivals what other systems offer on devices that cost nearly ten times as much. I hope they can transparently handle the high packet losses and connection drops that make the reality of mobile devices.

I want network interoperability to be a reality.

2G GSM interoperability was a mess. It took a while for phone to be available with all 4 common bands, and that didn't stay relevant for long as UMTS was getting deployed.

UMTS was a bigger mess, and it took 5 bands to cover the US and Europe (and most of the world with the notable exception of Japan). Now that we have phones that cover all 5 bands (e.g. Galaxy Nexus, Nexus 4), LTE is getting deployed.

LTE is an even bigger mess. It takes 5 bands just to cover the common US and European GSM carriers alone, and before you know it the list grows to over a dozen without digging much. Nexus 7 manages to support 4 bands in GSM, 5 bands in UMTS and 7 bands in LTE (and the LTE bands aren't the same in all variants). Other devices don't seem to be so lucky and have fewer connectivity options.

I hope that we can quickly reach a point where LTE devices can cover a dozen different bands or more, so that they can be used while traveling.

Comments

Popular posts from this blog

Visualizing large scale Uber Movement Data

Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found one at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found one released by NYC - TLC which looked pretty official and I was hooked.
To explore the data I wanted to try out OmniSci. I recently saw a video of a talk at jupytercon by Randy Zwitch where he goes through a demo of exploring an NYC Cab dataset using OmniSci. A…

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post (part 1), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit.
What we will cover today:How ARCore and ARKit does it's SLAM/Visual Inertia OdometryCan we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computerWhen we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts processing data from cam…

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn
Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications.
I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with.
Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real world. As a develope…