Skip to main content

Sony Vaio E Series Touchpad Right Click doesn't work in Ubuntu 12.04 LTS / Elementary OS

The past six months I had barely touched my own laptop. Thanks to the awesome server I was working on and the work laptop that T J Watson was so kind to provide me. My personal laptop seemed to slow and unresponsive compared to them.

Well after all I had a server with 16 Xeon Processor (so yeah each with 10 core) and a little more than 26 GB RAM to go with it. The laptop was nothing to compare with that, but it was a decent T430 with a awesome keyboard and decent processor+ram. But since I was approaching the end of my internship and the end of this vanity trip I thought better to go back to my own game and polish up my Vaio(SVE15118FNB) a little bit before I go back to it.

So I *again* followed my standard sanitation procedure and cleaned+formatted+duel booted the whole machine. And for a change thought I'll give Elementary OS a try.

The whole installation process was silky smooth and went without a hitch. The first time I booted the laptop, it had problem with the graphics though. But it vanished quickly when I installed the AMD Proprietary drivers(yeah I have a graphics card). But then I faced a new/weird issue.

I generally never had issue with touchpad drivers and even this time I didn't notice anything at first, but then realized......the right click doesn't work. No it just simply doesn't register as a right click but registers as a left click. No matter what you do what settings you change this doesn't solve. Even getting the synaptiks package manager won't do you any good. The mouse works perfectly well though.

So after days of head banging I finally came up with the solution and I am packaging it with this
This is only for kernel 3.2.0-24-generic-pae so I can't promise if it will run in any other system or not.

  1. Download the patch form here
  2. sudo apt-get install dkms build-essential
  3. cd Downloads (I'm assuming you kept the downloaded file here)
  4. tar jxvf psmouse-3.2.0-24-generic-pae.tar.bz2
  5. sudo mv psmouse-3.2.0-24-generic-pae /usr/build
  6. cd /usr/build
  7. sudo chmod -R a+rx psmouse-3.2.0-24-generic-pae
  8. sudo dkms add -m psmouse -v 3.2.0-24-generic-pae
  9. sudo dkms build -m psmouse -v 3.2.0-24-generic-pae
  10. sudo dkms install -m psmouse -v 3.2.0-24-generic-pae
  11. sudo modprobe -r psmouse
  12. sudo modprobe psmouse
  13. sudo dkms status
If everything goes alright then the output should be something like this

fglrx-updates, 13.350.1, 3.2.0-51-generic, x86_64: installed
fglrx-updates, 13.350.1, 3.2.0-70-generic, x86_64: installed
psmouse, 3.2.0-24-generic-pae, 3.2.0-70-generic, x86_64: installed (original_module exists)

(it will obviously be a little different for you)

Now you should be able to use your hardware right click buttons. Restart is not required.



Comments

Popular posts from this blog

ARCore and Arkit, What is under the hood: SLAM (Part 2)

In our last blog post (part 1), we took a look at how algorithms detect keypoints in camera images. These form the basis of our world tracking and environment recognition. But for Mixed Reality, that alone is not enough. We have to be able to calculate the 3d position in the real world. It is often calculated by the spatial distance between itself and multiple keypoints. This is often called Simultaneous Localization and Mapping (SLAM). And this is what is responsible for all the world tracking we see in ARCore/ARKit.
What we will cover today:How ARCore and ARKit does it's SLAM/Visual Inertia OdometryCan we D.I.Y our own SLAM with reasonable accuracy to understand the process better Sensing the world: as a computerWhen we start any augmented reality application in mobile or elsewhere, the first thing it tries to do is to detect a plane. When you first start any MR app in ARKit, ARCore, the system doesn't know anything about the surroundings. It starts processing data from cam…

ARCore and Arkit: What is under the hood : Anchors and World Mapping (Part 1)

Reading Time: 7 MIn
Some of you know I have been recently experimenting a bit more with WebXR than a WebVR and when we talk about mobile Mixed Reality, ARkit and ARCore is something which plays a pivotal role to map and understand the environment inside our applications.
I am planning to write a series of blog posts on how you can start developing WebXR applications now and play with them starting with the basics and then going on to using different features of it. But before that, I planned to pen down this series of how actually the "world mapping" works in arcore and arkit. So that we have a better understanding of the Mixed Reality capabilities of the devices we will be working with.
Mapping: feature detection and anchors Creating apps that work seamlessly with arcore/kit requires a little bit of knowledge about the algorithms that work in the back and that involves knowing about Anchors. What are anchors: Anchors are your virtual markers in the real world. As a develope…

Visualizing large scale Uber Movement Data

Last month one of my acquaintances in LinkedIn pointed me to a very interesting dataset. Uber's Movement Dataset. It was fascinating to explore their awesome GUI and to play with the data. However, their UI for exploring the dataset leaves much more to be desired, especially the fact that we always have to specify source and destination to get relevant data and can't play with the whole dataset. Another limitation also was, the dataset doesn't include any time component. Which immediately threw out a lot of things I wanted to explore. When I started looking out if there is another publicly available dataset, I found one at Kaggle. And then quite a few more at Kaggle. But none of them seemed official, and then I found one released by NYC - TLC which looked pretty official and I was hooked.
To explore the data I wanted to try out OmniSci. I recently saw a video of a talk at jupytercon by Randy Zwitch where he goes through a demo of exploring an NYC Cab dataset using OmniSci. A…