The Linux kernel has a module named uinput that allows userspace programs to simulate input devices like mice, keyboards and joysticks. I used this neat mechanism to turn my Android phone into a touchpad for my computer.
Here's how it works. On the phone there's an application that broadcasts touchscreen events over the network. You can get the application's source here. On the PC there's a script running that listens for the events and translates them into mouse events that get injected into the system via uinput, at which point the X server sees them and moves the mouse pointer accordingly. That part is written in Python and requires the python-uinput module (which in turn requires libsuinput). You can get its source here.
As always, there's room for improvement. I used my phone's trackball as left mouse button, but for phones or tablets that have no trackball it would probably make sense to use some regions of the screen as buttons. It would also be interesting to add multitouch gestures for scrolling and zooming. Another thing that comes to mind is to use the phone as a 3D controller for Google Earth. Four degrees of freedom should be doable with the touchscreen alone via two-finger gestures and maybe we could use the phone's other sensors like the accelerometer and gyroscope (which my Nexus One lacks, but newer phones have them) to get all six.
Thanks for the write up! Just installed on my DX and laptop with ease. I'll be using this over the next few weeks at home and on the road! Thanks!
ReplyDelete