I have previously described how to turn an Android device into a mouse under Linux, using the uinput module. There's no reason we should limit ourselves to a simple mouse. Some applications benefit from a multi-axis controller, which we can simulate in a similar way. One such application is Google Earth. Here's a demonstration video:
And here's how it's done. Just as before, there's an application running on the tablet that listens for touch events and broadcasts them on the network and a Python script running on the computer that listens for those events and translates them to simulated controller events that Google Earth understands. Here's the tablet application's source code and here's just the APK if you don't want to compile it yourself. The Python script that's running on the computer is here.
There are some additional hoops you have to jump through to get it running.
- The Python part requires python-uinput, which in turn requires libsuinput.
- Remember to unblock UDP port 20125 on your computer's firewall, as that's the port that the tablet application uses to broadcast touchscreen events (the tablet and the computer obviously have to be on the same network).
- Run android3dmouse.py as root.
- Run the TouchscreenBroadcaster3D application on your Android device and touch the screen (the simulated mouse device only appears on the computer after the first events are emitted).
- Look in /dev/input/ and figure out what the newly created device is called (it's going to be the eventN file with the highest number).
- As root, execute the following command to let Google Earth running as a regular user access the device: chmod 666 /dev/input/eventN
- Open /opt/google/earth/free/drivers.ini with a text editor and paste in the following after the "SETTINGS" line (after the curly brace), remembering to replace eventN with the proper device name:
SpaceNavigator/sensitivityX = 80 SpaceNavigator/sensitivityY = 80 SpaceNavigator/sensitivityZ = 30 SpaceNavigator/sensitivityPitch = 0.5 SpaceNavigator/sensitivityYaw = 0.5 SpaceNavigator/sensitivityRoll = 0.5 SpaceNavigator/device = /dev/input/eventN SpaceNavigator/zeroX = 0.0 SpaceNavigator/zeroY = 0.0 SpaceNavigator/zeroZ = 0.0 SpaceNavigator/zeroPitch = 0.0 SpaceNavigator/zeroYaw = 0.0 SpaceNavigator/zeroRoll = 0.0 SpaceNavigator/gutterValue = 0
- Since X happily grabbed the new device and it now moves the mouse cursor, execute the following command to make X let it go: xinput set-int-prop "python-uinput-mouse" "Device Enabled" 8 0
- Finally, run Google Earth.
You should now be able to pan, zoom and rotate via multi-touch gestures on your tablet's (or phone's) screen. (Yes, one more degree of freedom would be nice, but I haven't figured out a good way to do it yet.)
If you can't get this mess to work, I'm not the first one to do something like this, there's a Google Summer of Code project done by Reese Butler that works in a similar manner, but uses a slightly different control scheme. The Android part is on the Market and the computer part can be found here. Maybe you'll have more luck with that one.