Showing posts with label android. Show all posts
Showing posts with label android. Show all posts

2014-06-12

External notification light for your phone

Among other interesting things, Android 4.3 introduced a proper way of accessing notifications from an app. I used this API to make an external notification light for my phone:



It uses Adafruit's Trinket, a Bluetooth serial board from dx.com to talk to the phone and it's powered via USB. Here's a diagram of the connections:



The Arduino sketch that's running on the Trinket is very simple, it listens on the serial line that's connected to the Bluetooth board and turns the LED on and off depending on what character is received. You can see the sketch here. On Android side, the code is also pretty simple, there's an intent filter in the manifest to register a listener that gets called whenever a notification is posted or dismissed. In the listener we check what notifications are active and if they requested the notification light to be turned on (your phone might not even have a notification light, but the information is still there). Then we connect to the Trinket via Bluetooth and tell it to turn the LED on or off. You can see the code here. There is no UI and the Bluetooth address is hardwired.

There is no special permission in the manifest to let the app access notifications, instead you grant access via a checkbox in the security section of your phone's settings:



As usual, there's room for improvement. For example if the phone fails to connect to the device via Bluetooth (because it's out of range or powered off), it should probably try again in some time. Also, we could have an RGB LED and send color and timing information, instead of just on/off state, to better replicate the behavior of the notification light.

2013-01-07

USB audio dock for Android

If you want to get audio from your Android device to your speakers in a transparent (not application-specific) way, there have traditionally been two options: Bluetooth and the 3.5 mm jack (I guess MHL and HDMI are also valid options on some devices). Android 4.1 introduced another way: audio over USB. This has one additional benefit: you don't have to plug in an additional cable for the phone to charge. Problem is, I'm not aware of any actual products like audio docks that would make use of this feature. So I decided to make one myself.



I used a simple USB dock and a Raspberry Pi. The Raspberry Pi's audio output is connected to my living room amplifier. Below I describe the steps necessary to make this work.

Obviously you'll need a dock that matches your phone's physical shape (or you can skip the actual dock and just use a USB cable). The Raspberry Pi could be replaced by any computer running Linux. The Pi is a good fit here because of its small size. Then again it is a bad fit because its audio components are low quality (this could be worked around by using HDMI as the audio output or connecting an external sound card via USB, but I haven't tested these options).

The good thing about Android's audio over USB feature is that it uses a standard protocol, so if you connect your phone to a modern Linux box, it just shows up as an audio input. Therefore we only need to do two things here: enable the audio over USB feature after the device is plugged in and route the sound from the input to the output that goes to the speakers. We'll use a simple Python script to send the necessary magic over USB and PulseAudio for the audio routing.

I used a Raspbian "wheezy" image on the Pi, if you're using another distribution (or not using a Raspberry Pi), some adjustments may be necessary.

First, install git and PulseAudio:
sudo apt-get update
sudo apt-get install pulseaudio git
Use git to install a newer version of PyUSB than is available in Debian repositories:
git clone https://github.com/walac/pyusb
cd pyusb
sudo python setup.py install
Next we configure PulseAudio. Edit /etc/defaults/pulseaudio to start the daemon at boot and allow module loading:
PULSEAUDIO_SYSTEM_START=1
DISALLOW_MODULE_LOADING=0
In /etc/pulse/system.pa change the line that says:
load-module module-native-protocol-unix
to:
load-module module-native-protocol-unix auth-anonymous=1
(This will allow us to use the pactl command without worrying about authentication.)

Add the following line to /etc/pulse/daemon.conf:
resample-method = trivial
(I don't know why this is necessary. It wasn't needed on my regular PC, but audio wasn't working without it on the Pi.)

Then create a file named /etc/udev/rules.d/dock.rules and put the following line in it:
ACTION=="add",SUBSYSTEM=="usb",ATTR{idVendor}=="04e8",ATTR{idProduct}=="685c",RUN+="/home/pi/phone_docked.sh %s{idVendor} %s{idProduct}"
This will run /home/pi/phone_docked.sh every time my phone is connected. As you can see, unfortunately the USB vendor and product IDs for my specific phone are hardcoded here (they are then passed to the script as parameters). You'll need to change them to match your phone (you can look up the IDs by running lsusb with the phone connected). I don't know how to write a udev rule that will trigger when any Android phone is connected (instead it's probably more feasible to write a rule that matches Android phones and also some other devices and then just try to talk to it as if it was an Android phone and gracefully handle the situation if it doesn't respond).

The phone_docked.sh script does two things, first it runs the Python script (android-usb-audio.py) that enables audio over USB on the phone (passing along the vendor and product IDs), then it loads a PulseAudio module that routes the audio from the phone to the default output. Here's what the script looks like (put this in /home/pi/phone_docked.sh):
#!/bin/bash

/home/pi/android-usb-audio.py $1 $2
(sleep 3s ; pactl load-module module-loopback source=`pactl list sources short | grep alsa_input.usb | cut -f 1`) &
As you can see, the second part is not very elegant, it just waits 3 seconds where it should wait for the PulseAudio source to actually show up. Also it assumes there are no other USB audio sources.

Finally, here's the Python script that sends the necessary USB magic to the phone. This tells the phone to send audio over USB. The script gets the USB vendor and product IDs from command line parameters (put this in /home/pi/android-usb-audio.py):
#!/usr/bin/env python

import usb.core
import time
import sys

dev = usb.core.find(idVendor=int(sys.argv[1], 16), idProduct=int(sys.argv[2], 16))
mesg = dev.ctrl_transfer(0xc0, 51, 0, 0, 2)
# here we should check if it returned version 2
time.sleep(1)
# requesting audio
dev.ctrl_transfer(0x40, 0x3a, 1, 0, "")
# putting device in accessory mode
dev.ctrl_transfer(0x40, 53, 0, 0, "")
(The magic numbers come from the Android Open Accessory protocol.)

Give the two scripts executable permissions:
chmod 755 /home/pi/android-usb-audio.py
chmod 755 /home/pi/phone_docked.sh
And that's it, reboot, connect your phone and enjoy audio coming from your speakers. (I know what you're thinking: all this work just to avoid plugging in the 3.5 mm jack??)

Since the phone will charge from the Pi's USB port, you should use a power supply that will be enough for the Pi and for the 500 mA that the phone will draw.

I'm not demonstrating it here, but while the phone is connected, you can also send HID commands like play/pause/next/previous to it. This way you could make some physical (or web) controls for the dock and pass them through to the device.

While you're at it, you could plug a Bluetooth USB dongle to the Rasbperry Pi and make your dock also accept audio via Bluetooth. There are tutorials on the web showing how to do that.

2012-12-11

Face tracking robot

Android 4.0 introduced an API for face detection. It's really simple to use, you only have to set up a listener and it gets called each time a face is detected in the frame captured by the camera. I used it to make a Lego Mindstorms NXT robot that turns to look at you as you move around. Here's what it looks like:



The way it works is there's an application on the phone that checks if a face is detected. Then it communicates with the robot over Bluetooth and tells it to turn left or right, depending on the position of the face in the frame. Here's the application's source code if you want to take a look. I don't have building plans for the robot itself, but any robot with tank-like steering would work.

A cool way to demo this robot would be to use it for video chats or Google+ hangouts, but I don't think there's a way for two different applications to access the camera at the same time, so the face tracking functionality would have to be integrated into a video chat app. (Or I guess I could put a second phone on the robot.)



2012-08-21

Subway buddies

Have you ever wondered if you're meeting the same people every day on your daily commute? Over half a million people ride the Warsaw Metro daily, but if I arrive at the station at roughly the same time every day, surely there must be others with similar schedules? But even after almost a year of taking the same route, I couldn't recognize any familiar faces. So I decided it was time for a more scientific approach.

A surprisingly large number of people leave their phone's Bluetooth in discoverable state (I find it strange, because on most phones I've seen the default behavior is only to become discoverable for 120 seconds or so when pairing). I thought if I scanned for Bluetooth devices every time I rode the subway and saved the results, I could then check if I ever saw the same device on different days. So that's what I did, I made an application for my phone that does a Bluetooth scan and logs all devices it sees in a SQLite database. That was not ideal yet, because I still had to trigger the scan manually every time (I didn't want to keep scanning all the time, because that would quickly drain the battery and also I'd get a lot of false positives from my neighbors, coworkers etc.). Happily, there is a solution. Living in a city, just about the only time I lose cell phone signal is when I get on the subway. So that's a good condition to wait for to trigger the scan (there will be occasional false positives, but they don't bother us much). Obviously, this only works if your commute involves riding the subway and not, say, a bus.

Here's the source code of the application if you'd like to play with it.

So are there any strangers that happen to commute at the same time I do? After just two weeks of the experiment, with the help of a SQL query:

select addr, count(*), max(sightingtime)-min(sightingtime) y, group_concat(sightingtime) from sighting group by addr having y > 3600 order by y desc;

I found two persons that I shared a subway ride with on two different days. Unfortunately I can't really say much about them, except for the fact that one of them had an LG phone and the other a Samsung.

2012-02-21

Electricity meter fun

At the place I live, the electricity meter looks like this:



The fancy display shows total kilowatt hours and maximum power consumption, but it doesn't show current consumption. Instead it has this blinking red light - the faster it blinks, the higher the power consumption. But to get the actual figure, I'd have to count how many times it blinked, measure the time and then do some arithmetic. And that would be pedestrian. Instead I wrote an application for my phone that does the measurements (with the phone's camera) and displays the current power consumption:



Here's the source code if anyone's interested.

2011-10-17

Control Google Earth with an Android tablet

I have previously described how to turn an Android device into a mouse under Linux, using the uinput module. There's no reason we should limit ourselves to a simple mouse. Some applications benefit from a multi-axis controller, which we can simulate in a similar way. One such application is Google Earth. Here's a demonstration video:

And here's how it's done. Just as before, there's an application running on the tablet that listens for touch events and broadcasts them on the network and a Python script running on the computer that listens for those events and translates them to simulated controller events that Google Earth understands. Here's the tablet application's source code and here's just the APK if you don't want to compile it yourself. The Python script that's running on the computer is here.

There are some additional hoops you have to jump through to get it running.

  • The Python part requires python-uinput, which in turn requires libsuinput.
  • Remember to unblock UDP port 20125 on your computer's firewall, as that's the port that the tablet application uses to broadcast touchscreen events (the tablet and the computer obviously have to be on the same network).
  • Run android3dmouse.py as root.
  • Run the TouchscreenBroadcaster3D application on your Android device and touch the screen (the simulated mouse device only appears on the computer after the first events are emitted).
  • Look in /dev/input/ and figure out what the newly created device is called (it's going to be the eventN file with the highest number).
  • As root, execute the following command to let Google Earth running as a regular user access the device: chmod 666 /dev/input/eventN
  • Open /opt/google/earth/free/drivers.ini with a text editor and paste in the following after the "SETTINGS" line (after the curly brace), remembering to replace eventN with the proper device name:
    SpaceNavigator/sensitivityX = 80
    SpaceNavigator/sensitivityY = 80
    SpaceNavigator/sensitivityZ = 30
    SpaceNavigator/sensitivityPitch = 0.5
    SpaceNavigator/sensitivityYaw = 0.5
    SpaceNavigator/sensitivityRoll = 0.5
    SpaceNavigator/device = /dev/input/eventN
    SpaceNavigator/zeroX = 0.0
    SpaceNavigator/zeroY = 0.0
    SpaceNavigator/zeroZ = 0.0
    SpaceNavigator/zeroPitch = 0.0
    SpaceNavigator/zeroYaw = 0.0
    SpaceNavigator/zeroRoll = 0.0
    SpaceNavigator/gutterValue = 0
    
  • Since X happily grabbed the new device and it now moves the mouse cursor, execute the following command to make X let it go: xinput set-int-prop "python-uinput-mouse" "Device Enabled" 8 0
  • Finally, run Google Earth.

You should now be able to pan, zoom and rotate via multi-touch gestures on your tablet's (or phone's) screen. (Yes, one more degree of freedom would be nice, but I haven't figured out a good way to do it yet.)

If you can't get this mess to work, I'm not the first one to do something like this, there's a Google Summer of Code project done by Reese Butler that works in a similar manner, but uses a slightly different control scheme. The Android part is on the Market and the computer part can be found here. Maybe you'll have more luck with that one.

2011-08-19

Android phone as a touchpad for a PC

The Linux kernel has a module named uinput that allows userspace programs to simulate input devices like mice, keyboards and joysticks. I used this neat mechanism to turn my Android phone into a touchpad for my computer.



Here's how it works. On the phone there's an application that broadcasts touchscreen events over the network. You can get the application's source here. On the PC there's a script running that listens for the events and translates them into mouse events that get injected into the system via uinput, at which point the X server sees them and moves the mouse pointer accordingly. That part is written in Python and requires the python-uinput module (which in turn requires libsuinput). You can get its source here.

As always, there's room for improvement. I used my phone's trackball as left mouse button, but for phones or tablets that have no trackball it would probably make sense to use some regions of the screen as buttons. It would also be interesting to add multitouch gestures for scrolling and zooming. Another thing that comes to mind is to use the phone as a 3D controller for Google Earth. Four degrees of freedom should be doable with the touchscreen alone via two-finger gestures and maybe we could use the phone's other sensors like the accelerometer and gyroscope (which my Nexus One lacks, but newer phones have them) to get all six.

2011-03-30

Pivot screen fun

I used to think pivot monitors were only good for playing Pac-Man in portrait position. But why restrict ourselves to just the landscape and portrait positions?



Since the monitor only knows two states, I had to resort to a little trick. I attached my phone to the back of the monitor and wrote an application that broadcasts orientation data from the phone's accelerometer on the network. On the computer, another application receives the data and rotates the teapot accordingly. (Here's the source code to the phone application and the teapot application if you're interested.)

What I really wanted to do was to make the desktop rotate with all the windows. I think it should be possible with Compiz, but I'm pretty sure it would prove difficult enough for me to lose interest halfway through. So, a teapot for now.

2011-02-17

Morse Code Reader

I wanted to make a Morse code reading application for some time now. Here's my first attempt, it listens to the signal through the phone's mic and translates it to text:



I'm sure there are all kinds of advanced signal processing techniques that I should be using, but for now I went with the simplest, most naive approach. And it sort of works, except when it doesn't. What I do is I keep an exponential moving average of the signal's volume and I decide that the signal is currently "high" when the average is above a threshold and "low" when it's below it. The threshold is set based on recent peak volume. Then I try to decide what the length of a dot and a dash is, interpret the input accordingly, and finally translate the dots and dashes to letters. The application adapts to changing volume and speed within reasonable limits (it should work up to about 40 WPM).



I was hesitant about putting the application on Android Market, as I only tested it on my phone and it quite often fails at translating Morse code. Then again, it's not like I have anything to lose by getting 1-star reviews and insulting comments, so why not.

Here's a Market link.

2010-11-12

NXT Remote Control

Two of my favorite toys right now are my Lego Mindstorms set and my Android phone. And thanks to the Bluetooth protocol I've found a way to combine them. Behold!



Since the video is quite blurry, here's a better view of the application's interface:



The third control scheme (not pictured), which I like to call the tank mode, allows you to control each motor separately and works best if your phone has true multitouch support (not just pinch-to-zoom).

Now I'm not saying my application is very unique, there are at least two others on the Android Market that do the same, including an official one from Lego. But hey, it's mine!

If you'd like to try it, below is a QR code that you can scan from your phone (or if you're reading this on your phone, just tap the code). And here's the application's AppBrain page. (It's free of course.)



Finally if you'd like to see the source code, it's over here at Google Code.