Soundtrak: 3D input tracking technology
- Contributed in establishing the data pipeline from audio interface to Mac. Wrote scripts to generate a real-time visualization of high streaming finger position data in 3D.
- Wrote the system evaluation section of the paper for the IMWUT submission.
- Highlights: Significantly reduced the time and cost for system evaluation by proposing an innovative solution of hacking Makerbot and control it programmatically to gather ground-truth data.
The small size of wearable devices limits the efficient and scope of possible user interactions, as inputs are typically constrained to two dimensions: the touchscreen surface.
To invent a system that actively track's finger location in three dimensional space and be small enough to be integrated in commercial smartwatches and can be extended to large sized displays easily.
We developed SoundTrak, an active acoustic sensing technique that enables a user to interact with wearable devices in the surrounding 3D space by continuously tracking the finger's position with high resolution. The user wears a ring with an embedded miniature speaker sending an acoustic signal at a specific frequency (e.g., 11 kHz), which is captured by an array of miniature, inexpensive microphones on the target wearable device. A novel algorithm is designed to localize the finger's position in 3D space by extracting phase information from the received acoustic signals.
Under review for IMWUT 2017 (Interactive, Mobile, Wearable and Ubiquitous Technologies)
Cool tech on System Evaluation
Soundtrak has been one of the most technologically challenging and fruitful project I have done in a while. Research is always fast paced and I didn't have much time to experiment and we needed some clever solutions to known problems among researchers. The challenging problems were -
1. Visualization of high stream location tracking data
This seems like a trivial thing to do, right? Well, when there is a large number of the interdependent systems, things get messy. In the current version of Soundtrak system, the hardware sends the acoustic data to a Mac machine via a C library suitable for sending audio data. Now, the Soundtrak algorithms are written in Java, so there is a script which sends the data from C to Java. After all the processing in Java, there comes visualization part. After countless hours of searching a Java library which can chart 3D position data, we had no luck. So, I decided to send the data from Java and visualize it on Matlab. Just when we thought we found the solution, there was another big hurdle i.e. the high sampling rate of the data. For some reason, Matlab couldn't visualize data faster than Java was sending it. It was time for some clever hack. After hours of tuning parameters, we decided to use Matlab's plot3D method. But before plotting individual data points, we buffered data in batches and plotted. The buffer was kept big enough such that there is not delay in the real-time visualization. After hours of head scratching and code debugging. Here is the working visualization i.e. almost real-time.