Skip to content

gaurav38/HackDuke13

Repository files navigation

{\rtf1\ansi\ansicpg1252\cocoartf1187\cocoasubrtf400
{\fonttbl\f0\fswiss\fcharset0 Helvetica;}
{\colortbl;\red255\green255\blue255;}
\margl1440\margr1440\vieww25100\viewh13240\viewkind0
\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural

\f0\fs24 \cf0 Title: SmartWatch driven Gesture Interface for Mobile Devices\
*****\
\pard\tx720\tx1226\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural
\cf0 \
\pard\tx720\tx1440\tx2160\tx2880\tx3600\tx4320\tx5040\tx5760\tx6480\tx7200\tx7920\tx8640\pardirnatural
\cf0 \
Team Members:\
*******************\
Karthik Balasubramanian\
Gaurav Saraf\
Gaurav Patwardhan\
Hemant Gupta\
\
\
Softwares Used:\
*******************\
Pebble SDK 2.0 beta 2\
Pebble IOS and Android app v2.0 beta 2\
Eclipse with Android PebbleKit\
Gesture Recognition Toolkit by Nick Gallian\
\
Hardware Used:\
*******************\
Pebble SmartWatch\
\
Projects Idea:\
*************\
The Pebble is a SmartWatch which is paired with a smartphone running Android or IOS and basically serves as a notification device along with some sensors (screen, vibrator) for output and (buttons, accelerometer) for user input. Since everyone who uses a pebble almost always wears it, the data which its sensors generate can be a fingerprint of the user's activities. The accelerometer has been a part of the pebble's hardware and recently the public has been given access to it through a group of APIs. We have taken up this opportunity to classify different hand gestures using the accelerometer and we present a feasible prototype of the same. There are too many user scenarios where this information can be used to enhance the user experience. Following are some of them:\
	-	Interactive games played on tablets and smartphones\
	-	'Wrist Flicking' action based phone status change like changing from silent to loud mode, replying to a call with a predefined message,etc.\
	-	Ease of giving presentations on stage by changing slides with natural hand gestures\
\
\
\
Methodology used:\
*********************\
1. Create a pebble watchapp which resides on the SmartWatch and captures the accelerometer data. The watchapp parses this data and sends it to an android smartphone over bluetooth.\
2. Create an android app which receives this data and sends it to a classifier to recognize the gesture.\
3. Create a classifier to receive the data, to predict a gesture from the trained models and send the output back to the android phone.\
4. The android phone takes the appropriate action based on the predicted gesture.\
\
Challenges faced:\
*******************\
1. Due to the multiple versions of the firmware for the pebble android app and the pebble SmartWatch app we faced an unprecedented delay of couple of hours.\
2. Due to the bluetooth low energy mode of the pebble (for energy efficiency), the data stream received was not continuous and the watch went into sleep cycles periodically, thus messing up the gesture recognition. We solved this by reducing the sleep cycle to bare minimum at the cost of a little more battery usage.\
3. Data points had to be generated at very fast rate in order to get the complete picture and avoid 'sort of aliasing' situation and classify the gesture as something else.  Since the processor is low performance and also has to cater to sending of the data over bluetooth, we had to strike a very fine balance between the collection and sending of the data.\
4. Collecting data and training models for accurate gesture recognition is a tough task. For accurate gesture recognition, it is critical to capture the temporal aspects of accelerometer data. We tried a variety of sequence models, like HMM, frame based SVM, and other classifiers like Naive Bayes. In the end, we achieved the best performance with a Dynamic Time Warping model.\
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages