Methods: We collected smartphone accelerometer readings and transdermal alcohol content (TAC) readings from 19 subjects participating in an alcohol consumption field study. The TAC readings served as the ground-truth when training the system to make classifications. The TAC sensors and smartphone accelerometers both provided noisy readings which were cleaned with the MATLAB signal processing toolbox. We then segmented the data into 10 second windows and extracted features known to change when humans lose control of their center-of-mass (i.e. become intoxicated).
Results: We experimented with some feature extraction methods from sound recognition tasks and show that they provide a significant improvement in this task (up to 8% absolute accuracy gain in our case.) Finally, we built and trained several classifiers to call each window as a “sober walk” or “intoxicated walk”, the best of which achieved a test accuracy of 75.04%.
Conclusions and Implications: This result has promising implications for making classifications on noisy accelerometer data in the field and also offers multiple avenues for improvement. We plan to use our classifiers to build a mobile sobriety tracking application that ultimately will serve as a free, reliable, and widely adoptable application that tracks intoxication in real-time, enabling development of effective real-time mobile-based interventions which can later be delivered via the application to reduce unnecessary alcohol-related injury and death. The results and application will also benefit future studies as new sensor-bearing technologies become widely adopted.