Abstract: Smartphone-Based Intelligent System: Using AI and Motion Sensors for Real-Time Intervention during Heavy Alcohol Consumption Events (Society for Social Work and Research 23rd Annual Conference - Ending Gender Based, Family and Community Violence)

515P Smartphone-Based Intelligent System: Using AI and Motion Sensors for Real-Time Intervention during Heavy Alcohol Consumption Events

Schedule:
Saturday, January 19, 2019
Continental Parlors 1-3, Ballroom Level (Hilton San Francisco)
* noted as presenting author
Danielle Madden, PhD, Postdoctoral Scholar, University of Southern California
Jackson Killian, Doctoral Student, University of Southern California, CA
John Clapp, PhD, Professor, University of Southern California, CA
Background and Purpose: Excessive alcohol consumption is an avoidable health risk, yet it causes a significant percentage of yearly deaths and injuries on college campuses. Recent work showed that weekly mobile-based interventions can effectively reduce alcohol consumption in students. However, few studies investigate delivering mobile interventions in real-time during drinking events where interventions could reduce risks like drunk driving, alcohol poisoning, and violence. To address these shortcomings, we built an intelligent system capable of passively tracking smartphone accelerometer data to identify heavy drinking events in real time.

Methods: We collected smartphone accelerometer readings and transdermal alcohol content (TAC) readings from 19 subjects participating in an alcohol consumption field study. The TAC readings served as the ground-truth when training the system to make classifications. The TAC sensors and smartphone accelerometers both provided noisy readings which were cleaned with the MATLAB signal processing toolbox. We then segmented the data into 10 second windows and extracted features known to change when humans lose control of their center-of-mass (i.e. become intoxicated).

Results: We experimented with some feature extraction methods from sound recognition tasks and show that they provide a significant improvement in this task (up to 8% absolute accuracy gain in our case.) Finally, we built and trained several classifiers to call each window as a “sober walk” or “intoxicated walk”, the best of which achieved a test accuracy of 75.04%.

Conclusions and Implications: This result has promising implications for making classifications on noisy accelerometer data in the field and also offers multiple avenues for improvement. We plan to use our classifiers to build a mobile sobriety tracking application that ultimately will serve as a free, reliable, and widely adoptable application that tracks intoxication in real-time, enabling development of effective real-time mobile-based interventions which can later be delivered via the application to reduce unnecessary alcohol-related injury and death. The results and application will also benefit future studies as new sensor-bearing technologies become widely adopted.