Type: Full Paper
Time: Wed 10:30 am - 11:00 am
Session: Session 3 ‒ Best Papers
We propose a novel pervasive system to recognise human daily activities from a wearable device. The system is designed in a form of reading glasses, named 'Smart Glasses', integrating a 3-axis accelerometer and a first-person view camera. Our aim is to classify user's activities of daily living (ADLs) based on both vision and head motion data. This ego-activity recognition system not only allows caretakers to track on a specific person (such as patient or elderly people), but also has the potential to remind/warn people with cognitive impairments of hazardous situations. We present the following contributions in this paper: a features extraction method from accelerometer and video; a classification algorithm integrating both locomotive (body motions) and stationary activities (without or with a little motions); a novel multi-scale dynamic graphical model structure for structured classification over time. We collect, train and validate our system on a large dataset containing 20 hours of ADLs data, including 12 daily activities under different environmental settings. Our method improves the classification performance (F-factor) of conventional approaches from 43.32%(video features) and 66.02%(acceleration features) by an average of 20-40% to 84.45%, with an overall accuracy of 90.04% in realistic ADLs.