With the increasing availability of wearable devices such as Google Glass, Microsoft SenseCam, Samsung's Galaxy Gear, Autographer, MeCam, and LifeLogger, there is a recent upsurge of interest in lifelogging. Lifelogging is an activity of recording and documenting some portions of one's life. Typically, the recording is automatic using wearable devices. Lifelogging can potentially lead to many interesting applications, ranging from lifestyle analysis, behavior analysis, health monitoring, to stimulation for memory rehabilitation for dementia patients. Much advancement has been made in the hardware design for life-logging devices. In this design, we build an egocentric video dataset dubbed LENA (Life-logging EgoceNtric Activities) which includes egocentric videos of 13 fine-grained activity categories, recorded under diverse situations and environments using the Google Glass. Activities in LENA can also be grouped into 5 top-level categories to meet various needs and multiple demands for activities analysis research. We evaluate state-of-the-art activity recognition using LENA in detail and also analyze the performance of popular descriptors in egocentric activity recognition. Several kinds of descriptors HOF, HOG, MBH, Trajectory and Combined are computed for fine-grained categories(top and second level) using Matlab simulation.

Reference Paper-1: Activity Recognition in Egocentric Life-logging Videos
Author’s Name: Sibo Song, Vijay Chandrasekhar, Ngai-Man Cheung, Sanath Narayan, Liyuan Li, and Joo-Hwee Lim
Source: Asian Conference on Computer Vision
Reference Paper-2: Egocentric Activity Recognition Using HOG, HOF, and MBH Features
Author’s Name: K. P. Sanal Kumar and R. Bhavani
Request source code for academic purpose, fill REQUEST FORM or contact +91 7904568456 by WhatsApp or, fee applicable.

SIMULATION VIDEO DEMO