With rapid growth of egocentric videos from wearable devices, the need for instant video event detection is emerging. Different from conventional video event detection, it requires more considerations on real-time event detection and immediate video recording due to the computational cost on wearable devices (e.g., Google Glass). Conventional work of video event detection analyzed video content in an offline process and it is time-consuming for visual analysis. Observing that wearable devices are usually along with sensors, we propose a novel approach for instant event detection in egocentric videos by leveraging sensor-based motion context. We compute statistics of sensor data as features. Next, we predict the user’s current motion context by a hierarchical model, and then choose the corresponding ranking model to rate the importance score of the timestamp. With importance score provided in real-time, camera on the wearable device can dynamically record micro-videos without wasting power and storage. In addition, we collected a challenging daily-life dataset called EDS (Egocentric Daily-life Videos with Sensor Data), which contains both egocentric videos and sensor data recorded by Google Glass of different subjects. We evaluate the performance of our system on the EDS dataset, and the result shows that our method outperforms other baselines.
Leave a Reply