Thermal information surrounding a person is a rich source for understanding and identifying personal activities. Different daily activities naturally emit distinct thermal signatures from both the human body and surrounding objects; these signatures exhibit both spatial and temporal components as objects move and thermal energy dissipates, for example, when drinking a cold beverage or smoking a cigarette. We present HeatSight, a wearable system that captures the thermal environment of the wearer and uses machine learning to infer human activity from thermal, spatial, and temporal information in that environment. We achieve this by embedding five low-power thermal sensors in a pentahedron configuration which captures a wide view of the wearer’s body and the objects they interact with. We also design a battery life-saving mechanism that selectively powers only those sensors necessary for detection. With HeatSight, we unlock thermal as an egocentric modality for future interaction research.
Foodtrk: Track meals and snacks with pictures of food and questionnaire for research