The goal of this work is to create a simplified, robust sensor set for detecting seated activity, to determine how to intervene based on behavior inferred from activities, and to respond accordingly. One of the goals of developing a sensing environment is to make it modular, simple, unobtrusive, easy to install yet robust. To do so, we have followed an iterative “build and test” process for reducing the number and complexity of sensors, while retaining recognition accuracy.
We developed this technique for recognizing seated postures, and our results included a robust, inexpensive system with near-real-time prediction of seated postures. The application of this work is to help understand posture as an indicator of fidgeting, related to seat discomfort, along with other basic seated activities including left leg crossed, right leg crossed, leaning left, leaning right, leaning back, leaning forward, seated upright, right leg crossed, leaning left, left leg crossed, leaning right, and slouching. The sensor package can also be used as one of the data-gathering mechanisms in the car.
Location information is valuable for localization in the event of an emergency, as well as providing clinicians with valuable insight into their patients’ usage patterns and activity levels which would otherwise be unavailable to them. We succeeded in using audio and light signatures from our multi-sensor platform called eWatch to determine subjects’ location among eight location types, namely, office, hallway, conference room, elevator, bathroom, atrium, library and outside.