Smart reasoning systems to analyse human behavior
Organisers: Rianne Kaptein, and Wessel Kraaij TNO
Schedule: Wednesday 27th August, 10:50 - 12:30, Pomonazaal
More and more sensory devices have made their way to the market. Sensory devices can include Kinect to inspect body postures, cameras to look at facial expressions, body sensors to measure heart rate or skin conductance, computer logging to see activities on the computer and smart phone sensors like microphones, accelerometers, and ‘finger prints’ of WiFi access points. All aspects of a person's daily life can be monitored. Lifelogging is the process of tracking personal data generated by our own behavioral activities such as exercising, sleeping, and eating. The Quantified Self movement goes further and uses the collected data to improve health and lifestyle.
User-centric sensing and reasoning techniques can help to improve the efficiency and acceptability of physical and mental well-being and well-working applications. To make applications adaptive and intuitive, and allow them to provide personalized information and coaching to the user at the right time requires the availability of context information. Moreover, it requires advanced distributed reasoning algorithms that can deal with heterogeneity of sensed context information, partial context information and resource limitations regarding the processing of the collected context information such as location, behavior, mood, activity and weather conditions.
Well-being applications at work and at home are expected to help people to continue contributing to society, the marketplace and the economy (e.g. by allowing elderly people to live at home, by working from different locations improving the combination of work and private life, by monitoring and preventing work related stress, or by improving lifestyle – and thus health – increasing the productivity of employees). Furthermore, these applications may help suppressing the rising costs of stress-related diseases, burnouts, chronic disease and ill-health.
|10:50-11:10||Marianna Munafò, Daniela Palomba and Marco Casarotti, Inside S.r.l.||Biofeedback protocols for stress management in high-performance work environments|
|11:10-11:30||Matei Capatu, Georg Regal, Johann Schrammel, Elke Mattheiss, Mark Kramer, Nikolaos Batalas and Manfred Tscheligi, CURE - Center for Usability Research & Engineering||Context- and Time-Triggered In-Situ Questionnaires on a Mobile Phone|
|11:30-11:50||Tengqi Ye, Rami Albatal and Cathal Gurrin, Dublin City University||Real-time Lifelogging for Behavioural Analysis using Google Glass|
|11:50-12:10||Yorick Holkamp and John Schavemaker, Delft University of Technology||A Comparison of Human and Machine Learning-based Accuracy for Valence Classification of Subjects in Video Fragments|
|12:10-12:30||John Schavemaker, Saskia Koldijk, Leon Wiertz, Suzan Verberne, Maya Sappelli, Rianne Kaptein and Erik Boertjes, TNO||Fishualization: a group feedback display|