Automatic Multimodal Affect Detection for Research and Clinical Use
This is a subaward with the University of Pittsburgh

For behavioral science, automated coding of affective behavior from multimodal (visual, acoustic, and verbal) input will provide researchers with powerful tools to examine basic questions in emotion, psychopathology, and interpersonal processes. For clinical use, automated measurement will help clinicians to assess vulnerability and protective factors and response to treatment for a wide range of disorders.

Investigators

Principal Investigator, ORI
Jeff Cohn
Principal Investigator, University of Pittsburgh
Project Start Date

08/01/2017

Project End Date

04/30/2024

Funding Agency

National Institute of Mental Health (NIMH)

Current Status

Active, not recruiting