Interpret and represent conversational multimodal behavioral data into an intuitive and educational format using data visualization techniques.
To validate the research hypothesis, MACH (My Automated Conversation coacH), an embodied 3D character, was designed. MACH is able to “see”, “hear” and “respond” in real-time through a webcam and a microphone using an ordinary laptop. The experiment was contextualized for job interviews where MACH played the role of the interviewer, asked interview questions, and at the end, provided feedback. The effectiveness of MACH was assessed through a weeklong trial with 90 MIT undergraduates. Students who interacted with MACH were rated by human experts to have improved in overall interview performance, expressing excitement about the job, and were more likely to be recommended for the position, while the ratings of students in control groups did not improve.
Findings from this thesis could open up new interaction possibilities of helping people with public speaking, social-communicative difficulties, language learning or even dating!