Rapid Response

Role: Team member, advising on use of EDA as measure for participant reponse to simulations.

 

Project Description

The Rapid Response Training system (RRTS) was designed to train nurses in recognizing the signs and symptoms of rapid deterioration in patients. The RRTS simulates the daily duties of a nurse including visiting multiple patients typically four times a day, gathering the vital signs of the patients and reporting them in an Electronic Health Record System (EHR). To provide this experience our system is setup in a dual screen configuration. A large screen display is used to show the virtual hospital environment which includes the patient and the various instruments present in a general ward in a life-size view. The second display is used to present the Electronic Health Record [EHR] that would be used to report the quantitative and qualitative vital signs gathered while interacting with the virtual patient.

 

Virtual humans used in the RRTS have been modeled after real life patients who have undergone rapid deterioration, over the course of a nurse’s shift. The signs and symptoms of deterioration have been carefully modeled and animated in our virtual patients with the help of medical experts. Several modes of interaction with the patient are provided. The participants can ask a set of pre-defined medically relevant questions (via a dialogue box) or use one of the instruments in the patient’s environment to measure his vital signs, and record the observations in the simulated EHR system. In this research, the RRTS served as a rich experiment platform for empirical examination of how factors associated with the appearance of the virtual human can affect the users’ emotional state and responses.

 

Realistic versus stylized depictions of virtual humans in simulated inter-personal situations and their ability to elicit emotional responses in users has been an open question for artists and researchers alike. We empirically evaluated the effects of near visually realistic vs. non-realistic stylized appearance of virtual humans on the emotional response of participants in a medical virtual reality system that was designed to educate users in recognizing the signs and symptoms of patient deterioration. In a between- subjects experiment protocol, participants interacted with one of three different appearances of a virtual patient, namely visually realistic, cartoon-shaded and charcoal-sketch like conditions in a mixed reality simulation. Emotional impact were measured via a combination of quantitative objective measures were gathered using skin Electrodermal Activity (EDA) sensors, and quantitative subjective measures such as the Differential Emotion Survey (DES IV), Positive and Negative Affect Schedule (PANAS), and Social Presence questionnaire. The emotional states of the participants were analyzed across four distinct time steps during which the medical condition of the virtual patient deteriorated (an emotionally stressful interaction), and were contrasted to a baseline affective state. Objective EDA results showed that in all three conditions, male participants exhibited greater levels of arousal as compared to female participants. We found that negative affect levels were significantly lower in the visually realistic condition, as compared to the stylized appearance conditions. Furthermore, in emotional dimensions of interest-excitement, surprise, anger, fear and guilt participants in all conditions responded similarly. However, in social emotional constructs of shyness, presence, perceived personal- ity, and enjoyment-joy, we found that participants responded differently in the visually realistic condition as compared to the cartoon and sketch conditions. Our study suggests that virtual human appearance can affect not only critical emotional reactions in affective inter-personal training scenarios, but also users’ perceptions of personality and social characteristic of the virtual interlocutors. 

 

Publications

  1. Volonte, M., Babu, S., Chaturvedi, H., Newsome, N., Ebrahimi, E., Roy, T., Daily, S.B., Fasolino, T. (to appear) Effects of Virtual Human Appearance Fidelity on Emotion Contagion in Affective Inter-Personal Simulations. Proceedings of IEEE Virtual Reality, Greenville, S.C.

  2. Newsome, N., Chaturvedi, H., Babu, S.V., Luo, J., Ebrahimi, E., Roy, T., Daily, S.B., Fasolino, T. (2015, March). Comparative Evaluation of Stylized versus Realistic Representation of Virtual Humans on Users’ Emotional Responses in Simulated Interpersonal Experiences. Presented at the meeting of the IEEE International Conference on Virtual Reality, Arles, France.

  3. Wu, Y., Babu, S. V., Armstrong, R., Bertrand, J. W., Luo, J., Roy, T., Daily, S.B., Dukes, L.C., Hodges, L.F., Fasolino, T. (2014). Effects of virtual human animation on emotion contagion in simulated inter-personal experiences. IEEE Transactions on Visualization and Computer Graphics, 20(4), 626-635.