Physical Virtual Patient Simulator
Figure: Examples of the physical-virtual patient simulator showing a stroke patient (left), a
measles patient (center), and a sepsis patient highlighting the technology underneath (right).
Interdisciplinary collaborators:
- Dr. Greg Welch, Dr. Gerd Bruder, Dr. Jason Hochreiter, Dr. Nahal Norouzi, Ryan Schubert, (Institute of Simulation and Training, and Computer Science – University of Central Florida, Orlando FL)
- Dr. Laura Gonzalez (SentinelU, GA)
- Dr. Mindi Anderson,
- Dr. Desiree Diaz (College of Nursing – University of Central Florida, Orlando FL)
- Dr. Juan Cendan (College of Medicine – University of Central Florida, Orlando FL)
- Dr. Shiva Kalidindi (Nemours Hospital, Orlando FL)
Innovation Description:
In collaboration with a team of computer scientists and healthcare educators from the College of Nursing and the College of Medicine at University of Central Florida, I developed a new class of Physical-Virtual Patient Simulators. The PVPS consists of a mobile frame on which projectors, infrared cameras, infrared lights, speakers, tactile units (for pulse), and heater units for temperature are attached. The patient’s 3D shape is created using semi-transparent acrylic shell and the imagery is rear projected onto the surface of the physical shell. The shell is covered with a silicone material that allows the imagery to diffuse on the surface, allows for automated touch detection, and feels like skin in terms of texture and temperature. The virtual patient is modeled as a 3D character and programmed in the unity game engine to respond visually and verbally via animations. The visuals of the patient are sent to the projectors and rear projected on the physical shell (e.g., facial expressions, speech, blink, pupil reactions, joint movements). Localized speakers output the patient’s speech, and sounds (e.g., heart sounds, lung sounds), and the pulse is driven by an audio file sent to an acoustic haptic device. The temperature is controlled using multiple small heaters mounted to the frame. The infrared cameras can detect touch and send a signal to the simulation graphics engine to update the imagery in real time, allowing healthcare providers to perform touch sensing actions (e.g., capillary refill, tugging the lips/eyelids, detection of feeling on one side of the patient).
Impact:
We simulated healthy patients as wells multiple medical conditions including stroke, sepsis, child abuse, measles, burns, and others. We conducted multiple studies using the PVPS with nursing and medical students. We found that nursing students learned better when they interacted with a physical virtual patient compared to a mannequin. The students also had a higher sense of urgency and perceived higher authenticity. In multiple studies the participants had a high acceptance for the technology, and it increased their confidence, sense of presence, and perception of realism (see more details in the publications below). In the future, this technology could transition from basic research to being a core simulator that supplements existing simulations, especially for conditions that are otherwise hard to simulate. It can also represent diverse patients to enable healthcare trainees to experience interaction with a variety of conditions in a short period of time while being immersed, without worrying about how to use a new technology’s interface since the interaction is not different than interacting with a human.
Funding: National Science Foundation
Patents, Publications, Presentations:
2021
Gregory Welch, Gerd Bruder, Salam Daher, Jason Hochreiter, Mindi Anderson, Laura Gonzalez, Desiree Diaz
Physical-Virtual Patient System Patent
20210248926, 2021.
@patent{welch2021physical,
title = {Physical-Virtual Patient System},
author = {Gregory Welch and Gerd Bruder and Salam Daher and Jason Hochreiter and Mindi Anderson and Laura Gonzalez and Desiree Diaz},
url = {https://www.freepatentsonline.com/y2021/0248926.html},
year = {2021},
date = {2021-08-12},
urldate = {2021-08-12},
number = {20210248926},
issue = {US Patent App. 16/786,342},
abstract = {A patient simulation system for healthcare training is provided. The system includes one or more interchangeable shells comprising a physical anatomical model of at least a portion of a patient's body, the shell adapted to be illuminated from within the shell to provide one or more dynamic images viewable on the outer surface of the shells; wherein the system comprises one or more imaging devices enclosed within the shell and adapted to render the one or more dynamic images on an inner surface of the shell and viewable on the outer surface of the shells; one or more interface devices located about the patient shells to receive input and provide output; and one or more computing units in communication with the image units and interface devices, the computing units adapted to provide an interactive simulation for healthcare training. In other embodiments, the shell is adapted to be illuminated from outside the shell.},
keywords = {},
pubstate = {published},
tppubtype = {patent}
}
2020
Laura Gonzalez, Salam Daher, Gregory Welch
Neurological Assessment Using a Physical-Virtual Patient (PVP). Journal Article
In: vol. 51, iss. 6, pp. 802-818, 2020.
@article{gonzalez2020neurological,
title = {Neurological Assessment Using a Physical-Virtual Patient (PVP).},
author = {Laura Gonzalez and Salam Daher and Gregory Welch},
url = {https://journals.sagepub.com/doi/pdf/10.1177/1046878120947462},
doi = {10.1177/1046878120947462},
year = {2020},
date = {2020-08-12},
urldate = {2020-08-12},
booktitle = {Simulation and Gaming},
volume = {51},
issue = {6},
pages = {802-818},
abstract = {Background. Simulation has revolutionized teaching and learning. However, traditional manikins are limited in their ability to exhibit emotions, movements, and interactive eye gaze. As a result, students struggle with immersion and may be unable to authentically relate to the patient. Intervention. We developed a new type of patient simulator called the Physical-Virtual Patients (PVP) which combines the physicality of manikins with the richness of dynamic visuals. The PVP uses spatial Augmented Reality to rear project dynamic imagery (e.g., facial expressions, ptosis, pupil reactions) on a semi-transparent physical shell. The shell occupies space and matches the dimensions of a human head. Methods. We compared two groups of third semester nursing students (N=59) from a baccalaureate program using a between-participant design, one group interacting with a traditional high-fidelity manikin versus a more realistic PVP head. The learners had to perform a neurological assessment. We measured authenticity, urgency, and learning. Results. Learners had a more realistic encounter with the PVP patient (p=0.046), they were more engaged with the PVP condition compared to the manikin in terms of authenticity of encounter and cognitive strategies. The PVP provoked a higher sense of urgency (p=0.002). There was increased learning for the PVP group compared to the manikin group on the pre and post-simulation scores (p=0.027). Conclusion. The realism of the visuals in the PVP increases authenticity and engagement which results in a greater sense of urgency and overall learning.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Salam Daher, Jason Hochreiter, Ryan Schubert, Laura Gonzalez, Juan Cendan, Mindi Anderson, Desiree Diaz, Gregory Welch
Physical-Virtual Patient: A new patient simulator Journal Article
In: Society of Simulation in Healthcare Journal, vol. 15, iss. 2, pp. 115-121, 2020.
@article{daher2020physical,
title = {Physical-Virtual Patient: A new patient simulator},
author = {Salam Daher and Jason Hochreiter and Ryan Schubert and Laura Gonzalez and Juan Cendan and Mindi Anderson and Desiree Diaz and Gregory Welch },
url = {https://journals.lww.com/simulationinhealthcare/fulltext/2020/04000/the_physical_virtual_patient_simulator__a_physical.9.aspx},
doi = {10.1097/SIH.0000000000000409},
year = {2020},
date = {2020-04-01},
urldate = {2020-04-01},
journal = {Society of Simulation in Healthcare Journal},
volume = {15},
issue = {2},
pages = {115-121},
abstract = {Introduction: We introduce a new type of patient simulator referred to as the Physical-Virtual Patient Simulator (PVPS). The PVPS combines the tangible characteristics of a human-shaped physical form with the flexibility and richness of a virtual patient. The PVPS can exhibit a range of multisensory cues, including visual cues (eg, capillary refill, facial expressions, appearance changes), auditory cues (eg, verbal responses, heart sounds), and tactile cues (eg, localized temperature, pulse).
Methods: We describe the implementation of the technology, technical testing with healthcare experts, and an institutional review board–approved pilot experiment involving 22 nurse practitioner students interacting with a simulated child in 2 scenarios: sepsis and child abuse. The nurse practitioners were asked qualitative questions about ease of use and the cues they noticed.
Results: Participants found it easy to interact with the PVPS and had mixed but encouraging responses regarding realism. In the sepsis scenario, participants reported the following cues leading to their diagnoses: temperature, voice, mottled skin, attitude and facial expressions, breathing and cough, vitals and oxygen saturation, and appearance of the mouth and tongue. For the child abuse scenario, they reported the skin appearance on the arms and abdomen, perceived attitude, facial expressions, and inconsistent stories.
Conclusions: We are encouraged by the initial results and user feedback regarding the perceived realism of visual (eg, mottling), audio (eg, breathing sounds), and tactile (eg, temperature) cues displayed by the PVPS, and ease of interaction with the simulator.(Sim Healthcare 15:115–121, 2020)},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Methods: We describe the implementation of the technology, technical testing with healthcare experts, and an institutional review board–approved pilot experiment involving 22 nurse practitioner students interacting with a simulated child in 2 scenarios: sepsis and child abuse. The nurse practitioners were asked qualitative questions about ease of use and the cues they noticed.
Results: Participants found it easy to interact with the PVPS and had mixed but encouraging responses regarding realism. In the sepsis scenario, participants reported the following cues leading to their diagnoses: temperature, voice, mottled skin, attitude and facial expressions, breathing and cough, vitals and oxygen saturation, and appearance of the mouth and tongue. For the child abuse scenario, they reported the skin appearance on the arms and abdomen, perceived attitude, facial expressions, and inconsistent stories.
Conclusions: We are encouraged by the initial results and user feedback regarding the perceived realism of visual (eg, mottling), audio (eg, breathing sounds), and tactile (eg, temperature) cues displayed by the PVPS, and ease of interaction with the simulator.(Sim Healthcare 15:115–121, 2020)
2019
Salam Daher, Jason Hochreiter, Ryan Schubert, Gerd Bruder, Laura Gonzalez, Juan Cendan, Mindi Anderson, Desiree Diaz, Gregory Welch
Matching vs. Non-Matching Visuals and Shape for Embodied Virtual Healthcare Agents Conference
IEEE Virtual Reality, Osaka, Japan, 2019, ISBN: 978-1-7281-1377-7.
@conference{daher2019matching,
title = {Matching vs. Non-Matching Visuals and Shape for Embodied Virtual Healthcare Agents},
author = {Salam Daher and Jason Hochreiter and Ryan Schubert and Gerd Bruder and Laura Gonzalez and Juan Cendan and Mindi Anderson and Desiree Diaz and Gregory Welch},
url = {https://ieeexplore.ieee.org/document/8797814},
doi = {10.1109/VR.2019.8797814},
isbn = { 978-1-7281-1377-7},
year = {2019},
date = {2019-03-23},
urldate = {2019-03-23},
booktitle = {IEEE Virtual Reality},
pages = {886-887},
address = {Osaka, Japan},
abstract = {Embodied virtual agents serving as patient simulators are widely used in medical training scenarios, ranging from physical patients to virtual patients presented via virtual and augmented reality technologies. Physical-virtual patients are a hybrid solution that combines the benefits of dynamic visuals integrated into a human-shaped physical form that can also present other cues, such as pulse, breathing sounds, and temperature. Sometimes in simulation the visuals and shape do not match. We carried out a human-participant study employing graduate nursing students in pediatric patient simulations comprising conditions associated with matching/non-matching of the visuals and shape.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
2018
Salam Daher, Laura Gonzalez, Jason Hochreiter, Nahal Norouzi, Gerd Bruder, Gregory Welch
Touch-Aware Intelligent Physical-Virtual Agents for Healthcare Simulation Conference
ACM Intelligent Virtual Agents, Sydney, Australia, 2018.
@conference{daher2018physical,
title = {Touch-Aware Intelligent Physical-Virtual Agents for Healthcare Simulation},
author = {Salam Daher and Laura Gonzalez and Jason Hochreiter and Nahal Norouzi and Gerd Bruder and Gregory Welch},
url = {https://dl.acm.org/doi/abs/10.1145/3267851.3267876},
doi = {10.1145/3267851.3267876},
year = {2018},
date = {2018-11-05},
urldate = {2018-11-05},
booktitle = {ACM Intelligent Virtual Agents},
pages = {99-106},
address = {Sydney, Australia},
abstract = {Conventional Intelligent Virtual Agents (IVAs) focus primarily on the visual and auditory channels for both the agent and the interacting human: the agent displays a visual appearance and speech as output, while processing the human's verbal and non-verbal behavior as input. However, some interactions, particularly those between a patient and healthcare provider, inherently include tactile components. We introduce an Intelligent Physical-Virtual Agent (IPVA) head that occupies an appropriate physical volume; can be touched; and via human-in-the-loop control can change appearance, listen, speak, and react physiologically in response to human behavior. Compared to a traditional IVA, it provides a physical affordance, allowing for more realistic and compelling human-agent interactions. In a user study focusing on the neurological assessment of a simulated patient showing stroke symptoms, we compared the IPVA head with a high-fidelity touch-aware mannequin that has a static appearance. Various measures of the human subjects indicated greater attention, affinity for, and presence with the IPVA patient, all factors that can improve healthcare training.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Jason Hochreiter, Salam Daher, Gerd Bruder, Gregory Welch
Cognitive and Touch Performance Effects of Mismatched 3D Physical and Visual Perceptions Conference
IEEE Virtual Reality, Germany, 2018, ISBN: 978-1-5386-3365-6.
@conference{hochreiter2018cognitive,
title = {Cognitive and Touch Performance Effects of Mismatched 3D Physical and Visual Perceptions},
author = {Jason Hochreiter and Salam Daher and Gerd Bruder and Gregory Welch},
url = {https://ieeexplore.ieee.org/document/8446574},
doi = {10.1109/VR.2018.8446574},
isbn = {978-1-5386-3365-6},
year = {2018},
date = {2018-03-18},
urldate = {2018-03-18},
booktitle = {IEEE Virtual Reality},
pages = {1-386},
address = {Germany},
abstract = {While research in the field of augmented reality (AR) has produced many innovative human-computer interaction techniques, some may produce physical and visual perceptions with unforeseen negative impacts on user performance. In a controlled human-subject study we investigated the effects of mismatched physical and visual perception on cognitive load and performance in an AR touching task by varying the physical fidelity (matching vs. non-matching physical shape) and visual mechanism (projector-based vs. HMD-based AR) of the representation. Participants touched visual targets on four corresponding physical-visual representations of a human head. We evaluated their performance in terms of touch accuracy, response time, and a cognitive load task requiring target size estimations during a concurrent (secondary) counting task. After each condition, participants completed questionnaires concerning mental, physical, and temporal demands; stress; frustration; and usability. Results indicated higher performance, lower cognitive load, and increased usability when participants touched a matching physical head-shaped surface and when visuals were provided by a projector from underneath.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
2017
Salam Daher
Optical See-Through vs. Spatial Augmented Reality Simulators for Medical Applications Presentation
IEEE Virtual Reality, 01.03.2017.
@misc{daher2017optical1,
title = {Optical See-Through vs. Spatial Augmented Reality Simulators for Medical Applications},
author = {Salam Daher},
url = {PDF available upon request.},
year = {2017},
date = {2017-03-01},
urldate = {2017-03-01},
howpublished = {IEEE Virtual Reality},
keywords = {},
pubstate = {published},
tppubtype = {presentation}
}
Salam Daher
Optical See-Through vs. Spatial Augmented Reality Simulators for Medical Applications Conference
IEEE Virtual Reality, Los Angeles, CA, 2017.
@conference{daher2017optical2,
title = {Optical See-Through vs. Spatial Augmented Reality Simulators for Medical Applications},
author = {Salam Daher},
url = {PDF available upon request.},
year = {2017},
date = {2017-03-01},
urldate = {2017-03-01},
booktitle = {IEEE Virtual Reality},
address = {Los Angeles, CA},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
2016
Jason Hochreiter, Salam Daher, Arjun Nagendran, Laura Gonzalez, Gregory Welch
Optical Touch Sensing on Non-Parametric Rear-Projection Surfaces for Interactive Physical-Virtual Experiences Journal Article
In: Presence Journal, 2016.
@article{hochreiter2016optical,
title = {Optical Touch Sensing on Non-Parametric Rear-Projection Surfaces for Interactive Physical-Virtual Experiences},
author = {Jason Hochreiter and Salam Daher and Arjun Nagendran and Laura Gonzalez and Gregory Welch},
year = {2016},
date = {2016-07-01},
urldate = {2016-07-01},
journal = {Presence Journal},
keywords = {},
pubstate = {published},
tppubtype = {article}
}