MotivationCadaver dissections, plasticized models, two- dimensional (2-D) diagrams, and didactic lectures— these are the primary tools of traditional anatomy education. However, these methods have many limitations and are not always enough to help students build a visual-spatial understanding of anatomical structures. For example, 2-D diagrams lose an entire dimension in detail and do not allow for the observation of the motion of muscles and nerves in a three-dimensional (3-D) space (Figure 1). Virtual reality (VR) applications for supplemental anatomy education have been one of several new trends to emerge in education technology.1 1. C. Moro, Z. Štromberga, A. Raikos, and A. Stirling, “The effectiveness of virtual and augmented reality in health sciences and medical anatomy,” Anatomical Sciences Education 10, no. 6 (2017), https://doi.org/10.1002/ase.1696.
VR is a strong 3-D platform for students to practice direct manipulation of complex anatomical structures.2 In addition, VR provides a highly motivational learning environment while supporting the learning of those with lower visuospatial ability.3 To fill a gap in traditional anatomy education and build on VR as education technology, our research explores the efficacy of VR as a platform for anatomy education.
Visualizing the routes of neuronal signaling through 3-D space is particularly difficult to present through the use of traditional teaching methods.4 The relationship between nerves and muscles is complex, and nerve severance produces varied effects on the mechanics of a leg depending on where a nerve is damaged. Currently, the field of veterinary medicine is limited to understanding the underlying mechanics of muscle systems rather than observing the nerves and their muscles in motion. For our creative research project, we built “InNervate VR,” a VR program that allows a user to interact with and innervate (supply nerves to) a canine thoracic limb. The user is able to sever certain nerves on a canine thoracic limb and view how the nerve damage affects the leg’s range of motion. This tool will explore the possibilities of VR and seek to improve upon existing methods of higher-level anatomy education. |
Figure 1. 2-D anatomical diagrams do not support a 3-D understanding of anatomical structures nor allow for the observation of motion.
SINCE ITS INVENTION, VR HAS BEEN NOTED FOR ITS POTENTIAL IN EDUCATION AND TRAINING PURPOSES |
Background
Since its invention, VR has been noted for its potential in education and training purposes. There have been many case studies which test the usability of VR specifically for anatomy education. VR can provide engaging and intuitive environments for learning visually and spatially complex topics such as human anatomy, biochemistry, and molecular biology. One of the key reasons VR can surpass traditional methods of instruction is that VR enables users to move beyond “real-world” experiences by interacting with or altering virtual objects in ways that would otherwise be difficult or impossible.5,6
While there is much evidence supporting VR as a viable supplement of anatomy education, most studies evaluating the feasibility of VR educational tools conclude that it is not yet ready to replace traditional methods of teaching.7 Some of the downsides of VR for training and education purposes include cost, acceptance in the medical community, and limitations of technology including achieving realism. Furthermore, many studies comparing traditional methods of education versus new media-based methods conclude that while users who tested out new media methods had overall positive feedback on the experiences, they did not score significantly higher on evaluation exams than the users who followed traditional education methods.8 Most VR anatomy programs take advantage of the medium’s 3-D spatialization of anatomical structures, but their interaction is limited to switching visibility, adding and subtracting layers, manipulating the translation of organs, bones, muscles and structures, and viewing labels to learn about the spatial relationships of the anatomical structure.9,10,11,12
A literature review of the existing studies on VR teaching tools shows that the most successful VR programs are those which take advantage of the medium’s unique capabilities to illuminate a difficult topic to teach using traditional methods. Through our research, we seek to increase the level of dynamic interaction than what previous VR anatomy applications permitted by allowing the user to change the animation and movement of the 3-D models in their environment (Figure 2).
Exhibition
Our creative work exhibition demonstrated a user’s experience in two distinct environments—the external environment and the internal environment (Figure 3).
The first component of our exhibit, the external environment, is everything a user sees when they enter the Visualization Immersion Reality Lab (VIRL) at Texas A&M University. This stage of our exhibit is purely functional and includes all the necessary systems and components to utilize VR:
- HTC Vive headset, sensors, and controllers
- Structure built around the system delineating the area in which the user can move freely
- PC and associated systems that run the application on the headset
The second component of our exhibit, the internal environment, is the appearance of the exhibit through the lens of the HTC Vive headset. When using a VR application, the user’s physical world is completely replaced with a rendered, computer-graphics generated scene. Due to the computer-generated nature of VR, we had complete agency in designing the user’s experience of their environment within the application. The main focus of our application exhibit is the interaction between the user and the thoracic limb model, and we based our design decisions to facilitate this user interaction.
The application’s environment is a simple room that is generally void of defining features. This simple layout reduces distractions commonly associated with the newness of VR to some users. In order to direct the user’s attention to the learning target, the canine forelimb has the most saturated colors within the environment and is situated atop a visually unique pedestal structure in the center of the room. During the interaction, our user is able to move around the space and view the model about the pedestal’s full circumference. In addition to the physical components of our virtual environment, a user interface is needed to aid the user in understanding the different ways one can interact with the environment around them.
User Interface
We present our User Interface (UI) on two floating panels situated on the left and right sides of the leg, pushed slightly further back in space than the leg. This way, the user never loses sight of the leg when utilizing the interface. The most essential interaction within our program is the user’s ability to define a location along a nerve, sever the nerve at said location, and observe how this nerve damage affects the leg’s range of movement. Our solution to facilitating this interaction is to provide the user with a “Play Animation” button, paired alongside a “Reset” button. When pressed, the “Play animation” button will activate the thoracic limb to move in its full, healthy range of motion and come to a stop after two cycles. Each time the user damages a nerve, they can press “Play Animation” to view the resulting motion. If they would like to undo all damages and return their leg to a healthy state, they can press the “Reset” button. To cut a nerve, the user simply needs to point at a spot along a nerve and press the trigger of their hand controller. To better aid the user in understanding the results of their interactions, we have implemented visual feedback which turns the nerves and affected muscles red when damaged.
We provide the user with a third UI element: a menu list of possible nerves. Each possible selection will show small markers along the nerve models, revealing to the user where cuts can be made along the nerve. Another key UI element is a slider that gives the user the ability to reduce or increase the opacity of the leg’s muscle models, which will allow for a better view of any of the canine nerves obscured within muscles.
EVEN THOUGH A USER IS FREE TO MOVE AROUND...MANY USERS PREFER TO REMAIN RELATIVELY STATIONARY. |
AR FindingsCollaborative research between the Department of Veterinary Medicine and Biomedical Sciences and the Department of Visualization is an ongoing endeavor and tools such as our InNervate VR fall under the umbrella of creative works known as Creative Anatomy. Within this sphere of development, InNervate VR has a related prototype augmented reality application, InNervate AR. Augmented reality is technology which superimposes computer-generated images over the user’s view of the real world. AR typically utilizes hand-held devices such as phones to show imagery, unlike the less accessible headsets used by VR. A frequent critique of InNervate AR was the application’s limited ability to compare a damaged limb from a healthy limb with a full range of motion. This feedback led us to create an option for the user to toggle on or off a second limb in InNervate VR that always plays the animation for a healthy range of motion. This toggle allows a user to view the damaged limb side-by-side a healthy limb animation for comparison. Another way in which the prior AR work impacted our planned interaction is rotation features in the user interface. Even though a user is free to move around our thoracic limb asset while in the VR application, the AR application found that many users prefer to remain relatively stationary. By providing users with the option to rotate the leg, we can ensure that the users may view the limb in a 3D capacity even if they choose to stand in place. Both the previous AR work and our InNervate VR applications seek to fulfill a similar educational deficit in anatomy education. Their comparison may influence future work research under creative anatomy. |
Reflection
During our public presentation of InNervate VR, users agreed that we did not include a sufficient introduction to our virtual environment. Users who were less familiar with VR platforms did not know what todo upon entering the environment; they almost always needed verbal instructions from us explaining the interactions they could do in our environment. Similarly, users who were unfamiliar with anatomy were unsure of what exactly they were observing on the canine leg.
To remedy both issues, we implemented a tutorial section to our VR application. The tutorial introduces the users to the space by encouraging them to click on certain buttons in a pre-defined order. This not only reveals to the user the interactions possible in InNervate VR but also allows non-VR users to become comfortable using the Steam VR hand controller systems. Our tutorial also includes a labeling section where we provided the names of each muscle, bone, and nerve on our canine thoracic limb. We hope this labeling module will serve as a teaching tool to non- anatomy students as well as a refresher to students with some anatomy experience. Most major changes we implemented after receiving feedback were related to user interaction. If we were to modify our VR program, we would solidify our desired user interaction at an earlier stage of development. Because the user interaction is critical to the success of VR, it would have benefitted us to pinpoint potential interaction problems, such as minimal introduction to the virtual environment, earlier.
We believe InNervate VR is a promising addition to the higher-level anatomy classroom. Ideally, it will be implemented as a lab activity where students receive traditional lectures covering the topics of innervation in the canine thoracic limb. Afterward, they enter our Virtual Environment and truly visualize the complex concept of innervation. Not only will the students’ interaction with the canine limb solidify their understanding of their anatomy lectures, but our application allows them to view innervation in its true 3-D form. InNervate VR will help students better understand concepts for their exams, as well as reinforce anatomy students’ overall holistic understanding of innervation, which will carry on to support them in their professions. Feedback from both anatomy professors and students supports that InNervate VR does illustrate a topic that is traditionally challenging for students to comprehend. These experts also indicated that we achieved our goal of accurately representing the movement and texture of the muscles that make up our limb. As demand for visualization tools for the medical community increases, and as technology becomes more affordable and accessible, InNervate VR and any ongoing InNervate projects resulting from it will greatly enhance the teaching of innervation and other spatially-complex topics. |
IDEALLY, [INNERVATE VR] WILL BE IMPLEMENTED AS A LAB ACTIVITY WHERE STUDENTS RECEIVE TRADITIONAL LECTURES |
Acknowledgments
We would like to thank our principal investigator, Dr. Jinsil Seo, our graduate advisor Margaret Cook, and our 3-D technical advisors Caleb Kicklighter and Austin Payne for their guidance throughout this research. We would also like to thank the Department of Veterinary Medicine and Biomedical Sciences for their continued collaboration with the Department of Visualization.
Thank you also to the Department of Visualization and the Visualization Immersion Reality Lab (VIRL) for providing the necessary equipment and resources to carry out this research.
References
C. Moro, Z. Štromberga, A. Raikos, and A. Stirling, “The effectiveness of virtual and augmented reality in health sciences and medical anatomy,” Anatomical Sciences Education 10, no. 6 (2017), https://doi.org/10.1002/ase.1696.
S. Jang, J. M. Vitale, R. W. Jyung, and J. B. Black, “Direct manipulation is better than passive viewing for learning anatomy in a three-dimensional virtual reality environment.” Computers and Education 106, (2017): 150–65, https://doi.org/10.1016/j.compedu.2016.12.009.
K. Stepan, J. Zeiger, S. Hanchuk, A. Del Signore, R. Shrivastava, S. Govindaraj, and A. Iloreta, “Immersive virtual reality as a teaching tool for neuroanatomy,” International Forum of Allergy and Rhinology 7 no. 10 (2017): 1006–13, https://doi.org/10.1002/alr.21986.
N. Heise, H. A. Hall, B. A. Garbe, C. M. Eitel, and T. R. Clapp, “A Virtual Learning Modality for Neuroanatomical Education,” The FASEB Journal (April 20, 2018).
Helene Hoffman and Dzung Vu, “Virtual Reality: Teaching Tool of the Twenty-First Century?” Academic Medicine 72, no. 12, (1997): 1076–81.
H. S. Maresky, A. Oikonomou, I. Ali, N. Ditkofsky, M. Pakkal, and B. Ballyk, “Virtual reality and cardiac anatomy: Exploring immersive three‐ dimensional cardiac imaging, a pilot study in undergraduate medical anatomy education.” Clinical Anatomy 32, no. 2 (March 2019): 238–243, https://doi.org/10.1002/ca.23292.
S. A. Azer and S. Azer, “3D Anatomy Models and Impact on Learning: A Review of the Quality of the Literature,” Health Professions Education 2, no. 2 (2016): 80–98, https://doi.org/10.1016/j.hpe.2016.05.002.
Anthony M. Codd and Choudhurt Bipasha, “Virtual Reality Anatomy: Is It Comparable with Traditional Methods in the Teaching of Human Forearm Musculoskeletal Anatomy?” Anatomical Sciences Education 4, no. 3 (2011): 119–25.
M. Fairén, M. Farrés, J. Moyés, and E. Insa, “Virtual Reality to Teach Anatomy,” EUROGRAPHICS (2017).
Stepan, “Immersive virtual reality as a teaching tool,” 1006–13.
Jang, “Direct manipulation,” 150–65.
D. T. Nicholson, C. Chalk, W. R. J. Funnell, and S. J. Daniel, “Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model,” Medical Education, 40, no. 11 (2006): 1081–8, https://doi.org/10.1111/j.1365-2929.2006.02611.x.
Amber May Ackley ‘19Amber May Ackley ‘19 is a Visualization major with a minor in Computer Science from Plano, Texas, who went to Liberty High School. Amber’s passion for art and computer graphics has lead her to pursue new media projects such as the educational anatomy application InNervate VR as an Undergraduate Research Scholar. After graduation, she is pursuing a career as a Rigging Artist in order to further explore biological motion and performance in CG characters. |
Karla I. Chang Gonzalez ‘19Karla I. Chang Gonzalez ‘19 is a Visualization major from El Paso, Texas. As a Visualization major, Karla has a passion for storytelling and technical problem solving. During her undergraduate career, Karla served as President for TAMU ACM SIGGRAPH and volunteered at two national SIGGRAPH conferences. She also worked as a graphic designer and editor for Explorations during her freshman year of college. Karla’s long-term career goal is to work on animated feature films. |