Turning CT into 3D
Making use of the Microsoft HoloLens in teaching and research at Heidelberg University came about as the next logical step somewhere between Prof. Dr. Eberhard Scholz’s innovative spirit and one of our inspiring lectures.
Creating a lifelike holographic heart is an easy-peasy mission, from a data scientist’s point of view. Myriads of computed tomography scans, conducted by default from patients with congenital heart defects, ventricular fibrillation, and other dysfunctionalities, provide seemingly endless data that just needs to be tapped. From a UX perspective, however, modelling the object is merely part of the process. After all, making use of the HoloLens for medical training only makes sense when concurrently implementing certain workflows that do not yet exist off the shelf.
How do we create a solution more than software that actually helps students and researchers to work, teach and learn better? Can open AR standards be established? These and other questions lead our way in the collaborative HoloHeart project.
A holographic heart
In a semi-automated workflow we separate from the CT data what does not belong to the heart: bones, muscles, blood vessels, etc., are of no interest to us. A bit of code here, a bit of math there, help us do magic to hundreds of two-dimensional layers until the heart is visible in all its glory in 3D.
Using a secure network, the Mixed Reality Toolkit (MRTK) and the newly available Model Loader, we manage to create a process in which the application can be loaded directly from the data glasses’ menu instead of having to rely on cloud-based structures.
I will be holding you forever
It was love at first sight when Prof. Dr. Eberhard Scholz first experienced the HoloHeart. HoloLens on, Systems on and after a few “clicks” it appears: magnificent, colorful and very impressive. A heart in the middle of the room and within reach. The wearer of glasses can enlarge it, reduce it, turn it, turn it upside down. Soon, individual segments will be able to be pulled out and viewed in isolation. The model of a real organ will be in front of them, based on T-data from a living person.
I'll keep it shining everywhere I go
Soon it will be possible to follow the lecturer handling the ventricle, aorta and artery via laptop, tablet or smartphone. Particularly diffuse anatomical structures, such as complex heart defects, will become tangible, even in times of minimum distance. The “hands-on” model helps understand what is otherwise difficult to grasp. E.g. how close together risk areas are on the heart. This is relevant not only for students, but also for interdisciplinary exchange to evaluate positional relationships and discuss surgical interventions.
For HoloLens wearers, reality does not disappear in the process. They can interact with the model in space without losing a sense of their physical surroundings. This is where the music plays!
Simulations and 4D
In December 2020, we are working on refining the multiplayer session, which will enable multiple HoloLens wearers to access the model simultaneously and make manipulations on the heart, such as extracting, rotating and scaling individual segments, visible to all.
Ideas for creating new added value abound. For example, by virtually visualizing the propagation of excitation: comparable to a three-dimensional “map,” our HoloHeart could represent the propagation of electrical stresses in the heart and simulate incisions.
Very profitable would also be the integration of 4D: the contraction of the muscle, the squeezing of the chamber and finally the outflow of blood. A model that holographically represents the heartbeat from start to finish, encompassing all four dimensions of visual perception.
In the project, we process technologies entirely at the passage of time. Our HoloHeart has the potential to create great benefits for medicine that will need to be opened up to other specialties. You’re my brain, you’re my wits… well. Presumably other organs are also suitable as holographic, but not as song lyrics.