Development Unity Engine Visual Studio C# TFS (Team Foundation Server) Duration 4 Months
Learning Heart A study aid and teaching tool that can be used individually or collaboratively by multiple HoloLens users. Users can grasp the material like never before with this immersive experience.
When SphereGen's Mixed Reality Studio got its first project, our team was relatively small (I was the only developer). So the majority of the work for this project fell upon me to complete.
These tasks included
Working with the Doctors, Professors and SMEs at St. George's University to understand the desired app:
Visual Appearance (Realistic vs. Stylized vs. Cartoony vs. Simplified)
User Experience (Passive vs. Active)
Depth of features and mechanics
Acquire / Record audio
Create animations and transitions for state changes
Scripting all functionality
Implementing all of the above
Voice Command Recognition
Gesture Controlled Manipulation
Sharing / Collaboration between multiple users.
So much more!
Documentation (User Manual)
The aspects of this project I spent the most time on were implementing gesture controlled manipulation and sharing/collaboration between multiple users.
Gesture Controlled Manipulation of Objects
Incorporating gesture controls for manipulation was easy initially. What took time was finding the best implementation and then refining the control users had.
After some time, I felt brave enough to think my implementation was solid. That was until one of the testers did something completely unexpected. They walked to the other side of the object they were going to manipulate and turned around. The manipulations were now completely off, nothing was working as expected. This abnormal tester broke my spirit.
As more time passed and more implementations were tried and scrapped, I finally came across another one that was full-proof. That's right, this tester wouldn't stump me this time! They could stand anywhere they wanted, rotate/look anyway they wanted, and it would all be handled properly.
The tester was glad things were working as expected, but of course, as a tester, their job is to try and break things. In their attempt to break the functionality, they augmented their height. I still don't know how they acquired a ladder but everything held! Nothing broke!
That was of course, until they tried the new manipulations I had recently re-worked. I had not taken the orientation of the manipulated object (and any children/parents it may or may not have) in relation to the user's orientation into consideration. Let's just say my spirit was broken again.
The allotted time I had for this aspect of the project was running out. If I didn't figure out something soon, I'd have to drop this for now, and hope to continue working on it once everything else was completed.
Fortunately, an idea came to mind during a conversation I had the following weekend. I was so eager to try it I didn't wait for Monday. After returning home that night, I got right to work on my "goober" and once I was done, I eagerly awaited my return to the office the following morning.
Soon after arrival, I deployed my solution to the HoloLens and tested everything myself. Once I made some small adjustments and tuning, I felt confident that my manipulation woes were done with. The tester gave it a try and couldn't find a way to break my implementation for manipulation.
My final implementation (Scale/Rotate) only takes the user's (manipulator's) position and orientation into account:
Have an empty object designated for manipulation.
When manipulation begins, move the object to the exact position of the object to be manipulated.
Parent the empty object to the user (manipulator)
Zero out the empty objects local rotation and normalize the scale to one.
Parent the object to be manipulated to the empty object.
Apply manipulations to the empty object's local values. (Scale/Rotation)
Unparent the object being manipulated from the empty object and reparent it to its original parent.
My favorite aspects of this implementation are that it vastly cut down on all the math/calculations I had to do, and its easy to explain.
This implementation can be seen in the Learning Heart videos on this page.
This implementation for manipulation was the foundation for SphereGen's Mixed Reality Framework.
Its been vastly overhauled, updated and improved since this first incarnation and it's current state can be seen in SphereGen's more recent Mixed Reality Apps.