Here's some details, descriptions, media and links for many of the projects built during the Reality Virtually Hackathon in October, 2016.
You can also visit the list of award winners.
Let it Snow!
"Let it Snow!" is a fanciful mixed reality experience that transposes the magic of winter onto unlikely environments.
We created the winter wonderland effect out of a Unity particle system which simulates the falling now. To complete the scenario, we experimented with the HoloLens' spatial mapping function to identify horizontal surfaces in the real world that catch an accumulation of virtual flakes, all while a serenade by a holographic snowman emanates in spatial audio.
Team: Adrian Sas, K. Rio Saxton, Nikolaos Vlavianos
StudyVR: A Kinesthetic Learning Framework for the HTC Vive
When you were in school, were you ever bored by a concept that you couldn't apply to real life? Most children learn best through doing, and StudyVR is our solution to engage the next generation of kinesthetic learners.
StudyVR is a learning tool for grade school kids to connect with their science classes with the HTC Vive, using kinesthetic hands-on interaction with a room-scale virtual reality system to engage children on a deeper cognitive level.
Team: Sean Bailey, Valentina Feldman, Brendan Luu, Fatma Ozen, Tyler Roach
Those with hearing impairments do not use the same audio cues that are typically used in everyday communication. HoloCaptions provides real-life closed captioning using speech-to-text API's and leverages facial recognition with Microsoft Hololens.
Team: Tess Gauthier, Johan Ospina, David Tran
Into the Mountains: A VR Language Immersion Experience
This program is an immersive language learning adventure. Your mysterious orb host will guide you through beautiful, striking levels as you discover who this mysterious child is and how she is connected to you and where to find her. It is fully in the chosen language with no translations, and you will surprise yourself as you continue through levels of the amount of words and phrases you will begin to recognize through full immersion and repetition of phrases and words in levels.
Team: Alfredo Barzola, Katy Hamilton, Kiera Johnson, Matt Maes, Viraj Rai
The project creates virtual LEGO bricks located in the user's field of vision to instruct them on the next step of the building process. Today, almost everything that is built is created using 2D drawings, from bridges, to furniture, to rockets. Augmented reality presents an opportunity to view building instructions superimposed upon the item being built. This will improve efficiency for several industries and allow for more effective assembly instructions.
Team: Gabe Evans, Jarrett Linowes, Tim Marquart, Jonathan Schlueter, Jonathan Dyssel Stets
WayPoint RX: Filling prescriptions with a Hololens
The WaypointRx team created a demo to help anyone who fills prescriptions (especially non-pharmacists) reduce fulfillment errors and avoid the harmful or deadly situation of giving someone the incorrect medication. We did this by creating an Augmented Reality application that would allow the user to fill a prescription by giving them a clear, visualized path to the correct medication, a 3D model of the expected visual of the pill and clearly displaying the information on the medication and the amounts required. This allowed non-trained users to locate medications in our demo setup much more quickly and with substantially lower errors.
Team: Umar Arshad, Paul Katsen Varun Mani, Sara Remsen, Jan-Erik Steel
Real Talk uses virtual reality to create an immersive experience for children ages 9 to 12 who are learning to speak another language. Players learn English, in a contextual, real world environment as they interact with characters in the game. They compose and speak their responses, offering youth a fun and interactive way to learn.
Team: Krystian Babilinski, Mark Kabban, Jeffrey King, Ovetta Sampson, Zach Schiller
KickALZ is a VR experience created for people suffering from Alzheimer's disease, which combines aspects of music, reminiscence, and sensory therapies. Users enter a landscape and proceed on a journey where they plant and engage with memory flowers. Voice dialogue is encouraged and Watson Voice Recognition is used to generate a sentiment analysis which can be used as a diagnostic tool. At the end of the journey, the user is able to relax in their personal memory garden, viewing photos and messages from their loved ones.
Team: Jacob Hamman, Alice Kra, Akshay Mohan, Meredith Wilson, Dig Vijay Kumar Yarlagadda
Portable therapeutic AR application for amputees for treatment of phantom limb pain.
Team: JULIEN BOUVIER-VOLAILLE, JINGRO GUO, MEHDI LEFOUILI, LINDSAY LIN
MOLECULVR is a tangible chemistry learning game for the HTC Vive. With MOLECUVLR, users can reach into a virtual periodic table to grab atoms of elements and combine them to create molecules. The molecules appear in their geometric VSEPR (or ball-and stick) shapes and can demonstrate real-world concepts that cannot occur within a traditional classroom or lab setting. We believe that MOLECULVR is the first step in creating a platform for virtual reality education.
Team: Taylor Gates, Jacqueline Hom, Mariangely Iglesias Pena, Ryan Lee, Michelle Wantuch
We can all relate to the experience of sitting down to make progress on a task - whether reading a book, writing a proposal, or clearing through email - only to find ourselves wandering away from the work at hand. 'Focus' is a mixed reality application built for HoloLens that promotes positive attention and focus. The user begins by dynamically wrapping their work in a 'zone of focus.' If the user becomes distracted from their task, their attention is gently guided back to their prioritized activity.
Team: Bodhi Connolly, Liz Cormack, Aaron Faucher, Yan Liu, Akshay Mohan, Leon Zhang
Fizz Filter is an augmented reality application that intelligently transforms your entire environment based on spatial understanding. It's makes informed decisions regarding object placement in the space. It's potential for commercial use is great as it , for instance, allows a user to furnish an entire room in seconds. Other uses include : social sharing of fun "filters", relaxation
Team: Lisa van Acquoij, Yannick Boers, Andrew Dupuis, Yasmeen Roumie, Jinny Yan
Limits have been shattered with advancements in augmented reality. These limits have been shattered in every industry, especially entertainment and education. Block-E aims to change the way we play and learn through blocks. Block-E creates holographic blocks similar to LEGO and gives the user a very simple interface to create anything. Unlike physical LEGO bricks, the user can pick blocks of any color and shape, and the user has access to an infinite number of blocks. Best of all, you can't step on holographic blocks.
Team: Aman Jha, Peter Fan, Jacob Nazarenko
Data Tree Modeler
Our modeler takes coding data structures and visualizes them as a tree. By doing so we are eliminating complexity and relating information to something we see everyday. The tree structure is something that we inherently understand because we have evolved to understand objects and structures in nature. Our modeler also interacts with the tree in a very meaningful way. Like the roots of a tree, some of which are filament like and some very thick and strong, the data structures are the same way. We have rendered that effect by changing the opacity of the trace code depending on the use. Similarly as a plant/leaves grows old, it slowly turns from green (young, new) to brown (old, dead), the user is able to see that in the tree modeler. Zooming in and out gives the depth of information that the person is looking obtain.
Team: Ethan Anderson, Tim Besard, Aravind Elangovan, Sarthak Giri, and Adam Sauer
Provides realtime respiratory feedback during guided meditations in a VR environment
Team: Tim Gallati, Sourabh Jain, Shea Rembold, Shylo Shepherd, and Angelica Tinga
We built a model of Greenland from real data that have been collected via satellite and airborne missions. We move around the surface of Greenland in a plane, the way that the majority of data are collected, pointing out important features along the way. We also take you below the ice, so you can discover a place that hasn't seen the atmosphere in over 100,000 years!
Team: Alexandra Boghosian, Shaashwat Sharma, Qiushuo Wang, Michael Wissner
VR Story Tellers
The Inception of the project was at the MIT Media Lab Reality Virtually Hackathon, where all of our team members brainstormed & came up with this idea of turning text to a Virtual Reality Experience in real time. On entering your story on our website, the app generates the viewable VR content including real life characters(images), objects in proper locations - scenes(background), coupled with a background sound based on the characters, actions and mood on the Story plot.We have built a website to enter your own story (as plain text). As you click on submit, we analyze and figure out the main objects, their descriptions, the background and the overall mood of the story using machine learning and natural language processing. We used Machine learning APIs of Microsoft Cognitive Services and the IBM Watson Tone Analyzer. We combine the outputs of the APIs into a single JSON file, which is then stored on the Cloud (Amazon Web Services). The file is picked up by Unity and analyzed to pull up assets and related animations based on the objects and background sounds based on the mood of the plot. The objects are placed relative to each other with the appropriate background and a background music is played real time. This entire experience is viewable on Google Cardboard. We typically chose Cardboard as it is one of the cheapest VR device available with the best experience, hence more users would get access to the app.
Team: Adrian Babilinski, Biswaraj Kar, Nabanita De, Pat Pataranutaporn, Yuta Toga
utilizing the Hololenses spatial mapping, voice recognition, gesture control, and most importantly 3d spatial audio to create a multi textured environment of audio and colors. Think of it as an AR audio sequencer where your location is the determining factor on what you are hearing. Beats you place stay locked in their location allowing you to freely walk around your song and experience it from different vantage points, or just make pockets of sound that you can only hear from one specific place.
This is being transformed into SoundScapeAR in future builds. check out our development at www.SoundScapeAR.com
Team: Costas Frost, Max Harper, Jameson Nash, Scott Niejadlik, Matt Silverstein
As a team we are interested in how bias informs people's reactions to questions and real-world events. This project explores the possibility of using VR to visualize information in an immersive and three-dimensional way.
Since 1994, the Pew Research Center has been tracking the partisanship of American voters. It has uncovered a worrying trend of growing ideological consistency and political polarization. This VR application is an experiment to see if virtual reality can influence individuals' perceptions of the political proclivities of other people as well as their own. It uses an adapted version of Pew's Ideological Consistency Scale with an additional element of personal relevance.
Team: Justin Chin, Barry Dineen, Tom Gorham, keith hartwig, Jim Moffet
HoloCity allows users to experience real-time simulation of housing and transportation through use of a Hololens. Different colors and sizes of the LEGO blocks represent various types of zoning, such as commercial, residential, government, and nature. Flow and density of traffic in an area is visualized through the color and scale of drawn magnetic fields. Other properties such as capacity, walkability between vehicles and buildings, and many more attributes can be easily presented and understood. This has direct applications both for governments in the sense of city planning, communities for envisioning future projects, and for individuals looking to find the perfect neighborhood to call home.
Team: Poseidon Ho, Han-Chih Kuo, Ran Li, Riaz Munshi, Bolin Zhu