MIT Reality Hack 2020
A new year, a new hackathon. Reality Virtually Hack underwent a major redesign for 2020 and rebranded as MIT Reality Hack. This year, the MIT Reality team aimed to be even better than last year by giving participants access to the latest and greatest XR headsets: Nreal light, Hololens 2, Project North Star, Varjo XR-1, and HP Reverb. I got a chance to try a few of them.
The Nreal light is probably the first headset that people would envision an XR headset to be. The headset is the closest thing to looking like normal glasses. It operates best tethered to a phone for the extra computing power, but it can also operate as a standalone. The computing hardware is encased near the top of the frames. At first, I thought the computing hardware would obstruct my view to some degree, but surprisingly, it did not.
Project North Star looks like a headset that was initially built at an XR hackathon. The headset is definitely weird to wear for me, but it has one of the best hand tracking capabilities. I remember using a Leap Motion for projects during my days in college and the hand tracking capabilities has improved a lot since, especially with two hand tracking.
There was also some interesting new software available to the participants this year. Verb-collective is a high level tool that combines simple scripts to perform simple actions (push, open, grow, shrink or walk) into more complex actions. Verb-collective is already on the Unity asset store.There is also VisonLib, which is a platform for object tracking. And finally, Normcore is a library that makes it easier to add multi-user experience for unity projects.
You can see how the participants used all of this software and hardware here.
And now, I want to highlight the WebXR projects from the hackathon:
Draft360 is a collaborative XR tool, that allow users to collaborate on XR ideas, while in XR or using non-XR tools. Users can upload photos, record audio or draw directly on the existing XR environment. Draft 360 won the best AR award and the best of tools award. They will be presenting at AWE in May.
Planet Protector is an educational XR experience aimed to educate users on waste management and its impact on the planet. The first part of the experience is an arcade style game where the user attempts to sort waste into different categories. Then the experience goes on to educate the user on the current state of recycling and what the user can do to help.
Space Changer is an WebXR app to help users with home renovations by modifying surface’s color and texture in real time. The app works by detecting the corner boundaries of surfaces and overlays controls to change color and texture of the detected surface.
To-Gather aims to foster communities and fight social isolation by using AR markers to tell a story. Each marker contains a snippet of the story and the users have to find all of the markers to complete the story. Each marker is held by another person.
ProgressivelyEnhancedIndoorNavigationWebXR is an AR app that helps people navigate indoors by combining campus maps with indoor maps. This way, the user can determine the best way to reach a specific room on a specific floor in a specific building on campus. The AR app also provides virtual markers to guide the user to their location.
As you can see, MIT Reality Hack offers a great opportunity to try out XR. I strongly encourage you to apply for next iteration. It is a a great environment to learn XR skills even if you do not have any experience. There are workshops to provide a crash course into the different technologies, as well as community talks for a deeper dive into a specific technology. And throughout the entire hackathon, there will be mentors to guide you along the way.
If you are curious about WebXR and related technologies, feel free to follow the @SamsungInternet team.
See you next year for Reality Hack 2021.
By Winston Chen on February 24, 2020.