MIT Reality Hack 2022

Project Overview

Context: MIT Reality Hack is an annual community-run AR/VR Hackathon hosted at MIT. The event comprises of thought leaders, brand mentors and creators, participants, students, and technology lovers, who come together and attend tech workshops, talks, discussions, fireside chats, collaborations, hacking, and more.

Harbit: At MIT Reality Hack I led a team with the vision of creating a seamless experience for users to build and maintain good habits using Snapchat's augmented reality Spectacles via the habit formation method of pointing and calling.

Role: Team Lead & UX Engineer

Timeline: 2.5 days (March 2022)

Tools: Lens Studio (application used for creating index finger pointing detection, multi-object detection models, and event based 3D animations); Snap Spectacles (AR glasses); Blender (3D animations)

Research Summary

Every habit starts with a cue:

The "Habit Cycle"

How can we become more aware of these cues?

Pointing and Calling (Otherwise known as the Shisa Kanko 指差喚呼 Method)

Pointing and calling is a method of increasing awarness in order to reduce work place errors. This method, when used in an environment that requires hyper-awareness of one's surroundings, can reduce mistakes by almost 85%.

Let's try applying this method using augmented reality...

User Persona

User Journey

Pointing and calling flow

*Replaced step four with an AR based solution

Positive feedback for good habit cues flow

*Possible to recreate using available SnapAR libraries and tools (Banana is the cue)

Negative feedback for bad habit cues flow

*Possible to recreate using available SnapAR libraries and tools (Chocolate is the cue)

Product Demo

Current MVP

  • Index finger pointing detection to trigger events
  • Object detection with machine learning model (bottle & cup)
  • Animated feedback (good & bad habit cues)

Future Iterations

  • Track user actions
  • Measure user progress
  • Custom progression planning

Try it out: GitHub Repo