Far Sight


Time
2025

Program
MIT Reality Hack 2025 Winner
Best Social Impact with Qualcomm Technologies

Team
Shengtao Shen, Sheldon McLeod, Brennan Worth McGarey, Ming Jin Yong

GitHub Repo:
https://github.com/worth12/RH-2025

DevPost:
https://devpost.com/software/farsight-caxyd8

MIT Reality Hack 2025 Winners:
https://www.mitrealityhack.com/2025-tracks-prizes

Inspiration
The devastating fires in Los Angeles have left a lasting impact on the community, destroying homes, displacing families, and claiming lives. The sheer scale of the destruction, combined with the intensity and unpredictability of these fires, highlights the need for innovative tools to assist first responders. Firefighters often find themselves entering burning buildings with little to no visibility, toxic air, and the ever-present risk of structural collapse. Sadly, the search for survivors—whether they be people or pets—can be delayed by a lack of reliable information about what lies ahead. These heartbreaking realities inspired us to design a rover that can serve as their eyes, ears, and environmental gauge, providing real-time data to ensure rescue operations are as safe and effective as possible.

What It Does
Our rover is designed to scout ahead in dangerous environments, giving firefighters an advanced view of the situation. Equipped with the Qualcomm RB3 Vision Development Kit, it runs an object detection model capable of identifying people, pets, and fire hotspots. Additionally, the rover features an Arduino setup with air quality and temperature sensors to monitor environmental conditions. All collected data is streamed to a Meta Quest 3 headset via Meta SDKs, where firefighters can view a Heads-Up Display (HUD) that overlays real-time information, including:

  • The location of individuals or pets in need of rescue
  • The presence of fire and its intensity
  • Air quality and temperature data for situational awareness
  • This comprehensive system ensures firefighters are equipped with actionable insights before entering hazardous areas.

How We Built It
  • Hardware Integration: We started with the Qualcomm RB3 Vision Development Kit to leverage its advanced image processing capabilities for object detection. The Arduino was incorporated to handle environmental sensing, using air quality and temperature sensors to measure key metrics.
  • Object Detection Model: We trained and optimized a deep learning model to detect people, pets, and fire patterns. This model was deployed on the RB3 kit, enabling real-time processing on the rover.
  • Data Streaming and HUD: Using the Meta SDKs, we developed a system to stream data from the rover to a Meta Quest 3 headset. The information is displayed on an intuitive HUD, allowing firefighters to see relevant details overlaid on their surroundings.
  • Rover Design: The rover was built with mobility in mind, enabling it to navigate through tight spaces and challenging terrains commonly found in fire rescue scenarios.
  • Testing: We simulated various scenarios to ensure the rover’s reliability, including obstacle navigation, object detection accuracy, and seamless data streaming.

Future Development and Applications
While our rover focuses on assisting firefighters, the true value of this technology lies in its versatile locating and HUD system. By separating the sensor and data display technologies from the specific locomotion method, this solution can be adapted to various platforms. Future iterations could include drones for aerial reconnaissance, robot dogs for increased mobility in uneven terrain, or even wearable devices for human first responders. Beyond firefighting, this system could be applied to disaster relief, search and rescue operations, and industrial safety inspections, making it a valuable tool across multiple domains.









Contact

Email: yuhan_wang@gsd.harvard.edu
Linkedin: https://www.linkedin.com/in/yuhan-wang-095874264