Rubber Ducky
Time
2026

Program
MIT Media Lab Tangible Media Group

Team:
Lingdong Huang, Haoheng Tang, Kai Zhang

Prize
HARD MODE 2026
MIT AI Hardware Hackathon
Anthropic Prize Winner
Programmers talk to rubber ducks to reason about code. Our AI-powered mechanical duck listens—and then fixes your code itself, pecking at the keyboard with its oversized bill.

We built it to stress-test LLMs in the physical world. Because the bill can’t hit single keys precisely, the duck constantly makes and corrects errors while replanning its movements. We gave it a grumpy personality—it quacks, complains, and begrudgingly types through its Sisyphean task.

This project pulls “vibe coding” into the physical world, asking: if humans can continuously adapt, re-plan, and correct actions in real time, can AI do the same?

Mechanically, the duck’s neck uses a four-bar linkage with two degrees of freedom—extension and pecking—driven by two servos. To cover the keyboard, we optimized servo ranges through simulation and solved inverse kinematics using a compact circle-intersection method, ensuring stable, non-flipping motion. The result is a dexterous neck controlled entirely on a microcontroller.

On the software side, we coordinated multiple components (mechanics, audio, sensing, AI) through an HTTP-based system: a central server assigns tasks, and distributed modules execute and report back—bringing the duck to life as a quacking, pecking, collaborative machine.










Contact

Email: yuhan_wang@gsd.harvard.edu
Linkedin: https://www.linkedin.com/in/yuhan-wang-095874264
Instagram: @rabourackey
TEL: 8577563864