The Problem
"You’re sitting at a table. You're hungry. The food is right in front of you… but you can’t lift your hand to eat."
For millions of people around the world living with upper limb disabilities, this is their daily reality. Eating—a basic human necessity—becomes a dependency. We wanted to change that.
Meet EatAble
EatAble is a voice-controlled robotic assistant designed to restore independence. Users simply SAY what they want to eat, and the robot DETECTS, GRASPS, and gently FEEDS them.

Voice Controlled
No complex controllers. Just speak naturally to your assistant.
Smart Detection
Computer vision identifies food items for precise grasping.
The Technology
We built EatAble using the Reachy Mini robot, powered by AMD hardware. We integrated ElevenLabs for voice interaction and Hugging Face models for object detection, creating a seamless loop of "Listen, Understand, Serve."
See it in action
Watch how EatAble is helping restore dignity and independence.
Acknowledgements
A huge thanks to the AMD Robotics Hackathon organizers and sponsors: AMD, Hugging Face, ElevenLabs, and Gurupass for an amazing event.

"It’s not just a robot. It’s a companion."
Tihado - Engineering for Impact.