More Brilliant Hacks from Hack-A-Bot 2025 Powered by IMX500: Project AURA & Sign Language Interpreter

Building on the energy and innovation that defined Hack-A-Bot 2025, we're taking a closer look at some of the standout projects that emerged from the 24-hour challenge at The University of Manchester. Nearly 300 students from all sorts of fields, from neuroscience to software engineering, came together for the weekend to turn ideas into real, practical solutions—things that could make a difference in people's lives. A key technology behind these projects was the Raspberry Pi AI Camera, integrated with Sony’s IMX500 sensor, enabling AI-driven solutions with powerful computer vision and real-time processing.

Sign Language Interpreter

What if you could teach a camera to understand sign language? That’s exactly what one Hack-A-Bot team set out to do, leveraging the Raspberry Pi AI Camera with Sony’s IMX500 sensor to tackle real-time sign language letter interpretation.

Creating their own dataset, the team meticulously photographed and labelled gestures to train a custom TensorFlow Lite classifier. This lightweight classifier was then deployed directly onto the IMX500, enabling the entire recognition process to occur on-device, showcasing the power of tailored edge AI for accessibility.

Here’s what the team had to say about their journey: “Working with the IMX500 Raspberry Pi AI Camera was surprisingly simple and intuitive. The built-in AI capabilities were powerful, as well as really fun to explore and understand. The hackathon itself was a really good event, incredibly well organized and very enjoyable for everyone. The Sony team's support and energy throughout the event made the whole experience even better.

Built With:
  • Raspberry Pi 3 Model B
  • Raspberry PI AI Camera with IMX500 Sensor
  • TensorFlow
  • 3D-printer

Project AURA-Architectural User-responsive Reactive Assembly

AURA reimagines how we interact with physical environments. Centered on a scale model of the University of Manchester’s Engineering Building, the project allows users to control the model through simple hand gestures.

Utilizing a Raspberry Pi 5 and a Raspberry Pi AI Camera with Sony’s IMX500 sensor, the system interprets movements like swipes and T-poses via a locally running TensorFlow Lite model. The IMX500’s edge AI capabilities handle real-time image analysis, and detected gestures trigger commands to a Raspberry Pi Pico, which actuates servos and motors to dynamically alter the model's form.

The result is an immersive, touch-free interactive model that demonstrates how edge AI can power smart architecture and responsive environments.

Built With:
  • Raspberry Pi 5
  • Raspberry Pi AI Camera with IMX500 Sensor
  • Raspberry Pi Pico
  • Servo Motors & DC Motor

The Raspberry Pi AI Camera, powered by the capabilities of Sony’s IMX500 sensor, was central to the success of these impactful projects, providing the advanced computer vision and real-time processing necessary to bring these innovative ideas to life. We extend our congratulations to both the Project AURA and Sign Language Interpreter teams for their dedication.