The Things Conference 2025 banner

Accelerate your Linux and Edge AI product development

Foundries.io is teaming up with Edge Impulse (a Qualcomm company) at The Things Conference 2025, to demonstrate how our technologies combine for AI-powered IoT can help you to accelerate your Linux and Edge AI product development.

Schedule a meeting with us

Workshop

From Voice to Action: Deploying Embedded Linux and Edge AI with FoundriesFactory and Edge Impulse

Join this hands-on workshop presented by Foundries.io and Edge Impulse to learn how to build, deploy, and manage secure AI-powered embedded Linux devices — from data collection to over-the-air deployment.

We'll guide you through using FoundriesFactory, a DevOps platform for embedded Linux, to design, develop, and manage connected devices with CI/CD pipelines, OTA updates, and robust security features — all streamlined for real-world deployment.

On the AI side, you'll explore how to use the Edge Impulse platform to collect voice data, train a compact neural network model to recognize basic color-related words (e.g., “red,” “blue,” “green”), and deploy that model into your Linux-based device.

The result? A complete project where the embedded device recognizes voice commands and controls a smart light bulb accordingly — showcasing a practical application of voice-enabled edge AI.

Whether you're an embedded developer exploring AI or an ML engineer entering the world of Linux devices, this workshop gives you a complete pipeline from development to deployment — using real hardware and production-ready tooling.

Picture of Raul Munoz

Raul Munoz ‐ Technical Marketing Manager, Foundries.io

Picture of Jim Bruges

Jim Bruges ‐ Developer Relations Engineer, Edge Impulse

Product demos

At this year's The Things Conference, Foundries.io will showcase how our FoundriesFactory™ platform enables secure, scalable, and production-ready development for edge AI devices — live and hands-on.

We'll feature two demo stations, each powered by the Qualcomm® Dragonboard™ RB3 Gen 2 Vision Kit, both running the Linux microPlatform and managed remotely by the same FoundriesFactory.

Voice Recognition + Smart Light Control

Each station will demonstrate a voice-controlled system, built using Edge Impulse, capable of recognizing color-related keywords (e.g., “red,” “blue,” “green”) and changing the color of a smart light bulb accordingly — in real-time.

A special trigger word will activate an OTA (over-the-air) update via FoundriesFactory, remotely switching both stations to a new visual AI application — showcasing the power of dynamic fleet management and secure remote software delivery.

Rotating Visual AI Applications (via OTA)

Each station can be seamlessly switched between the following applications:

  1. Face Detection (Qualcomm AI Stack Demo)

    This demo uses real-time face detection powered by GStreamer and Qualcomm's AI Stack. The system identifies human faces from the camera feed with low latency and highlights them in the video stream. Learn more

  2. People Recognition with Edge Impulse

    A continuously evolving demo that recognizes known individuals using a custom ML model trained on Edge Impulse. This model has improved with data collected at past events — highlighting how iterative learning at the edge can improve accuracy over time.

  3. Video Super-Resolution (Qualcomm AI Stack Demo)

    This application demonstrates real-time upscaling of low-resolution video using AI. It enhances video clarity and sharpness, showing how AI can transform video feeds on the edge device itself. Learn more

Tickets

With the promo code FOUNDRIES30, you can get a 30% discount on your Explore Ticket and have the chance to meet both teams and experience all demos in person. Don't miss out on this opportunity!

Redeem it now

Picture of Raul Munoz

Foundries.io and Edge Impulse booth location: B02