Mapping with a Robotic Quadruped

SLAM navigation and real-time mapping, running live on a quadruped robot platform.

A major 5G infrastructure vendor looking to showcase their connectivity solutions for robotics applications needed a compelling way to demonstrate autonomous navigation and mapping capabilities to stakeholders and decision-makers. The goal was to provide a live, interactive demonstration that made the technology tangible, deploying a production-grade robotics stack on a commercial quadruped and giving non-technical audiences a way to interact with it directly, without hiding what the technology was actually doing. The demo also needed to be reliable enough to run in a live lab setting, in front of an audience.

A tech company needed a live, interactive robotics demo to make autonomous navigation tangible for stakeholders.

SLAM

Robot Navigation

Foxglove

ROS2

Autonomous Navigation

Human-Robot Interaction

What We Built

Focus built a full autonomous navigation and mapping stack on a Unitree GO2 quadruped, running ROS 2 Jazzy with RTAB-Map SLAM and the Nav2 navigation stack, containerized in Docker on the robot's onboard compute. The system supported both Gazebo simulation and real-robot operation from a single codebase, enabling efficient development and validation before live deployment. Participants interacted through a Foxglove Studio interface connected via a WebSocket bridge, allowing them to reset the map, trigger recovery behaviors, and send click-to-goal navigation commands in real time. As the robot moved, the map it was building appeared live on screen, giving any observer a direct window into how SLAM-based autonomy works in practice.

No items found.

Impact

Stakeholders experienced autonomous navigation firsthand, turning an abstract technology proposition into a tangible demo that drove real engagement.

Production-Grade Autonomy Stack

ROS 2 Jazzy, RTAB-Map SLAM, and the Nav2 navigation stack ran fully containerized on the robot's onboard compute, delivering a reliable, self-contained system capable of operating live in front of an audience.

Simulation-to-Reality in One Codebase

A single codebase supported both Gazebo simulation and real-robot deployment, allowing the team to develop and validate behaviors in simulation before switching to live hardware without any reconfiguration.

Interactive Control for Non-Technical Audiences

A Foxglove Studio interface let participants send click-to-goal commands, reset the map, and trigger recovery behaviors in real time, making autonomous navigation something anyone in the room could directly engage with.

A live demo that closed the gap between technical capability and business understanding, making quadruped autonomy tangible for the people who need to act on it.