Founder Spotlight: Zhaopeng Chen is Building Smarter, More Agile Robots

For most of his career, Zhaopeng Chen built robots for places most people never see. As a researcher at the DLR’s Institute of Robotics and Mechatronics in Germany, he worked on robotic systems for space stations and missions that could go from the Moon to Mars, even as far as a comet. Yet when he looked at robots on Earth, he saw a different picture: a field that sounded futuristic, but was still tiny compared with cars and smartphones, and held back in his view by one core issue – robots still lacked true intelligence.
Chen had been fascinated by science fiction stories about robots since childhood, started building robotic hands and arms during his studies in China, and went on to complete a doctorate at DLR, where he worked on the team that developed a widely acclaimed humanoid hand. Over time, the question that stuck with him was not just how far he could push the technology, but how to bring that level of precision and sensing into real factories and hospitals.
In 2018, together with fellow DLR researchers, he founded Agile Robots as a spin off to bridge artificial intelligence and robotics and tackle complex industrial and medical tasks that conventional automation still struggles to handle.
Today, Agile Robots is a Munich-based provider of next generation automation solutions that combines sensitive robot arms, cobots, mobile platforms, a dexterous anthropomorphic hand, and the new industrial humanoid Agile ONE with an AI driven software stack. Its systems are deployed in sectors such as automotive, consumer electronics and healthcare, where they perform tasks ranging from precision assembly and quality inspection to surgical assistance and teleoperated procedures in infectious disease settings.
In this conversation, Chen talks about his career journey in robotics, the idea behind Physical AI and robotic workers, what it takes to turn cutting edge research into production ready systems on factory floors, and his advice for founders and operators building at the intersection of robotics and AI.
Where did your interest in robotics begin, and how did it turn into a career interest?
Ever since I was a kid, I’ve loved science fiction books and movies that imagine a future full of robots. That curiosity stayed with me, so when I went to university, I set my sights on robotics, even though joining the robotics research institute there was not easy. In my third year, I started working on real projects and I have been building robotic systems ever since, including robotic hands and arms, and I was involved end-to-end, from mechanical design and electronics to control algorithms and eventually AI.
Later, I joined the DLR’s Institute of Robotics and Mechatronics in Munich and spent several years working on aerospace robotics. That experience, and the people I worked with there, shaped how I think about high reliability systems and precision.
What made you leave space robotics and start Agile Robots, and what problem were you trying to solve?
I spent many years at DLR working on robotic systems for outer space. From a technical perspective, it was very exciting But at some point, I started asking myself when this kind of technology would really change everyday life on Earth. When you look at the numbers, robotics is still very small compared with industries like automotive or smartphones, and one big reason in my view is that most robots still lack real intelligence. They are very good tools, but they are not yet the kind of flexible, collaborative co-workers many people imagine when they think about robots.
Around 2018, I felt that the technology and the real-world demand were finally starting to line up. That was the moment when my colleagues and I decided to spin out of DLR and start Agile Robots. The core idea was simple to describe but hard to execute, which was to bring artificial intelligence and robotics together in a way that makes robots truly intelligent in the physical world.
For us, that means embodied intelligence. The robot needs a “body” that can feel and see what is happening around it, and a “brain” that can learn from real applications. We work on both sides. On the hardware side, we build sensitive, safe robots that can operate close to people. On the software and AI side, we develop models that learn from many different tasks and environments so the robot can understand instructions, adapt to variations and handle a much wider range of work than traditional automation. That combination is what Agile Robots was created to solve.
How would you explain the core idea behind Agile Robots to someone who is not a roboticist?
Most robotic systems you see today are really mechatronic systems. If there is no intelligence, they are not truly robots in my definition. They are devices or tools. At Agile Robots, our long-term goal is AGI (artificial general intelligence) in robotics, which means robots that can generalize across many tasks and environments rather than being locked into a single narrow use case.
To get there, we combine three things. First is embodied intelligence, the physical robots with rich sensing that can safely interact with the world. Second is a large robot model or foundation model that captures skills and knowledge across tasks. Third is data that comes directly from real industrial applications. When you combine those three elements, you can go one step beyond traditional automation and start to automate a much broader range of processes. Our robots already use methods like large language models and vision language models, and they have full-body sensitivity, so they can understand natural language instructions, identify and manipulate objects, and respond to touch and pressure.
Agile Robots builds both hardware and software. What does your technology stack look like in practice?
I often say our company is like an iceberg. The hardware, the robots you see in videos or at trade fairs, is only the part above the surface. Underneath is our software and AI systems, and that is where a large part of the value is created. We develop force-controlled, multisensing robots for industry and medicine, and we also work on humanoid technologies like a five-finger dexterous hand, where each finger is essentially a small robot of its own.

identical, modularly designed robotic fingers.
On the hardware side, an example is our Yu 5 Industrial cobot. It is designed to work safely next to people and to take on everyday jobs on the line, such as inspecting parts, loading and unloading machines, doing final assembly, dispensing material, and helping with packaging.
Our software platform, AgileCore, connects everything. It abstracts hardware from different vendors, provides drag and drop programming, dashboards for process monitoring, and a software development kit so developers can add their own skills. On top of that, we have AgileAI, our AI assistant. With AgileAI, operators can use natural language to define tasks, get step-by-step guidance for connecting peripherals, and use vision language models to help the robot understand what it is seeing and what it should do next.
What are some real-world applications where your robots are already having an impact?
We focus mainly on three markets: 3C (computer, communication and consumer electronics) manufacturing, automotive, and medical, and we’re increasingly involved in energy and related industries as well. On the industrial side, our cobots and arms are used for applications like quality inspection, machine tending, assembly, dispensing, and packaging.
Yu 5, for example, can inspect parts with its camera and force sensing, keep machines running at night, and take over fine assembly or dispensing steps that require consistent, precise motion.
In healthcare, we develop surgical robots that can send tactile feedback back to the surgeon, so they can actually feel what is happening at the tool tip. We have also built teleoperated systems that can, for example, take throat swabs in infectious disease settings so medical staff can keep a safer distance.
AI opens up new types of use cases, too. At the Automatica trade fair in Munich, we used AgileAI and AgileCore to control a mobile robot that performed server maintenance tasks. Visitors could type something like, “Check whether all hard disks are present” or “Please replace this drive,” and the system would find the right component, carry out the task, and then report back. For me, that is a nice illustration of how Physical AI can handle more complex, real-world workflows, not just simple pick-and-place.

What have you learned from taking robotics out of the lab and into high-volume production environments?
I have been building robots for almost two decades, so on the technology side, there are fewer surprises for me now. What I learned very quickly, though, is that having an impressive demo is not the same as having a product that actually works in a factory. Turning cutting-edge technology into something that survives on a production line is a challenging process.
That is why I tell my team not to put ourselves on a pedestal. At heart, we are craftsmen. The most important work happens at the customer site, on the line, where you see why a human operator can still complete certain tasks that a robot struggles with. When we started working with Foxconn, our team from Germany literally lived on their campus over a period of time. We lined up with the workers for lunch and spent countless hours on the shop floor studying how they actually work and where our robots fell short. Only after going through that did our technology start to become a product that customers recognized and trusted.
Partnerships also matter. In the early days, when we were a small and unknown company in Germany, it was difficult to convince suppliers that we were serious when we talked about larger volumes. This is one of the many areas where HSG helped tremendously. When suppliers checked who our investors were, their attitude changed.
You often talk about Physical AI and robotic workers. How do you see the future of robotics and intelligent systems?
For me, the next industrial revolution is Physical AI. By that I mean intelligent, autonomous, and flexible robots that can perceive, understand, and act in the physical world. Our vision of robotic workers is not limited to machines that look like humans, although some will. It is about robots that have both a “brain” and “hands” – they can feel forces and touch, see their surroundings, understand language, and then work safely alongside people. In our lab, for example, we show a robot that can feel when a nail at the end of its arm just touches skin without piercing it, but it can still pop a balloon.
Our new humanoid, Agile ONE, is one step in that direction. It is designed as an industrial co-worker that can move autonomously between workstations, use cameras, LiDAR, and speech recognition to understand what is happening around it, and rely on dexterous hands with integrated force and tactile sensing to handle both delicate and more demanding tasks. Behind it is a robotic foundation model that learns from real industrial data, simulations, and human teleoperation.
Agile ONE is not meant to work alone. It is part of a larger system with our arms, cobots, hands, and mobile platforms, all connected through our software platform. The idea is that the whole production system can keep learning and adapting over time, instead of being a fixed, one-off automation project.
What advice would you share with founders and operators who are building with robotics and AI today?
The first thing is to be honest about how hard it is. Robotics sounds futuristic, but in reality, it is a lot of craftsmanship, and the market will not forgive you for ignoring practical details. You have to spend time where your robots are deployed, understand the work better than anyone else, and treat feedback from operators very seriously. Technology alone is not enough.
Second, focus and persistence matter more than big slogans. Whatever I do, I try to give it everything, and I always think through the worst-case scenario so I can prepare for it. That applies just as much to extreme sports as to building a company. If you keep working tirelessly toward something that seems almost impossible at the beginning, you will find that help appears along the way. For us, that meant combining German precision engineering with the scale and speed of the Chinese robotics market and then staying committed for many years.