Overview

Robots are increasingly expected to operate in unstructured and dynamic environments. In such environments, robots must be capable of performing complex and diverse tasks, which places new demands on robustness and interpretability of robot perception.

In this workshop, we aim at examining the state of the art in robot perception and discuss what is still missing for achieving these properties for rigorous robot perception. We will discuss how perception in modular pipelines and in end-to-end learning approaches, e.g., using foundation models such as VLMs or VLAs, can support robustness and interpretability. The intended workshop scope is not limited to visual perception, instead we want to also explore how different modalities (e.g. tactile perception) can contribute to answering above questions. Lastly, in the workshop, we will ask how robustness and interpretability can be assessed for robot systems.

By invited talks, roundtable discussions, and spotlight/poster presentations of contributed extended abstracts, the workshop provides an opportunity to identify open challenges, assess the promises and limitations of current approaches, and chart new directions for achieving robust and interpretable perception in robotics.



Speakers

  • Margarita Chli

    Margarita Chli

    ETH Zurich and University of Cyprus

    She is a Professor of Robotic Vision and Director of the Vision for Robotics Lab at the University of Cyprus, and a Visiting Professor at ETH Zurich. Her research has pioneered vision-based autonomous flight and collaborative monocular SLAM for drone swarms. She is the recipient of an ERC Consolidator Grant and has delivered invited keynote talks at venues including the World Economic Forum in Davos, TEDx, and ICRA.

  • Xiaolong Wang

    Xiaolong Wang

    UC San Diego

    He is an Assistant Professor in the ECE department at the University of California, San Diego, and a Visiting Professor at NVIDIA Research. His research focuses on the intersection between computer vision and robotics. His specific interest lies in representation learning with videos and physical robotic interaction data. These comprehensive representations are utilized to facilitate the learning of human-like robot skills, with the goal of generalizing the robot to interact effectively with a wide range of objects and environments in the real physical world.

  • Kostas Alexis

    Kostas Alexis

    Norwegian University of Science and Technology

    He is a Full Professor at the Department of Engineering Cybernetics of the Norwegian University of Science and Technology (NTNU) at Trondheim, Norway. His research goal is to contribute towards establishing true navigational and operational autonomy for robotics.

  • Yu Xiang

    Yu Xiang

    University of Texas at Dallas

    He is an Assistant Professor in the Department of Computer Science at the University of Texas at Dallas. His research lies at the intersection of robotics and computer vision, with a focus on enabling intelligent systems to perceive, understand, and act in complex 3D environments.

  • Georgia Chalvatzaki

    Georgia Chalvatzaki

    TU Darmstadt

    She is a Full Professor for Interactive Robot Perception & Learning at the Computer Science Department of the Technical University of Darmstadt and Hessian.AI. Her research focuses on robot learning for mobile manipulation in assistive robotics, advancing embodied AI through methods at the intersection of machine learning and classical robotics.

  • Alberto Rodriguez

    Alberto Rodriguez

    Boston Dynamics

    He is the Director of Robot Behavior for Atlas at Boston Dynamics with prior roles being Associate Professor in the Department of Mechanical Engineering at MIT, leading the MCube lab. His research focuses on robotic manipulation, emphasizing contact mechanics, tactile sensing, and data-driven control methods. His work integrates modeling, planning, and learning to advance manipulation skills in both academic and industrial robotics.

Schedule

This is a preliminary version of the schedule and may be subject to change.

TimeDescription
8:50Opening Remarks by the Workshop Organizers
9:00Topic 1: Perception in Navigation
9:00Margarita Chli Margarita Chli ETH Zurich and University of CyprusTBD (requested: Robustness and Interpretability in Modular Approaches for Navigation)
9:30Xiaolong Wang Xiaolong Wang UC San DiegoTBD (requested: Robustness and Interpretability in End-to-end Learning Approaches for Navigation)
10:00Spotlight Talks
10:15Coffee Break and Poster Session
11:00Kostas Alexis Kostas Alexis Norwegian University of Science and TechnologyTBD (requested: Robustness and Interpretability in Application-Oriented Approaches for Navigation)
11:30Roundtable Discussion
12:00Lunch
13:00Topic 2: Perception in Manipulation
13:00Yu Xiang Yu Xiang University of Texas at DallasTBD (requested: Robustness and Interpretability in Modular Approaches for Manipulation)
13:30Georgia Chalvatzaki Georgia Chalvatzaki TU DarmstadtTBD (requested: Robustness and Interpretability in End-to-end Learning Approaches for Manipulation)
14:00Spotlight Talks
14:15Coffee Break and Poster Session
15:00Alberto Rodriguez Alberto Rodriguez Boston DynamicsTBD (requested: Robustness and Interpretability in Application-Oriented Approaches for Manipulation)
15:30Roundtable Discussion
16:00Closing Remarks

Call for Papers

We invite the submission of field reports or extended abstracts on the following topics of interest:

  • - Robust perception for navigation in unstructured and dynamic environments
  • - Robust perception for manipulation in unstructured and everyday environments
  • - Perception in end-to-end learning architectures for robotic navigation and manipulation
  • - Interpretability and robustness of perception in end-to-end learning robot systems
  • - Uncertainty quantification for robot perception methods
  • - Introspection and interpretability of perception methods in robot systems
  • - Tactile or visuo-tactile perception for robust contact-rich manipulation in unstructured environments
  • - Lessons learned from robot perception in integrated robot systems, incl. informative failure cases
  • - Datasets and benchmarks for robustness and interpretability of perception in real-world robot systems

All submitted papers will be reviewed on the basis of technical quality, relevance, significance, and clarity. The page limit of submitted papers is 4 pages including references. We also accept submissions of previously presented work, that you have extended on and work that is being published as part of the ICRA2026 main conference. Upon acceptance, you will be able to present your submission as part of the poster session. Some papers will be selected for oral spotlight presentations. All accepted submissions will be available for the workshop on this website (non archival).

Please submit your extended abstracts following the ICRA 2026 format guidelines. For details see the following links:

Please send your submission to TBD.

Important Dates

Extended Abstract Submission Deadline: April 7, 2026

Decision Notification: May 8, 2026

Final Version: May 22, 2026

Workshop Date: June 1, 2026

Organizers