DIFusio@

2ª semana de marzo 4 charlas de investigadores en robótica

Image

En el contexto de la asignatura Robótica Inteligente II del máster KISA, cuatro investigadores darán charlas invitadas durante segunda semana de marzo:

6 de marzo: Silvia Rossi (presencial, 15:00)
7 de marzo: Concha Monje (online, 15:00 ehu.webex.com/meet/elazkano)
9 de marzo: Oier Mees (presencial, 15:00)
9 de marzo: Zerrin Yumak (online, 17:00 ehu.webex.com/meet/elazkano)

___________________________________

Silvia Rossi (DIETI, University of Naples Federico II, Italy)

Talk: Behavioral Adaptation and Transparency in HRI

Abstract: To effectively collaborate with people, robots are expected to detect and profile the users they are interacting with and modify and adapt their behavior according to the learned models. Many research challenges for personal robotics are related to the need for a high degree of personalization of the robot's behavior to the specific user’s needs and preferences. It is crucial to investigate how robots can better adjust to their human interaction partners and how the behavior of a robot can be personalized based on the user’s characteristics such as personality and cognitive profile, and their dynamic changes. However, in practice, it is not just a matter of performing an interaction task correctly, but mainly of performing such tasks in a way that is socially acceptable to humans. The more robots become adaptive and flexible, the more their behaviors need to be transparent. When interacting with complex intelligent artefacts, humans inevitably formulate expectations to understand and predict their behaviors. Indeed, robots’ behaviors should be self-explanatory so that users can be confident in their knowledge of what these systems are doing and why. In this talk, we will present and discuss several topics connected to adaptivity in robot behavior and the need for legibility and transparency.

---
Alicia Concepci'on Monje (UC3, Madrid)

Talk: Control fraccionario de robots blandos

Abstract: El campo emergente de la robótica blanda se presenta hoy en día como una forma innovadora de crear y aplicar tecnologías robóticas en nuestra vida diaria. Pese a ser un campo relativamente nuevo, tiene un gran potencial para cambiar la forma en la que nos relacionamos con los robots y la manera en que los usamos. La naturaleza blanda de los robots los convierte en sistemas complejos difíciles de modelar, y solo bajo un control robusto de estos modelos podremos asegurar que realicen sus tareas de forma eficiente y segura. En esta charla abordaremos el problema de control de robots blandos mediante la técnica de control fraccionario y demostraremos experimentalmente la bondad de este tipo de controladores.

---
Oier Mees (Freiburg University, Germany)

Talk: Interactive Language Instructable Robot Learning

Abstract: Despite considerable progress in robot learning, and contrary to the expectations of the general public, the vast majority of robots deployed out in the real world today continue to remain restricted to a narrow set of preprogrammed behaviors for specific tasks in controlled industrial settings. As robots become ubiquitous across human- centered environments the need for "generalist" robots grows: how can we scale robot learning systems to autonomously acquire general-purpose knowledge that allows them to perform a wide range of everyday tasks in unstructured environments based on arbitrary instructions from the user?
In my work, I have focused on addressing the challenging problem of relating human language to a robot's perceptions and action by introducing techniques that leverage self- supervision and structural priors from uncurated data to enable sample-efficient learning of language-conditioned manipulation and navigation tasks.

---
Zerrin Yumak (Utrecht University, Netherlands)

Talk: AI-driven Virtual Humans with Non-verbal Communication Skills

Abstract: With the recent advancements in computer graphics, 3D virtual humans reached to a level of very high appearance realism. They take place in a range of applications such as games, chatbots for customer service and finance, simulations for education and healthcare, remote communication and (social) VR. Yet, their interactivity and movement is limited. We as humans are very receptive of non-verbal behaviors when we engage in social and emotional interactions. In order to interact with virtual humans naturally, they should also be equipped with non-verbal communication skills such as facial expressions, gestures and gaze. As they take place in more interactive applications, the demand to automatically generate their behavior on-the-fly will increase. This talk will discuss the role of AI and machine learning algorithms to generate the behavior and movement of virtual humans. I will present the state-of-the-art and some of our research work on interactive virtual humans.


Filtro por temas