Image 01

Posts Tagged ‘Robotics’

Eye fixations and observer motion and their role in 3D perception

Friday, September 9th, 2011

Jan-Olof Eklundh
Centre for Autonomous Systems,
KTH Royal Institute of Technology,
Sweden

A human observer is constantly active, moving about in the world and shifting gaze while fixating different parts of the surrounding scene. Sometimes the movements are performed with an intent to acquire more information about some object or part of the scene that is of interest, but often there’s no direct relation between the motions and actions by the observer and what he/she is looking at. Nevertheless the motions and fixations performed always provide valuable information about the 3D structure of the world and not least about what possibly could constitute objects of interest.

These observations suggest that artificial systems – ‘seeing robots’ – should use similar approaches to perceive and act in their environment. In the talk I’ll discuss the role of monocular and binocular fixation in 3D perception by a robot vision system and how the fixation and body actions influence the ’understanding’ of the world. It will be shown that problems of e.g. figure-ground-segmentation that seem difficult in computer vision become easier when observer actions are taken into account. I’ll also show some examples of how we’ve implemented mechanisms of such types.

Jan-Olof Eklundh: “Eye fixation and observer motion and their role in 3D perception” from eSMCs on Vimeo.

On the role of embodiment in the emergence of cognition

Friday, September 9th, 2011

Rolf Pfeifer
AI Lab,
University of Zürich,
Switzerland

Traditionally, in robotics, artificial intelligence, and neuroscience, there has been a focus on the study of the control or the neural system itself. Recently there has been an increasing interest into the notion of embodiment in all disciplines dealing with intelligent behavior, including psychology, philosophy, and linguistics. In an embodied perspective, cognition is conceived as emergent from the interaction of brain, body, and environment, or more generally from the relation between physical and information (neural, control) processes. It can be shown, and this is one of the underlying assumptions of the eSMC project, that through the embodied interaction with the environment, in particular through sensory-motor coordination, information structure is induced in the sensory data, thus facilitating categorization, perception and learning. The patterns thus induced depend jointly on the morphology, the material characteristics, the action and the environment. Because biological systems are mostly “soft”, a new engineering discipline, “soft robotics”, has taken shape over the last few years. I will discuss the far-reaching implications of embodiment, in particular of having a soft body, on our view of the mind and human behavior in general: Cognition is no longer centralized in the brain, but distributed throughout the organism, functionality is “outsourced” to morphological and material properties of the organism, which requires an understanding of processes of self-organization. Because in “soft” systems part of the functionality is in the morphology and materials, there is no longer a clear separation between control and the to-be-controlled, which implies that we need to fundamentally re-think the notion of control. The ideas will all be illustrated with case studies from biology — humans and animals — and robotics and will be summarized as a set of four “message” for embodied systems.

Rolf Pfeifer: “On the role of embodiment in the emergence of cognition” from eSMCs on Vimeo.