Real-time model based visual servoing tasks on a humanoid robot

  • Amine Abou Moughlbay*
  • , Enric Cervera
  • , Philippe Martinet
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Citations (Scopus)

Abstract

Several model based techniques have been used to apply various domestic service tasks on humanoid robots ( through teleoperation, learning, ...). But for many reasons, it is more suitable to study the interaction between the robot and its environment using the Sensor Based Control in these cases. In this paper we present a work of integration of real-time visual servoing techniques in performing self localization and different manipulation tasks on a humanoid robot in closed loop. Real-time model based tracking techniques are used to apply 3D visual servoing tasks on the Nao humanoid robot. Elementary tasks used by the robot to perform a concrete scenario are detailed with their corresponding control laws. Experimental results are presented for the following tasks: self-localization of the robot while walking, head servoing for the visibility task, detection, tracking and manipulation of environment's objects.

Original languageEnglish
Title of host publicationIntelligent Autonomous Systems 12 - Proceedings of the 12th International Conference, IAS 2012
PublisherSpringer Verlag
Pages321-333
Number of pages13
EditionVOL. 1
ISBN (Print)9783642339257
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event12th International Conference on Intelligent Autonomous Systems, IAS 2012 - Jeju Island, Korea, Republic of
Duration: 26 Jun 201229 Jun 2012

Publication series

NameAdvances in Intelligent Systems and Computing
NumberVOL. 1
Volume193 AISC
ISSN (Print)2194-5357

Conference

Conference12th International Conference on Intelligent Autonomous Systems, IAS 2012
Country/TerritoryKorea, Republic of
CityJeju Island
Period26/06/1229/06/12

Fingerprint

Dive into the research topics of 'Real-time model based visual servoing tasks on a humanoid robot'. Together they form a unique fingerprint.

Cite this