Cours "Cognitive and Developmental robotics", Jan-Avril 2016

Cet espace regroupe les ressources utilisées pour le cours

Robotique développementale et cognitive: Modélisation des processus de développement sensorimoteurs, cognitifs et sociaux chez les humains et les robots

de PY Oudeyer, M Lopes et avec la participation de M Quinson pour les TD.

(ENS Rennes magistère informatique et Master Sciences Cognitivies Univ. Bordeaux)

Dates:
11 janvier CM PY Oudeyer
25 janvier CM PY Oudeyer
1 février CM PY Oudeyer
8 février CM M Lopes
7 mars CM M Lopes
14 mars CM M Lopes
Master Sciences Co: 14/15 mars TD

Syllabus

1a) Overview of developmental and cognitive robotics
1b) Developmental robotics, the engineering side: life long multi task learning, developmental constraints, curiosity, maturation
1c) Developmental robotics, the modeling side: modeling motor development, modeling speech and language development, modeling the emergence of languages in population of individuals

2a) Machine learning applied to robotics (introduction to regression, optimization and RL for basic robotic tasks, typically learning to optimize a motor primitive to achieve a predefined task
2b) learning by demonstration)
2b) HRI: user-centered design and evaluation, applications in assistive robotics, tools for managing verbal and non-verbal communication

Presentations et slides

###Cours 1 (PY Oudeyer): “Introduction à la robotique développementale et cognitive”
Powerpoint: https://flowers.inria.fr/coursENSRennes15.pptx
Pdf: https://flowers.inria.fr/coursENSRennes15.pdf

Cours 2 (PY Oudeyer): “Modélisation des processus de l’apprentissage et du développement: l’apprentissage actif”

Powerpoint: https://flowers.inria.fr/coursENSRennes15Cours2.pptx
Pdf: https://flowers.inria.fr/coursENSRennes15Cours2.pdf
Vidéo: https://www.youtube.com/watch?list=PL8W4iBcZa2ElG_Q38ihjPdINjgkVXt0Uu&v=bkv83GKYpkI

Cours 3 (PY Oudeyer): “Modélisation de la dynamique de formation des langues”

https://flowers.inria.fr/coursENSRennes15Cours3.pdf
Pdf: https://flowers.inria.fr/coursENSRennes15Cours3.pptx

Cours 4 (PY Oudeyer): “La plateforme de robotique open-source Poppy: Science, éducation et art”

https://flowers.inria.fr/coursENSRennes15Cours4.pptx
Pdf: https://flowers.inria.fr/coursENSRennes15Cours4.pdf

Cours 5 (M Lopes): “Social Learning”

Pdf: https://flowers.inria.fr/coursENSRennes15Cours5.pdf

Cours 6 (M Lopes): “Models of Decision Making and Exploration in Animals”

Pdf: https://flowers.inria.fr/coursENSRennes15Cours6.pdf
(B Busch): “Acting for comfort in human-robot interaction”
Pdf: https://flowers.inria.fr/coursENSRennes15Cours6b.pdf
(B Clement):Multi-Armed Bandits for Intelligent Tutoring Systems
Pdf: https://flowers.inria.fr/coursENSRennes15Cours6c.pdf

TDs/Projets et évaluation (ENS Rennes)

TD1: Découverte de la programmation de la plateforme Poppy Humanoid

TD1 (fev 16)V2.pdf (63.0 KB)

TD2 (mars 2016): Algorithmes d’exploration et d’apprentissage de modèles sensorimoteurs: la librairie Explauto

TD2(march16).pdf (73.0 KB)

Projects and Evaluation M2 Sciences Cognitives

Two activities to introduce to the use of exploration in robots and on machine learning. (14/15mars)

14 Mars
9.00 - 12.00 Introduction to visual programming and parallel with textual programming:
TD objective: manipulate and program Thymio and Poppy-Ergo_Jr robots

  • Half an hour on the VPL for Thymio, example of use in school (please install VPL on your computer)
  • An hour and a half on Snap! for Poppy-Ergo_Jr (to use Snap! with ergo, import this file --> pypot-snap-bloc.xml)
  • One hour on translation Snap! function to Python function (no installation required)

13.00 - 16.00 Découverte et introduction à Explauto : une bibliothèque d’exploration autonome (basée sur le TD2)
Pour préparer le TD, avoir installé :

Vous pouvez vous référer à cette documentation pour l’installation d’anaconda et des notebooks

16.00 - 18.00 cours final

15 Marc - Salle George Boole 2

9.00 - 12.00 Introduction to Practical Machine Learning

  • Crash course of the program “KNIME”, which applies machine learning algorithms to your data, without any programming (only drag & drop). (60min)
  • 15min break
  • Hands-on tutorial for solving a real-life machine learning problem (Kaggle challenge, to be announced) with KNIME (60min)
  • 15min break
  • Tipps & tricks for practical machine learning (data preprocessing, visualization, good & bad features). (30min)

please prepare: Download "KNIME Analytics Plattform 3.1.1 (latest version) + all free extensions for your target plattform (Win/Lin/Mac), 64 Bit from here: https://www.knime.org/downloads/overview (you don’t need to give them your name and for the email address you can enter a fake address)

13.00 - 17.00 Student presentations

Evaluation:(15mars) Students from the Cognitive Science course will be evaluated based on their presentation of a research paper. Each student has to choose one the articles below, and has to prepare a 15 mn presentation (with slides) followed by 5 mn questions. The goal is to convey pedagogically the main ideas, methods and findings of the articles, and and potentially to provide a critical view of the article (e.g. explaining its strengths and limits). The note for the evaluation will be based on the clarity, capacitiy to express a critical view and to answer questions about the article pedagogically. Presentation time 15min+10min question.

Student presentations:

  • Barrouillet
    Wicaksono, H., & Sammut, C. [A Learning Framework for Tool Creation by a Robot]

  • Demangeat
    Loeg, [Optimal is not enough]

  • Pointreau
    Puglisi, A., Baronchelli, A., & Loreto, V. (2008). Cultural route to the emergence of linguistic categories. Proceedings of the National Academy of Sciences, 105(23), 7936-7940

  • Chenot
    Steels, L. and Belpaeme, T. (2005) Coordinating Perceptually Grounded Categories through Language. A Case Study for Colour. Behavioral and Brain Sciences, 28(4):469-489.

  • Cinquin
    Carpenter, M., Call, J., & Tomasello, M. (2002). Understanding “prior intentions” enables two-year-olds to imitatively learn a complex task. Child Development, 73(5), 1431–1441.

  • Mazon
    Csibra, G., & Gergely, G. (2007). “Obsessed with goals”: Functions and mechanisms of teleological interpretation of actions in humans. Acta Psychologica, 124, 60–78.

  • Roy
    Kidd, C., Piantadosi, S.T., & Aslin, R.N. (2014.) The Goldilocks effect in infant auditory cognition. Child Development, 85(5):1795-804.

  • Guemann
    Markant, D.B., Settles, B., and Gureckis, T.M. (2015) Self-directed learning favors local, rather than global, uncertainty. Cognitive Science

  • Jahanpour
    Active Learning of Object and Body Models with Time Constraints on a Humanoid Robot A Ribes, J Cerquides, Y Demiris, RL de Mántaras

  • Desprez
    Exploring affordances and tool use on the iCub
    V. Tikhanoff, U. Pattacini, L. Natale, G. Metta

  • Poumarat-Marquant
    Stoytchev, A. (2005, April). Behavior-grounded representation of tool affordances. In Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference on (pp. 3060-3065). IEEE.

  • Matthias
    Teaching Robots the Use of Human Tools from Demonstration with Non-Dexterous End-Effectors, Wenbin Li and Mario Fritz, Humanoids 2015

Article list

Self-organization of early vocal development in infants and machines: the role of intrinsic motivation
Moulin-Frier, C., Nguyen, S.M., Oudeyer, P-Y. (2014)
Frontiers in Psychology (Cognitive Science), 4(1006),

A Computational Model of Social-Learning Mechanisms, Manuel Lopes, Francisco S. Melo, Ben Kenward and Jose Santos-Victor. Adaptive Behaviour, 467(17), 2009.

Puglisi, A., Baronchelli, A., & Loreto, V. (2008). Cultural route to the emergence of linguistic categories. Proceedings of the National Academy of Sciences, 105(23), 7936-7940.
PNAS2008.pdf (756.0 KB)

Csibra, G., & Gergely, G. (2007). “Obsessed with goals”: Functions and mechanisms of teleological interpretation of actions in humans. Acta Psychologica, 124, 60–78.

Carpenter, M., Call, J., & Tomasello, M. (2002). Understanding “prior intentions” enables two-year-olds to imitatively learn a complex task. Child Development, 73(5), 1431–1441.

Steels, L. and Belpaeme, T. (2005) Coordinating Perceptually Grounded Categories through Language. A Case Study for Colour. Behavioral and Brain Sciences, 28(4):469-489.

Pieter Abbeel, Andrew Ng, “Apprenticeship learning via inverse reinforcement learning.” In 21st International Conference on Machine Learning (ICML). 2004

Calinon, S., Guenter, F. and Billard, A. (2007). On Learning, Representing and Generalizing a Task in a Humanoid Robot. IEEE Transactions on Systems, Man and Cybernetics, Part B, 37:2, 286-298.

Ugur, E., Nagai, Y., Sahin, E., & Oztop, E. (2015). Staged development of robot skills: Behavior formation, affordance learning and imitation with motionese. Autonomous Mental Development, IEEE Transactions on, 7(2), 119-139.Staged Development of Robot Skills Behavior Formation, Affordance Learning and Imitation.pdf (2.1 MB)

Active Learning of Object and Body Models with Time Constraints on a Humanoid Robot
A Ribes, J Cerquides, Y Demiris, RL de Mántaras
http://www.iiia.csic.es/~mantaras/TAMD.pdf1

A.D. Dragan and S.S. Srinivasa, “Generating Legible Motion”. Robotics: Science and Systems (R:SS), 2013.

Kidd, C., Piantadosi, S.T., & Aslin, R.N. (2014.)
The Goldilocks effect in infant auditory cognition.
Child Development, 85(5):1795-804.

Nelson, J. D., McKenzie, C. R. M., Cottrell, G. W., & Sejnowski, T. J. (2010). Experience Matters: Information Acquisition Optimizes Probability Gain. Psychological Science, 21(7), 960–969.

Markant, D.B., Settles, B., and Gureckis, T.M. (2015) Self-directed learning favors local, rather than global, uncertainty. Cognitive Science

Wicaksono, H., & Sammut, C. A Learning Framework for Tool Creation by a Robot.

Teaching Robots the Use of Human Tools from Demonstration with Non-Dexterous End-Effectors
Wenbin Li and Mario Fritz, Humanoids 2015
07363586.pdf (1.2 MB)

Stoytchev, A. (2005, April). Behavior-grounded representation of tool affordances. In Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference on (pp. 3060-3065). IEEE.
https://smartech.gatech.edu/bitstream/handle/1853/20655/StoytchevICRA2005.pdf

Exploring affordances and tool use on the iCub
V. Tikhanoff, U. Pattacini, L. Natale, G. Metta
http://www.poeticon.eu/publications/1335_Tikhanoff_etal2013.pdf

Relativement au TD1 : dans la partie IV, jouer avec un cube. Ici se trouve le tutoriel considéré.
Lorsque je cherche à insérer un cube, avec le code qui suit :

# let's create a Cube !

io = poppy._controllers[0].io

name = 'cube'
position = [0.2, 0, 0.8] # X, Y, Z
sizes = [0.1, 0.1, 0.1] # in meters
mass = 0.5 # in kg
io.add_cube(name, position, sizes, mass)

il se produit l’erreur suivante :

    /usr/lib/python3.5/site-packages/pypot/vrep/io.py in _inject_lua_code(self, lua_code)
    265     def _inject_lua_code(self, lua_code):
    266         """ Sends raw lua code and evaluate it wihtout any checking! """
--> 267         msg = (ctypes.c_ubyte * len(lua_code)).from_buffer_copy(lua_code)
    268         self.call_remote_api('simxWriteStringStream', 'my_lua_code', msg)
    269 

TypeError: a bytes-like object is required, not 'str'

Il semblerait que ce problème se soit déjà manifesté sous une autre forme ici. Un patch a été proposé. Visiblement, c’est l’injection d’un bout de code Lua dans vrep qui pose problème… (et je trouve étonnant qu’un objet de type str ne ressemble pas à des bytes…)

Nous sommes au moins deux élèves à essayer ce TP et qui rencontrent la même erreur; un autre en revanche ne l’a pas.

Y a-t-il une configuration paticulière (version de Python par exemple) sous laquelle on pourrait avoir la quasi-garantie que ce code fonctionne ?

edit : J’ai ouvert une issue sur GitHub : github.com/poppy-project/pypot/issues/149
re-edit : Ce bug a été résolu, dans moins d’une semaine a priori une nouvelle release de pypot devrait sortir avec ce bug en moins.

Pour compléter la réponse. Cette version est déja disponible depuis la branche master sur github : https://github.com/poppy-project/pypot
Elle n’est simplement pas encore disponible via pip.

Ce bug n’affecte que Python 3.* En Python 2.7 il ne devrait pas y avoir de problème.