Skip to main content

General


The second NeuTouch International Summer School is a one-week virtual summer school focused on touch for robotics. The event will include talks from some of the top-tier international expertise of the field, discussions, lectures and finally, hands-on tutorials.


We will offer an observer and active track registration, but the active track places will be limited. To apply for the active track, we ask for a 1-page abstract outlining your prior experience with touch and motivation.
 
The event will be held online, primarily through ZOOM; we will send the access link to subscribers before the start of each event.

Program


The event will be held from the 20 -24 of September. It will be split into half a day of lectures, discussions, and invited talks, and the other half-day will be spent with hands-on tutorials.

Day 1 - Monday 20th September

Topic of the Day: Tactile-based full-body control
Time (CEST)SpeakerTitle
08.45 - 09.00

Welcome and Overview

09.00 - 10.15
Professor Gordon Cheng

Tactile-based full-body control

10.15 - 11.00
Simon Müller-Cleve, Luca Lach

Presentations by PhD-students

11.00 - 11.30

break

11.30 - 12.30
Dr. Lorenzo Natale

Tactile perception on the iCub humanoid robot

Afternoon Sesstion:
14.00 -16.00
Luca Lach

Hands-on tutorial: Reinforcement-learning based force control for stable grasping

In this tutorial, you will model a force controller for the TIAGo robot. You will be provided with a simulation setup of the robot in a typical grasping scenario. First, you will model the control behaviour yourself, e.g. as a PID controller. Later you can train a Reinforcement Learning agent on this task and compare the performance of the two approaches.

Day 2 - Tuesday 21st September

Topic of the day: Tactile sensing and active perception
Time (CEST)SpeakerTitle
09.00 - 10.15
Dr. Robert Haschke

Slip-Detection and Tactile-Based Grasping

10.15 - 12.30
Dr. Robert Haschke

Hands-on tutorial: Slip Classification

We will experiment with and evaluate various neural network architectures to detect and classify slippage.
To this end, we will work on pre-recorded real-world data from the tactile sensors in our multi-fingered robot hands.

Afternoon Sesstion:
14.00 - 14.45
Jakub Cmíral, Pablo Salazar

Presentations by PhD-students

14.45 - 15.00

break

15.00 -16.15
Professor Wenzhen Yuan

Making sense of the physical world with high-resolution tactile sensing

Day 3 - Wednesday 22nd September

Topic of the day: Tactile shape exploraiton
Time (CEST)SpeakerTitle
09.00 - 10.15
Professor Nathan Lepora

SoftBOTS – Soft Biomimetic Optical Tactile Sensing

10.15 - 11.00
Nicola Piga, Gabriele Caddeo

Presentations by PhD-students

11.00 -11.30

break

11.30 - 12.30
Dr. Chiara Bartolozzi

Neuromorphic touch for robotics

Afternoon Sesstion:
14.00 -16.00
Simon Frederik Müller-Cleve

Guided tutorial on how to find and follow edges with the humanoid robot i-Cub.

The tutorial deals with finding and following edges and is based on the humanoid robot iCub (IIT). For this, we provide a docker container in which the simulation environment is already set up. We use gazebo with YARP as a simulation environment. We will write our code in C++. If you want to know more click here.

Day 4 - Thursday 23rd September

Day dedicated to the projects
Time (CEST)Title
09.00 - 12.00

Continuation work on tutorial tasks

Day 5 - Friday 24th September

Topic of the day: Soft robotics and physics-based manipulation
Time (CEST)SpeakerTitle
09.00 - 10.15
Professor Oliver Brock

The Role and Peculiarities of (Tactile) Sensing in Soft Robotics

10.15 - 11.00
TBA

Presentations by PhD-students

11.00 - 11.30

break

11.30 - 12.30
Professor Tony Prescott

Active Touch Sensing in Mammals and Robots

Afternoon Sesstion:
14.00 -16.00
Students

Presentation of tutorial results

Program

Speakers:


20th september 2021

Tactile-based full-body control

In this talk, I will explain the rich interactions that have been made possible with a large area e-skin. I will demonstrate both upper-body as well as lower-body robot interaction with challenging environments. In so doing, I show that the performance of robots can improve substantially with e-skin and that robots can be kept from damages to themselves and the environment. Furthermore, I show a new method that can reduce the impact forces of walking robots while at the same time providing them with better balancing ability.

Gordon Cheng

Professor, technical university of munich

Gordon Cheng has made pioneering contributions in Humanoid Robotics, Neuroengineering, Artificial Intelligence for the past 20 years. Since 2010, Gordon Cheng has been holding the Chair for Cognitive Systems, which he also founded. The Chair for Cognitive Systems is part of the Department of Electrical and Computer Engineering at Technical University of Munich (TUM), Munich/Germany.
Gordon Cheng is also the Head of the CoC for Neuro-Engineering - Center of Competence Neuro-Engineering in the Department of Electrical and Computer Engineering. Since 2016, he has been the Program Director of the Elite Master of Science program in Neuroengineering (MSNE) of the Elite Network of Bavaria, a unique study program in Germany.
Please let me know if you need anything else. As I am on holidays next week, I am cc-ing my colleague Ilona for any further questions.


Tactile perception on the iCub humanoid robot

Robots can actively interact with the environment to learn about objects and their properties, using their sensory system. Among the other sensors, the sense of touch is particularly interesting for roboticists, as it has great potentials to advance how robots perceive and manipulate objects.

In this talk, I will revise our work on tactile perception with the iCub robot. I will describe the technology of the iCub tactile system, and how it has been designed to cover the body of the robot and customized to be mounted on the hands. I will also present our work on tactile object manipulation and active perception.

 

Lorenzo Natale

Principal Investigator, Istituto Italiano di Tecnologia

Lorenzo Natale is Tenured Senior Researcher at the Italian Institute of Technology. He received his degree in Electronic Engineering (with honours) and Ph.D. in Robotics from the University of Genoa. He was later postdoctoral researcher at the MIT Computer Science and Artificial Intelligence Laboratory. He was invited professor at the University of Genova where he taught the courses of Natural and Artificial Systems and Antropomorphic Robotics for students of the Bioengineering curriculum. Since 2020 he is visiting Professor at the University of Manchester.

Lorenzo Natale has contributed to the development of various humanoid platforms. He was one of the main contributors to the design and development of the iCub platform, its software architecture and the YARP middleware. His research interests range from vision and tactile sensing to software architectures for robotics.

He has been principal investigator and co-principal investigator in several EU funded projects. He was general chair of IEEE ARSO 2018 and served as Program Chair of ICDL-Epirob 2014 and HAI 2017. He is Specialty Chief Editor for the Humanoid Robotics Section of Frontiers in Robotics and AI, associate editor for IEEE-Transactions on Robotics and IEEE Robotics and Automation Letters. He is Ellis Fellow and Core Faculty of the Ellis Genoa Unit.



21st september 2021

Making sense of the physical world with high-resolution tactile sensing

In this talk, I will introduce the development of a high-resolution tactile sensor GelSight, and how it could help robots understand and interact with the physical world. GelSight is a vision-based tactile sensor that measures the geometry of the contact surface with a spatial resolution of around 25 micrometers, and it also measures the shear forces and torques at the contact surface. With the help of high-resolution information, a robot could easily detect the precise shape and texture of the object surfaces being contacted and therefore recognize them. But it can help robots get more information from contact, such as understanding different physical properties of the objects and assisting manipulation tasks. This talk will cover our previous works on both object property perception and slip detection with GelSight sensors. I will also discuss our recent development in tactile sensor simulation and how we envision the work will help the community with the research in tactile sensing.

Wenzhen Yuan

Professor, Carnegie Mellon University, Robotics Institute

Wenzhen Yuan is an assistant professor in the Robotics Institute at Carnegie Mellon University and the director of the CMU RoboTouch Lab. She is a pioneer in high-resolution tactile sensing for robots, and she also works in multi-modal robot perception, soft robots, robot manipulation, and haptics. Yuan received her Master of Science and PhD degrees from MIT and Bachelor of Engineering degree from Tsinghua University. She also worked as a Postdoctoral researcher at Stanford University.


Slip-Detection and Tactile-Based Grasping

Slip detection is a major prerequisite for successful object grasping and manipulation: Early detection of incipient slip enables a robot hand to stabilize a grasp and thus not loose the object. For manipulation, the notion of slip is an important feedback signal to monitor the intended motion of an object in hand. Starting from different sensor modalities provided by the broad range of different hardware implementations, this talk will introduce various approaches to slip detection, based on shear-force detection, friction cone estimation, or vibration detection.

Robert Haschke

Head of Robotics Group, Bielefeld University

Robert Haschke received the diploma and PhD in Computer Science from the University of Bielefeld, Germany, in 1999 and 2004, working on the theoretical analysis of oscillating recurrent neural networks. Since then, his work focuses more on robotics, still employing neural methods whereever possible. Robert is currently heading the Robotics Group within the Neuroinformatics Group, striving to enrich the dexterous manipulation skills of our two bimanual robot setups through interactive learning. His fields of research include neural networks, cognitive bimanual robotics, grasping and manipulation with multi-fingered dexterous hands, tactile sensing, and software integration.



22nd september 2021

SoftBOTS – Soft Biomimetic Optical Tactile Sensing

Reproducing the capabilities of the human sense of touch in machines is an important step in enabling robot manipulation to have the ease of human dexterity. A combination of robotic technologies will be needed, including soft robotics, biomimetics and high-resolution tactile sensing. This combination is considered here as a SoftBOT (Soft Biomimetic Optical Tactile) sensor. Here I review the BRL TacTip as a prototypical example of such a sensor. Topics include the relation between artificial skin morphology and the transduction principles of human touch, the nature and benefits of tactile shear sensing, 3D printing for fabrication and integration into robot hands, the application of AI to tactile perception and control, and the recent step-change in capabilities due to deep learning. This talk consolidates those advances from the past decade to indicate a path for robots to reach human-like dexterity.

Nathan Lepora

Professor, University of Bristol

Nathan F. Lepora received the B.A. degree in Mathematics and the Ph.D. degree in Theoretical Physics from the University of Cambridge, UK. He is currently a Professor of Robotics and AI with the University of Bristol, UK. He leads the Tactile Robotics Group that researches dexterous robots with a human-like sense of touch and manual intelligence, an area undergoing rapid and exciting progress. He is a recipient of a Leverhulme Research Leadership Award on ‘A Biomimetic Forebrain for Robot Touch’. His research group won the ‘Contributions in Soft Robotics Research’ category in the 2016 Soft Robotics Competition. He also writes books on science and technology, most recently ‘Robots!’ in the acclaimed Findout! series by Dorling-Kindersley.


Neuromorphic touch for robotics

Neuromorphic sensing takes inspiration from biology to encode sensory signals in a way that captures important information about the external world. Neuromorphic vision sensors are known as event-driven cameras and react only to changes in the light impinging on the sensitive areas. They feature extremely low latency and high temporal resolution coupled with high compression and dynamic range. Lately, a similar concept has been applied to pressure sensors, used as tactile sensors. These are extremely useful in any application that entails the physical interaction of an artificial device with the environment, as robotics and prosthetics. In this talk, we will discuss the principles of neuromorphic event-driven sensing, how they are applied to the tactile domain and how can robotics benefit from those.

Chiara Bartolozzi

Principal Investigator, Istituto Italiano di Tecnologia

Chiara Bartolozzi is Researcher at the Italian Institute of Technology. She earned a degree in Engineering at University of Genova (Italy) and a Ph.D. in Neuroinformatics at ETH Zurich, developing analog subthreshold circuits for emulating biophysical neuronal properties onto silicon and modelling selective attention on hierarchical multi-chip systems.

She is currently leading the Event-Driven Perception for Robotics group, with the aim of applying the "neuromorphic" engineering approach to the design of robotic platforms as enabling technology towards the design of autonomous machines.

This goal is pursued by inducing a paradigm shift in robotics, based on the emerging concept of Event-Driven (ED) sensing and processing. Similarly to their biological counterpart, and differently from traditional robotic sensors, ED sensory systems sample their input signal at fixed (and relative) amplitude changes, intrinsically adapting to the dynamics of the sensory signal: temporal resolution is extremely high for fast transitory signals and decreases for slower inputs.

This approach naturally leads to better robots that acquire, transmit and process information only when needed, optimising the use of resources, leading to real-time, low-cost, operation.

Chiara has participated to a number of EU funded projects, she is currently coordinating the European Training Network "NeuTouch", where 15 PhD students are studying how touch perception works in humans and animals, in order to develop artificial touch perception systems for robots and hand prosthesis. As leader of the educational activities of the coordination and support action NEUROTECH, she is co-organising the Neuromorphic Colloquium, a series of online events to build up educational material for the next generation of neuromorphic researchers.

She is an IEEE member, actively supporting the CAS and RAS societies. In 2020, she has co-chaired "AICAS2020", on Circuits and systems for efficient embedded AI.



24th september 2021

The role and Peculiarities of (tactile) Sensing in Soft Robotics

Soft robotics represents a conceptual departure from "hard" robotics in the sense that a) functionality previously specified in controllers and perception algorithms is now realized directly in robot hardware and b), as a result of the hardware's new, extended role, many of the existing approaches to sensing cannot be applied any longer as they interfere with this new functionality. In this presentation, I will discuss the core "philosophical" principles underlying soft robotics and their consequences for sensing and sensing technologies. Just like the functionality of a robot can be distributed across hardware and software, sensing can also be distributed across an implicit portion (achieved by hardware but without specific sensors) and an explicit portion (as in traditional robotics realized through dedicated sensing hardware). The distribution of sensing requirements across explicit and implicit parts greatly decreases the complexity of sensing, but it also reduces the need for sensing to achieve competent control. To achieve this, so-called "computational sensors" are a key concept. These sensors measure relatively generic properties of an object/robot and then extract from the generic sensor information an application-specific signal through computation. I will argue that this greatly simplifies sensing for many applications and can even be beneficial in traditional robotics. To conclude I will argue that sensing in soft robotics differs conceptually from sensing in "hard" robotics as a) it does not always have to be explicit and b) it can better exploit physical properties of the system to facilitate sensing with computational sensors.

Kopf & Kragen

Oliver Brock

Professor, TU Berlin, robotics and biology lab

Oliver Brock is the Alexander-von-Humboldt Professor of Robotics in the School of Electrical Engineering and Computer Science at the Technische Universität Berlin, a German "University of Excellence". He received his Ph.D. from Stanford University in 2000 and held postdoctoral positions at Rice University and Stanford University. He was an Assistant and Associate Professor in the Department of Computer Science at the University of Massachusetts Amherst before moving back to Berlin in 2009. The research of Brock's lab, the Robotics and Biology Laboratory, focuses on robot intelligence, mobile manipulation, interactive perception, grasping, manipulation, soft material robotics, interactive machine learning, deep learning, motion generation, and the application of algorithms and concepts from robotics to computational problems in structural molecular biology. Oliver Brock directs the Research Center of Excellence "Science of Intelligence". He is an IEEE Fellow and was president of the Robotics: Science and Systems Foundation from 2012 until 2019.


Image credits are © Kopf & Kragen


Active Touch Sensing in Mammals and Robots

Active sensing is the control of the sensory apparatus so as to maximize information gain in relation to current goals. In biological
systems, active sensing generally involves tight feedback loops between sensor and effector systems, these have been widely studied
in invertebrate and vertebrate model systems including humans. The understanding of active sensing, and of related perceptual
processes (active perception), is also being advanced through the development of computational and physical models. Robots,
in particular, can provide compelling platforms for evaluating hypotheses about the biology of active sensing since they face the
same challenge of operating in the physical world and make transparent the mechanisms required to implement closed-loop active
sensing control. This talk reviews the current understanding of active touch sensing in mammals, focusing on vibrissal (whisker)
touch and human cutaneous (fingertip) touch, and explores how robotic modeling is casting light on some of the principles and
processes underlying biological active touch.

Tony Prescott

Professor, University of Sheffield, Director of Sheffield Robotics

Tony Prescott is Professor of Cognitive Robotics at the University of Sheffield and the co-creator award-winning animal-like robots, he has also worked to develop brain-inspired control systems for humanoid robots. His background mixes psychology and brain theory with robotics and artificial intelligence, and his research aims at answering questions about the human condition by creating synthetic entities with capacities such as perception, memory, emotion and sense of self.. He is the co-founder and current Director of Sheffield Robotics a cross-institutional robotics research institute with over two hundred active researchers. He is also a co-founder of Consequential Robotics, a UK start-up developing new kinds of assistive and companion robots including the pet-like robot MiRo. Tony has published over two hundred refereed articles and journal papers and has co-edited several books including the Scholarpedia of Touch and the Handbook of Living Machines. He regularly writes and speaks on societal and ethical issues in robotics and artificial intelligence. His research has been covered by the major news media including the BBC, CNN, Discovery Channel, Science Magazine and New Scientist.


Talks

Organization Commitee

Ella Janotte

Istituto Italiano di Tecnologia

Ella is a PhD student enrolled at the University of Groningen but based in Geova at the Istituto Italiano di Tecnologia. She is a fellow in the Neutouch project and investigates multitansducive neuromorphics touch, trying to apply the biological coding strategies to pressure sensor readouts.  

Luca Lach

Istituto Italiano di Tecnologia

Luca is a PhD student at PAL Robotics and Bielefeld University as a research fellow of the NeuTouch EU project. His main focus lies on reliable grasping and object manipulation by using tactile information on mobile robots. Using the TIAGo robot, he is exploring classical control approaches as well as machine learning for reliable force control.

Simon Frederik Müller-Cleve

University of Groningen

Simon is an engineer who has studied mechatronics and bio-mechatronics in his B.Sc and M.Sc. Next to his studies he hain expeience working in different laboratories with carbon fiber and additive manufacturing (3d printing) and the field of robotics. In the last years during his studies studies, he was highly attracted by neuromorphic engineering and following his interests started his Ph.D. at the Istituto Italiano di Tecnologia (IIT) in the event-driven and perception (EDPR) group lead by Chiara Bartolozzi.

Chiara Bartolozzi

Istituto Italiano di Tecnologia

Chiara Bartolozzi is a Researcher at the Italian Institute of Technology. She is currently leading the Neuromorphic Systems and Interfaces group, with the aim of applying the "neuromorphic" engineering approach to the design of robotic platforms as enabling technology towards the design of autonomous machines. Chiara is project coordinator for Neutouch, an MSCA Innovative Training Network.

Robert Haschke

Bielefeld University

Robert Haschke received the diploma and PhD in Computer Science from the University of Bielefeld, Germany, in 1999 and 2004, working on the theoretical analysis of oscillating recurrent neural networks. Since then, his work focuses more on robotics, still employing neural methods whereever possible. Robert is currently heading the Robotics Group within the Neuroinformatics Group, striving to enrich the dexterous manipulation skills of our two bimanual robot setups through interactive learning. His fields of research include neural networks, cognitive bimanual robotics, grasping and manipulation with multi-fingered dexterous hands, tactile sensing, and software integration.

Lorenzo Natale

Istituto Italiano di Tecnologia

Lorenzo Natale is a Researcher at the Italian Institute of Technology. where he leads the Humanoid Sensing and Perception group. He was invited professor at the University of Genova where he taught the courses of Natural and Artificial Systems and Antropomorphic Robotics for students of the Bioengineering curriculum. Since 2020 he is visiting Professor at the University of Manchester.