You are asked to work on real robotic data to understand control parameters of a light-weight robotic arm.
Why are robotic hands so clumsy? In an attempt to solve this problem, we investigate the use of tactile sensors for robotic in-hand manipulation. In particular, based on a number of tactile sensors (the BioTac; the DLR tactile sensor; and the iCub tactile sensor), we want to create algorithms that can use such sensing to manipulate objects. In cleartext: write, unscrew a bulb, or hand over a knife with a robotic hand.
In Machine Learning, we investigate methods to map high-dimensional non-linear data within a control process. Even though most of our data are related to the above fields of research, the methods we employ and develop are general methods, in which we combine deep belief networks with time sequence learning.
The use of surface electromyography (sEMG) for prosthetic control has been in place since the 1960's. We go a step further. On the one hand, we optimise the conditioning of the sEMG signal, and find new ways of relating it to limb movement. But we also look at different channels to control prosthetic and assistive robotic devices, including central nervous system implants.
In biomechanics, we create details models of human kinematic and dynamic properties of arms, hands, fingers, and legs. These models are needed to understand which properties of human movement are intrinsic---caused by muscles, tendons, ligaments and bones---and which are controlled by the nervous system. Our resulting models are used in the construction and control of novel robotic systems, including prosthetic hands and robotic arms and legs.
Limb rehabilitation and assistive robotics are paramount applications of the techniques developed in biomimetic robotics. We focus upon human-computer interfaces to aid the disabled regain the lost limb functionality. In our view, both rehabilitation and prosthetics rely on re-establishing the sensori-motor loop with the missing limb. This includes both ways: feed-forward control by detecting the user''s will to move and sensorial feedback by transducing digital readings to feelings.
Axel von Arnimfortiss: software architect
Human Brain Project - Neurorobotics
Moritz AugustTUM: MSc candidate
deep autoencoder networks
Adrià Puigdomènech BadiaTUM: student
convolutional neural networks
Justin BayerTUM: PhD candidate
time series learning
Nutan ChenTUM: PhD candidate
Rémy DegenneTUM: MSc candidate
cNN for one-class learning
Frederik DiehlTUM: student assistant
tactile force learning
Hannes HöppnerDLR: PhD candidate
hannes.hoeppnerdlrde, +49 8153 28-1062
Rachel HornungDLR: PhD candidate
Haris JabbarTUM: student
grip force modelling
Sören Jentzschfortiss: PhD candidate
Human Brain Project: spiking networks
Maximilian KarlTUM: MSc candidate
fast robust PCA
Daniela KorhammerTUM: PhD candidate
ML-based movement models
Artur LohrerTUM: research associate
tactile sensor fusion
Marvin LudersdorferTUM: MSc candidate
Saahil OgnawalaTUM: MSc candidate
recurrent neural networks
Christian OsendorferTUM: PhD candidate
unsupervised learning, deep networks
Thomas RückstiessTUM: PhD candidate
reinforcement learning and design
Patrick van der SmagtDirector of BRML labs
fortiss, An-Institut der Technischen Universität München
Professor for Biomimetic Robotics and Machine Learning, TUM
Chairman of Assistenzrobotik e.V.
smagtbrmlorg, +49 89 289-25793
Benedikt StafflerMPI/TUM: PhD candidate
cNN for connectome
Georg StillfriedDLR: PhD candidate
kinematics of the human hand
Michael StrohmayrDLR: postdoc
the DLR artificial skin
Sebastian UrbanTUM: PhD candidate
learning skin data
surbantumde, +49 89 289-25794
Holger UrbanekDLR: PhD candidate
holger.urbanekdlrde, +49 8153 28-2450
Jörn VogelDLR: PhD candidate
BCI robot control
joern.vogeldlrde, +49 8153 28-2166
Your name could be herewant to join our team? check out the positions on the left.
Below our 15 most recent publications. If you need more, follow the link. And note: All downloadable PDFs are for personal use only. Please do not redistribute.