In this work a model for ultrasonic motors should be developed and a robust controller designed. The stability/robustness of the controller should be analysed. The motor parameters are to be identified and the control scheme should be implemented and evaluated on a 1-DoF testbed.
You are asked to work on real robotic data to understand control parameters of a light-weight robotic arm.
Why are robotic hands so clumsy? In an attempt to solve this problem, we investigate the use of tactile sensors for robotic in-hand manipulation. In particular, based on a number of tactile sensors (the BioTac; the DLR tactile sensor; and the iCub tactile sensor), we want to create algorithms that can use such sensing to manipulate objects. In cleartext: write, unscrew a bulb, or hand over a knife with a robotic hand.
We exploit deep learning, deep recurrent neural networks, Gaussian Processes, random forests, and more for the representation of sensor and movement data. This includes tactile and vision applications, multimodal sensory integration, BMI data, etc.
We create detailed models of human kinematic and dynamic properties of arms, hands, fingers, and legs. To this end we use recurrent neural networks, GP-LVMs, and other machine-learning methods to describe kinematics, statics, and stiffness of human movement. At the same time, biomechanic and computational neural models complete the picture.
Limb rehabilitation and assistive robotics are paramount applications of the techniques developed in biomimetic robotics. We focus upon human-computer interfaces to aid the disabled regain the lost limb functionality. In our view, both rehabilitation and prosthetics rely on re-establishing the sensori-motor loop with the missing limb. This includes both ways: feed-forward control by detecting the user''s will to move and sensorial feedback by transducing digital readings to feelings.
Axel von Arnimfortiss: software architect
Human Brain Project - Neurorobotics
Benedikt StafflerMPI/TUM: PhD candidate
cNN for connectome
Christian OsendorferTUM: PhD candidate
unsupervised learning, deep networks
Daniela KorhammerTUM: PhD candidate
ML-based movement models
Dhahanjay ShahTUM: MSc candidate
machine learning for tactile sensing
Frederik DiehlTUM: student assistant
tactile force learning
Georg StillfriedDLR: PhD candidate
kinematics of the human hand
Hannes HöppnerDLR: PhD candidate
hannes.hoeppnerdlrde, +49 8153 28-1062
Holger UrbanekDLR: PhD candidate
holger.urbanekdlrde, +49 8153 28-2450
Jörn VogelDLR: PhD candidate
BCI robot control
joern.vogeldlrde, +49 8153 28-2166
Justin BayerTUM: PhD candidate
time series learning
Lucia SeitzTUM: BSc candidate
brain implant data regression
Mark HartensteinTUM: MSc candidate
Markus KühneTUM: PhD candidate
MR-compatible haptic interfaces
Marvin Ludersdorferfortiss: PhD candidate
Max FiedlerTUM: MSc candidate
unsupervised learning for tactile sensing
Maximilian KarlTUM: PhD candidate
Michael StrohmayrDLR: postdoc
the DLR artificial skin
Nutan ChenTUM: PhD candidate
Patrick van der SmagtDirector of BRML labs
fortiss, An-Institut der Technischen Universität München
Professor for Biomimetic Robotics and Machine Learning, TUM
Chairman of Assistenzrobotik e.V.
smagtbrmlorg, +49 89 289-25793
Philip HäusserTUM: PhD candidate
deep learning for vision
Rachel HornungDLR: PhD candidate
Sebastian UrbanTUM: PhD candidate
learning skin data
surbantumde, +49 89 289-25794
Sören Jentzschfortiss: PhD candidate
Human Brain Project: spiking networks
Thomas RückstiessTUM: PhD candidate
reinforcement learning and design
Your name could be herewant to join our team? check out the positions on the left.
Below our 15 most recent publications. If you need more, follow the link. And note: All downloadable PDFs are for personal use only. Please do not redistribute.