Chaitanya Chawla

chaitanya.chawla [at] tum [dot] de

I'm an ECE undergraduate student in the School of Computation, Information and Technology at Technical University of Munich, where I have worked with Prof. Darius Burschka and Prof. Dongheui Lee. My work focusses on extracting and learning skill abstractions across humans and robots.

Currently I am collaborating with Prof. Jean Oh and Tanmay Shankar from Carnegie Mellon University, where I am currently working on extracting abstract representations of agent-object interactions.

I have also been a part-time working student at Roboverse Reply GmBH building applications for Boston Dynamics Spot Robot.

CV  /  Github  /  LinkedIn



 

Research

I am broadly interested in deep learning methods applied to robotics. I hope to build intelligent agents that can generalize to a diverse range of real-world tasks.

Translating Agent-Environment Interactions across Humans and Robots
 
T. Shankar, C. Chawla, A. Hassan J. Oh
 
In Submission to International Conference on Robotics and Automation, ICRA 2024
 
Paper / Website / Code / Video
 

We developed an unsupervised approach to learn temporal abstractions of skills incorporating agent-environment interactions. We hope to learn representations of patterns of motion of objects in the environment, or patterns of change of state. Our approach is able to learn semantically meaningful skill segments across robot and human demonstrations, despite being completely unsupervised.

Robot-Agnostic Framework for One-Shot Intrinsic Feature Extraction
 
C. Chawla, A. Costinescu, D. Burschka
 
In Preparation for IEEE Transactions on Knowledge and Data Engineering
 
Report / Presentation / Code
 

We developed an algorithmic framework to extract different intrinsic features from human demonstrations. We are studying various features, including interactions with objects along the trajectory, analyzing the environment for interactions with the background (e.g., wiping or writing), and classifying the type of motion within a trajectory segment (e.g., shaking, rotating, or transporting).

Visual Teleoperation using Learning from Demonstration
 
C. Chawla, D. Lee
 
Independent Research Project
 
Report / Presentation / Code
 

We presented a method to learn human motions using a Learning-From-Demonstration approach. Using Dynamic Motion Primitives, we were able to teleoperate a Franka Panda Arm using the learned trajectories.


 


Projects

Face Recognition using Autoencoders and PCA
 
C. Chawla, K. Brenner
 
Course Project
 
Paper
 
Comparing different methods including Autoencoders and PCA, for feature representation in face reconstruction
Candy Throwing Robot for Halloween 2023!
 
C. Chawla
 
2 hours long Project on Halloween Eve, Bot Intelligence Group
 
Distributing candies during Halloween at the Robotics Institute, Carnegie Mellon University
 


Teaching Assistantships

At TU Munich:-

  • Robotic Control Laboratory (EI06931) with Dr. Stefan Sosnowski
  • Mathematical Analysis (MA9411) with Dr. Dominik Meidner
  • Physics for Electrical Engineers (PH9009) with Prof. Christian Back
  • Bridge Course for Mathematics (MA9001) with Dr. Dominik Meidner
  • Last updated: Jan 2023

    Imitation is the highest form of flattery