Tianshu Kuai

I am a first year CS PhD student at the Université de Montréal and Mila, advised by Prof. Noam Aigerman.

I was a master student in the Department of Computer Science at the University of Toronto, advised by Prof. Igor Gilitschenski. Before that, I graduated from the Engineering Science program at the University of Toronto, majoring in Robotics. During my undergraduate studies, I was fortunate to work with Prof. Steven L. Waslander on 3D LiDAR object detection. I also did a research internship at Samsung AI Center Toronto in 2023, working with Dr. Alex Levinshtein.

My current research focuses on geometry processing using deep learning. I am also interested in other relevant research topics in 3D vision and graphics. Previously, I have worked on non-rigid 3D reconstruction, 3D scene editing, and autonomous driving perception.

Email  /  CV  /  Google Scholar  /  Github  /  LinkedIn  /  Twitter

profile photo

Research
clean-usnob Towards Unsupervised Blind Face Restoration using Diffusion Prior
Tianshu Kuai, Sina Honari, Igor Gilitschenski, Alex Levinshtein
WACV 2025
project page / arXiv
clean-usnob CAMM: Building Category-Agnostic and Animatable 3D Models from Monocular Videos
Tianshu Kuai, Akash Karthikeyan, Yash Kant, Ashkan Mirzaei, Igor Gilitschenski
CVPRW 2023
project page / paper / arXiv / code / data
clean-usnob Self-Supervised Image-to-Point Distillation via Semantically Tolerant Contrastive Loss
Anas Mahmoud, Jordan S.K. Hu, Tianshu Kuai, Ali Harakeh, Liam Paull, Steven L. Waslander
CVPR 2023
paper / arXiv / code
clean-usnob Point Density-Aware Voxels for LiDAR 3D Object Detection
Jordan S.K. Hu, Tianshu Kuai, Steven L. Waslander
CVPR 2022
paper / arXiv / code

Work Experience
clean-usnob Research Intern
Samsung AI Center Toronto
May 2023 - April 2024 | Toronto, ON
clean-usnob Machine Learning Research Intern
Qualcomm
May 2020 - May 2021 | Toronto, ON

Academic Service
  • Reviewer: CVPR 2023, ECCV 2024, WACV (2024, 2025), AAAI 2025

Misc
  • I am involved in maintaining a curated list of papers in NeRF Editing at awesome-nerf-editing. We welcome contributions to continue expanding and improving this collection.