FACULTY

Faculty

At Westlake, we welcome talented people, outstanding scholars, research fellows, and young scientists from all backgrounds. We expect to have a community of 300 assistant, associate, and full professors (including chair professors), 600 research, teaching, technical support and administrative staff, and 900 postdoctoral fellows by 2026.

返回
Tao Lin, Ph.D.

Tao Lin, Ph.D.

Tao Lin, Ph.D.

School of Engineering

Artificial Intelligence and Data Science (AI)

School of Engineering

联系

网站: https://lins-lab.github.io

Simplicity is the ultimate sophistication. ---Leonardo da Vinci

Biography

In Winter 2022, I will be joining the Westlake University as a Tenure Track Assistant Professor. Prior to that, I defended my Ph.D. thesis in June 2022, and was a Ph.D. student at MLO, École Polytechnique Fédérale de Lausanne (EPFL), very fortunately under the supervision of Prof. Martin Jaggi and Prof. Babak Falsafi. I received the Master of Science degree from EPFL and Bachelor of Engineering degree from Zhejiang University (ZJU).

History

2022

Assistant Professor, School of Engineering, Westlake University

Ph.D. degree, École Polytechnique Fédérale de Lausanne (EPFL)

2017

Master's degree, École Polytechnique Fédérale de Lausanne (EPFL)

2014

Bachelor's degree, Zhejiang University

Research

My recent research interests lie in the intersection of optimization and generalization for deep learning:

  • leveraging theoretical/empirical understanding (e.g., loss landscape, and training dynamics)

  • to design efficient & robust methods (both learning and inference)

  • for deep learning (centralized) and collaborative deep learning (distributed and/or decentralized),

  • under imperfect environments (e.g., noisy, heterogeneous, and hardware-constrained).

We are actively looking for postdoc researchers, prospective PhD students, research assistants/interns, and visiting scholars, to join our lab. For a detailed job description, please check our website: https://lins-lab.github.io/openings/

Representative Publications

1. Hao Zhao, Yuejiang Liu, Alexandre Alahi, Tao Lin#. "On Pitfalls of Test-time Adaptation." ICML 2023

2. Liangze Jiang*, Tao Lin*#. "Test-Time Robust Personalization for Federated Learning." ICLR 2023.

3. Thijs Vogels*, Lie He*, Anastasia Koloskova, Tao Lin, Sai Praneeth Karimireddy, Sebastian U. Stich, Martin Jaggi. "RelaySum for Decentralized Deep Learning on Heterogeneous Data." NeurIPS 2021.

4. Tao Lin#, Sai Praneeth Karimireddy, Sebastian U. Stich, Martin Jaggi. "Quasi-Global Momentum: Accelerating Decentralized Deep Learning on Heterogeneous Data." ICML 2021.

5. Lingjing Kong*, Tao Lin*#, Anastasia Koloskova, Martin Jaggi, Sebastian U. Stich. "Consensus Control for Decentralized Deep Learning." ICML 2021.

6. Tao Lin*#, Lingjing Kong*, Sebastian U. Stich, Martin Jaggi. "Ensemble Distillation for Robust Model Fusion in Federated Learning." NeurIPS 2020.

7. Mengjie Zhao*, Tao Lin*, Fei Mi, Martin Jaggi, Hinrich Schütze. "Masking as an Efficient Alternative to Finetuning for Pretrained Language Models." EMNLP 2020.

8. Tao Lin*#, Lingjing Kong*, Sebastian U. Stich, Martin Jaggi. "Extrapolation for Large-batch Training in Deep Learning." ICML 2020.

9. Tao Lin#, Sebastian U. Stich, Luis Barba, Daniil Dmitriev, Martin Jaggi. "Dynamic Model Pruning with Feedback." ICLR 2020.

10. Anastasia Koloskova*, Tao Lin*, Sebastian U. Stich, Martin Jaggi. "Decentralized Deep Learning with Arbitrary Communication Compression." ICLR 2020.

11. Tao Lin#, Sebastian U. Stich, Kumar Kshitij Patel, Martin Jaggi. "Don't Use Large Mini-Batches, Use Local SGD." ICLR 2020.

12. Tian Guo, Tao Lin, Nino Antulov-Fantulin. "Exploring Interpretable LSTM Neural Networks over Multi-Variable Data." ICML 2019.

13. Tao Lin*#, Tian Guo*, Karl Aberer. "Hybrid Neural Networks for Learning the Trend in Time Series." IJCAI 2017.

Contact Us