"It has been exciting joining Westlake University, where colleagues are conducting innovative research in different areas of natural and mathematical sciences. I have been enjoying my work with my lab members, and wish that our research will contribute to this great environment."
Biography
Dr. Yue Zhang currently works as a tenured full professor at Westlake University. From Sep 2018 to Jun 2022, he worked as a tenured associate professor at Westlake University. From Jul 2012 to Aug 2018, he worked as an assistant professor at Singapore University of Technology and Design (SUTD). Before joining SUTD, Yue Zhang worked as a postdoctoral research associate at the University of Cambridge. He received a Ph.D. from the University of Oxford in Dec 2009 and an MSc degree from the University of Oxford in Oct 2006, working on statistical machine translation from Chinese to English by parsing. Yue Zhang received his undergraduate degree in Computer Science from Tsinghua University, China. He served as the reviewer for top journals such as Computational Linguistics, Transaction of Association of Computational Linguistics (action editor), IEEE Transaction on Big Data (associate editor), ACM Transactions on Asian and Low Resource Language Information Processing (associate editor) and Journal of Artificial Intelligence Research. He was the PC co-chair at EMNLP 2022, IALP 2017 and CCL 2020. He was also the area chair of ACL 2017/18/19/20, COLING 2014/18, NAACL 2015/19, EMNLP 2015/17/19/20, EACL 2021 and IJCAI 2021.
History
2022
Tenured full professor,Westlake University.
The 2022 Faculty Award for Excellence in Mentoring Students, Westlake University
2021
The 2021 Faculty Award for Excellence in Research, Westlake University
2018
Associate professor, Westlake University
2012
Assistant professor, Singapore University of Technology and Design (SUTD)
2010
Postdoctoral research associate, University of Cambridge
2006
Ph.D. degree, University of Oxford
2005
Master's degree, University of Oxford
1999
Bachelor's degree, Tsinghua University
Research
Yue Zhang leads the text intelligence lab on language technologies research. The main goal is to investigate robust open domain human language understanding and synthesis technologies and their downstream uses. Current work can be categorized into three tiers. On the bottom level, machine learning algorithms are investigated on fundamental representations and languages, including syntax, semantics and world knowledge information. We are currently working on deep learning and transfer learning algorithms. On the second level, fundamental NLP tasks such as syntactic and semantic analysis and text synthesis for Chinese and English are investigated, and information extraction tasks involving entities, relations, events and sentiments are explored. On the top level, text mining tasks that leverage our language technologies are also investigated, and we have worked on financial technologies based on text understanding, bio NLP and educational tasks.
Representative Publications
1. Guangsheng Bao, Zhiyang Teng, and Yue Zhang. 2023. Target-Side Augmentation for Document-Level Machine Translation. In Proceedings of the 61th Annual Meeting of the Association for Computational Linguistics (ACL).
2. Linyi Yang, Yingpeng Ma and Yue Zhang. 2023. Measuring Consistency in Text-based Financial Forecasting Models. In Proceedings of the 61th Annual Meeting of the Association for Computational Linguistics (ACL).
3. Xuefeng Bai, Yulong Chen and Yue Zhang. 2022. Graph Pre-training for AMR Parsing and Generation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (ACL).
4. Leyang Cui, Sen Yang and Yue Zhang. 2022. Investigating Non-local Features for Neural Constituency Parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (ACL).
5. Cunxiang Wang, Pai Liu and Yue Zhang. 2021. Can Generative Pre-trained Language Models Serve As Knowledge Bases for Closed-book QA? In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL).
6. Yafu Li, Yongjing Yin, Yulong Chen and Yue Zhang. 2021. On Compositional Generalization of Neural Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (ACL).
7. Wenyu Du, Zhouhan Lin, Yikang Shen, Timothy J. O’Donnell, Yoshua Bengio and Yue Zhang. 2020. Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL).
8. Chen Jia, Xiaobo Liang, Yue Zhang. 2019. Cross-Domain NER using Cross-Domain Language Modeling. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL).
Contact Us