尹伊淳

主要研究方向

大模型预训练、神经符号系统


教育经历:2018届博士 信息科学技术学院,北京大学. 导师:张铭.


个人介绍

尹伊淳,北京大学信息科学技术学院博士,目前是华为诺亚方舟实验室的高级研究员,负责盘古语言大模型的研发。他是人工智能领域的优秀研究者,其研究成果广泛发表在顶级学术会议,包括ACL、EMNLP和NeurIPS,论文引用次数超过3000次。作为TinyBERT的主要作者,他在模型压缩与优化方面具有深厚的技术积累。他目前的研究方向大语言模型,包括模型架构、分析优化和高效训练等。尹伊淳以其扎实的学术背景和技术能力,不断推动大语言模型领域的进步,其工作在学术界和工业界均产生了重要影响。


主要论文


Yu Pan, Ye Yuan, Yichun Yin, Zenglin Xu, Lifeng Shang, Xin Jiang, Qun Liu, 2023. Reusing Pretrained Models by Multi-linear Operators for Efficient Training.


Yichun Yin, Cheng Chen, Lifeng Shang, Xin Jiang, Xiao Chen, Qun Liu, 2021. AutoTinyBERT: Automatic Hyper-parameter Optimization for Efficient Pre-trained Language Models.


Jianhao Shen , Yichun Yin, Lin Li, Lifeng Shang, Xin Jiang, Ming Zhang, Qun Liu, 2021. Generate & Rank: A Multi-task Framework for Math Word Problems.


Xiaoqi Jiao, Yichun Yin, Lifeng Shang, Xin Jiang, Xiao Chen, Linlin Li, Fang Wang, Qun Liu, 2020. TinyBERT: Distilling BERT for Natural Language Understanding (The most cited paper at EMNLP2020)


Yichun Yin, Chenguang Wang, Ming Zhang, 2020. PoD: Positional Dependency-Based Word Embedding for Aspect Term Extraction.


Yichun Yin, Lifeng Shang, Xin Jiang, Xiao Chen, Qun Liu, 2019Dialog State Tracking with Reinforced Data Augmentation.


Yichun Yin, Yangqiu Song, Ming Zhang:, 2017. Document-Level Multi-Aspect Sentiment Classification as Machine Comprehension.


Yichun Yin, Furu Wei, Li Dong, Kaimeng Xu, Ming Zhang, Ming Zhou, 2016. Unsupervised Word and Dependency Path Embeddings for Aspect Term Extraction.