Dr. L. Kong

PhD Carnegie Mellon University
Assistant Professor


Tel: (+852) 2857 8271
Fax: (+852) 3585 1012
Email:
Homepage: https://ikekonglp.github.io/

lpk

Dr. Lingpeng Kong is an assistant professor in the Department of Computer Science at the University of Hong Kong (HKU). His research tackles the core problems in natural language processing (NLP) by designing representation learning algorithms that exploit linguistic structures. His work lies at the intersection of deep learning and structured prediction, with an application focus on syntactic parsing, speech recognition, social media analysis and machine translation. Before joining HKU, he was a research scientist at Google DeepMind from 2017 to 2020. Dr. Kong obtained his Ph.D. from Carnegie Mellon University in 2017, co-advised by Noah Smith and Chris Dyer.

Research Interests

Artificial Intelligence, Machine Learning, Natural Language Processing

Selected Publications

  • Adhiguna Kuncoro*, Lingpeng Kong*, Daniel Fried, Dani Yogatama, Laura Rimell, Chris Dyer, Phil Blunsom, Syntactic Structure Distillation Pretraining For Bidirectional Encoders, Transactions of the Association for Computational Linguistics , September 2020. (*equal contribution) (TACL 2020)
  • Lingpeng Kong, Cyprien de Masson d'Autume, Wang Ling, Lei Yu, Zihang Dai, Dani Yogatama, A Mutual Information Maximization Perspective of Language Representation Learning, In International Conference on Learning Representations, Ethiopia, April 2020. (ICLR 2020)
  • Cyprien de Masson d'Autume, Sebastian Ruder, Lingpeng Kong, Dani Yogatama, Episodic Memory in Lifelong Language Learning, In Advances in Neural Information Processing Systems, Vancouver, Canada, December 2019. (NeurIPS 2019)
  • Lingpeng Kong, Gabor Melis, Wang Ling, Lei Yu, Dani Yogatama, Variational Smoothing in Recurrent Neural Network Language Models, In International Conference on Learning Representations, New Orleans, Louisiana, May 2019. (ICLR 2019)
  • Adhiguna Kuncoro, Miguel Ballesteros, Lingpeng Kong, Chris Dyer, Graham Neubig, Noah A. Smith, What Do Recurrent Neural Network Grammars Learn About Syntax?, In Proceedings of the Conference of the European Chapter of the Association for Computational Linguistics, Valencia, Spain, January 2017. [Outstanding Paper Award] (EACL 2017)
  • Adhiguna Kuncoro, Miguel Ballesteros, Lingpeng Kong, Chris Dyer, and Noah A. Smith, Distilling an Ensemble of Greedy Dependency Parsers into One MST Parser, In Proceedings of the Conference on Empirical Methods in Natural Language Processing, Austin, TX, November 2016 (EMNLP 2016)
  • Lingpeng Kong, Chris Dyer, Noah A. Smith, Segmental Recurrent Neural Networks, in Proceedings of International Conference on Learning Representations, Puerto Rico, May 2016. (ICLR 2016)
  • Dani Yogatama, Lingpeng Kong, and Noah A. Smith, Bayesian Optimization of Text Representations, in Proceedings of the Conference on Empirical Methods in Natural Language Processing, Lisboa, Portugal, September 2015. (EMNLP 2015)
  • Lingpeng Kong, Alexander M. Rush, and Noah A. Smith, Transforming Dependencies into Phrase Structures, in Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics, Denver, CO, May 2015. (NAACL 2015)
  • Lingpeng Kong, Nathan Schneider, Swabha Swayamdipta, Archna Bhatia, Chris Dyer and Noah A. Smith, A Dependency Parser for Tweets, in Proceedings of the Conference on Empirical Methods in Natural Language Processing, Doha, October 2014. (EMNLP 2014)