Christopher D. Manning

Christopher David Manning (born September 18, 1965) is a computer scientist and applied linguist whose research in the areas of natural language processing, artificial intelligence and machine learning is considered highly influential. He is the current Director of the Stanford Artificial Intelligence Laboratory (SAIL).

Manning has been described as “the leading researcher in natural language processing”,[1] well known for co-developing GloVe word vectors; the bilinear or multiplicative form of attention, now widely used in artificial neural networks including the transformer; tree-structured recursive neural networks; and approaches to and systems for Textual entailment. His main educational contributions are his textbooks Foundations of Statistical Natural Language Processing (1999) and Introduction to Information Retrieval (2008), and his course CS224N Natural Language Processing with Deep Learning, which is available online. Manning also pioneered the development of well-maintained open source computational linguistics software packages, including CoreNLP, Stanza, and GloVe.[2][3][4][5]

Manning is the Thomas M. Siebel Professor in Machine Learning and a professor of Linguistics and Computer Science at Stanford University. He received a BA (Hons) degree majoring in mathematics, computer science, and linguistics from the Australian National University (1989) and a PhD in linguistics from Stanford (1994), under the guidance of Joan Bresnan.[6][7] He was an assistant professor at Carnegie Mellon University (1994–96) and a lecturer at the University of Sydney (1996–99) before returning to Stanford as an assistant professor. At Stanford, he was promoted to associate professor in 2006 and to full professor in 2012. He was elected an AAAI Fellow in 2010.[8] He was previously President of the Association for Computational Linguistics (2015) and he has received an honorary doctorate from the University of Amsterdam (2023). Manning was awarded the IEEE John von Neumann Medal “for advances in computational representation and analysis of natural language” in 2024.[9][1]

Manning's linguistic work includes his dissertation Ergativity: Argument Structure and Grammatical Relations (1996), a monograph Complex Predicates and Information Spreading in LFG (1999),[10] and his work developing Universal Dependencies,[11] from which he is the namesake of Manning's Law.

Manning's PhD students include Dan Klein, Sepandar Kamvar, Richard Socher, and Danqi Chen.[7] In 2021, he joined AIX Ventures[12] as an Investing Partner. AIX Ventures is a venture capital fund that invests in artificial intelligence startups.

  1. ^ a b "Christopher D. Manning". IEEE. Retrieved 27 October 2024.
  2. ^ "Christopher D Manning - AD Scientific Index 2022". www.adscientificindex.com. Retrieved 22 February 2022.
  3. ^ "Christopher Manning". CIFAR. Retrieved 22 February 2022.
  4. ^ "Laying the foundation for today's generative AI". Stanford. Retrieved 27 October 2024.
  5. ^ "Stanford NLP Group". Retrieved 23 April 2023.
  6. ^ Manning, Christopher. "Christopher Manning". The Stanford Natural Language Processing Group. Retrieved 24 May 2022.
  7. ^ a b Manning, Christopher. "Christopher Manning and Ph.D. Students' Dissertations". The Stanford Natural Language Processing Group. Retrieved 24 May 2022.
  8. ^ "Elected AAAI Fellows". AAAI. Retrieved 6 January 2024.
  9. ^ "UvA honorary doctorates for psychiatrist Vikram Patel and computer scientist Christopher Manning". Retrieved 23 April 2023.
  10. ^ "Complex Predicates and Information Spreading in LFG". Retrieved 23 April 2023.
  11. ^ de Marneffe, Marie-Catherine; Manning, Christopher D.; Nivre, Joakim; Zeman, Daniel (13 July 2021). "Universal Dependencies". Computational Linguistics. 47 (2): 255–308. doi:10.1162/coli_a_00402. ISSN 0891-2017. S2CID 219304854. Retrieved 22 February 2022.
  12. ^ "AIX Ventures - An AI Fund". AIX Ventures. Retrieved 13 January 2023.