Christopher David Manning (born September 18, 1965) is a computer scientist and applied linguist whose research in the areas of natural language processing, artificial intelligence and machine learning is considered highly influential. He is the current Director of the Stanford Artificial Intelligence Laboratory (SAIL).
Manning has been described as “the leading researcher in natural language processing”,[1] well known for co-developing GloVe word vectors; the bilinear or multiplicative form of attention, now widely used in artificial neural networks including the transformer; tree-structured recursive neural networks; and approaches to and systems for Textual entailment. His main educational contributions are his textbooks Foundations of Statistical Natural Language Processing (1999) and Introduction to Information Retrieval (2008), and his course CS224N Natural Language Processing with Deep Learning, which is available online. Manning also pioneered the development of well-maintained open source computational linguistics software packages, including CoreNLP, Stanza, and GloVe.[2][3][4][5]
Manning is the Thomas M. Siebel Professor in Machine Learning and a professor of Linguistics and Computer Science at Stanford University. He received a BA (Hons) degree majoring in mathematics, computer science, and linguistics from the Australian National University (1989) and a PhD in linguistics from Stanford (1994), under the guidance of Joan Bresnan.[6][7] He was an assistant professor at Carnegie Mellon University (1994–96) and a lecturer at the University of Sydney (1996–99) before returning to Stanford as an assistant professor. At Stanford, he was promoted to associate professor in 2006 and to full professor in 2012. He was elected an AAAI Fellow in 2010.[8] He was previously President of the Association for Computational Linguistics (2015) and he has received an honorary doctorate from the University of Amsterdam (2023). Manning was awarded the IEEE John von Neumann Medal “for advances in computational representation and analysis of natural language” in 2024.[9][1]
Manning's linguistic work includes his dissertation Ergativity: Argument Structure and Grammatical Relations (1996), a monograph Complex Predicates and Information Spreading in LFG (1999),[10] and his work developing Universal Dependencies,[11] from which he is the namesake of Manning's Law.
Manning's PhD students include Dan Klein, Sepandar Kamvar, Richard Socher, and Danqi Chen.[7] In 2021, he joined AIX Ventures[12] as an Investing Partner. AIX Ventures is a venture capital fund that invests in artificial intelligence startups.