Geoffrey Hinton

Professor Hinton has recommended books in the following areas:

References: [1]

Geoffrey Everest Hinton is an English Canadian cognitive psychologist and computer scientist, most noted for his work on artificial neural networks. Since 2013 he divides his time working for Google (Google Brain) and the University of Toronto. In 2017, he co-founded and became the Chief Scientific Advisor of the Vector Institute in Toronto. With David Rumelhart and Ronald J. Williams, Hinton was co-author of a highly cited paper published in 1986 that popularized the backpropagation algorithm for training multi-layer neural networks, although they were not the first to propose the approach. Hinton is viewed by some as a leading figure in the deep learning community and is referred to by some as the “Godfather of Deep Learning”. The dramatic image-recognition milestone of the AlexNet designed by his student Alex Krizhevsky for the ImageNet challenge 2012 helped to revolutionize the field of computer vision. Hinton was awarded the 2018 Turing Award alongside Yoshua Bengio and Yann LeCun for their work on deep learning. Hinton—together with Yoshua Bengio and Yann LeCun—are referred to by some as the “Godfathers of AI” and “Godfathers of Deep Learning”. He received his Ph.D. in artificial intelligence in 1978 for research supervised by Christopher Longuet-Higgins at the University of Edinburgh. After his Ph.D. he worked at the University of Sussex, and then the University of California, San Diego, and Carnegie Mellon University. He was the founding director of the Gatsby Charitable Foundation Computational Neuroscience Unit at University College London, and is currently a professor in the computer science department at the University of Toronto. He holds a Canada Research Chair in Machine Learning, and is currently an advisor for the Learning in Machines & Brains program at the Canadian Institute for Advanced Research. Hinton joined Google in March 2013 when his company, DNNresearch Inc., was acquired. While Hinton was a professor at Carnegie Mellon University (1982–1987), David E. Rumelhart and Hinton and Ronald J. Williams applied the backpropagation algorithm to multi-layer neural networks. Their experiments showed that such networks can learn useful internal representations of data. During the same period, Hinton co-invented Boltzmann machines with David Ackley and Terry Sejnowski. His other contributions to neural network research include distributed representations, time delay neural network, mixtures of experts, Helmholtz machines and Product of Experts. Notable former Ph.D. students and postdoctoral researchers from his group include Richard Zemel, Brendan Frey, Radford M. Neal, Ruslan Salakhutdinov, Ilya Sutskever, Yann LeCun and Zoubin Ghahramani.

Awards

  • 1998: elected a Fellow of the Royal Society (FRS)

  • 2001: the first winner of the Rumelhart Prize

  • 2001: awarded an Honorary Doctorate from the University of Edinburgh

  • 2005: recipient of the IJCAI Award for Research Excellence lifetime-achievement award

  • 2011: Herzberg Canada Gold Medal for Science and Engineering

  • 2013: awarded an Honorary Doctorate from the Université de Sherbrooke

  • 2016: elected a foreign member of National Academy of Engineering

  • 2016: IEEE/RSE Wolfson James Clerk Maxwell Award

  • 2016: BBVA Foundation Frontiers of Knowledge Award

  • 2018: Turing Award (with Yann LeCun, and Yoshua Bengio)

  • 2018: awarded a Companion of the Order of Canada