Nir Shavit (Hebrew: ניר שביט) is an Israeli computer scientist. He is a professor in the Computer Science Department at Tel Aviv University and a professor of electrical engineering and computer science at the Massachusetts Institute of Technology.

Nir Shavit
Alma materTechnion, Hebrew University of Jerusalem
Known forSoftware transactional memory, wait-free algorithms
AwardsGödel prize, Dijkstra prize
Scientific career
FieldsComputer science: concurrent and parallel computing
Thesis (1990)
Websitewww.cs.tau.ac.il/~shanir/

Nir Shavit received B.Sc. and M.Sc. degrees in computer science from the Technion - Israel Institute of Technology in 1984 and 1986, and a Ph.D. in computer science from the Hebrew University of Jerusalem in 1990. Shavit is a co-author of the book The Art of Multiprocessor Programming, is a winner of the 2004 Gödel Prize in theoretical computer science for his work on applying tools from algebraic topology to model shared memory computability, and a winner of the 2012 Dijkstra Prize for the introduction and first implementation of software transactional memory. He is a past program chair of the ACM Symposium on Principles of Distributed Computing (PODC) and the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA).

He heads up the Computational Connectomics Group at MIT's Computer Science and Artificial Intelligence Laboratory, focusing on techniques for designing, implementing, and reasoning about multiprocessors, and in particular the design of concurrent data structures for multi-core machines.

Recognition edit

Currently he has co-founded a company named Neural Magic along with Alexzander Mateev. The company claims to use highly sparse neural networks to make deep learning computationally so efficient that GPUs won't be needed. For certain use cases they claim a speed up of 175x.[2]

References edit

  1. ^ ACM Names Fellows for Computing Advances that Are Transforming Science and Society Archived 2014-07-22 at the Wayback Machine, Association for Computing Machinery, accessed 2013-12-10.
  2. ^ "The Future of Deep Learning is Sparse. - Neural Magic". 12 July 2019.

External links edit