Open main menu

Stockfish is a free and open-source[3] UCI chess engine, available for various desktop and mobile platforms. It is developed by Marco Costalba, Joona Kiiski, Gary Linscott and Tord Romstad, with many contributions from a community of open-source developers.[4]

This is a logo for Stockfish Chess Engine.jpg
Developer(s)Marco Costalba
Joona Kiiski
Gary Linscott
Tord Romstad[1]
Initial releaseNovember 2, 2008; 10 years ago (2008-11-02)
Stable release
10 / November 29, 2018; 10 days ago (2018-11-29)
Repository Edit this at Wikidata
Written inC++
Operating systemMicrosoft Windows
TypeChess engine
LicenseGNU GPLv3[2]
DroidFish is a free Android chess program that bundles the Stockfish engine.

Stockfish is consistently ranked first or near the top of most chess-engine rating lists and is the strongest open-source chess engine in the world.[5][6][7] It won the unofficial world computer chess championships in season 6 (2014), season 9 (2016), season 11 (2018), season 12 (2018), and season 13 (2018). It finished runner-up in season 5 (2013), season 7 (2014) and season 8 (2015). Stockfish is derived from Glaurung, an open-source engine by Romstad.



Stockfish can use up to 512 CPU threads in multiprocessor systems. The maximal size of its transposition table is 128 GB. Stockfish implements an advanced alpha–beta search and uses bitboards. Compared to other engines, it is characterized by its great search depth, due in part to more aggressive pruning and late move reductions.[8][9]

Stockfish supports Chess960, which is one of the features that was inherited from Glaurung. The Syzygy tablebase support, previously available in a fork maintained by Ronald de Man, was integrated into Stockfish in 2014.[10] In 2018 support for the 7-men Syzygy was added, shortly after becoming available.


Since 2013, Stockfish has been developed using a distributed testing framework named Fishtest, where volunteers are able to donate CPU time for testing improvements to the program.[11][12][13]

Changes to game-playing code are accepted or rejected based on results of playing of tens of thousands of games on the framework against an older "reference" version of the program, using sequential probability ratio testing. Tests on the framework are verified using the chi-squared test, and only if the results are statistically significant are they deemed reliable and used to revise the software code.

As of June 2018, the framework has used a total of more than 1200 years of CPU time to play more than 840 million chess games.[14] After the inception of Fishtest, Stockfish incurred an explosive growth of 120 Elo points in just 12 months, propelling it to the top of all major rating lists.[15][16] In Stockfish 7, FishTest author Gary Linscott was added to the official list of authors in acknowledgement of his contribution to Stockfish's strength.

Competition resultsEdit

Participation in TCECEdit

In 2013 Stockfish finished runner-up at both TCEC Seasons 4 and 5, with Superfinal scores of 23–25 first against Houdini 3 and later against Komodo 1142. Season 5 was notable for the winning Komodo team as they accepted the award posthumously for the program's creator Don Dailey, who succumbed to an illness during the final stage of the event. In his honor, the version of Stockfish that was released shortly after that season was named "Stockfish DD".[17]

On 30 May 2014, Stockfish 170514 (a development version of Stockfish 5 with tablebase support) convincingly won TCEC Season 6, scoring 35.5-28.5 against Komodo 7x in the Superfinal.[18] Stockfish 5 was released the following day.[19] In TCEC Season 7, Stockfish again made the Superfinal, but lost to Komodo with the score of 30.5-33.5.[18] In TCEC Season 8, despite losses on time caused by buggy code, Stockfish nevertheless qualified once more for the Superfinal, but lost the ensuing 100-game match 46.5-53.5 to Komodo.[18]

Stockfish version 8 is the winner of the 2016 Season 9 of TCEC against Houdini version 5 with the score of 54.5 versus 45.5.[20] Stockfish finished third during season 10 of TCEC and won seasons 11 (59 vs. 41 against Houdini 6.03)[21], 12 (60 vs. 40 against Komodo 12.1.1)[22], and 13 (55 vs. 45 against Komodo 2155.00)[23] convincingly.[24]

Stockfish versus NakamuraEdit

Stockfish's strength relative to the best human chess players was most apparent in a handicap match with grandmaster Hikaru Nakamura (2798-rated) in August 2014. In the first two games of the match, Nakamura had the assistance of an older version of Rybka, and in the next two games, he received White with pawn odds but no assistance. Nakamura was the world's fifth-best human chess player at the time of the match, while Stockfish was denied use of its opening book, as well as endgame tablebase. Stockfish won each half of the match 1.5–0.5. Both of Stockfish's wins arose from positions in which Nakamura, as is typical for his playing style, pressed for a win instead of acquiescing to a draw.[25]

An artificial-intelligence approach, designed by Jean-Marc Alliot of the Institut de recherche en informatique de Toulouse ("Toulouse Computer Science Research Institute"), which compares chess grandmaster moves against that of Stockfish, rated Magnus Carlsen as the best player of all time, as he had the highest probability of all World Chess Champions to play the moves that Stockfish suggested.[26]

Computer chess tournamentEdit

In November 2017, held an open tournament of the ten strongest chess engines, leading to a "Super final" tournament between the two finalists - Stockfish and Houdini. In the 20-game Super final, Stockfish won over Houdini with a score 10.5-9.5. Five games were decisive, with 15 ending in a draw. Of the decisive games, three games were won by Stockfish (one as Black), and two games won by Houdini (winning both as Black). The average game length was 199.5 ply (100 moves).[27][28] The tournament was organized with a variety of time controls, and engines allocated equal computing support; each having its own dedicated AWS virtualized instance of a hyperthreaded Intel Xeon 2.90 GHz (two processors each with 18 cores) with 60 GB RAM running on a Windows-based server.[27]

Stockfish also won the 2018 version of this tournament, again defeating Houdini in the finals- this time with a score of 120/200. During the bonus-games segment of this tournament, Stockfish lost a game against the neural network engine Leela Chess Zero (also known as "Lc0") despite Leela having a 1-pawn handicap. It is unclear whether LC0 was running on the same hardware as Stockfish or whether it was the Google Cloud version.[29]

In internal tests by the developer of Leela Chess Zero (who is also one of the developers of Stockfish), Stockfish won 76 games, Leela Chess Zero won 16 games, and there were 308 draws.[30]

Stockfish versus AlphaZeroEdit

In December 2017, Stockfish 8 was used as a benchmark to evaluate Google division Deepmind's AlphaZero, with each engine supported by different hardware. AlphaZero was trained through self-play for a total of nine hours, and reached Stockfish's level after just four.[31][32] Stockfish was allocated 64 threads and a hash size of 1 GB; AlphaZero was supported with four application-specific TPUs. Each program was given one minute's worth of thinking time per move.

In 100 games from the normal starting position AlphaZero won 25 games as White, won 3 as Black, and drew the remaining 72, with 0 losses.[33] AlphaZero also played twelve 100-game matches against Stockfish starting from twelve popular openings for a final score of 290 wins, 886 draws and 24 losses, for a point score of 733:467.[34][note 1] The research has not been peer reviewed and Google declined to comment until it is published.[33]

In response to this, Stockfish developer Tord Romstad commented, "The match results by themselves are not particularly meaningful because of the rather strange choice of time controls and Stockfish parameter settings: The games were played at a fixed time of 1 minute/move, which means that Stockfish has no use of its time management heuristics (lot of effort has been put into making Stockfish identify critical points in the game and decide when to spend some extra time on a move; at a fixed time per move, the strength will suffer significantly). The version of Stockfish used is one year old, was playing with far more search threads than has ever received any significant amount of testing, and had way too small hash tables for the number of threads. I believe the percentage of draws would have been much higher in a match with more normal conditions."[36]

Grandmaster Hikaru Nakamura also showed skepticism of the significance of the outcome, stating "I don't necessarily put a lot of credibility in the results simply because my understanding is that AlphaZero is basically using the Google super computer and Stockfish doesn't run on that hardware; Stockfish was basically running on what would be my laptop. If you wanna have a match that's comparable you have to have Stockfish running on a super computer as well."[36]


Release versions and development versions are available as C++ source code and as precompiled versions for Microsoft Windows, macOS, Linux 32-bit/64-bit and Android.

Stockfish has been a very popular engine for various platforms. On the desktop, it is the default chess engine bundled with the Internet Chess Club interface programs BlitzIn and Dasher. On the mobile platform, it has been bundled with the Stockfish app, SmallFish and Droidfish. Other Stockfish-compatible graphical user interfaces (GUIs) include Fritz, Arena, Stockfish for Mac, and PyChess.[37][38] As of March 2017, Stockfish is the AI used by Lichess,[39] a popular online chess site.


  1. ^ The academic paper on this sequence of games does not provide the computer resources allocated to each engine.[35]


  1. ^ "Stockfish/src/uci.cpp". Retrieved 18 March 2016.
  2. ^ Cite error: The named reference About was invoked but never defined (see the help page).
  3. ^ Chabris, Christopher. "The Real Kings of Chess Are Computers". Wall Street Journal. Retrieved 18 September 2015.
  4. ^ Eade, James (2016). Chess for Dummies. Hoboken, New Jersey: John Wiley & Sons. p. 476. ISBN 9781119280033. OCLC 960819719. Retrieved 2 January 2017.
  5. ^ "CEGT Best Versions 40/20 (AMD 4200+)". Chess Engines Grand Tournament. 29 June 2014. Archived from the original on 8 September 2012. Retrieved 1 July 2014.
  6. ^ "CCRL 40/40". Computer Chess Rating Lists. 29 June 2014. Archived from the original on 2 October 2011. Retrieved 1 July 2014.
  7. ^ "IPON Rating List". 6 June 2014. Retrieved 1 July 2014.
  8. ^ Kaufman, Larry (24 November 2013). "Stockfish depth vs. others; challenge". Retrieved 8 March 2014.
  9. ^ Kislik, Erik (6 June 2014). "IM Erik Kislik analyzes the TCEC Superfinal in-depth". Retrieved 7 June 2014.
  10. ^ "Stockfish development versions". Archived from the original on 11 November 2014. Retrieved 1 February 2015.
  11. ^ "Stockfish Testing Framework". Retrieved 7 March 2014.
  12. ^ "Get Involved". Retrieved 8 March 2014.
  13. ^ Costalba, Marco (1 May 2013). "Fishtest Distributed Testing Framework". Retrieved 18 April 2014.
  14. ^ "Stockfish Testing Framework - Users". Retrieved 14 June 2018.
  15. ^ "Fast GM Rating List".
  16. ^ "CCRL Rating List". Archived from the original on 2014-05-30.
  17. ^ "Stockfish Blog on Stockfish DD".
  18. ^ a b c "TCEC Season Archive". Retrieved 9 January 2015.
  19. ^ Costalba, Marco (31 May 2014). "Stockfish 5". Retrieved 19 June 2014.
  20. ^ "Stockfish is the TCEC Season 9 Grand Champion". Chessdom. Retrieved 5 December 2016.
  21. ^ "TCEC Season 11 Superfinal 2018". Retrieved 2018-11-18.
  22. ^ "TCEC Season 12 Superfinal 2018". Retrieved 2018-11-18.
  23. ^ "TCEC Season 13 Superfinal 2018". Retrieved 2018-11-18.
  24. ^ "Stockfish convincingly wins TCEC Season 11". Chessdom. Retrieved 18 April 2018.
  25. ^
  26. ^ "When artificial intelligence evaluates chess champions". Science Daily. CNRS. 25 April 2017.
  27. ^ a b announces computer chess championship.
  28. ^ Stockfish wins computer championship.
  29. ^ Leela vs Stockfish, CCCC bonus games, 1-0
  30. ^ LC0 vs. Stockfish
  31. ^ Knapton, Sarah; Watson, Leon (6 December 2017). "Entire human chess knowledge learned and surpassed by DeepMind's AlphaZero in four hours". Retrieved 6 December 2017.
  32. ^ Vincent, James (6 December 2017). "DeepMind's AI became a superhuman chess player in a few hours, just for fun". The Verge. Retrieved 6 December 2017.
  33. ^ a b "'Superhuman' Google AI claims chess crown". BBC News. 6 December 2017. Retrieved 7 December 2017.
  34. ^ "DeepMind's AlphaZero crushes chess". 6 December 2017. Retrieved 13 December 2017.
  35. ^ Silver, David; Hubert, Thomas; Schrittwieser, Julian; Antonoglou, Ioannis; Lai, Matthew; Guez, Arthur; Lanctot, Marc; Sifre, Laurent; Kumaran, Dharshan; Graepel, Thore; Lillicrap, Timothy; Simonyan, Karen; Hassabis, Demis (5 December 2017). "Mastering Chess and Shogi by Self-Play with a General Reinforcement Learning Algorithm". arXiv:1712.01815 [cs.AI].
  36. ^ a b "AlphaZero: Reactions From Top GMs, Stockfish Author". 8 December 2017. Retrieved 13 December 2017.
  37. ^ Using the Stockfish Engine, Stockfish Support.
  38. ^ ChessEngines, PyChess Github.
  39. ^ [1] Lichess uses Stockfish announcement.

Further readingEdit

External linksEdit