Stockfish 16.1 is here!

March 7, 2024:
CCRL Blitz
Stockfish 16.1 64-bit 3746.
Stockfish 16 64-bit 3761.
I don't understand how Stockfish 16.1 has a lower rating than stockfish 16, when the score against all the other engines in detail is still better than the previous one. Probably despite the strength this version draws more times than the previous one. Or explain it to me, thanks.
CCRL March 9, 2024:
Stockfish 16.1 64bit score 62,6%
Stockfish 16 64bit score 65,9%
@wonderk said in #40:
> Stockfish 16.1 has 27 Elo points more than stockfish 16. But on the ccrl 40/15 and ccrl blitz site (therefore playing with other engines), it's weaker than stockfish 16

CCRL is not showing Stockfish 16.1 being statistically significantly weaker than 16. Error bars are important and should not be ignored. If the difference is 3 Elo and the error bars are +- 17, it means close to nothing.

@Toscani said in #48:
> Dev versions have Elo changes too.

While Abrok has been a trusted source of development builds for a long time, we recommend downloading our official dev builds from Github:

@farhad5269 said in #49:
> Can I ask you something? Does normal stockfish have access to the table base? Or does it find in the start of the 8 peice and use only it's calculation?

By default Stockfish doesn't have access to tablebases. Tablebases must be downloaded separately and the user has to manually tell the engine where they are using the `SyzygyPath` option.

Tablebases are nice to have but they can take up a lot of storage space (from a few MB to a few TB) and do not increase the Elo of the engine by a significant amount

@Toscani said in #50:
> I don't know, so I asked a similar question to the AI:
> "Stockfish 16.1 now uses 2 NNUE's, Does the engine still need access to opening book or an EGTB? During the initial position does it use the NNUE or it calculates the initial move?"
> AI answers:
> The AI has links so you can confirm the answer.

This is nonsense.
Start Stockfish and type UCI. The "EvalFile" is the large neural network. The "EvalFileSmall" is the small neural network. So there are two NNUE. They are both optimized. What was the smallest and largest chess position? Did it get trained on positions that are in the opening or end games?
Was the Small NN trained on (less than) < 16 pieces?
Was the Large NN trained on (greater than) >16 pieces?

Abstracting positions and generating candidates moves:

Tapered scaling .... weighting of the different criterias:

NN Structure ... four layers:

NNUE - to replace its standard evaluation:

Training Data ... converted from Leela training data

Lc0 self-play training

Why are EPD (last positions of opening lines) and 3 men EGTB still required?

How is Stockfish NNUE Trained?
How should I get stockfish 16.1 on my phone? Because it doesn't exist in the Google store for the android. Because I really want to see how do neurons fire into the stockfish mind about any position?