tailuge.github.io/chess-o-tron/public/openings/openingtree.html
analyse wins and losses in your last 100 openings on lichess
all your openings are belong to us
This is very cool. Congratulations!
Both the idea is novel and the execution is spectacular.
If only you implemented candlestick charts of rating history as well.
Very cool!
Yes, cool feature, as usual :)
I'm very inspired by this application because using the combination of the API and the browser is a very flexible way to make sense of the game history ( either creating an opening book as in the OP, or charts which is my whimsy ).
I already have the candlestick charts in my offline standalone interface but this is very cumbersome ( start the interface, download the PGN, compile the charts, takes ages ).
I'm talking about charts like this:
smartchessguiapp.github.io/chartpage.html
I have my own rough shot trying to aggregate the history from 100 games block:
smartchessguiapp.github.io/ligames.html
However with this I'm running into the problem, that even if I wait 2 seconds between requests, after a few automated requests the server denies service. I have to find a mechanism to cache game history and only add pages that are missing.
@sakkozik it is possible to fetch multiple pages and stay under the lichess rate limit, you can copy the code to do it. e.g. in your last 300 games I would say you need to work on your d4 lines :) clicking on the leaf nodes will take you to the games the data is drawn from.
tailuge.github.io/chess-o-tron/public/openings/openingtree.html?player=sakkozik&filter=&pages=1%2C2%2C3&colour=white&trim=true
I added a transmogrify button, for kicks.
Unfortunately the same error occurs with your code. After loading 15 pages the server responds with status 429 Too Many Requests and the script hangs.
Console log:
smartchessguiapp.github.io/imagepages/toomanyreqerrpage.html
nice!
please add an option to filter variants (including crazyhouse) thanks
1500 games ! I'm not surprised they limit total fetches per minute to stop DoS attacks by @sakkozik :)
@mathace I will look into it, wondering if what I wrote can cope with crazyhouse move syntax.
My problem is that the whole thing is not transparent. The only thing lila ReadMe says that you should wait 1 second between requests ( you wait 1.5 seconds, I wait 2 seconds ). It is not clear what is the allowed rate / minute, whether it depends on content length or number of requests, what other limitations are there. If the limits - no matter how harsh - are known you can program for them and wait patiently for your result. As things stand currently you can never know when your script hangs.
This topic has been archived and can no longer be replied to.