COW-database & SQLITE usefull scripts
-
- Posts: 6
- Joined: Wed Jul 13, 2022 10:12 am
COW-database & SQLITE usefull scripts
Hello all,
i discovered that COW uses the sqlite database format so i made some simple queries for people to play with.
Maybe somebody has some other usefull sql-queries that we can use.
I think Query-3 is handy to create a custom made Ebook.
First create from youre mainbase a pgn-file with the opening you want to create.
Create a new Ebook and import the pgn-file.
If needed, Import additional pgn-files in the new created Ebook.
Run query-3 to import the already existing position evaluations.
Export end-positions to an EPD-file
Analyse the EPD-file with engine
Backsolve the complete file and the new Ebook is complete.
Additional you maybe want to import the new Ebook in youre mainbase and backsolve that also.
My mainbook sofar, which contains all openings : https://wetransfer.com/downloads/bca995 ... 758/fb170b
this book is completely analysed with stockfish-15 at depth 25 or more.
I encourage other people to make there books also available so we can create 1 big analysed ebook.
Q U E R I E S :
always backup youre existing files before executing the queries, you never know !!
blahblahblah -- garbage command to prevent the complete script runnning by accident
-- in both bases the tablenames and fields are identical, we use main. to refer to our mainbase
-- ChessU is the file that will be updated
-- main is the COW database with the correct position assessments and comments
-- just select a Query and run the selected query lines
-- Query-1 : DISPLAY THE RECORDS IN BOTH FILES
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
select fboard, fnumeric_assessment from ChessU.tmainPosition;
--limit 5; -- display all records -field from ChessU
select fboard, fnumeric_assessment from main.tmainPosition;
--limit 5: -- display all record -field from the main database
-- Query-2 : displays all records that have the same position in both bases
-- optional is : where the value = -32768 (initual value)
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
SELECT ChessU.tmainPosition.fboard,
ChessU.tMainPosition.fnumeric_assessment,
main.tMainPosition.fnumeric_assessment
FROM ChessU.tMainPosition,
main.tMainPosition
WHERE ChessU.tMainPosition.fboard = main.tMainPosition.fboard
--AND ChessU.tmainPosition.fnumeric_assessment = -32768 -- optional
;
-- Query-3 : update ChessU.tmainPosition.fnumeric_assessment
-- with the value in the main book
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
UPDATE ChessU.tMainPosition
SET fnumeric_assessment =
( SELECT main.tMainPosition.fnumeric_assessment
FROM main.tMainPosition
WHERE ChessU.tMainPosition.fboard = main.tMainPosition.fboard
)
;
UPDATE ChessU.tMainPosition
SET fnumeric_assessment = -32768
WHERE fnumeric_assessment is NULL
;
-- Query-4 : change/delete comments from a database
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
UPDATE ChessU.tMainPosition
SET fcomments = '' ; -- blanks all comments
Vacuum ChessU;
-- Query-5 : change/delete comments from main database
-- and clean up the database
-- *******************************************************************************
-- ** Be carefull : this deletes all comments in youre main base !!!!!!!!!!!!!! *
-- *******************************************************************************
UPDATE tMainPosition
SET fcomments = '' ;
vacuum main;
-- Query-6 : Cleanup youre main database, removes empty/deleted records
-- attached databases cant be cleaned this way
vacuum main;
i discovered that COW uses the sqlite database format so i made some simple queries for people to play with.
Maybe somebody has some other usefull sql-queries that we can use.
I think Query-3 is handy to create a custom made Ebook.
First create from youre mainbase a pgn-file with the opening you want to create.
Create a new Ebook and import the pgn-file.
If needed, Import additional pgn-files in the new created Ebook.
Run query-3 to import the already existing position evaluations.
Export end-positions to an EPD-file
Analyse the EPD-file with engine
Backsolve the complete file and the new Ebook is complete.
Additional you maybe want to import the new Ebook in youre mainbase and backsolve that also.
My mainbook sofar, which contains all openings : https://wetransfer.com/downloads/bca995 ... 758/fb170b
this book is completely analysed with stockfish-15 at depth 25 or more.
I encourage other people to make there books also available so we can create 1 big analysed ebook.
Q U E R I E S :
always backup youre existing files before executing the queries, you never know !!
blahblahblah -- garbage command to prevent the complete script runnning by accident
-- in both bases the tablenames and fields are identical, we use main. to refer to our mainbase
-- ChessU is the file that will be updated
-- main is the COW database with the correct position assessments and comments
-- just select a Query and run the selected query lines
-- Query-1 : DISPLAY THE RECORDS IN BOTH FILES
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
select fboard, fnumeric_assessment from ChessU.tmainPosition;
--limit 5; -- display all records -field from ChessU
select fboard, fnumeric_assessment from main.tmainPosition;
--limit 5: -- display all record -field from the main database
-- Query-2 : displays all records that have the same position in both bases
-- optional is : where the value = -32768 (initual value)
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
SELECT ChessU.tmainPosition.fboard,
ChessU.tMainPosition.fnumeric_assessment,
main.tMainPosition.fnumeric_assessment
FROM ChessU.tMainPosition,
main.tMainPosition
WHERE ChessU.tMainPosition.fboard = main.tMainPosition.fboard
--AND ChessU.tmainPosition.fnumeric_assessment = -32768 -- optional
;
-- Query-3 : update ChessU.tmainPosition.fnumeric_assessment
-- with the value in the main book
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
UPDATE ChessU.tMainPosition
SET fnumeric_assessment =
( SELECT main.tMainPosition.fnumeric_assessment
FROM main.tMainPosition
WHERE ChessU.tMainPosition.fboard = main.tMainPosition.fboard
)
;
UPDATE ChessU.tMainPosition
SET fnumeric_assessment = -32768
WHERE fnumeric_assessment is NULL
;
-- Query-4 : change/delete comments from a database
attach database 'c:\COW-Bases\ChessU.4MD' as ChessU;
UPDATE ChessU.tMainPosition
SET fcomments = '' ; -- blanks all comments
Vacuum ChessU;
-- Query-5 : change/delete comments from main database
-- and clean up the database
-- *******************************************************************************
-- ** Be carefull : this deletes all comments in youre main base !!!!!!!!!!!!!! *
-- *******************************************************************************
UPDATE tMainPosition
SET fcomments = '' ;
vacuum main;
-- Query-6 : Cleanup youre main database, removes empty/deleted records
-- attached databases cant be cleaned this way
vacuum main;
Re: COW-database & SQLITE usefull scripts
I don't get it.
Can you explain it all again ? What is this used for ? Why not let the tool do it for you ?
Can you explain it all again ? What is this used for ? Why not let the tool do it for you ?
-
- Posts: 6
- Joined: Wed Jul 13, 2022 10:12 am
Re: COW-database & SQLITE usefull scripts
Hello reivax,
Query-3 : how else do you create a specific opening ebook. Say the Caro-Kann with all the numeric assessments of youre standard book in it ?
i see nowhere an option to import the numeric assessment of the standard book into a new created book after importing pgn-files into that
new book.
Query-4 : say you get an ebook from somebody else, how to delete all the comments before importing it into youre standard database ?
Query-6 : i haven´t seen an option in COW to compress the database, the size should shrinks after deleting comments/moves but maybe i overlook something.
So these queries could help you with these questions, maybe you will use them or maybe you don´t. Or maybe you get ideas on youre own for other queries.
Next can somebody help me with the following problem i have after importing a PGN-game with position evaluations in it. for example :
The sample pgn-game would create the following comment in the position after 1. d4 Nf6 2. Bg5 c5 3. Bxf6 gxf6 4. e3 Qb6 5. Nc3 cxd4 :
-0.51/23 Deep Fritz 14 x64/Clare
-0.02/24 Houdini 4 w32
My problem is how to strip these 2 (or more lines) lines down to :
-051 23
-002 24
then sort the lines with the depth value :
-002 24
-051 23
after this is done i should be able to create a script or query to adjust the position evaluation in the ebook automaticly. So any help would be appreciated. This could save us all lots of time when creating ebooks with position evaluations. Any ideas are welcome.
[Event "sample-pgn"]
[Date "2022.06.28"]
[White ""]
[Black "?"]
1. d4 Nf6 2. Bg5 c5 3. Bxf6 gxf6 4. e3 Qb6 5. Nc3 Qxb2 (5... cxd4 6. Qxd4 Qxb2 {-0.33/25 Deep Fritz 13/Clare})
(5... cxd4 6. exd4 Qxb2 {-0.51/23 Deep Fritz 14 x64/Clare}) (5... cxd4 6. exd4
Qxb2 {-0.02/24 Houdini 4 w32}) *
Query-3 : how else do you create a specific opening ebook. Say the Caro-Kann with all the numeric assessments of youre standard book in it ?
i see nowhere an option to import the numeric assessment of the standard book into a new created book after importing pgn-files into that
new book.
Query-4 : say you get an ebook from somebody else, how to delete all the comments before importing it into youre standard database ?
Query-6 : i haven´t seen an option in COW to compress the database, the size should shrinks after deleting comments/moves but maybe i overlook something.
So these queries could help you with these questions, maybe you will use them or maybe you don´t. Or maybe you get ideas on youre own for other queries.
Next can somebody help me with the following problem i have after importing a PGN-game with position evaluations in it. for example :
The sample pgn-game would create the following comment in the position after 1. d4 Nf6 2. Bg5 c5 3. Bxf6 gxf6 4. e3 Qb6 5. Nc3 cxd4 :
-0.51/23 Deep Fritz 14 x64/Clare
-0.02/24 Houdini 4 w32
My problem is how to strip these 2 (or more lines) lines down to :
-051 23
-002 24
then sort the lines with the depth value :
-002 24
-051 23
after this is done i should be able to create a script or query to adjust the position evaluation in the ebook automaticly. So any help would be appreciated. This could save us all lots of time when creating ebooks with position evaluations. Any ideas are welcome.
[Event "sample-pgn"]
[Date "2022.06.28"]
[White ""]
[Black "?"]
1. d4 Nf6 2. Bg5 c5 3. Bxf6 gxf6 4. e3 Qb6 5. Nc3 Qxb2 (5... cxd4 6. Qxd4 Qxb2 {-0.33/25 Deep Fritz 13/Clare})
(5... cxd4 6. exd4 Qxb2 {-0.51/23 Deep Fritz 14 x64/Clare}) (5... cxd4 6. exd4
Qxb2 {-0.02/24 Houdini 4 w32}) *
-
- Posts: 179
- Joined: Wed Jan 23, 2019 12:31 am
Re: COW-database & SQLITE usefull scripts
Your computer skills are way beyond mine. I assume one cannot run an SQL query from within COW. Please correct me if I am wrong.
I believe Mike released a video that showed how to do something similar to what you are attempting to do. The process was too complicated for me at the time.
I experimented with a file of PGN games. These had not been previously analyzed and evaluated. From within COW, I had COW identify and name 1,000 leaf nodes. I'm sure there are many more, so this step is incomplete. I went to the first leaf node. It had a result which automatically provided an evaluation. I believe that evaluation automatically backsolved. With the f3 button (actually the Fn button and the f3 button), I advanced to the next leaf mode. As before, COW evaluated the position based on the result. I repeated the process. When I came to a position that had no evaluation, I ran the engine, stopped, and entered an evaluation. Then I jumped to the next leaf node.
I kept this up until I grew tired, well before the one thousandth leaf node. It's a tedious process. However I can see that it would be worthwhile if the process could be automated. I think Mike's video explained how to do that, but I haven't been able to lay my hands on it again.
Sometimes players lose on time in a winning position. This is especially important as some chess databases include speed games. While most game results give a valid evaluation of the final position, some do not. I don't think this would be a concern if I could restrict the game by rating and type. Strong players playing with a large time increment will not resign an equal or better position.
I believe Mike released a video that showed how to do something similar to what you are attempting to do. The process was too complicated for me at the time.
I experimented with a file of PGN games. These had not been previously analyzed and evaluated. From within COW, I had COW identify and name 1,000 leaf nodes. I'm sure there are many more, so this step is incomplete. I went to the first leaf node. It had a result which automatically provided an evaluation. I believe that evaluation automatically backsolved. With the f3 button (actually the Fn button and the f3 button), I advanced to the next leaf mode. As before, COW evaluated the position based on the result. I repeated the process. When I came to a position that had no evaluation, I ran the engine, stopped, and entered an evaluation. Then I jumped to the next leaf node.
I kept this up until I grew tired, well before the one thousandth leaf node. It's a tedious process. However I can see that it would be worthwhile if the process could be automated. I think Mike's video explained how to do that, but I haven't been able to lay my hands on it again.
Sometimes players lose on time in a winning position. This is especially important as some chess databases include speed games. While most game results give a valid evaluation of the final position, some do not. I don't think this would be a concern if I could restrict the game by rating and type. Strong players playing with a large time increment will not resign an equal or better position.
-
- Posts: 179
- Joined: Wed Jan 23, 2019 12:31 am
Re: COW-database & SQLITE usefull scripts
I found that valuable video that I had previously mentioned:
https://www.youtube.com/watch?v=UetSrXJfUMg
It covers a lot of ground. I have to admit that I never backsolved from the starting position before. Some of my databases are very large and I did not want to get bogged down. To test this feature, I just now backsolved from the starting position with a data set with 133,000 positions and it did not take too long.
I don't know if Mike's video fully addresses this thread, but it seems to go a long way.
https://www.youtube.com/watch?v=UetSrXJfUMg
It covers a lot of ground. I have to admit that I never backsolved from the starting position before. Some of my databases are very large and I did not want to get bogged down. To test this feature, I just now backsolved from the starting position with a data set with 133,000 positions and it did not take too long.
I don't know if Mike's video fully addresses this thread, but it seems to go a long way.
-
- Posts: 179
- Joined: Wed Jan 23, 2019 12:31 am
Re: COW-database & SQLITE usefull scripts
I followed Mike's video and exported leaf node positions as an EPD file. Then I had COW evaluate them as shown in the video. As this was a large file, it takes a long time and I aborted the process. As I understand the video, Mike is using this feature to evaluate all final positions to confirm that results (win, lose, or draw) are based upon the position - not running out of time.
My problem is a little different. I start with a select group of games - games from higher rated players, say 2400 and above. I will trust the result. However, some final positions end with a comment instead of a result. I wonder if there is a way to have COW automatically evaluate only those games that lack a final outcome. This might only be one game in 30, but it is enough to throw off position evaluations (from backsolving).
My problem is a little different. I start with a select group of games - games from higher rated players, say 2400 and above. I will trust the result. However, some final positions end with a comment instead of a result. I wonder if there is a way to have COW automatically evaluate only those games that lack a final outcome. This might only be one game in 30, but it is enough to throw off position evaluations (from backsolving).
-
- Posts: 179
- Joined: Wed Jan 23, 2019 12:31 am
Re: COW-database & SQLITE usefull scripts
It would be nice to have a pause and resume feature. I think my computer would overheat if I performed this action in one session.FreeRepublic wrote: ↑Wed Jul 20, 2022 5:53 pm As this was a large file, it takes a long time and I aborted the process
-
- Posts: 179
- Joined: Wed Jan 23, 2019 12:31 am
Re: COW-database & SQLITE usefull scripts
Using Mike's video:
https://www.youtube.com/watch?v=UetSrXJfUMg
I was able to evaluate all final positions and backsolve. By restricting the choice of variation and only using the games of highly rated players, I was able to limit the data base to about 400 games. When I analyzed with the engine, I set the defaults as Mike showed, and further reduced the analyze minimum depth to 16 (from 32). That last step sped up the process.
To recap, the end position of 400 games between strong players (2400 and above) were evaluated. Backsolving indicates better and lesser moves all the way back to move 1, based upon the final position and mini-max decisions on the way.
The evaluated data base provides a great place to begin the learning process. One can start to play through the lines. Any improvements will be captured automatically, further improving the quality of the data base. Improvements can come from your own ideas, new games, IM/GM commentary, or your engine.
The process takes a lot of computational horsepower which is taxing with my computer and larger collections of PGN games. A resume/pause button would help. For the moment, I might try letting my computer go into sleep mode and cool off. I have found that COW is very good about waking from a deep sleep.
https://www.youtube.com/watch?v=UetSrXJfUMg
I was able to evaluate all final positions and backsolve. By restricting the choice of variation and only using the games of highly rated players, I was able to limit the data base to about 400 games. When I analyzed with the engine, I set the defaults as Mike showed, and further reduced the analyze minimum depth to 16 (from 32). That last step sped up the process.
To recap, the end position of 400 games between strong players (2400 and above) were evaluated. Backsolving indicates better and lesser moves all the way back to move 1, based upon the final position and mini-max decisions on the way.
The evaluated data base provides a great place to begin the learning process. One can start to play through the lines. Any improvements will be captured automatically, further improving the quality of the data base. Improvements can come from your own ideas, new games, IM/GM commentary, or your engine.
The process takes a lot of computational horsepower which is taxing with my computer and larger collections of PGN games. A resume/pause button would help. For the moment, I might try letting my computer go into sleep mode and cool off. I have found that COW is very good about waking from a deep sleep.
Re: COW-database & SQLITE usefull scripts
I love The Seven Seas level in geometry dash meltdown! The background music from F-777 really adds to the excitement of playing.
-
- Posts: 1
- Joined: Mon Nov 18, 2024 3:27 am
Re: COW-database & SQLITE usefull scripts
Cowsql adds a network protocol to SQLite that allows several instances of your application to operate together as a highly available cluster without relying on external databases.FreeRepublic wrote: ↑Thu Jul 21, 2022 12:44 pm Using Mike's video:
https://www.youtube.com/watch?v=UetSrXJfUMg geometry dash lite
I was able to evaluate all final positions and backsolve. By restricting the choice of variation and only using the games of highly rated players, I was able to limit the data base to about 400 games. When I analyzed with the engine, I set the defaults as Mike showed, and further reduced the analyze minimum depth to 16 (from 32). That last step sped up the process.
To recap, the end position of 400 games between strong players (2400 and above) were evaluated. Backsolving indicates better and lesser moves all the way back to move 1, based upon the final position and mini-max decisions on the way.
The evaluated data base provides a great place to begin the learning process. One can start to play through the lines. Any improvements will be captured automatically, further improving the quality of the data base. Improvements can come from your own ideas, new games, IM/GM commentary, or your engine.
The process takes a lot of computational horsepower which is taxing with my computer and larger collections of PGN games. A resume/pause button would help. For the moment, I might try letting my computer go into sleep mode and cool off. I have found that COW is very good about waking from a deep sleep.
The name "cowsql" roughly refers to the "pets vs. cattle" concept, because it is generally acceptable to delete or rebuild a specific node in an application that utilizes cowsql for data storage.