Text vs Sqlite Benchmark
Jun 16, 2013 23:49:35 GMT 1
Post by 2lss on Jun 16, 2013 23:49:35 GMT 1
I need a good method for storing some data (for another program) so I made this program to compare plain text files vs. a sqlite database.
Basically it just enters random strings of data in the form of a table into either a text file or a sqlite database. From there you can read the files to get an idea of which method is faster.The downside is that you have to use the 'time' shell command due to the fact that the bacon SECONDS function only returns whole seconds.
EDIT - Per vovchik's suggestion, I have added TIMER for benchmarking. No need to use the 'time' shell command.
From what I see, it takes about the same amount of time to read a text file versus a sqlite database of the same dimensions, however when writing, the sqlite database is much slower than the text file. Overall I think the moral of the story is that if your reading from the sqlite database then your ok, but writing large amounts of data may take awhile.
Make sure to use the patched version of the sqlite bacon include file. I had to change a few things to suppress the column headings.
Examples:
Write a table to a text file that has 100 rows, 10 columns, and each string is 5 random characters long.
Same as above but writes to a sqlite database instead
Reads the contents of the text file and stores the values into an associative array. The -v flag causes the values to be printed.
Same as above but with a sqlite database and no output.
Any thoughts or suggestions welcome ;D
EDIT - There is no way for my to update the attachment. See post five for the new version.
Basically it just enters random strings of data in the form of a table into either a text file or a sqlite database. From there you can read the files to get an idea of which method is faster.
EDIT - Per vovchik's suggestion, I have added TIMER for benchmarking. No need to use the 'time' shell command.
From what I see, it takes about the same amount of time to read a text file versus a sqlite database of the same dimensions, however when writing, the sqlite database is much slower than the text file. Overall I think the moral of the story is that if your reading from the sqlite database then your ok, but writing large amounts of data may take awhile.
Make sure to use the patched version of the sqlite bacon include file. I had to change a few things to suppress the column headings.
Examples:
./txtvsdb -r 100 -c 10 -s 5 --write-text
Write a table to a text file that has 100 rows, 10 columns, and each string is 5 random characters long.
./txtvsdb -r 100 -c 10 -s 5 --write-db
Same as above but writes to a sqlite database instead
./txtvsdb --read-text -v
Reads the contents of the text file and stores the values into an associative array. The -v flag causes the values to be printed.
./txtvsdb --read-db
Same as above but with a sqlite database and no output.
Any thoughts or suggestions welcome ;D
EDIT - There is no way for my to update the attachment. See post five for the new version.