I'm writing this dictionary style script that is going to have tens of thousands of data entries. Sadly my target server doesn't allow me to install mysqltcl so I'm settling into putting my data into a .csv file (If you have any ideas for data storage that don't require any dependencies I'd like to hear them too). However going through text file line by line sounds really inefficient. So I was wondering if anyone has done such scripts before and maybe have some tips or examples how to reduce unnecessary server load and more importantly - seek times.
I was thinking about dividing my data into multiple files. After all entries have their unique identifiers as search key and therefore can be easily spread into logical pieces that would be only couple of thousand lines long. I'm really lazy coder, if you have existing code to do such search from multiple files please paste it here ;)