This is the new home of the egghelp.org community forum.
All data has been migrated (including user logins/passwords) to a new phpBB version.


For more information, see this announcement post. Click the X in the top right-corner of this box to dismiss this message.

Processing large amounts of data

Help for those learning Tcl or writing their own scripts.
Post Reply
a
antsukka
Voice
Posts: 2
Joined: Sun Jan 03, 2010 11:26 am

Processing large amounts of data

Post by antsukka »

I'm writing this dictionary style script that is going to have tens of thousands of data entries. Sadly my target server doesn't allow me to install mysqltcl so I'm settling into putting my data into a .csv file (If you have any ideas for data storage that don't require any dependencies I'd like to hear them too). However going through text file line by line sounds really inefficient. So I was wondering if anyone has done such scripts before and maybe have some tips or examples how to reduce unnecessary server load and more importantly - seek times.

I was thinking about dividing my data into multiple files. After all entries have their unique identifiers as search key and therefore can be easily spread into logical pieces that would be only couple of thousand lines long. I'm really lazy coder, if you have existing code to do such search from multiple files please paste it here ;)
Post Reply