Hello, I have modified the URL title script to log URL's posted to a chan and output it to a html file. It works sometimes, it will log a few links, but then for some strange reason it will overwrite the whole file with a blank file, and start over.. It usually will log 3-4 urls before restarting.. Here is the part I modified (the rest of the script seems to work well). I each url posted to be inserted into the top of the file
if {[string match "*://*" $word]} {
set ulf [open /home/peakoil/public_html/urls.html r]
set ulogdata [read -nonewline $ulf]
close $ulf
set ulines [split $ulogdata "\n"]
set ulf [open /home/peakoil/public_html/urls.html w]
set urlline "<a href=\"$word\">$urtitle</a><br>"
set ulines [linsert $ulines 0 $urlline]
puts $ulf [join $ulines "\n"]
close $ulf
}
If you're going to modify other peoples scripts, you best learn how to debug them as well.. Try adding putcmdlog lines in so you can see what's going on.
And opening twice the same file is quite unoptimised.
You can open the file in read/write mode and get each line one by one. If the line matches your current URL, exit the loop (and do nothing). When arriving at the end of the file (so, you didn't see the url), just append it to your file.
ok let me clear some things up. The code you saw is mine, I added it into a script that grabs the url title of a url posted to a chan. It had url logging capabilities which work fine; however, it outputted the urls to an eggdrop log file. I want the URL's to be visible on a website, so I came up with that code. I want each url posted to be inserted into the top of the file, not appended at the bottom
Ok, sorry, I didn't look completely your code.
Peharps your variable is too big for the system, you can try adding each $uline with a foreach loop.
I know this trouble exists with exec command when the string is too long, so try this way.