This is the new home of the egghelp.org community forum.
All data has been migrated (including user logins/passwords) to a new phpBB version.


For more information, see this announcement post. Click the X in the top right-corner of this box to dismiss this message.

search a file delete a line

Help for those learning Tcl or writing their own scripts.
Post Reply
t
topdawg_b
Voice
Posts: 32
Joined: Sun Dec 07, 2008 7:13 pm

search a file delete a line

Post by topdawg_b »

I have a file it is a text file it's dedigned like this.

word1 info about word 1
word2 info about word 2
word3 info about word 3

the file is called trigger.txt

i want to delete a line. like seek out word2 if the text line begins with word2, delete that whole line. leaving the file

word1 info about word 1
word3 info about word 3

what is the most effecient wat of doing this?
v
vigilant
Halfop
Posts: 48
Joined: Thu Jan 05, 2006 12:06 am

Post by vigilant »

You delete it by using lsearch function, renaming the file, and then replacing.
Anser Quraishi
Website: http://www.anserq.com
t
topdawg_b
Voice
Posts: 32
Joined: Sun Dec 07, 2008 7:13 pm

Post by topdawg_b »

I know how to use the lsearch command
not sure what you mean by renaming it and replacing it.
this is what I have so far
thestb would have the "word2" in the exaple i gave

Code: Select all

set thestb [lindex $text 1]
set in [open "triggers.txt" r] 
set data [read $in] 
set line [split $data \n] 
set here [lsearch [string tolower $line] "$thestb *"]
thanks for the help[/code]
t
topdawg_b
Voice
Posts: 32
Joined: Sun Dec 07, 2008 7:13 pm

Post by topdawg_b »

this is what i came up with while i was waiting. I am sure there are errors even though it seems to work. please advise on inproper syntax. thanks
where test.ins has 3 lines in it
line 1
line 2
line 3
command = test:del file.ins 1
file becomes
line 1
line 3

Code: Select all

proc test:del {file num} {
 set out [open $file r]
 set data [read $out] 
 set line [split $data \n] 
 close $out
 set out [open $file w]
 set x 0
  while {[lindex $line $x] != ""} {
  if {$x != $num} {puts $out "[lindex $line $x]\r"}
  incr x
 }
 close $out
}
n
nml375
Revered One
Posts: 2860
Joined: Fri Aug 04, 2006 2:09 pm

Post by nml375 »

There are numerous approaches to do this.

Going with your first approach, you've determined the list offset of the wanted line. Having that, creating the new file would be a mere matter of using lreplace to replace (remove) the list item from the list, using join with a custom separator (similar, but the opposite, to split) to convert the new list to a string, and writing it to the new file.

A somewhat different approach, would be to read the file and convert it into a list (like before), but rather than using lsearch, simply use foreach to iterate through the whole list and test each item for the keyword, at the same time writing any non-matching line to the new file. This will provide more powerful tools for the matching, but will use more resources while running.
NML_375
g
game_over
Voice
Posts: 29
Joined: Thu Apr 26, 2007 7:22 am

Post by game_over »

Code: Select all

proc test:del {file num criteria} { ;# criteria is must ot be "word2"
 set out [open $file r]
 set data [read $out]
 set line [split $data \n]
 close $out
 set out1 [open $file w] ;# $out i use diferent names to w nad r 
   foreach delline $line {
   if {[lindex $delline 0] == "$criteria" && [lindex $delline 0] != ""} {
   putlog "$delline" ;# if you see this is efcet in log !!!!! 1 !!!!!!
   puts $out1 "$delline"
  }
}
 close $out1
}
!!!!! 1 !!!!!! ->
word1 info about word 1
word3 info about word 3
t
topdawg_b
Voice
Posts: 32
Joined: Sun Dec 07, 2008 7:13 pm

Post by topdawg_b »

shouldn't this line
if {[lindex $delline 0] == "$criteria" && [lindex $delline 0] != ""} {
putlog "$delline" ;# if you see this is efcet in log !!!!! 1 !!!!!!
puts $out1 "$delline"
}
actually be
if {[lindex $delline 0] != "$criteria" && [lindex $delline 0] != ""} {
putlog "$delline" ;# if you see this is efcet in log !!!!! 1 !!!!!!
puts $out1 "$delline"
}
in order to put the non deleted info into the file?

are there any limits to using
set data [read $out]
set line [split $data \n]

would this same method work if the file had 5000 lines?
n
nml375
Revered One
Posts: 2860
Joined: Fri Aug 04, 2006 2:09 pm

Post by nml375 »

The only limit is memory, as read will try to read as much data at once as possible from the file.

Although not a limit for the script, time might be a limit for your eggdrop, as while this script is processing, no other actions will be taken, possibly causing your eggdrop to ping timeout, etc..

For huge datasources, I'd suggest something like below. Keep in mind that this will only process one line per second, so filtering a huge file will take considerable time, however, since it's driven by timers, your eggdrop will remain responsive meanwhile. It also implements a simple filelock to prevent multiple filterings at once.

Code: Select all

proc StartFilter {File Pattern} {
 if {[info exists ::FilterLockfile] && $::FilterLockfile == 1} {return 0}
 set ::FilterLockfile 1
 set fIdRead [open "$File" "RDONLY"]
 while {[file exists [set tmpfile [file join ${temp-path} [randstring 8]]]]} {}
 set fIdWrite [open "$tmpfile" "WRONLY CREAT"]
 fconfigure $fIdRead -blocking 0
 ProcessFile $fIdRead $fIdWrite $Pattern [list file rename -force -- $tmpfile $File]
}

proc ProcessFile {ReadFId WriteFId Pattern Cleanup} {
 if {[gets $ReadFId string] == -1 && [eof $ReadFId]} {
  close $ReadFId
  close $WriteFId
  eval $Cleanup
  set ::FilterLockfile 0
 } {
  if {![string equal -nocase $Pattern [lindex [split $string] 0]]} {
   puts $WriteFId $string
  }
  utimer 1 [list ProcessFile $ReadFId $WriteFId $Pattern $Cleanup]
 }
}
NML_375
Post Reply