I am giving a file (in) as input and sorting out the desired matched pattern in a different file(out). Since I am getting the output in loop, I am unable to remove the duplicate item.
set in [open filename_1 r]
set data [read $in]
close $in
set data [split $data "\n"]
foreach line $data {
if {[string match "X*" $line] == 1} {
set new [list $line]
foreach nets $new {
set newnets [lrange $nets 1 4]
lsort -unique $newnets
puts $newnets
}
I'd suggest you create a new, separate, list, and add desired rows there. This allows you to use lsearch to see whether the current data already exists within this list (if it does, just skip it).
One thing I noticed though; the code below is pretty pointless, as the list in $new will always contain one single list item (hence the loop will only run once).
set newdata [lsort -unique [lsearch -all -inline -glob $data "X*"]]
#you do some data-mangling, so we'll do it here... I'm assuming each line is a valid tcl-list on it's own
foreach line $newdata {
puts [lrange $line 1 4]
}