| View previous topic :: View next topic |
| Author |
Message |
pranjal_ccna961 Voice
Joined: 18 Feb 2009 Posts: 5
|
Posted: Mon Feb 23, 2009 12:27 am Post subject: how to remove duplicate names in the o/p file. |
|
|
Hi,
I am giving a file (in) as input and sorting out the desired matched pattern in a different file(out). Since I am getting the output in loop, I am unable to remove the duplicate item.
set in [open filename_1 r]
set data [read $in]
close $in
set data [split $data "\n"]
foreach line $data {
if {[string match "X*" $line] == 1} {
set new [list $line]
foreach nets $new {
set newnets [lrange $nets 1 4]
lsort -unique $newnets
puts $newnets
}
Can anyone help me in this? |
|
| Back to top |
|
 |
nml375 Revered One
Joined: 04 Aug 2006 Posts: 2857
|
Posted: Mon Feb 23, 2009 12:42 pm Post subject: |
|
|
I'd suggest you create a new, separate, list, and add desired rows there. This allows you to use lsearch to see whether the current data already exists within this list (if it does, just skip it).
One thing I noticed though; the code below is pretty pointless, as the list in $new will always contain one single list item (hence the loop will only run once).
| Code: | set new [list $line]
foreach nets $new { |
_________________ NML_375, idling at #eggdrop@IrcNET |
|
| Back to top |
|
 |
nml375 Revered One
Joined: 04 Aug 2006 Posts: 2857
|
Posted: Mon Feb 23, 2009 12:49 pm Post subject: |
|
|
Or, actually, since you use lists and glob-style matching, this could probably be made alot simpler...
| Code: | set newdata [lsort -unique [lsearch -all -inline -glob $data "X*"]]
#you do some data-mangling, so we'll do it here... I'm assuming each line is a valid tcl-list on it's own
foreach line $newdata {
puts [lrange $line 1 4]
} |
_________________ NML_375, idling at #eggdrop@IrcNET |
|
| Back to top |
|
 |
|