| View previous topic :: View next topic |
| Author |
Message |
topdawg_b Voice
Joined: 07 Dec 2008 Posts: 32
|
Posted: Fri Dec 12, 2008 5:28 pm Post subject: search a file delete a line |
|
|
I have a file it is a text file it's dedigned like this.
word1 info about word 1
word2 info about word 2
word3 info about word 3
the file is called trigger.txt
i want to delete a line. like seek out word2 if the text line begins with word2, delete that whole line. leaving the file
word1 info about word 1
word3 info about word 3
what is the most effecient wat of doing this? |
|
| Back to top |
|
 |
vigilant Halfop
Joined: 05 Jan 2006 Posts: 48
|
Posted: Fri Dec 12, 2008 10:57 pm Post subject: |
|
|
You delete it by using lsearch function, renaming the file, and then replacing. _________________ Anser Quraishi
Website: http://www.anserq.com |
|
| Back to top |
|
 |
topdawg_b Voice
Joined: 07 Dec 2008 Posts: 32
|
Posted: Fri Dec 12, 2008 11:49 pm Post subject: |
|
|
I know how to use the lsearch command
not sure what you mean by renaming it and replacing it.
this is what I have so far
thestb would have the "word2" in the exaple i gave
| Code: |
set thestb [lindex $text 1]
set in [open "triggers.txt" r]
set data [read $in]
set line [split $data \n]
set here [lsearch [string tolower $line] "$thestb *"] |
thanks for the help[/code] |
|
| Back to top |
|
 |
topdawg_b Voice
Joined: 07 Dec 2008 Posts: 32
|
Posted: Sat Dec 13, 2008 7:00 am Post subject: |
|
|
this is what i came up with while i was waiting. I am sure there are errors even though it seems to work. please advise on inproper syntax. thanks
where test.ins has 3 lines in it
line 1
line 2
line 3
command = test:del file.ins 1
file becomes
line 1
line 3
| Code: |
proc test:del {file num} {
set out [open $file r]
set data [read $out]
set line [split $data \n]
close $out
set out [open $file w]
set x 0
while {[lindex $line $x] != ""} {
if {$x != $num} {puts $out "[lindex $line $x]\r"}
incr x
}
close $out
}
|
|
|
| Back to top |
|
 |
nml375 Revered One
Joined: 04 Aug 2006 Posts: 2857
|
Posted: Sat Dec 13, 2008 9:17 pm Post subject: |
|
|
There are numerous approaches to do this.
Going with your first approach, you've determined the list offset of the wanted line. Having that, creating the new file would be a mere matter of using lreplace to replace (remove) the list item from the list, using join with a custom separator (similar, but the opposite, to split) to convert the new list to a string, and writing it to the new file.
A somewhat different approach, would be to read the file and convert it into a list (like before), but rather than using lsearch, simply use foreach to iterate through the whole list and test each item for the keyword, at the same time writing any non-matching line to the new file. This will provide more powerful tools for the matching, but will use more resources while running. _________________ NML_375, idling at #eggdrop@IrcNET |
|
| Back to top |
|
 |
game_over Voice
Joined: 26 Apr 2007 Posts: 29
|
Posted: Mon Dec 15, 2008 8:19 am Post subject: |
|
|
| Code: | proc test:del {file num criteria} { ;# criteria is must ot be "word2"
set out [open $file r]
set data [read $out]
set line [split $data \n]
close $out
set out1 [open $file w] ;# $out i use diferent names to w nad r
foreach delline $line {
if {[lindex $delline 0] == "$criteria" && [lindex $delline 0] != ""} {
putlog "$delline" ;# if you see this is efcet in log !!!!! 1 !!!!!!
puts $out1 "$delline"
}
}
close $out1
} |
!!!!! 1 !!!!!! ->
word1 info about word 1
word3 info about word 3 |
|
| Back to top |
|
 |
topdawg_b Voice
Joined: 07 Dec 2008 Posts: 32
|
Posted: Tue Dec 16, 2008 9:04 am Post subject: |
|
|
shouldn't this line
| Quote: |
if {[lindex $delline 0] == "$criteria" && [lindex $delline 0] != ""} {
putlog "$delline" ;# if you see this is efcet in log !!!!! 1 !!!!!!
puts $out1 "$delline"
}
|
actually be
| Quote: |
if {[lindex $delline 0] != "$criteria" && [lindex $delline 0] != ""} {
putlog "$delline" ;# if you see this is efcet in log !!!!! 1 !!!!!!
puts $out1 "$delline"
}
|
in order to put the non deleted info into the file?
are there any limits to using
set data [read $out]
set line [split $data \n]
would this same method work if the file had 5000 lines? |
|
| Back to top |
|
 |
nml375 Revered One
Joined: 04 Aug 2006 Posts: 2857
|
Posted: Tue Dec 16, 2008 3:45 pm Post subject: |
|
|
The only limit is memory, as read will try to read as much data at once as possible from the file.
Although not a limit for the script, time might be a limit for your eggdrop, as while this script is processing, no other actions will be taken, possibly causing your eggdrop to ping timeout, etc..
For huge datasources, I'd suggest something like below. Keep in mind that this will only process one line per second, so filtering a huge file will take considerable time, however, since it's driven by timers, your eggdrop will remain responsive meanwhile. It also implements a simple filelock to prevent multiple filterings at once.
| Code: |
proc StartFilter {File Pattern} {
if {[info exists ::FilterLockfile] && $::FilterLockfile == 1} {return 0}
set ::FilterLockfile 1
set fIdRead [open "$File" "RDONLY"]
while {[file exists [set tmpfile [file join ${temp-path} [randstring 8]]]]} {}
set fIdWrite [open "$tmpfile" "WRONLY CREAT"]
fconfigure $fIdRead -blocking 0
ProcessFile $fIdRead $fIdWrite $Pattern [list file rename -force -- $tmpfile $File]
}
proc ProcessFile {ReadFId WriteFId Pattern Cleanup} {
if {[gets $ReadFId string] == -1 && [eof $ReadFId]} {
close $ReadFId
close $WriteFId
eval $Cleanup
set ::FilterLockfile 0
} {
if {![string equal -nocase $Pattern [lindex [split $string] 0]]} {
puts $WriteFId $string
}
utimer 1 [list ProcessFile $ReadFId $WriteFId $Pattern $Cleanup]
}
}
|
_________________ NML_375, idling at #eggdrop@IrcNET |
|
| Back to top |
|
 |
|