This is the new home of the egghelp.org community forum.
All data has been migrated (including user logins/passwords) to a new phpBB version.


For more information, see this announcement post. Click the X in the top right-corner of this box to dismiss this message.

RSSNews.tcl - by perpleXa --- Help!

Support & discussion of released scripts, and announcements of new releases.
S
SWAT
Voice
Posts: 2
Joined: Sun Oct 09, 2005 3:15 pm

Post by SWAT »

This is my RSSnews file. It works great (with private notices and all). But I noticed that the scripts doesn't use the 'database files' anymore. It didn't create new ones and it didn't update them. Can anyone help me fix this? Below is the my 'edited' script.

Code: Select all

array unset feed

set feed(3DGamers) {
  URL=http://www.3dgamers.com:80/feeds/netscape/3dgamers.rss   /* The location of the news, in format http://domain.com:port/file.xml (port value is optional). */
  DATABASE=scripts/dbase/rssnews/.3dgamers                     /* The file where the news are saved. */
  CHANNELS=#CeTa		                               /* On which channels is the feed activated? Set to ALL for all channels. */
  POSTNEWS=0                                                   /* Post news if there are new ones? */
  POSTLIMIT=3                                                  /* max. posted news */
  PUBLIMIT=3                                                   /* How many news are posted on pub triggers? */
  MSGLIMIT=3                                                  /* How many news are posted on msg triggers? */
  /* Thats the design used for automated output. */
  POSTLAYOUT=\00300,14\[\00309\002!\002\00300\] \00300www.\002\003093D\00300gamers\002.com \[\00309\002!\002\00300\] <news> \[\00309\002!\002\00300\] <link> \[\00309\002!\002\00300\]\003
  /* And this one is for use on trigger events. */
  TRIGLAYOUT=\00300,14\[\00309\002!\002\00300\] \00300www.\002\003093D\00300gamers\002.com \[\00309\002!\002\00300\] <news> \[\00309\002!\002\00300\] <link> \[\00309\002!\002\00300\]\003
}

## ---- End of Setup -------------------------------------------
## -------------------------------------------------------------

if {[package vcompare [info tclversion] 8.4] < 0} {
  putlog "You don't have TCL 8.4, you have to upgrade to version 8.4 or higher to use [file tail [info script]]."
  return;
}

package require http

namespace eval rss {
  variable protect 60
  variable timeout 20
  variable pubbind {!news}
  variable msgbind {!news}
  variable v_major 4
  variable v_minor 5
  variable v_build 0251
  variable version $v_major.$v_minor.$v_build
  variable client "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7) Gecko/20040803 Firefox/0.9.3"
  bind PUB -|- $pubbind [namespace current]::public
  bind MSG -|- $msgbind [namespace current]::private
  bind TIME -|- {?0 * * * *} [namespace current]::check
  namespace export public private check
}

proc rss::check {args} {
  global feed
  variable client
  variable timeout
  putquick "PING :[clock seconds]" -next
  foreach id [array names feed] {
    set url "0"
    set database "0"
    set channels "0"
    set postnews "1"
    set postlimit "3"
    set publimit "3"
    set msglimit "10"
    set postlayout {\00314(\00307<publisher>\00314)\00307 <news> \00314<\00307<link>\00314>\003}
    foreach line [split $feed($id) \n] {
      regsub -all -- {/\*.*?\*/} $line {} line
      regexp -nocase -- {^\s*URL=(.+?)\s*$} $line tmp url
      regexp -nocase -- {^\s*DATABASE=(.+?)\s*$} $line tmp database
      regexp -nocase -- {^\s*CHANNELS=(.+?)\s*$} $line tmp channels
      regexp -nocase -- {^\s*POSTNEWS=(.+?)\s*$} $line tmp postnews
      regexp -nocase -- {^\s*POSTLIMIT=(.+?)\s*$} $line tmp postlimit
      regexp -nocase -- {^\s*PUBLIMIT=(.+?)\s*$} $line tmp publimit
      regexp -nocase -- {^\s*MSGLIMIT=(.+?)\s*$} $line tmp msglimit
      regexp -nocase -- {^\s*POSTLAYOUT=(.+?)\s*$} $line tmp postlayout
    }
    if {($url == 0) || ($database == 0) || ($channels == 0)} {
      putlog "RSS: Warning: Couldn't load configuration for the \[$id\] feed."
      continue
    }
    if {$postnews == 0} {
      continue
    }
    if {![file isdirectory [file dirname $database]]} {
      file mkdir [file dirname $database]
    }
    set count 0
    set data {}
    http::config -useragent $client
    catch {http::geturl $url -command "[namespace current]::check:data {$database} {$channels} {$postlimit} {$postlayout}" -timeout [expr $timeout * 1000]}
  }
}

proc rss::check:data {database channels postlimit postlayout token} {
  upvar 0 $token state
  if {![string equal -nocase $state(status) "ok"]} {
    return 0
  }
  set latestnews "iddqd"
  if {[file exists $database]} {
    set temp [open $database r+]
    set latestnews [gets $temp]
    if {[string length $latestnews] <= 1} {
      set latestnews "iddqd"
    }
    close $temp
  }
  set data [http::data $token]
  http::cleanup $token
  set publisher [publisher $data]
  set data [parse $data]
  set temp [open $database w+]
  set postlayout [join $postlayout { }]
  foreach {item} $data {
    regsub -all -- {<id>} $postlayout [lindex $item 0] output
    regsub -all -- {<publisher>} $output $publisher output
    regsub -all -- {<link>} $output [lindex $item 1] output
    regsub -all -- {<news>} $output [lindex $item 2] output
    regsub -all -- {<upfirstchar\s(.*?)>} [clean $output] {[upfirstchar "\1"]} output
    puts $temp [decode [subst $output]]
  }
  close $temp
  set count 0
  set temp [open $database r+]
  while {![eof $temp]} {
    gets $temp headline
    if {([string equal -nocase $latestnews $headline]) || ([string equal -nocase $latestnews "iddqd"]) || ($count == $postlimit)} {
      break
    }
    incr count
    msg $channels $headline
  }
  close $temp
}

proc rss::news {target id type} {
  global feed
  variable client
  variable timeout
  if {$type == 2} {
    set msgtype PRIVMSG
  } else {
    set msgtype NOTICE
  }
  set url "0"
  set publimit "3"
  set msglimit "10"
  set triglayout "\00314\[\00307<id>\00314\]\00307 <news> \00314<\00307<link>\00314>\003"
  foreach item [split $feed($id) \n] {
    regsub -all -- {/\*.*?\*/} $item {} item
    regexp -nocase -- {^\s*URL=(.+?)\s*$} $item tmp url
    regexp -nocase -- {^\s*PUBLIMIT=(.+?)\s*$} $item tmp publimit
    regexp -nocase -- {^\s*MSGLIMIT=(.+?)\s*$} $item tmp msglimit
    regexp -nocase -- {^\s*TRIGLAYOUT=(.+?)\s*$} $item tmp triglayout
  }
  if {($url == 0)} {
    putquick "$msgtype $target :Warning: Couldn't load configuration for the \[$id\] feed."
    return 0
  }
  if {$type == 1} {
    set limit $msglimit
  } elseif {$type == 2} {
    set limit $publimit
  } else {
    return 0
  }
  http::config -useragent $client
  catch {http::geturl $url -timeout [expr $timeout * 1000]} token
  if {[regexp -nocase -- {^couldn\'t\sopen\ssocket:\s+?(.*)$} $token tmp state(status)]} {
    putquick "$msgtype $target :Warning: Couldn't connect to the \[$id\] feed ($state(status))."
    return 0
  }
  upvar 0 $token state
  if {![string equal -nocase $state(status) "ok"]} {
    putquick "$msgtype $target :Warning: Couldn't connect to the \[$id\] feed (connection $state(status))."
    return 0
  }
  set data [http::data $token]
  http::cleanup $token
  set publisher [publisher $data]
  set data [parse $data]
  set count 0
  set triglayout [join $triglayout { }]
  foreach {item} $data {
    incr count
    regsub -all -- {<id>} $triglayout [lindex $item 0] output
    regsub -all -- {<publisher>} $output $publisher output
    regsub -all -- {<link>} $output [lindex $item 1] output
    regsub -all -- {<news>} $output [lindex $item 2] output
    regsub -all -- {<upfirstchar\s(.*?)>} [clean $output] {[upfirstchar "\1"]} output
    set output [decode [subst $output]]
    if {$type == 2} {
      if {[regexp -- {c} [getchanmode $target]]} {
        set output [stripcodes c $output]
      }
    }
    puthelp "$msgtype $target :$output"
    if {($count == $limit)} {
      break
    }
  }
}

proc rss::publisher {content} {
  set publisher {n/a}
  regsub -all -- {\n+|\s+|\t+} $content { } content
  regsub -all -- {([\\&])} $content {\\\1} content
  regexp -nocase -- {<title>(.+?)</title>} $content tmp publisher
  return $publisher
}

proc rss::parse {content} {
  regsub -all -- {\n+|\s+|\t+} $content { } content
  regsub -all -- {([\\&])} $content {\\\1} content
  set item 0
  set news ""
  while {[regexp -nocase -- {<item(\s[^>]*?)?>(.+?)</item>} $content -> & value]} {
    incr item
    set title {n/a}
    regexp -nocase -- {<title>(.+?)</title>} $value -> title
    regexp -nocase -- {\<\!\[CDATA\[(.*?)\]\]\>} $title -> title
    set link {n/a}
    regexp -nocase -- {<link>(.+?)</link>} $value -> link
    regexp -nocase -- {\<\!\[CDATA\[(.*?)\]\]\>} $link -> link
    regsub -nocase -- {<item.*?>.+?</item>} $content {} content
    lappend news "$item {$link} {$title}"
  }
  return [lsort -integer -unique -index 0 $news]
}

proc rss::decode {content} {
  if {![regexp -- & $content]} {
    return $content
  }
  set escapes {
      \x20 " \x22 & \x26 &apos; \x27 – \x2D < \x3C > \x3E ˜ \x7E € \x80 ¡ \xA1
    ¢ \xA2 £ \xA3 ¤ \xA4 ¥ \xA5 ¦ \xA6 § \xA7 ¨ \xA8 © \xA9 ª \xAA « \xAB
    ¬ \xAC ­ \xAD ® \xAE &hibar; \xAF ° \xB0 ± \xB1 ² \xB2 ³ \xB3 ´ \xB4 µ \xB5
    ¶ \xB6 · \xB7 ¸ \xB8 ¹ \xB9 º \xBA » \xBB ¼ \xBC ½ \xBD ¾ \xBE ¿ \xBF
    À \xC0 Á \xC1 Â \xC2 Ã \xC3 Ä \xC4 Å \xC5 Æ \xC6 Ç \xC7 È \xC8 É \xC9
    Ê \xCA Ë \xCB Ì \xCC Í \xCD Î \xCE Ï \xCF Ð \xD0 Ñ \xD1 Ò \xD2 Ó \xD3
    Ô \xD4 Õ \xD5 Ö \xD6 × \xD7 Ø \xD8 Ù \xD9 Ú \xDA Û \xDB Ü \xDC Ý \xDD
    Þ \xDE ß \xDF à \xE0 á \xE1 â \xE2 ã \xE3 ä \xE4 å \xE5 æ \xE6 ç \xE7
    è \xE8 é \xE9 ê \xEA ë \xEB ì \xEC í \xED î \xEE ï \xEF ð \xF0 ñ \xF1
    ò \xF2 ó \xF3 ô \xF4 õ \xF5 ö \xF6 ÷ \xF7 ø \xF8 ù \xF9 ú \xFA û \xFB
    ü \xFC ý \xFD þ \xFE ÿ \xFF
  }
  set content [string map $escapes $content]
  regsub -all -- {&[a-zA-Z]+?;} [clean $content] {?} content
  regsub -all -- {&#(\d{1,3});} $content {[format %c [scan \1 %d]]} content
  return [subst $content]
}

proc rss::private {nickname hostname handle arguments} {
 global feed
 variable spam
 variable protect
  set arguments [clean $arguments]
  set spewfeed [lindex $arguments 0]
  if {![validfeed $spewfeed 1]} {
    putquick "NOTICE $nickname :Please supply a valid feed: [join [lsort -dictionary [array names feed]] ",\x20"]"
    return 0
  }
  set spewfeed [validfeed $spewfeed 2]
  if {([info exists spam(flood,$spewfeed,$hostname)])} {
    set s [expr [clock seconds] - $spam(flood,$spewfeed,$hostname)]
    if {$s < $protect} {
      putquick "NOTICE $nickname :Sorry - This trigger has recently been used. It will be unlocked in [expr $protect - $s] seconds."
      return 0
    }
  }
  set spam(flood,$spewfeed,$hostname) [clock seconds]
  news $nickname $spewfeed 1
}

proc rss::public {nickname hostname handle channel arguments} {
 global feed
 variable spam
 variable protect
  set arguments [clean $arguments]
  set spewfeed [lindex $arguments 0]
  if {![validfeed $spewfeed 1]} {
    putquick "NOTICE $nickname :Please supply a valid feed: [join [lsort -dictionary -unique [array names feed]] ",\x20"]"
    return 0
  }
  set spewfeed [validfeed $spewfeed 2]
  if {([info exists spam(flood,$spewfeed,$channel)]) && (![isop $nickname $channel])} {
    set s [expr [clock seconds] - $spam(flood,$spewfeed,$channel)]
    if {$s < $protect} {
      putquick "NOTICE $nickname :Sorry - This trigger has recently been used. It will be unlocked in [expr $protect - $s] seconds."
      return 0
    }
  }
  set spam(flood,$spewfeed,$channel) [clock seconds]
  set channels 0
  foreach item [split $feed($spewfeed) \n] {
    regsub -all -- {/\*.*\*/} $item {} item
    regexp -nocase -- {^\s*CHANNELS=(.+?)\s*$} $item tmp channels
  }
  if {([lsearch -exact [string tolower $channels] [string tolower $channel]] == -1) && (![string equal -nocase $channels "ALL"])} {
    putquick "NOTICE $nickname :The \[$spewfeed\] feed is not available on this channel. ($channels)"
    return 0
  }
  news $nickname $spewfeed 1
}

proc rss::msg {channels headline} {
  if {[string equal -nocase $channels "ALL"]} {
    foreach channel [channels] {
      if {[regexp -- {c} [getchanmode $channel]] && [regexp -- {\003} $headline]} {
        lappend nocolors $channel
      } else {
        lappend colors $channel
      }
    }
  } else {
    foreach channel [channels] {
      if {[lsearch -exact [string tolower $channels] [string tolower $channel]] >= 0} {
        if {[regexp -- {c} [getchanmode $channel]] && [regexp -- {\003} $headline]} {
          lappend nocolors $channel
        } else {
          lappend colors $channel
        }
      }
    }
  }
  if {[info exists nocolors]} {
    putquick "PRIVMSG [join $nocolors {,}] :[stripcodes c $headline]"
  }
  if {[info exists colors]} {
    putquick "PRIVMSG [join $colors {,}] :$headline"
  }
}

proc rss::validfeed {keyword type} {
 global feed
  foreach id [array names feed] {
    if {[string equal -nocase $id $keyword]} {
      switch -exact -- $type {
        {1} {
          return 1
        }
        {2} {
          return $id
        }
      }
    }
  }
  return 0
}

proc rss::upfirstchar {content} {
  regsub -all -- {((^|\s)([a-z]))} [clean $content] {[string toupper "\1"]} content
  return [subst $content]
}

proc rss::clean {string} {
  regsub -all -- {([\(\)\[\]\{\}\$\"\\])} $string {\\\1} string
  return $string
}
User avatar
demond
Revered One
Posts: 3073
Joined: Sat Jun 12, 2004 9:58 am
Location: San Francisco, CA
Contact:

Post by demond »

there is no need of database files to compare if something in the feed has changed or not - see my rss script for example
connection, sharing, dcc problems? click <here>
before asking for scripting help, read <this>
use

Code: Select all

 tag when posting logs, code
S
SWAT
Voice
Posts: 2
Joined: Sun Oct 09, 2005 3:15 pm

Post by SWAT »

I'll have a look at your script later (I can't look at any code anymore, and the other scripts is currently working). The database files are VERY handy. If you set the script for updating the text files every X minutes (for example 60 or 30) and then post the content from the textfile to the channel, the data-traffic would decrease. Because if 10 users would use the rssfeed-command in 10 minutes, the website would get 10 hits (and you'd have to wait longer, because every time the website needs to be contacted.)

Example:
Without DB files: 10 users in 10 minutes = 100kb traffic and waiting time is 10 seconds.
With DB files: 10 users in 10 minutes = 10 kb traffic and waiting time is 2 seconds.
User avatar
demond
Revered One
Posts: 3073
Joined: Sat Jun 12, 2004 9:58 am
Location: San Francisco, CA
Contact:

Post by demond »

my script is fundamentally different from perplexa's, it doesn't fetch feeds on user demand, it fetches feeds periodically and compares feeds contents in memory, announcing only updated headlines as necessary

naturally, it has the whole feeds contents available at all times (in memory, not in file), and it sends that content out on demand
connection, sharing, dcc problems? click <here>
before asking for scripting help, read <this>
use

Code: Select all

 tag when posting logs, code
r
rix
Halfop
Posts: 42
Joined: Wed Sep 21, 2005 1:04 pm
Location: Estonia

Post by rix »

hey, you got this nice script, but there's one thing bothering me - it posts feed to channel randomly, but I want it to post only when the trigger is pulled.

already tryed demond's script but that didn't work for me :(
Post Reply