| View previous topic :: View next topic |
| Author |
Message |
gencha Voice
Joined: 10 Feb 2007 Posts: 15
|
Posted: Thu Nov 15, 2007 7:08 pm Post subject: |
|
|
!local works great now
Thanks again |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Thu Nov 15, 2007 8:58 pm Post subject: |
|
|
| gencha wrote: | This was my groups request:
I'll install your fix and see what it does for me.
Thanks for the quick reply  |
AHA, I see now exactly what is causing it, thanks. This is how all group replies should look now
Groups now uses double lookups (google groups/usenet groups) to ensure accuracy. Get it Here again or any v1.96 link above.. have a fun  |
|
| Back to top |
|
 |
gencha Voice
Joined: 10 Feb 2007 Posts: 15
|
Posted: Fri Nov 16, 2007 7:33 am Post subject: |
|
|
| Excellent, thanks again. |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Thu Nov 22, 2007 11:33 am Post subject: |
|
|
!local now uses double lookups (local maps/global maps) to ensure accuracy. !google weather had issues, now all those have been resolved (at the moment google is omitting conditions, when they add these back bot will work with them as well).
Get the current script here or at any of the v1.96 links above. Most important, remember, have a fun.  |
|
| Back to top |
|
 |
gencha Voice
Joined: 10 Feb 2007 Posts: 15
|
Posted: Wed Nov 28, 2007 7:15 am Post subject: |
|
|
What i encountered quite often is that urls (especially images) are displayed with spaces.
Having them replaced with %20 would be great. |
|
| Back to top |
|
 |
Astur Voice
Joined: 23 Nov 2007 Posts: 16
|
Posted: Fri Nov 30, 2007 4:29 pm Post subject: |
|
|
Is there the possibility to disable some modules? I don't need all of this stuff.
I just need google search, google image search, youtube, locate and wikipedia.
I didn't found anything to disable the other stuff and tried to delete it, but the script was no longer working after that ^^ (I'm new to tcl and eggdrops, so).
Can anyone help me? |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Fri Nov 30, 2007 8:24 pm Post subject: |
|
|
| Astur wrote: | Is there the possibility to disable some modules? I don't need all of this stuff.
I just need google search, google image search, youtube, locate and wikipedia.
I didn't found anything to disable the other stuff and tried to delete it, but the script was no longer working after that ^^ (I'm new to tcl and eggdrops, so).
Can anyone help me? |
| Code: | # number of search results/image links to return, 'define:' is always 1 as some defs are huge
variable search_results 4
variable image_results 4
variable local_results 4
variable group_results 3
variable news_results 3
variable print_results 3
variable video_results 4
variable youtube_results 5
variable locate_results 1
variable gamespot_results 3
variable trans_results 1
variable daily_results 4
variable gamefaq_results 20
variable blog_results 3
variable ebay_results 3
variable popular_results 10
variable rev_results 1
variable wiki_results 1
variable wikimedia_results 1
variable recent_results 10
variable mininova_results 3
variable ign_results 3
variable myspacevids_results 3
variable trends_results 20 | Setting any of the above to 0 effectively disables that trigger completely in every regard, even !help <trigger> will say it's disabled. Try typing !help all after you've edited/saved the config, directly after you've issued your rehash/restart to your bot. You can then clearly see if your triggers are truly disabled or not.
| gensha wrote: | What i encountered quite often is that urls (especially images) are displayed with spaces.
Having them replaced with %20 would be great. |
I'll look into this shortly, it's easily fixed as the script already has a urlencode function that should be correcting links such as this. I'll take a peek and see if I can stumble upon a flaw.  |
|
| Back to top |
|
 |
QQleQ Voice
Joined: 20 Nov 2006 Posts: 14
|
Posted: Mon Dec 03, 2007 12:41 pm Post subject: |
|
|
Tcl error [incith::google::public_message]: invalid command name "stripcodes"
I also have that, while i am running tcl 8.4 and the right eggdrop version..  |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Mon Dec 03, 2007 7:15 pm Post subject: |
|
|
| QQleQ wrote: | Tcl error [incith::google::public_message]: invalid command name "stripcodes"
I also have that, while i am running tcl 8.4 and the right eggdrop version..  |
This is the debut version of eggdrop that supports stripcodes. Eggdrop 1.6.18 also supports it. The stripcodes is only used by wikipedia/wikimedia. Because wikipedia/wikimedia isn't presently capable of establishing proper character encodings 100%, stripcodes is needed to keep problematic characters from triggering colors,bold,underline,etc... As soon as I complete wikipedia/wikimedia it will properly support encodings correctly and these stripcodes will be irrelevant and removed. Meantime, you can be safely edit the stripcodes out if you don't wish to upgrade (albeit with the possible strange bolding, coloring, underlining of wiki results in some languages).. |
|
| Back to top |
|
 |
QQleQ Voice
Joined: 20 Nov 2006 Posts: 14
|
Posted: Tue Dec 04, 2007 11:58 am Post subject: |
|
|
The whole wikipedia search makes it hang like crazy..
every now and than, especially when wiki requests are flooded it goes:
TCL error [incith::google::public_message]: couldn't open socket: host is unreachable
and does that for every line (takes a minute between 2 requests)
and eventually makes the eggdrop jump servers or ping timeout.
Sometimes the wikipedia search works just fine. |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Tue Dec 04, 2007 7:05 pm Post subject: |
|
|
| QQleQ wrote: | The whole wikipedia search makes it hang like crazy..
every now and than, especially when wiki requests are flooded it goes:
TCL error [incith::google::public_message]: couldn't open socket: host is unreachable
and does that for every line (takes a minute between 2 requests)
and eventually makes the eggdrop jump servers or ping timeout.
Sometimes the wikipedia search works just fine. |
It sounds like a bandwidth problem to me, as that error only happens when a bandwidth shortage occurs. With wikipedia/wikimedia there are three possible page loads that can occur and only the first checks for socket/timeout errors. Only the first check is made because the bot is under the assumption it will have acceptable bandwidth to load these additional pages afterwards. The error detection is for the most part just to catch bad user input, as only the 1st page load is from user input. The bot itself then decides which links to traverse/display based off results in that 1st page and is why the assumption is made.
To make it more clear keep in mind each of these page loads is set at 15 second timeouts. This means that if no error occurs on the first page load, but during the 2nd and 3rd ones you run short of bandwidth, you will have to wait for these timeouts to expire. And since no error catching is done on these 2nd and 3rd loads, your error happens and you've waited 30/45 seconds. This is intentional as some pages are too long to load entirely, and some sites are too slloooooooow to complete it all in 10 seconds. The moral is, the ends of pages cannot otherwise be searched without leaving it at 15 seconds. I may at one point incorporate additional error catches just for those rare instances when the bot is being hammered and has run out of bandwidth... Then it can return an custom message of your choosing to be given as the reply, you can be nasty or nice this way.
I tried to make it work exactly as it would within a real web-browser. So it can be seamless to the user as to how it works. If you've used the multi-language country switch, the #toc table-of-contents, or any of the kewl #sub-tagging features you would notice how much work went into it to keep it simple yet powerful.
If it isn't a bandwidth problem, can you provide some of the queries to wikipedia that are causing the issue? Maybe I can replicate it, and solve it. |
|
| Back to top |
|
 |
QQleQ Voice
Joined: 20 Nov 2006 Posts: 14
|
Posted: Tue Dec 04, 2007 11:32 pm Post subject: |
|
|
Found out that the problem might be when the user uses control codes in their query
for instance:
!wiki amster1dam |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Sun Dec 16, 2007 6:17 pm Post subject: |
|
|
Merry Christmas..below is your gift, finally.. heh | Code: | # amount of lines you want your wiki* results to span, the more lines the more
# of the wiki article or section you will see, some get cut short if so raise this.
# this affects both wikipedia and wikimedia results.
#
variable wiki_lines 2
...snipped older parts not relevant...
# enable encoding conversion, set this to 1 to enable.
# with this enabled it will follow the format of encoding conversions listed
# below. these will affect both input and output and will follow country switch.
#
variable encoding_conversion_input 1
variable encoding_conversion_output 1
# encoding conversion lookups
# here is where you can correct language encoding problems by pointing their
# abbreviation towards an encoding. if you want more, feel free to add more.
# this is a somewhat poor example below, there are possibly hundreds of additions
# that need to be added to this section. enjoy and merry christmas ;P
#
variable encode_strings {
en:utf-8
com:utf-8
sr:iso8859-5
ru:cp1251
ar:iso8859-6
} |
As you can see, some new features have found their way into the script. Mainly, for those times your finding wikipedia cutting everything off entirely too short you can now adjust it and give it more lines.
Also, the encoding feature desired the most is now up for beta test. You can adjust it using the two switches:
- encoding_conversion_input - set to 1 causes input to be converted from
- encoding_conversion_output - set to 1 causes output to be converted to
Setting both, of course will do both. This will only work if the countrycode you use is defined in the encode_strings table directly below it. This is the big 'ol table I've referenced needing before, which atm isn't very big at all. This will require several additions but for now feel free to add to this table and help test this script. Consider this to be v1.9.7 finally..
Get the new script HERE (v1.9.7) or at the v1.9.7 link on the very first page of this thread.
keep in mind the encodings are beta. I converse only in English, so there are bound to be some unforseen problems such as regexp/regsub fixes and slight realignment regarding the placement of the input/output encoding sections. If you want this script to work correctly with your countries languages it's up to you to provide some input, helpful feedback, and participate in this thread so accomodations can be made in the script to add your country correctly. Otherwise that language will "never" work corre... *trails off in mid sentence, gets up, and walks off* 
Last edited by speechles on Sun Dec 16, 2007 7:30 pm; edited 9 times in total |
|
| Back to top |
|
 |
speechles Revered One

Joined: 26 Aug 2006 Posts: 1398 Location: emerald triangle, california (coastal redwoods)
|
Posted: Sun Dec 16, 2007 6:27 pm Post subject: |
|
|
For those curious, below is the segments of code for the input/output encoding conversions I've used. | Code: | # this is my input encoding hack, this will convert input before it goes
# out to be queried.
if {$incith::google::encoding_conversion_input > 0 && $country != ""} {
set encoding_found [lindex [split [lindex $incith::google::encode_strings [lsearch -glob $incith::google::encode_strings "$country:*"]] :] 1]
if {$encoding_found != "" && [lsearch -exact [encoding names] $encoding_found] != -1} {
set input [encoding convertfrom $encoding_found $input]
}
}
...snipped out query building/fetch html routines here...
# this is the output encoding hack.
if {$incith::google::encoding_conversion_output > 0} {
set encoding_found [lindex [split [lindex $incith::google::encode_strings [lsearch -glob $incith::google::encode_strings "$country:*"]] :] 1]
if {$encoding_found != "" && [lsearch -exact [encoding names] $encoding_found] != -1} {
set input [encoding convertto $encoding_found $input]
}
} |
Last edited by speechles on Thu Dec 20, 2007 12:29 am; edited 2 times in total |
|
| Back to top |
|
 |
Zircon Op
Joined: 21 Aug 2006 Posts: 191 Location: Montreal
|
|
| Back to top |
|
 |
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|