This is the new home of the egghelp.org community forum.
All data has been migrated (including user logins/passwords) to a new phpBB version.


For more information, see this announcement post. Click the X in the top right-corner of this box to dismiss this message.

UNOFFICIAL incith-google 2.1x (Nov30,2o12)

Support & discussion of released scripts, and announcements of new releases.
Post Reply
g
gencha
Voice
Posts: 15
Joined: Sat Feb 10, 2007 2:52 pm

Post by gencha »

!local works great now
Thanks again
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

gencha wrote:This was my groups request:
19:02:11 <@gencha> !groups titan
19:02:15 <@TTB005> 8,130 Results | kern/37299: Contribution: Support for so @ http://groups.google.com/group/muc.list ... 6c9daff3aa | <tr><td> Features teen titan porn tee @ http://groups.google.com/group/gg3-myle ... porn">Teen Titan Porn</a> <nobr>Group: <a href="/group/gg3-mylexus-5 | Titan - Tactics @
19:02:16 <@TTB005> http://groups.google.com/group/rec.game ... d705b5259d
I'll install your fix and see what it does for me.
Thanks for the quick reply :)
AHA, I see now exactly what is causing it, thanks. This is how all group replies should look now
sample from irc wrote:<speechles> !groups titan
<sp33chy> 8,140 Results | Titan????????? @ http://groups.google.com/group/ktfs-member | ?? ????? @ http://groups.google.com/group/fc-titan | kern/37299: Contribution: Support for so @ http://groups.google.com/group/muc.list ... 6c9daff3aa

<speechles> !groups marijuana
<sp33chy> 8,660 Results | rec.drugs.cannabis @ http://groups.google.com/group/rec.drugs.cannabis | alt.politics.marijuana @ http://groups.google.com/group/alt.politics.marijuana | Swiss study has some surprises on marij @ http://groups.google.com/group/uk.rec.d ... 2e685f7764

<speechles> !groups .nl marihuana
<sp33chy> 2.100 Resultaten | reggae,raggamuffin(halldance),hip-hop co @ http://groups.google.nl/group/rastapan | Marihuana... @ http://groups.google.nl/group/pl.sci.me ... d88fda1813 | Marihuana... @ http://groups.google.nl/group/pl.sci.me ... d88fda1813

<speechles> !groups works now perfectly
<sp33chy> 36,400 Results | xbox 360 wireless connection fails after @ http://groups.google.com/group/xbox-360 ... ter-update | Did the WTC attack ruin a perfectly go @ http://groups.google.com/group/rec.arts ... 2434f1ec45 | More deserved MUD for vista... @
<sp33chy> http://groups.google.com/group/microsof ... 4b70bf9add
Groups now uses double lookups (google groups/usenet groups) to ensure accuracy. Get it Here again or any v1.96 link above.. have a fun :lol:
g
gencha
Voice
Posts: 15
Joined: Sat Feb 10, 2007 2:52 pm

Post by gencha »

Excellent, thanks again.
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

!local now uses double lookups (local maps/global maps) to ensure accuracy. !google weather had issues, now all those have been resolved (at the moment google is omitting conditions, when they add these back bot will work with them as well).

Get the current script here or at any of the v1.96 links above. Most important, remember, have a fun. :lol:
g
gencha
Voice
Posts: 15
Joined: Sat Feb 10, 2007 2:52 pm

Post by gencha »

What i encountered quite often is that urls (especially images) are displayed with spaces.
Having them replaced with %20 would be great.
A
Astur
Voice
Posts: 16
Joined: Fri Nov 23, 2007 10:27 am

Post by Astur »

Is there the possibility to disable some modules? I don't need all of this stuff.

I just need google search, google image search, youtube, locate and wikipedia.

I didn't found anything to disable the other stuff and tried to delete it, but the script was no longer working after that ^^ (I'm new to tcl and eggdrops, so).

Can anyone help me?
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

Astur wrote:Is there the possibility to disable some modules? I don't need all of this stuff.

I just need google search, google image search, youtube, locate and wikipedia.

I didn't found anything to disable the other stuff and tried to delete it, but the script was no longer working after that ^^ (I'm new to tcl and eggdrops, so).

Can anyone help me?

Code: Select all

    # number of search results/image links to return, 'define:' is always 1 as some defs are huge
    variable search_results 4
    variable image_results 4
    variable local_results 4
    variable group_results 3
    variable news_results 3
    variable print_results 3
    variable video_results 4
    variable youtube_results 5
    variable locate_results 1
    variable gamespot_results 3
    variable trans_results 1
    variable daily_results 4  
    variable gamefaq_results 20
    variable blog_results 3
    variable ebay_results 3
    variable popular_results 10
    variable rev_results 1
    variable wiki_results 1
    variable wikimedia_results 1
    variable recent_results 10
    variable mininova_results 3
    variable ign_results 3
    variable myspacevids_results 3
    variable trends_results 20
Setting any of the above to 0 effectively disables that trigger completely in every regard, even !help <trigger> will say it's disabled. Try typing !help all after you've edited/saved the config, directly after you've issued your rehash/restart to your bot. You can then clearly see if your triggers are truly disabled or not.
gensha wrote:What i encountered quite often is that urls (especially images) are displayed with spaces.
Having them replaced with %20 would be great.
I'll look into this shortly, it's easily fixed as the script already has a urlencode function that should be correcting links such as this. I'll take a peek and see if I can stumble upon a flaw. ;)
Q
QQleQ
Voice
Posts: 14
Joined: Mon Nov 20, 2006 10:05 pm

Post by QQleQ »

Tcl error [incith::google::public_message]: invalid command name "stripcodes"

I also have that, while i am running tcl 8.4 and the right eggdrop version.. :(
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

QQleQ wrote:Tcl error [incith::google::public_message]: invalid command name "stripcodes"

I also have that, while i am running tcl 8.4 and the right eggdrop version.. :(
This is the debut version of eggdrop that supports stripcodes. Eggdrop 1.6.18 also supports it. The stripcodes is only used by wikipedia/wikimedia. Because wikipedia/wikimedia isn't presently capable of establishing proper character encodings 100%, stripcodes is needed to keep problematic characters from triggering colors,bold,underline,etc... As soon as I complete wikipedia/wikimedia it will properly support encodings correctly and these stripcodes will be irrelevant and removed. Meantime, you can be safely edit the stripcodes out if you don't wish to upgrade (albeit with the possible strange bolding, coloring, underlining of wiki results in some languages)..
Q
QQleQ
Voice
Posts: 14
Joined: Mon Nov 20, 2006 10:05 pm

Post by QQleQ »

The whole wikipedia search makes it hang like crazy..
every now and than, especially when wiki requests are flooded it goes:
TCL error [incith::google::public_message]: couldn't open socket: host is unreachable
and does that for every line (takes a minute between 2 requests)
and eventually makes the eggdrop jump servers or ping timeout.

Sometimes the wikipedia search works just fine.
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

QQleQ wrote:The whole wikipedia search makes it hang like crazy..
every now and than, especially when wiki requests are flooded it goes:
TCL error [incith::google::public_message]: couldn't open socket: host is unreachable
and does that for every line (takes a minute between 2 requests)
and eventually makes the eggdrop jump servers or ping timeout.

Sometimes the wikipedia search works just fine.
It sounds like a bandwidth problem to me, as that error only happens when a bandwidth shortage occurs. With wikipedia/wikimedia there are three possible page loads that can occur and only the first checks for socket/timeout errors. Only the first check is made because the bot is under the assumption it will have acceptable bandwidth to load these additional pages afterwards. The error detection is for the most part just to catch bad user input, as only the 1st page load is from user input. The bot itself then decides which links to traverse/display based off results in that 1st page and is why the assumption is made.

To make it more clear keep in mind each of these page loads is set at 15 second timeouts. This means that if no error occurs on the first page load, but during the 2nd and 3rd ones you run short of bandwidth, you will have to wait for these timeouts to expire. And since no error catching is done on these 2nd and 3rd loads, your error happens and you've waited 30/45 seconds. This is intentional as some pages are too long to load entirely, and some sites are too slloooooooow to complete it all in 10 seconds. The moral is, the ends of pages cannot otherwise be searched without leaving it at 15 seconds. I may at one point incorporate additional error catches just for those rare instances when the bot is being hammered and has run out of bandwidth... Then it can return an custom message of your choosing to be given as the reply, you can be nasty or nice this way. :D

I tried to make it work exactly as it would within a real web-browser. So it can be seamless to the user as to how it works. If you've used the multi-language country switch, the #toc table-of-contents, or any of the kewl #sub-tagging features you would notice how much work went into it to keep it simple yet powerful.

If it isn't a bandwidth problem, can you provide some of the queries to wikipedia that are causing the issue? Maybe I can replicate it, and solve it.
Q
QQleQ
Voice
Posts: 14
Joined: Mon Nov 20, 2006 10:05 pm

Post by QQleQ »

Found out that the problem might be when the user uses control codes in their query

for instance:

!wiki amster1dam
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

Merry Christmas..below is your gift, finally.. heh

Code: Select all

    # amount of lines you want your wiki* results to span, the more lines the more
    # of the wiki article or section you will see, some get cut short if so raise this.
    # this affects both wikipedia and wikimedia results.
    #
    variable wiki_lines 2

    ...snipped older parts not relevant...

    # enable encoding conversion, set this to 1 to enable.
    # with this enabled it will follow the format of encoding conversions listed
    # below. these will affect both input and output and will follow country switch.
    #
    variable encoding_conversion_input 1
    variable encoding_conversion_output 1

    # encoding conversion lookups
    # here is where you can correct language encoding problems by pointing their
    # abbreviation towards an encoding. if you want more, feel free to add more.
    # this is a somewhat poor example below, there are possibly hundreds of additions
    # that need to be added to this section. enjoy and merry christmas ;P
    #
    variable encode_strings {
      en:utf-8
      com:utf-8
      sr:iso8859-5
      ru:cp1251
      ar:iso8859-6
    }
As you can see, some new features have found their way into the script. Mainly, for those times your finding wikipedia cutting everything off entirely too short you can now adjust it and give it more lines.

Also, the encoding feature desired the most is now up for beta test. You can adjust it using the two switches:
- encoding_conversion_input - set to 1 causes input to be converted from
- encoding_conversion_output - set to 1 causes output to be converted to
Setting both, of course will do both. This will only work if the countrycode you use is defined in the encode_strings table directly below it. This is the big 'ol table I've referenced needing before, which atm isn't very big at all. This will require several additions but for now feel free to add to this table and help test this script. Consider this to be v1.9.7 finally..

Get the new script HERE (v1.9.7) or at the v1.9.7 link on the very first page of this thread.
keep in mind the encodings are beta. I converse only in English, so there are bound to be some unforseen problems such as regexp/regsub fixes and slight realignment regarding the placement of the input/output encoding sections. If you want this script to work correctly with your countries languages it's up to you to provide some input, helpful feedback, and participate in this thread so accomodations can be made in the script to add your country correctly. Otherwise that language will "never" work corre... *trails off in mid sentence, gets up, and walks off* :wink:
Last edited by speechles on Sun Dec 16, 2007 7:30 pm, edited 9 times in total.
User avatar
speechles
Revered One
Posts: 1398
Joined: Sat Aug 26, 2006 10:19 pm
Location: emerald triangle, california (coastal redwoods)

Post by speechles »

For those curious, below is the segments of code for the input/output encoding conversions I've used.

Code: Select all

      # this is my input encoding hack, this will convert input before it goes
      # out to be queried.
      if {$incith::google::encoding_conversion_input > 0 && $country != ""} {
        set encoding_found [lindex [split [lindex $incith::google::encode_strings [lsearch -glob $incith::google::encode_strings "$country:*"]] :] 1]
        if {$encoding_found != "" && [lsearch -exact [encoding names] $encoding_found] != -1} {
          set input [encoding convertfrom $encoding_found $input]
        }
      }
      
      ...snipped out query building/fetch html routines here...
           
      # this is the output encoding hack.
      if {$incith::google::encoding_conversion_output > 0} {
        set encoding_found [lindex [split [lindex $incith::google::encode_strings [lsearch -glob $incith::google::encode_strings "$country:*"]] :] 1]
        if {$encoding_found != "" && [lsearch -exact [encoding names] $encoding_found] != -1} {
          set input [encoding convertto $encoding_found $input]
        }
      }
Last edited by speechles on Thu Dec 20, 2007 12:29 am, edited 2 times in total.
Z
Zircon
Op
Posts: 191
Joined: Mon Aug 21, 2006 4:22 am
Location: Montreal

Post by Zircon »

Hi there

Thanks for this very nice script. I added fr:iso8859-1 for french encoding. When i type !wiki montréal, it gives the desired result but with this link at the end : http://fr.wikipedia.org/wiki/Montr%C3%A9al

Are we supponsed to have these characters in the link ?

By the way, !wiki montreal gives http://fr.wikipedia.org/wiki/Montreal
Post Reply