This is the new home of the egghelp.org community forum.
All data has been migrated (including user logins/passwords) to a new phpBB version.


For more information, see this announcement post. Click the X in the top right-corner of this box to dismiss this message.

Strip Html

Old posts that have not been replied to for several years.
Locked
d
digitaldj
Voice
Posts: 23
Joined: Tue Jan 07, 2003 9:07 am

Strip Html

Post by digitaldj »

Is there a way to strip html codes out of the way when using http::geturl?
User avatar
strikelight
Owner
Posts: 708
Joined: Mon Oct 07, 2002 10:39 am
Contact:

Re: Strip Html

Post by strikelight »

digitaldj wrote:Is there a way to strip html codes out of the way when using http::geturl?
Not using http::geturl, there isn't a way to strip html codes..
But after you have retrieved the contents of the page, you can
use tcl commands like regsub to do the job for you.

For starters,
ie: regsub -all {<[^>]*>} $data {} data
d
digitaldj
Voice
Posts: 23
Joined: Tue Jan 07, 2003 9:07 am

Post by digitaldj »

ah, nice. thanks.
C
ClubCX
Voice
Posts: 35
Joined: Mon Nov 19, 2001 8:00 pm
Location: Bournemouth, UK
Contact:

Post by ClubCX »

Another alternative is to utilize Lynx to filter out the html and format the page. Not recommended for high-use web scripts though.
p
ppslim
Revered One
Posts: 3914
Joined: Sun Sep 23, 2001 8:00 pm
Location: Liverpool, England

Post by ppslim »

Lynx uses ncurses from what I understand. This would make a mess of the input received or even simply not work.

The regsub method is far quicker.
User avatar
GodOfSuicide
Master
Posts: 463
Joined: Mon Jun 17, 2002 8:00 pm
Location: Austria

Post by GodOfSuicide »

look @ efnet.tcl from DAWG-TCL
Locked