chrometweaks.org

What FREE web-iPage hosting service would You advise me to use for my personal webpage?

Click Here To View All Answers...


First question I got is What FREE web-iPage hosting service would You advise me to use for my personal webpage? Thanks in advance for any answer or 2. My 2nd question... This is the new updated copy of spiders.txt for includes/spiders.txt.

Just copy the text and paste in your file. I have done search for over 3 hours because I was getting oscsid in soem search engines and got the list of only active spiders. some of them are new as they are 2 versions coming these days..

Hope it helps..

Dated: 7/19/2004.

$Id: spiders.txt,v 1.2 2003/05/05 17:58:17 dgw_ Exp $.

Almaden.ibm.com.

Appie 1.1.

Arachnoidea.

Architext.

ArchitextSpider.

Ask jeeves.

Asterias2.0.

Augurfind.

BackRub/2.1.



Steeler/1.3.

Szukacz.

T-h-u-n-d-e-r-s-t-o-n-e.

Teoma.

Teoma_agent1.

UK Searcher Spider.

WebCrawler.

Winona.

ZyBorg.

Turnitinbot.

Ultraseek.

Vagabondo.

Voilabot.

W3c_validator.

Zao/0.

Zyborg/1.0.

This post has been edited by.

211655.

: 20 July 2004, 22:26..

Comments (108)

I would like to know the answer too. Anyone here know what is the answer to that question. I'll do some investigation and get back to you if I discover an good answer. You should email the people at iPage as they probably could assist you..

Comment #1

Here's a somewhat more inclusive file. Steve, does the case really matter? In applications_top.php, all case is converted..

$Id: spiders.txt,v 1.2 2003/05/05 17:58:17 dgw_ Exp $.

Acoon.

Ah-ha.com.

Ahoy.

Almaden.ibm.com.

Altavista-intranet.

Ananzi.

Anthill.

Appie 1.1.

Arachnoidea.

Arachnophilia.

Arale.

Araneo.

Aranha.

Architext.

ArchitextSpider.

Aretha.

Arks.

Ask jeeves.

Asterias2.0.

Atn worldwide.

Atomz.

Augurfind.

Backrub.



Slurp/2.0.

Slurp/2.0j.

Slurp/3.0.

Slurp/si.

Snooper.

Speedfind.

Steeler/1.3.

Suke.

Suntek.

Supersnooper.

Surfnomore.

Sven.

Szukacz.

T-h-u-n-d-e-r-s-t-o-n-e.

T-rex.

Tach black widow.

Tarantula.

Tcl w3.

Templeton.

Teoma.

Teoma_agent1.

Teomaagent.

Teomatechnologies.

The peregrinator.

The world wide web wanderer.

The world wide web worm.

Titan.

Titin.

Tkwww.

Toutatis.

Turnitinbot.

Ucsd.

Udmsearch.

UK Searcher Spider.

Ultraseek.

Url check.

Vagabondo.

Valkyrie.

Verticrawl.

Victoria.

Vision-search.

Voilabot.

Voyager.

Vscooter/2.0 G.R.A.B. V1.1.0.

W3c_validator.

W3m2.

W3mir.

Walhello appie.

Wallpaper.

Web core / roots.

Web hopper.

Web wombat.

Webbandit.

Webcatcher.

Webcopy.

WebCrawler.

Webfoot.

Weblayers.

Weblinker.

Weblog monitor.

Webmirror.

Webquest.

Webreaper.

Webs.

Websnarf.

Webstolperer.

Webvac.

Webwalk.

Webwalker.

Webwatch.

Webwombat.

Webzinger.

Wget.

Whatuseek winona.

Whizbang.

Whowhere.

Wild ferret.

Winona.

Wire.

Wired digital.

Wwwc.

Xget.

Xift.

Yahoo.

Yahoo! Slurp.

YahooSeeker/1.1.

Yandex.

Zao/0.

Zyborg.

Zyborg/1.0.

Jack..

Comment #2

Yesterday I found this file on other forum.

Is this file ready to use ? or it's just the list of possible spiders?.

$Id: spiders.txt,v 1.2 2003/05/05 17:58:17 dgw_ Exp $.

'antibot',.

'appie',.

'architext',.

'bjaaland',.

'digout4u',.

'echo',.

'fast\-webcrawler',.

'ferret',.

'googlebot',.

'gulliver',.

'harvest',.

'htdig',.

'ia_archiver',.

'jeeves',.

'jennybot',.

'linkwalker',.

'lycos_',.

'mercator',.

'moget',.

'muscatferret',.

'myweb',.

'netcraft',.

'nomad',.

'petersnews',.

'scooter',.

'slurp',.

'unlost_web_crawler',.

'voila',.

'^voyager\/',  # Add ^ and \/ to avoid to exclude voyager and amigavoyager browser.

'webbase',.

'weblayers',.

'wisenutbot'.

);.

@RobotsSearchIDOrder_list2 = (.

'acme\.spider',.

'ahoythehomepagefinder',.

'alkaline',.

'arachnophilia',.

'aretha',.

'ariadne',.

'arks',.

'aspider',.

'atn\.txt',.

'atomz',.

'auresys',.

'backrub',.

'bigbrother',.

'blackwidow',.

'blindekuh',.

'bloodhound',.

'brightnet',.

'bspider',.

'cactvschemistryspider',.

'calif[^r]',.

'cassandra',.

'cgireader',.

'checkbot',.

'churl',.

'cmc',.

'collective',.

'combine',.

'conceptbot',.

'coolbot',.

'core',.

'cosmos',.

'cruiser',.

'cusco',.

'cyberspyder',.

'deweb',.

'dienstspider',.

'digger',.

'diibot',.

'directhit',.

'dnabot',.

'download_express',.

'dragonbot',.

'dwcp',.

'e\-collector',.

'ebiness',.

'eit',.

'elfinbot',.

'emacs',.

'emcspider',.

'esther',.

'evliyacelebi',.

'nzexplorer',.

'fdse',.

'felix',.

'fetchrover',.

'fido',.

'finnish',.

'fireball',.

'[^a]fish',.

'fouineur',.

'francoroute',.

'freecrawl',.

'funnelweb',.

'gama',.

'gazz',.

'gcreep',.

'getbot',.

'geturl',.

'golem',.

'grapnel',.

'griffon',.

'gromit',.

'hambot',.

'havindex',.

'hometown',.

'htmlgobble',.

'hyperdecontextualizer',.

'iajabot',.

'ibm',.

'iconoclast',.

'ilse',.

'imagelock',.

'incywincy',.

'informant',.

'infoseek',.

'infoseeksidewinder',.

'infospider',.

'inspectorwww',.

'intelliagent',.

'irobot',.

'iron33',.

'israelisearch',.

'javabee',.

'jbot',.

'jcrawler',.

'jobo',.

'jobot',.

'joebot',.

'jubii',.

'jumpstation',.

'katipo',.

'kdd',.

'kilroy',.

'ko_yappo_robot',.

'labelgrabber\.txt',.

'larbin',.

'legs',.

'linkidator',.

'linkscan',.

'lockon',.

'logo_gif',.

'macworm',.

'magpie',.

'marvin',.

'mattie',.

'mediafox',.

'merzscope',.

'meshexplorer',.

'mindcrawler',.

'momspider',.

'monster',.

'motor',.

'mwdsearch',.

'netcarta',.

'netmechanic',.

'netscoop',.

'newscan\-online',.

'nhse',.

'northstar',.

'occam',.

'octopus',.

'openfind',.

'orb_search',.

'packrat',.

'pageboy',.

'parasite',.

'patric',.

'pegasus',.

'perignator',.

'perlcrawler',.

'phantom',.

'piltdownman',.

'pimptrain',.

'pioneer',.

'pitkow',.

'pjspider',.

'pka',.

'plumtreewebaccessor',.

'poppi',.

'portalb',.

'puu',.

'python',.

'raven',.

'rbse',.

'resumerobot',.

'rhcs',.

'road_runner',.

'robbie',.

'robi',.

'robofox',.

'robozilla',.

'roverbot',.

'rules',.

'safetynetrobot',.

'search_au',.

'searchprocess',.

'senrigan',.

'sgscout',.

'shaggy',.

'shaihulud',.

'sift',.

'simbot',.

'site\-valet',.

'sitegrabber',.

'sitetech',.

'slcrawler',.

'smartspider',.

'snooper',.

'solbot',.

'spanner',.

'speedy',.

'spider_monkey',.

'spiderbot',.

'spiderline',.

'spiderman',.

'spiderview',.

'spry',.

'ssearcher',.

'suke',.

'suntek',.

'sven',.

'tach_bw',.

'tarantula',.

'tarspider',.

'techbot',.

'templeton',.

'teoma_agent1',.

'titin',.

'titan',.

'tkwww',.

'tlspider',.

'ucsd',.

'udmsearch',.

'urlck',.

'valkyrie',.

'verticrawl',.

'victoria',.

'visionsearch',.

'vwbot',.

'w3index',.

'w3m2',.

'wallpaper',.

'wanderer',.

'wapspider',.

'webbandit',.

'webcatcher',.

'webcopy',.

'webfetcher',.

'webfoot',.

'weblinker',.

'webmirror',.

'webmoose',.

'webquest',.

'webreader',.

'webreaper',.

'websnarf',.

'webspider',.

'webvac',.

'webwalk',.

'webwalker',.

'webwatch',.

'whatuseek',.

'whowhere',.

'wired\-digital',.

'wmir',.

'wolp',.

'wombat',.

'worm',.

'wwwc',.

'wz101',.

'xget',.

# Other robots reported by users.

'aport',.

'awbot',.

'baiduspider',.

'bobby',.

'boris',.

'bumblebee',.

'cscrawler',.

'daviesbot',.

'exactseek',.

'ezresult',.

'gigabot',.

'gnodspider',.

'henrythemiragorobot',.

'internetseer',.

'justview',.

'linkbot',.

'metager\-linkchecker', # Must be before linkchecker.

'linkchecker',.

'microsoft_url_control',.

'msiecrawler',.

'msnbot',.

'nagios',.

'nederland\.zoek',.

'perman',.

'pompos',.

'psbot',.

'rambler',.

'redalert',.

'shoutcast',.

'slysearch',.

'surveybot',.

'turnitinbot',.

'turtlescanner',  # Must be before turtle.

'turtle',.

'ultraseek',.

'webclipping\.com',.

'webcompass',.

'yandex',.

'zealbot',.

'zyborg'.

'robot',.

'crawl',.

'spider'.

);.

# RobotsHashIDLib.

'acme\.spider','Acme.Spider',.

'ahoythehomepagefinder','Ahoy! The Homepage Finder',.

'alkaline','Alkaline',.

'appie','Walhello appie',.

'arachnophilia','Arachnophilia',.

'architext','ArchitextSpider',.

'aretha','Aretha',.

'ariadne','ARIADNE',.

'arks','arks',.

'aspider','ASpider (Associative Spider)',.

'atn\.txt','ATN Worldwide',.

'atomz','Atomz.com Search Robot',.

'auresys','AURESYS',.

'backrub','BackRub',.

'bigbrother','Big Brother',.

'bjaaland','Bjaaland',.

'blackwidow','BlackWidow',.

'blindekuh','Die Blinde Kuh',.

'bloodhound','Bloodhound',.

'brightnet','bright.net caching robot',.

'bspider','BSpider',.

'cactvschemistryspider','CACTVS Chemistry Spider',.

'calif[^r]','Calif',.

'cassandra','Cassandra',.

'cgireader','Digimarc Marcspider/CGI',.

'checkbot','Checkbot',.

'churl','churl',.

'cmc','CMC/0.01',.

'collective','Collective',.

'combine','Combine System',.

'conceptbot','Conceptbot',.

'coolbot','CoolBot',.

'core','Web Core / Roots',.

'cosmos','XYLEME Robot',.

'cruiser','Internet Cruiser Robot',.

'cusco','Cusco',.

'cyberspyder','CyberSpyder Link Test',.

'deweb','DeWeb(c) Katalog/Index',.

'dienstspider','DienstSpider',.

'digger','Digger',.

'diibot','Digital Integrity Robot',.

'directhit','Direct Hit Grabber',.

'dnabot','DNAbot',.

'download_express','DownLoad Express',.

'dragonbot','DragonBot',.

'dwcp','DWCP (Dridus\' Web Cataloging Project)',.

'e\-collector','e-collector',.

'ebiness','EbiNess',.

'eit','EIT Link Verifier Robot',.

'elfinbot','ELFINBOT',.

'emacs','Emacs-w3 Search Engine',.

'emcspider','ananzi',.

'esther','Esther',.

'evliyacelebi','Evliya Celebi',.

'nzexplorer','nzexplorer',.

'fdse','Fluid Dynamics Search Engine robot',.

'felix','Felix IDE',.

'ferret','Wild Ferret Web Hopper #1, #2, #3',.

'fetchrover','FetchRover',.

'fido','fido',.

'finnish','Hmhkki',.

'fireball','KIT-Fireball',.

'[^a]fish','Fish search',.

'fouineur','Fouineur',.

'francoroute','Robot Francoroute',.

'freecrawl','Freecrawl',.

'funnelweb','FunnelWeb',.

'gama','gammaSpider, FocusedCrawler',.

'gazz','gazz',.

'gcreep','GCreep',.

'getbot','GetBot',.

'geturl','GetURL',.

'golem','Golem',.

'googlebot','Googlebot (Google)',.

'grapnel','Grapnel/0.01 Experiment',.

'griffon','Griffon',.

'gromit','Gromit',.

'gulliver','Northern Light Gulliver',.

'hambot','HamBot',.

'harvest','Harvest',.

'havindex','havIndex',.

'hometown','Hometown Spider Pro',.

'htdig','ht://Dig',.

'htmlgobble','HTMLgobble',.

'hyperdecontextualizer','Hyper-Decontextualizer',.

'iajabot','iajaBot',.

'ibm','IBM_Planetwide',.

'iconoclast','Popular Iconoclast',.

'ilse','Ingrid',.

'imagelock','Imagelock',.

'incywincy','IncyWincy',.

'informant','Informant',.

'infoseek','InfoSeek Robot 1.0',.

'infoseeksidewinder','Infoseek Sidewinder',.

'infospider','InfoSpiders',.

'inspectorwww','Inspector Web',.

'intelliagent','IntelliAgent',.

'irobot','I, Robot',.

'iron33','Iron33',.

'israelisearch','Israeli-search',.

'javabee','JavaBee',.

'jbot','JBot Java Web Robot',.

'jcrawler','JCrawler',.

'jeeves','Jeeves',.

'jobo','JoBo Java Web Robot',.

'jobot','Jobot',.

'joebot','JoeBot',.

'jubii','The Jubii Indexing Robot',.

'jumpstation','JumpStation',.

'katipo','Katipo',.

'kdd','KDD-Explorer',.

'kilroy','Kilroy',.

'ko_yappo_robot','KO_Yappo_Robot',.

'labelgrabber\.txt','LabelGrabber',.

'larbin','larbin',.

'legs','legs',.

'linkidator','Link Validator',.

'linkscan','LinkScan',.

'linkwalker','LinkWalker',.

'lockon','Lockon',.

'logo_gif','logo.gif Crawler',.

'lycos_','Lycos',.

'macworm','Mac WWWWorm',.

'magpie','Magpie',.

'marvin','marvin/infoseek',.

'mattie','Mattie',.

'mediafox','MediaFox',.

'merzscope','MerzScope',.

'meshexplorer','NEC-MeshExplorer',.

'mindcrawler','MindCrawler',.

'moget','moget',.

'momspider','MOMspider',.

'monster','Monster',.

'motor','Motor',.

'muscatferret','Muscat Ferret',.

'mwdsearch','Mwd.Search',.

'myweb','Internet Shinchakubin',.

'nagios','Nagios monitoring checker',.

'netcarta','NetCarta WebMap Engine',.

'netcraft','Netcraft Web Server Survey',.

'netmechanic','NetMechanic',.

'netscoop','NetScoop',.

'newscan\-online','newscan-online',.

'nhse','NHSE Web Forager',.

'nomad','Nomad',.

'northstar','The NorthStar Robot',.

'occam','Occam',.

'octopus','HKU WWW Octopus',.

'openfind','Openfind data gatherer',.

'orb_search','Orb Search',.

'packrat','Pack Rat',.

'pageboy','PageBoy',.

'parasite','ParaSite',.

'patric','Patric',.

'pegasus','pegasus',.

'perignator','The Peregrinator',.

'perlcrawler','PerlCrawler 1.0',.

'phantom','Phantom',.

'piltdownman','PiltdownMan',.

'pimptrain','Pimptrain.com\'s robot',.

'pioneer','Pioneer',.

'pitkow','html_analyzer',.

'pjspider','Portal Juice Spider',.

'pka','PGP Key Agent',.

'plumtreewebaccessor','PlumtreeWebAccessor',.

'poppi','Poppi',.

'portalb','PortalB Spider',.

'puu','GetterroboPlus Puu',.

'python','The Python Robot',.

'raven','Raven Search',.

'rbse','RBSE Spider',.

'resumerobot','Resume Robot',.

'rhcs','RoadHouse Crawling System',.

'road_runner','Road Runner: The ImageScape Robot',.

'robbie','Robbie the Robot',.

'robi','ComputingSite Robi/1.0',.

'robofox','RoboFox',.

'robozilla','Robozilla',.

'roverbot','Roverbot',.

'rules','RuLeS',.

'safetynetrobot','SafetyNet Robot',.

'scooter','Scooter (AltaVista)',.

'search_au','Search.Aus-AU.COM',.

'searchprocess','SearchProcess',.

'senrigan','Senrigan',.

'sgscout','SG-Scout',.

'shaggy','ShagSeeker',.

'shaihulud','Shai\'Hulud',.

'sift','Sift',.

'simbot','Simmany Robot Ver1.0',.

'site\-valet','Site Valet',.

'sitegrabber','Open Text Index Robot',.

'sitetech','SiteTech-Rover',.

'slcrawler','SLCrawler',.

'slurp','Inktomi Slurp',.

'smartspider','Smart Spider',.

'snooper','Snooper',.

'solbot','Solbot',.

'spanner','Spanner',.

'speedy','Speedy Spider',.

'spider_monkey','spider_monkey',.

'spiderbot','SpiderBot',.

'spiderline','Spiderline Crawler',.

'spiderman','SpiderMan',.

'spiderview','SpiderView(tm)',.

'spry','Spry Wizard Robot',.

'ssearcher','Site Searcher',.

'suke','Suke',.

'suntek','suntek search engine',.

'sven','Sven',.

'tach_bw','TACH Black Widow',.

'tarantula','Tarantula',.

'tarspider','tarspider',.

'techbot','TechBOT',.

'templeton','Templeton',.

'teoma_agent1','TeomaTechnologies',.

'titin','TitIn',.

'titan','TITAN',.

'tkwww','The TkWWW Robot',.

'tlspider','TLSpider',.

'ucsd','UCSD Crawl',.

'udmsearch','UdmSearch',.

'urlck','URL Check',.

'valkyrie','Valkyrie',.

'verticrawl','Verticrawl',.

'victoria','Victoria',.

'visionsearch','vision-search',.

'^voyager\/','Voyager',.

'vwbot','VWbot',.

'w3index','The NWI Robot',.

'w3m2','W3M2',.

'wallpaper','WallPaper',.

'wanderer','the World Wide Web Wanderer',.

'wapspider','',.

'wisenutbot','WISENutbot (Looksmart)',.

'yandex', 'Yandex bot',.

'zealbot','ZealBot',.

'zyborg','Zyborg (Looksmart)',..

Comment #3

No, it is not ready for use. It looks like it may have been taken from a section of code. You would need to strip out everything except the names..

Jack..

Comment #4

May be a stupid question, but to a newbie like me it isn't..

What is this file for please ??.

What does it allow/deny ??.

Thanks...

Comment #5

Qihun, that list is from.

Awstats.

, a great open source web log analyzer..

Yet another spider I just discovered is.

'nutch'.

'seeker', which I have in my list, is a new Yahoo shopping bot that is sort of their version of Froogle. The seeker spider has some bugs - it can't handle dynamic queries. Doesn't seem to be too active yet....

Jack_mcs, application_top downcases the user agent flag, but NOT the strings it reads from spiders.txt. That's why any uppercase in spiders.txt will result in a non-match..

Keith, 2.2-MS2 has a "Prevent Spider Sessions" feature in Admin under Sessions. The idea is to cause search engine spiders to not start a session, so that links they save don't include the osCid and they don't start adding to a cart everything in your store!.

The way it works is this - every web connection includes a "user agent" string. MOST spiders have a meaningful and somewhat unique string. If you have the feature enabled, application_top reads spiders.txt into an array, and for each string in the list, looks to see if that string is found in a downcased copy of the user agent..

For example, Googlebot sends this user agent:.

"Googlebot/2.1 (+http://www.google.com/bot.html)".

Spiders.txt contains the string 'googlebot', which will be found in the user agent. If found, a session won't be started and Google won't save URLs with session IDs..

Because this is a substring search, if spiders.txt includes 'spider', then any user agent containing 'spider' (in any case) will match. So you don't need more than one string containing 'spider'. You do want to be careful not to be too general, as some user browsers might have user agents that could be confused with a spider...

Comment #6

@Keith.

It prevents search engines from creating sessions and getting session ID's in the URL..

@Jack.

Spiders.txt should be all lower case and no duplicates. Once 'scooter' is found the loop exits but if it was not scooter this will check 5 extra times for something you know already is not there..

Scooter < loop exits here..



Scooter/1.0.

Scooter/2.0 G.R.A.B. V1.1.0.

Scooter/2.0 G.R.A.B. V1.1.0.

Scooter/2.0 G.R.A.B. X2.0..

Comment #7

Ahh, sure enough. I looked at the code again and see I didn't go far enough. My entries are always converted to lowercase in my spiders file but when you find one on the web, all of the entries usally begin with uppercase, leading one to assume it doesn't matter. Thanks for pointing that out..

Jack..

Comment #8

Got my WebProNews ezine today. They have a link to a spider list. It can be found at:.

Http://www.psychedelix.com/agents.html.

Regards,.

Donna Gordon.

UKGoods.com..

Comment #9

Keep in mind that every line you add to spiders.txt is more processing that every page load has to go through. Rather than try to list every spider in existence, my philosophy is to have the list reflect the spiders likely to visit my iPage site (and which I might care about.) I watch the logs to see if there is any unwanted activity..

I think there is still some redundancy in the current list - for example, "seventwentyfour" is actually "linkwalker"...

Comment #10

This is the new updated copy of spiders.txt for includes/spiders.txt.

Just copy the text and paste in your file. I have done search for over 3 hours because I was getting oscsid in soem search engines and got the list of only active spiders. some of them are new as they are 2 versions coming these days..

Hope it helps..

Dated: 7/19/2004.

$Id: spiders.txt,v 1.2 2003/05/05 17:58:17 dgw_ Exp $.

Almaden.ibm.com.

Appie 1.1.

Arachnoidea.

Architext.

ArchitextSpider.

Ask jeeves.

Asterias2.0.

Augurfind.

BackRub/2.1.



Steeler/1.3.

Szukacz.

T-h-u-n-d-e-r-s-t-o-n-e.

Teoma.

Teoma_agent1.

UK Searcher Spider.

WebCrawler.

Winona.

ZyBorg.

Turnitinbot.

Ultraseek.

Vagabondo.

Voilabot.

W3c_validator.

Zao/0.

Zyborg/1.0.

This post has been edited by.

211655.

: 20 July 2004, 22:26..

Comment #11

I think you should take a few minutes and study the code that reads spiders.txt. You will find that if a string in that file is not all lowercase, it will never match. Also, as the code looks for the key as any substring in the ident field, a lot of these lines are redundant..

Here is what I am using at the moment, after studying logs. I have eliminated redundant lines and improved chances for matches where version numbers are used. Many of these spiders I never see in the logs, but I left them in..

Almaden.ibm.com.

Appie 1.1.

Architext.

Asterias.

Atomz.

Augurfind.

Bannana_bot.

Bdcindexer.

Crawl.

Docomo.

Frooglebot.

Gaisbot.

Geobot.

Gigabot.

Googlebot.

Grub.

Gulliver.

Henrythemiragorobot.

Ia_archiver.

Iconsurf.

Infoseek.

Kit_fireball.

Lachesis.

Linkwalker.

Lycos_spider.

Mantraagent.

Mercator.

Msnbot.

Moget/.

Muscatferret.

Naverbot.

Naverrobot.

Ncsa beta.

Netresearchserver.

Ng/.

Npbot.

Obot.

Osis-project.

Polybot.

Pompos.

Psbot.

Scooter.

Seeker.

Seventwentyfour.

Sidewinder.

Slurp.

Spider.

Steeler/.

Szukacz.

T-h-u-n-d-e-r-s-t-o-n-e.

Teoma.

Turnitinbot.

Tutorgig.

Ultraseek.

Vagabondo.

Voilabot.

W3c_validator.

Websitepulse.

Zao/.

Zyborg..

Comment #12


This question was taken from a support group/message board and re-posted here so others can learn from it.