|
09-30-2003, 01:25 PM | #1 |
Green Mole
Join Date: Sep 2003
Posts: 10
|
Using text file for URL's
My configuration:
W2K Apache 1.3 PHP How do I force spider.php to use a text file containing a list of websites to crawl? Does the list need to be in a specific format? for example: http:www.mywebpage.com ??? Thank you |
10-01-2003, 12:53 AM | #2 |
Purple Mole
Join Date: Sep 2003
Location: Kassel, Germany
Posts: 119
|
__________________
-Roland- :: Test PhpDig 1.6.2 here :: - :: Test-Search for (little) Intelligent Php-Dig Fuzzy :: |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
SEF URL's? | raustin | Mod Requests | 0 | 07-22-2008 11:39 AM |
QUESTION: How-to Spider Multiple URL's, not just one at a time. | 2wheelin | How-to Forum | 4 | 06-13-2004 11:42 PM |
Spidering multiple URL's | 2wheelin | Mod Requests | 0 | 05-22-2004 06:51 PM |
Problem with indexing from text file | bloodjelly | Troubleshooting | 9 | 04-19-2004 04:56 PM |
indexing from command line with text file | Wayne McBryde | Troubleshooting | 8 | 01-12-2004 06:56 PM |