|
03-23-2007, 07:14 AM | #1 |
Green Mole
Join Date: Mar 2007
Posts: 2
|
Crawler speed improvement (although affects limit)
I had the problem phpdigExplore() returns to many duplicate links. This caused the spider to check 100s of duplicate URLs, which caused a slowdown, and the 1000 pages limit was hit quite fast.
Finally I added the following code at the end of phpDigExplore(): PHP Code:
|
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Multiple crawler code error | noel | Troubleshooting | 3 | 11-06-2005 11:10 AM |
Split the search engine and the web crawler | nobrin | How-to Forum | 1 | 08-15-2005 10:53 AM |
Submit crawler | Mindrot | Mod Requests | 5 | 08-26-2004 07:36 PM |
Anything to speed up spidering | jinkas | Mod Requests | 0 | 08-25-2004 02:07 PM |
Italian language improvement | cybercox | Mod Submissions | 0 | 01-11-2004 04:41 AM |