![]() |
Spidering a directory - timeout after 10 documents
Hi here is my config:
+ PhpDig 1.8.0 + PHP Version 4.2.3 + safe_mode: off + mysql 3.23.49 + Server API: CGI + System: Linux infong 2.4.21 #1 SMP Wed Jul 30 09:58:54 CEST 2003 i686 unknown + Directories admin/temp, includes & text_content permissions set to 777. + Database holds the following tables: engine, excludes, keywords,logs, sites, spider & tempspider. + not behind any firewall config.php: define('SPIDER_MAX_LIMIT',300); //max recurse levels in spider define('SPIDER_DEFAULT_LIMIT',300); //default value define('RESPIDER_LIMIT',300); //recurse limit for update define('PHPDIG_DEFAULT_INDEX',true); I try to spider a directory called "glossar" with 24 documents. Beginning to spider with Search depth: 1 And after ca. 1 Minute the prozess stopped and only 10 documents were spidered. What can I do? thx for support tams (hamburg - germany) |
It sounds like your settings are all correct. I found the first time that I spidered my site, the process seemed to hang like that. However, I cleared all the tables and started over. The second time, the spidering seemed to work fine.
Try doing that, and let us know how it goes. :) |
i respidered and everything is ok.
thx you t. |
All times are GMT -8. The time now is 12:07 AM. |
Powered by vBulletin® Version 3.7.3
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
Copyright © 2001 - 2005, ThinkDing LLC. All Rights Reserved.