|
06-13-2004, 09:22 PM | #1 |
Green Mole
Join Date: Apr 2004
Location: Cali
Posts: 10
|
Why does phpDig rely on creating txt files for descriptions?
I'm sure someone else has probably asked this question (so if there's a link to the answer please forgive me and let me know what it is) but why does PHP dig rely on creating txt files in the text_content folder rather than simply storing the information as a text or medium-text field in the mysql database?
Maybe it's just a no-brainer but I now have about 10,500 urls in one phpdig database and the return time on the search has grinded down to a near halt. So, I was thinking maybe that's part of the problem. If anyone knows the answer to this question, please let me know. Thanks. And sorry for the bother. |
06-14-2004, 12:09 AM | #2 |
Green Mole
Join Date: Mar 2004
Posts: 22
|
The search is not done using files except for exact match.
Actually, files are only used for text snippet. So when PHPDig open a file the search is already done and you know wich file to open. To speed up your search try this thread |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
indexing *.txt files | skunkone | Troubleshooting | 2 | 11-08-2004 08:03 AM |
PhpDig Ignoring Something in robots.txt | Destroyer X | Troubleshooting | 2 | 06-18-2004 02:57 PM |
robots.txt versus robotsxx.txt | Charter | IPs, SEs, & UAs | 0 | 03-11-2004 07:00 PM |
Searching *.txt files | rafarspd | Troubleshooting | 4 | 12-05-2003 10:01 AM |
phpDig ignores robots.txt | Dragonfly | Troubleshooting | 1 | 09-12-2003 07:54 AM |