I am going to execute some scripts on my server These scripts will index websites index page I want to execute some 10 scripts on a server whether the server will withstand the load These scripts will index only index page and not any other page of a website I want to exactly know how much server load these scripts will take? will I need to look for separate servers or a single sever is enough?? these scripts will be executed all round the day!
Do you have a dedicated server, or are you on shared hosting? Because there's a good chance that these could cause a steadily high server load
Depends upon how your program is writtena dn amount of crawling you want to be done by them... BTW, using curl for such applications tends to have lesser load.
Some time back I used the Fluid Dynamics search engine on a shared hosting account with no problems. I had it set to index one site at a time and store all external links to review later. I never had any problems. 10 at a time might be a much bigger load.
If you are only going to access one site at a time with one thread, it is not going to really put too much load on the server but then again without knowing exactly what you are doing, it is hard to say. I supposed you could get one physical server then run your script and see how the load is, if it is not good enough, go with a beefier machine or run it off more than one machine.
It will index only 1 page a second and it will store it in database and then it will again index the 1 page for a new site