Hi expert, Just want to ask you a question. Let say I want to make a program automatically surf the whole wiki webpage and then extract some useful information for me, is it possible to do? and by which language I can use as the best choice?
Can you guys set more light on it for me? I just give an example as illustration purpose only. It's really a relief if you can give me some keywords like there are something similar already done or whatever etc and I can save my time to explore it. btw, a milllion thanks
Thanks you! However as I said it's just only an example. The db of wiki is available and free to download, but if we want to collect other site info like this digital 4rum, what/where should I start?
it seems there are something unclear here, my question like software which will automatically surf all the pages of a web-page and collect the useful information for me. Let say a text file, in this text file, there are some important numbers and I need it, so anyone could give me any hints or where should I start?