I've found a brief explanation about keyword tunneling : "it is how the subject matter on another site affects your site"... as i understand it, it's the topical relationship of a certain page to its link (another page)
I think the word he is talking about is anchor text tunneling..about linking using anchor text.... See on this link.
This is what I uncovered: Focused crawlers are programs designed to selectively retrieve Web pages relevant to a specific domain for the use of domain-specific search engines. Tunneling is a heuristic-based method that solves global optimization problem. In this paper we use content block algorithm to enhance focused crawler’s ability of traversing tunnel. The novel Algorithm not only avoid granularity too coarse when evaluation on the whole page but also avoid granularity too fine based on link-context. A comprehensive experiment has been conducted, the result shows obviously that this approach outperforms BestFirst and Anchor text algorithm both in harvest ratio and efficiency. Crawling the Web to build collections of documents related to pre-specified topics became an active area of research during the late 1990's, crawler technology having been developed for use by search engines. Now, Web crawling is being seriously considered as an important strategy for building large scale digital libraries. This paper covers some of the crawl technologies that might be exploited for collection building. For example, to make such collection-building crawls more effective, focused crawling was developed, in which the goal was to make a "best-first" crawl of the Web. We are using powerful crawler software to implement a focused crawl but use tunneling to overcome some of the limitations of a pure best-first approach. Tunneling has been described by others as not only prioritizing links from pages according to the page's relevance score, but also estimating the value of each link and prioritizing them as well. We add to this mix by devising a tunneling focused crawling strategy which evaluates the current crawl direction on the fly to determine when to terminate a tunneling activity. Results indicate that a combination of focused crawling and tunneling could be an effective tool for building digital libraries.
scorpionagency I agree with this great strategie" but this really feels like it would be a lot of work true or false ?? (Focused crawlers are programs designed to selectively retrieve Web pages relevant to a specific domain for the use of domain-specific search engines.)
Not really, I think the majority of the work is in the programming. Once thats done it's all a click of the button Or even self automated. Maintenance I would assume to be minimal once a stable program is released. The exception is if enhancements / Upgrades are continually being done to the same program (Then there would be a higher rate of conflicts requiring maintenance).