How to scan thousands of domains for page rank at a time.

Discussion in 'Products & Tools' started by xtmx, Aug 29, 2012.

  1. #1
    I developed a Java program to scan unlimited URLs for pagerank. The first 3,000 URLS went by at a rate of 5-6 URLs / second (it finished in just over five minutes). Then, it started giving me errors. The program didn't crash, but I could tell that Google was no longer accepting my requests.

    Is there a way to bypass these limitations? Or should I distribute the program as is, and recommend users to only scan 3,000 URLs at a time?
     
    xtmx, Aug 29, 2012 IP
  2. free2watchmovies

    free2watchmovies Active Member

    Messages:
    1
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    86
    #2
    does`t scrapebox have this option, to scan the url of a website for pr pages?
     
    free2watchmovies, Aug 10, 2013 IP
  3. cipango

    cipango Active Member

    Messages:
    18
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    73
    #3
    use proxies - don't see other way around it. In Scrapebox, with 20-30 proxies I can PR check a list of 200k URLs without problems.
     
    cipango, Aug 27, 2013 IP
  4. shahjahan-the-warrior

    shahjahan-the-warrior Member

    Messages:
    86
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    43
    #4
    Always use Private Proxies over Public ones though. Good Luck with the tool . Let us know how it goes as I'm also looking for such great tool.
     
    shahjahan-the-warrior, Sep 10, 2013 IP