I run a website which lists expiring domains. For every domain, I used to check the number of links and pages indexed in Google, and people used to like this information a lot because it allowed them to pick the best domains for re-registration. But it also means that I for 2000-3000 domains every day I had to query Google for "site:domain.com" and "link:domain.com". Google does not allow issuing such number of site: and link: queries daily so I used proxies from InstantProxy.com. However, it appears that a few days Google banned all IP ranges of these proxies and they became unusable. Therefore, I would like to ask if you know of any good proxies for such queries, or maybe another method of bulk getting the number of links and pages indexed in Google for a domain?
Hello, you can try to find a public proxy, for example on : https://hidester.com/proxylist/ Make sure to select sock 4/5 which are the best it's what i do usually.
This is a nice service you provide for your visitors. However, with that many sites to update each day Google starts getting suspicious. Like the others said, you need to try and control your time outs. Google considers this flooding or a robot searching. Have you ever tried to set up your own proxy or tried using a few free VPN account. Soft Ether is out of Japan and had a lot of free VPN account you can connect to.
If you haven't found the solution though public proxy, I suggest you to try private proxy. You may take a look on Blazing proxies. They are offering both public and private proxies.
That's really an interesting issue. I do not know whether FoxyProxy is also blocked, but may you want to give that service a try. It is not free, but it has worked for us in the past very well.
Most public proxies won't work with Google, so do not even try them. MOZ offers a paid API to perform many queries on domains. You can try that or use a VPN and change proxies after few minutes.