Seems their servers are having excessive trouble/load today. 999 Rate Limit Exceeded all over the place on sites that use Yahoo's API including one I use internally for backlink tracking. I know for sure I haven't reached their 5000 limit.
i'm finding this too, http://tools.seobook.com/link-tools/backlinks/backlinks.php which I use for checking backlinks seems to be broken, guessing it's because of this. Anyone else know of a tool that checks total backlinks and gives you the url they are from?
seems they are still having trouble.. 3 days later and still seeing 999 Rate Limit Exceeded and I know I'm nowhere near the 5000 limit. I don't' see anywhere to even report this to Yahoo. Perhaps they just don't care. By the looks of it, nobody uses Yahoo API's around here anyways.
Yeah still seeing the same problem here too. Really quite annoying, was the best method I had for checking backlinks
Create 20 yahoo account. Create 20 different API key picked randomly every call. Solved my problem. Oh ya, cache result.
I made a second Yahoo API Account, for another Domain with a different IP. But I still have the same problem. Other backlink checkers seem to work again. Anyone can help?
I tried that (multiple API keys), but they all give 999 error. It didn't produce any better results for me. =(
You can 3000000000 people using one API key, the rate limit is based on IP address, not key. Something is seriously wrong with their servers, it's been over a week now, i had to disable the api calls. I used the same call from my computer here (development environment) and it worked fine i wonder if they stuffed up, and it was like if 5 calls ban for 24 months lol
This was driving me nuts - however I did figure fix it in my app *and* figured out what on earth made the difference. See my Yahoo Search API 999 error story here: http://www.affiliatesonfire.com/web-hosting/half-an-ip-address-and-the-fix-for-the-yahoo-search-api-999-error To make a (very) long story short, setting the User Agent fixed it! This is definitely a new problem. Suspect they added this as a requirement across the APIs and haven't caught the documentation up yet.
Yeah, it needs a user agent now. Thanks for whoever figured that out. If you're using file_get_contents you can do this: ini_set('user_agent','Your Bot'); Code (markup):
And if this doesn't work. Try this: function getPage ($url) { if (function_exists('curl_init')) { $ch = curl_init($url); curl_setopt($ch, CURLOPT_HEADER, 0); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); @curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']); $retval = curl_exec($ch); curl_close ($ch); return $retval; } return file_get_contents($url); } Code (markup):
Lexiseek, I'll take the thanks for figuring it out. Somebody else may have also but I found it by trial and error after many hours working through different possibilities as to why it had stopped working. I had searched forums etc for a solution but found no other reference to a solution. Todis, I like your pass-through of the user agent from the script's browser type - neat. My script will be running from a cron job at some point so I'm filling in a user agent directly: curl_setopt($session,CURLOPT_USERAGENT,"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)"); PHP: I did find a UA requirement in the documentation of the Yahoo shopping API. They even suggest faking common browser UA's. Go figure. From my experimentation the search API appears to take anything - as long as it's not blank. /aof