Shawn Our site www.dealgain.com had a weight 7 before your update. At that time, our site google PR was 0. Now its google page rank is 1, after your change in weight formula, our weight became zero. I know that you would consider a PR 1 if the page rank for any site is either 0 or unassigned. May I request you to give some weight for the sites with the PR 1 over the sites with PR 0. Thanks Shawn Jag
I'm suprised you don't have more weight based on the number of pages your site has. Are they indexed by G?
Well now.. Didn't the pooh hit the fan - until I found this thread At the end of the day, all is good, if the impressions remain the same. 2 things that would help stem the Q's tho.. a quick notice (on the relevant page) or email announcing the weight changes. The other thing that would help would be removing the limit on the number of active ads in a single acct. Perhaps allow grouping of ads or simillar, so admins/users can spot pending ads quickly too. Having to shunt weight from acct to acct is getting more and more painful as I add sites to the coop. Anyhoo, nice to see the cap removed, and hopefully no reduction in impressions. Cheers, JL
I still think it is too early for the current impressions...the question is, did they go up or down after 3-5 days? Also, I do agree that this will attract more sites to the COOP -- I'll be adding another few sometime next week for one client and probably a few more for another client. Cygnus
Ha! I freaked out when I checked our weight this evening. Something on the ad-network page would have been nice. I kept hitting the validate link thinkin' there was no way we could have lost 90% of our weight like that. Whoever was asking if a 100K -> 10K drop was normal, that's about the same percentage we dropped, from about 63K to 6.5K.
Its not actually a new formula...The actual weight displayed (for everyone)was dropped by a factor of ten and caps were removed.
Ahh, it's so weird looking at weight right now. Hard to tell how good you are doing relative to other people.
Is this statement true? Me thinks not. My guess would be: Old formula: pow(base_PR,A) x min(page_count,B) : where B was 4000? New Forumla: pow(base_PR,A) x log(page_count,C) rather than, as you are suggesting: pow(base_PR,A) x page_count / 10 This would explain the overall drop in weight. But this is just from the tid-bits Shawn has provided us - I haven't spent any time trying to figure out if this is correct or trying to determine the constants I don't doubt it's a bit more complex than this - how about it Shawn, is my take on it even remotely close?
My post was based on something way less scientific. I track all my reffered accounts and after the change noticed that every single one under the cap had changed by that factor. My own accounts included. Obviously the cap was lifted as Shawn has stated and is evident in my own accounts. Thats enough to give me and most a fair idea of what a good weight is in the new system. Oh well
Well - didn't mean to be too geeky - I have as much info as the next guy. Just going off what Shawn said: 1) The cap has been removed 2) It's a logarithmic scale I also assumed that the PR contributes in a non-linear fashion and I'm assuming that is still the case. So I think it's just a co-incidence that you are seeing the same scaling across all the domains you monitor. I'm guessing they all have similar page counts or something? If you compared a couple of sites like: Site1 : 5000 pages PR4 Site2 : 50000 pages PR4 These sites would have had the same weight before. Now they would be very different. Big sites win - smaller sites lose - by how much is dependant on how many big sites are in the co-op. I'm not stating any of this as fact - I could well be talking rubbish
You probably are right and it most likely is a coincidence. Sometimes its easier for me to function if I keep things simple in my own mind.
Well As seen It's stupid if a website has over 40K pages google bots hardly every go in one of the inside pages WHILE a website that has 10 page google keep track on them I AM GUESSING ThIs information from the "Cache option" in google So caps should be 4k but what do I know ....
There are a lot of sites out there where google will regularly crawl past 40k page...on a daily basis (especially if they contain unique content). It isn't about the big G though, it is about ad impressions and what you can do with them.
huh? The only part of that I really got was: so if you have a site that has 100,000 different products each with it's own page, it's stupid? Someone should tell those stupid fools at amazon... They have 50Million pages http://www.google.com/search?q=site:www.amazon.com DMOZ has about 10Million pages. http://www.google.com/search?q=site:dmoz.org Are you saying that if those sites joined the co-op (if only) their weight should be capped? Also - I think you'll find that Page Rank influences how deep Google will crawl a site.