Does 60,000 Webpages = 60,000 Visitors/Month?

Discussion in 'General Marketing' started by wmsolutions, May 15, 2009.

  1. exodus

    exodus Well-Known Member

    Messages:
    1,900
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    165
    #21
    OP: Advice, when making a video. Clear the desktop. You have a lot of "personal" stuff on there. ;)
     
    exodus, May 15, 2009 IP
  2. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #22
    Hey guys,

    WOW... you've been busy since my last post... excellent points and, lol, great jabs. :D
    I would have replied earlier, but it took me a programmer in India and some English-speaking connections of his to decode:

    sillyasstoofugginlazytodorealworkandmakerealmoneyhalfwittedcrap

    Exodus... thanks for the advice! Actually, I keep really busy... develop a lot of folders in the process, and almost all of those folders are really just archived versions of my earlier batches of desktop folders. I delete the oldest ones if a few months have passed and I still didn't need anything from them. :p


    Ok... brass tacks... I guess it's my turn?

    First, the MOWG isn't a script or program; it's a method, involving about 15 pieces of software that have been referenced/bundled for ease, and an internal forum for helping folks along. As you imagine, everyone is at a different level of understanding, so we tend to have to start 'easily', and let folks bid up the level of discussion by asking what they know. Our job is to keep up with their levels of understanding, much like in this thread here.

    Secondly, we are not an article generation system as such, but you're on the right track. What's different is that we can change anything on the source coding of a page... anything. The point is not to test one article at a time or one set of keyphrases at a time, because it will take forever to make solid dents and they won't last.

    Folks pay professionals $950 to get a webpage written in such a way as to compete for a standard keyphrase. And that page might last a month or so before others take it over. So you have to hope to make your $950 back every couple of months from each such page.

    Here's what the MOWG Method can test... all at once:

    Different header/footer images
    Different .css sheets applied to each type of tag, such as <td> tags
    Different page backgrounds
    Different layout of images vs. text on a single page
    Different arrangements of bullet-points, to see which arrangement gets the most attention
    Different videos and AdSense placements
    ... and so forth.

    Everyone talks about 'which is the best selling color for your page background'... or 'should the menu buttons be arranged with 'contact' at the top or bottom?' or 'should that image on your page be larger or smaller?' and 'should the AdSense stream be on the upper-left side, or the upper-right side?

    Well... if you try to test combinations of all of these things one at a time, it would take years to learn the best combinations. And that is what has made internet marketers great: they DID these tests. For years. One at a time.

    But we can do them faster... which gets to the next question, of course: what of Google Slap?

    I'll get to how we are different when it comes to that next... because a lot of you are jibing and asking, which is great...
     
    wmsolutions, May 15, 2009 IP
  3. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #23
    Ok...

    Google looks at a bunch of things when reviewing your webpages, as you well know.

    Google looks at 3 things in particular when determining whether a webpage is unique or a dupe. The overall aim is this: 'a user should have a unique experience when landing on any webpage.'

    Let's go through it:

    1) Google looks at the coding itself on your site: <here>not here</and here> with the purpose of discovering what is generally your website template and standardized feeds... RSS, AdSense, email capture blocks, videos, etc. The main point here is that much of this coding will expectedly be the same from page to page: you're usually going to use the same header... css styles, menu layout, page background colors, etc... from page to page. Google is discovering what your main frame is like. If Google didn't differentiate the coding from the content itself, then it would probably consider all of your pages duplicates of a top-level/folder page.

    Here, a website owner doesn't have to worry about anything. Set up your template, menu, header/footer, and have at it. Usually no issues will arise from a strict coding review.


    2) Google looks at the coding-to-content ratio of your webpages. This is because coding falls into clear categories, and meant to be 'seen/heard' by visitors. Well, coding that represents content, if drawn from other domains, will almost certainly be duplicate content. Some of that is fine... general-use of RSS and AdSense, and mainstream videos. But the question becomes how much content is on the page. If the 'content' ratio is very low, compared to the coding, then your webpage could be a doorway page... and not just that, but you could be trying to create 1,000's of doorways that really add no 'new' value or combination for the end-user. From page to page on your site, the visitor would generally have the 'same total experience.' This is spamming, almost without question.

    Mostly, you see this when you see pages from a website open with a common header saying 'check out our great deals on ____ below!' and then all you see are AdSense streams. And you see page after page where the header says pretty much the same thing... and there's really no content. Just coding.

    So it's too hard to create pages like this which create unique user experiences.


    3) This is the highest debated, least-understood, most troublesome element of the whole equation: Google rates the content of your website... the content, itself, compared to the content of other pages (again, we're just talking about Google looking for dupe pages; not other assessments like keyword density by shingle-counts, meta tag optimization or anything else).

    So... what constitutes duplicate content?

    Well, I might have a page with a running theme, such as 'quick brown fox'... so I keep speaking about 'quick brown fox' throughout my pages. Of course, this does not make my pages duplicates. One would expect sets of keywords or a theme to be treated from page to page, and indeed, the more unique pages you have on your website that talk about a given theme, the higher you rank for total content to offer on that theme, and the more relevant you become during searches for that theme, vis-a-vis other websites. Tons of pages is welcome, welcome, welcome when it comes to high rankings on a range of keyphrases that lead to what your content offers.

    But let's get further into it... which I'll do in a minute... I've been trying to suck the inside of my coffee cup dry for the past 3 minutes... "this is America, and I am very thirsty!" (who said that? I don't remember, but boy, it stuck).
     
    wmsolutions, May 15, 2009 IP
  4. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #24
    Ok... websites of course come in all sizes and shapes (pardon the pun!)

    Some are article-driven... others are product-driven... others give charts of data.

    Part of what Google's algorithms have to do is discern what to expect, in each case.

    For instance, if you sell 'yellow hats' in different sizes for different prices, and you want to target geographical regions with different deals (shipping costs vary), then you might have one page offering:

    Yellow Hat - 'small' - 'Price: $16.50' - 'clasp-fitted' (or velcro) - 'Wisconsin Residents: 10% off for a limited time only!'

    Then, another page might be almost identical. The content would still be laid out well, probably in a clear table structure, and the info would probably be populated automatically for this page, like the last one.

    So it might say:

    Yellow Hat - 'medium' - 'Price: $15.50' - 'clasp-fitted' - 'Rhode Island Residents: 8% off for a limited time only!'

    Aside from that, the rest of the content (and definitely the coding) would be the same. The image is probably the same image (it's not so common to produce a different image for each size of yellow hat), and feeds from the exact same source location in the images folder... same with the thumbnail; same with the blow-up.

    Yet, in this case, Google will probably get it. It will probably NOT consider these duplicate pages... even though the content is quite similar.

    In another case, perhaps you have a website showing historic gold prices in ounces/grams... going back hour by hour or day by day... each page might show a chart (24 entries: one per hour), and you have a different page for each day going back in history for perhaps a year.

    Very often the entries will definitely match up. What does Google note, though?
    The content... yes. Google has to grasp the point of the charts. It can't call it duplicate because the numbers are accurate when they're the same... it would be strange to be unindexed because your information is accurate.

    But part of what else Google notes is the organization of the folders. For instance, in this case, the main folder structure might be date-based, with a different folder name being something like:

    /gold/pricing/march-14/2008/
    /gold/pricing/january-12/2007/

    ... and so on. Folder structure certainly counts for organizing information in a way that helps Google determine what is happening, and how to treat it for searches. Solid structure is a must.

    And when it comes to articles... blurbs... general introductions... etc. How does that go?

    Well, it's very common to have a different pitch page for anything you're trying to sell or present... you might have one devoted to 'lowest price'... another to 'top quality'... another to 'unique/best in class'... etc. Much of the content will be similar, but the focus can be different.

    When Google assesses such cases for similarity/diversity, it has to go by the wording... specifically, including the grammar, definitely.

    Example:

    'the quick brown fox jumped over the lazy dog'

    First, we can drop the two usages of the word 'the' for Google's assessment. That will be tossed for the assessment.

    Then we have this:

    'quick brown fox jumped over lazy dog'

    So... 3 total adjectives, two nouns, and a prepositional phrase.

    Each word can be seen as a stand-alone, except for the phrase 'jumped over', because there are only so many combinations that can typically happen here, from a grammar standpoint:

    jumped in... jumped out... jumped through... jumped on... and a handful of others.

    So the CASES to be considered here leave us with 6 items that COULD be modified:

    'quick' 'brown' 'fox' jumped over' 'lazy dog' (yes, we could treat 'jumped' and 'over' separately, but this is less conservative and not advisable).

    With those 6 cases... say that was your whole webpage (summarizing to make it easier to study... wouldn't make sense if this was really a whole stand-alone webpage)

    With 6 cases, if we modified one case for a synonym, what is the PERCENT modification?

    1/6 = 17.5% difference in the 'user experience' when hitting this variant page. That's not good enough to pass Google's appraisal of differentiation.

    What we need is at least 30% total variance, and it has to be spread evenly, and it has to follow grammatical rules.

    'quick brown fox jumped over lazy dog'

    and

    'quick brown fox jumped over tired dog'

    are duplicate pages.



    But... if we change at least two words, we're on the margin... it's still too tight, but the user experience is indeed changing, as is the focus of our keyphrase expansion:

    'quick brown fox jumped over lazy dog'

    vs.

    'rapid light-brown fox jumped over tired dog'

    is getting closer. It's still too close to be comfortable with.

    This is much, much better... let's change 3 entries. That's a 50% differential:

    'quick brown fox jumped over lazy dog'

    vs.

    'fast brown fox hopped around exhausted dog'

    ... NOW we're safe. These are NOT duplicates. Google will see these as unique user experiences.

    Completely different varieties can be such:

    'speedy dark-brown fox leapt over sleeping mutt'
    'rapid light-brown fox sprang over tired canine'
    'fast brown fox flew over exhausted pooch'

    ... and so on. These are NOT duplicates of one another according to Google, because the user experience (defined by % difference of the content on the page) is different enough from page to page.

    Translated to practical terms...? You could be talking about 'lowest cost'... 'highest quality'... 'top calibre service'... 'rare (or unique) item'... etc.

    Ultimately, different combinations of various nouns, pronouns, adjectives, adverbs, prepositional phrases, etc. allows you to target whole new keywords with... distinct user experiences.

    Think 'lazy dog'... well, when I picture a 'lazy dog', I might picture a dog asleep on the kitchen floor. But a dog sleeping might not be lazy at all. He might have been catching frisbees all day at the park, and he's just exhausted. So where I might type 'lazy dog' into a search, your page could come up. BUT... if someone else types 'exhausted dog' into a search, your page will probably NOT come up... there are plenty of other pages talking about 'exhausted dogs'... specifically by word.

    Another way to see this is if both come up. Well, the searcher is going to tend to aim for the link that uses the wording that searcher put into the search box. So if they see 'tired' and 'exhausted'... but they typed 'exhausted'... then there is a higher chance they'll click on the link with 'exhausted' in it, instead of the link with 'tired.' Most folks miss the majority of the 100's of people searching for them nearly EVERY SECOND... of every hour... of every day.

    The idea is to broaden your keyphrase content by working with Google to provide unique user experiences, not to throw up a ton of webpages that are basically duplicates of each other and get smacked around (!)


    Summing it up... this will be my last bit below in a couple minutes...

    Thanks for your patience: your questions are good and I did want to give them that respect!
     
    wmsolutions, May 15, 2009 IP
  5. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #25
    Ok.

    What we figured out how to do was to create not just 1,000's of webpage combinations, but how to work through the content throughout an entire page, and create literally 10,000,000's of variants at a crack.

    But that's crazy, because most of the combinations are too similar (and most folks don't have enough storage online for that many pages... and the processing it would take to create them would be just nuts).

    From the millions of variants, we figured out how to isolate down the few thousand (like 60,000) that are actually unique from one another by literal % of content similarity.

    Then we figured out how to develop a massive folder structure that Google can follow, understand, and find value with... and create the pages into the 'right' folders.

    Finally... we automated almost ALL of this process. We can't automate everything, because yes - there would be 'gibberish' on much of the pages. But we can treat words in different cases very easily: 'this word can change in the 3rd paragraph, but must stay the same in the first paragraph due to grammar'... etc.

    So our page variants:

    ARE unique by user-defined % similarity. Each page is different than the other by a % that the user can decide. So only the final pages that are at LEAST that different are produced, and nothing that is too similar.

    ARE NOT gibberish-productions. Each word can be treated in various cases. It changes when it's fine to change, and stays the same when it isn't.

    ARE dropped into a massive folder structure that auto-populates to sort the variants produced, such that Google sees comprehensive organization throughout the page series.

    ARE able to be layered, such that pitches... explanations... articles... etc. are all unique and fit only with their unique IB/OB related variants.

    ARE able to be so COMPLETELY made different that nothing in the source coding is the same at all, except by coincidence... there are only so many ways to say <table> , <border=1> , <td> ... etc.

    It's not a script; it's a whole method using about 15 pieces of software that we put together. The ways to use the software and what can be done with it really is only limited by imagination; it can be used for a great number of other applications. As users get into it, it'll be neat to see what folks come up with in the way of new app's! :)

    I do hope you found it interesting or useful when trying to assess the value of what we're working on these days!

    We did make the Tutorial free... most of that is the intro stuff, of course; the strategies and depth come in the related forum. The Tutorial had to start somewhere, though, and it does cover a ton of programming and optimization concepts to kick things off and give folks a bearing with this whole approach.

    Ok... hope this answered some of your questions! :)
     
    wmsolutions, May 15, 2009 IP
  6. Scoty

    Scoty Active Member

    Messages:
    620
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    60
    #26
    Sem-Advance as I've said I don't want to use generated content, and anyone with a bit of sense who did I would expect to "release" it over months instead of just one night.

    Damn I kept up for a few posts but it's too much to read
     
    Scoty, May 16, 2009 IP
  7. EMO_Ralez

    EMO_Ralez Peon

    Messages:
    386
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #27
    60k pages WITHOUT backlinks is 0 treffic,

    you can use plugins like auto social bookmarkers and ping tools to notify searchengines of your content, the content has to be keyword rich and seo'd

    only then you will receive GROWING traffic
     
    EMO_Ralez, May 16, 2009 IP
  8. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #28
    Hi guys,

    Scoty - sorry, I realize I didn't clarify this earlier... the pages you create aren't online. They are created on your own computer, into a massive folder structure (that sits in one single folder for ease). From there, you can absolutely decide your uploading strategy... all at once, folder by folder, or a few pages from EACH folder, each day or two. Whatever is the most sensible for your website.

    Ok... you already realize that uploading a ton of webpages at once is not cool in Google's eyes. If you
    create a Google sitemap to upload (which everyone greedily does, and it's understandable)... you're
    probably right. Generally, you should just let Google find your webpages on its own, instead of 'demanding that'
    it index your webpages, which is what a Google Sitemap basically is all about.

    On the other hand, Google does understand that when you're adding, say, a new product line (10,000 new
    items from a new manufacturer). Google usually indexes lots of new pages at once for product lines, charts, etc.

    However, with articles, general pitches, blurbs, etc: Google doesn't like it when you produce a ton of articles ("why did you take the time to write 5,000
    new articles, but during that period, you didn't take the time to upload any for someone's benefit?") That's an issue that flies into the teeth of common sense.

    So we have various strategies about how to handle uploading pages, depending on what you're selling. What I
    don't see from 'article generators' is any explanation about strategies, nor do I see any concept of
    creating sub-folders. For instance, you can create 60,000 webpages that are solidly optimized, and
    structure them in a massive folder structure. Then, just upload a new folder daily. That way Google
    sees bites of quality, structured info appear. And because you're doing this every day or 2-3, then
    Google also sees that the content on your website changes/updates/adds by the day-2-3-. This means
    to Google that your site is live and updated often. Very, very cool. Google keeps coming back and
    indexing you higher every couple days.

    EMO (is that from EmoCorp? Absolutely cool payment processor!)
    You're right about lack of backlinks, of course - especially if you're not submitting a Google Sitemap.
    Somehow, Google has to be able to navigate through your webpages. That much is simple to design in our system... we thought of that.
    Creating backlinks that naturally walk up to the next level of pages is not hard; it's a simple, self-automating process, page by page.

    Then you can still upload folders selectively, a couple per day (like 10 - 20 pages daily or whatever you want).
    And the backlinks will be intact for whatever is uploaded.


    Scoty - lol... too much to read, right?

    Well heck, man... reading is really important; academics are a simple 'must' when it comes to internet marketing.
    If you have trouble reading, gotta develop that cranium muscle a bit. ;)

    Concentrated reading is like weightlifting... it's tough while you do it, but afterward, *all those new concepts are in your head*,
    and everything is easier to understand and categorize (and use!) ...

    Mark Twain said:
    "The brain is like a rubber band.
    Once stretched, it never again returns to its original shape."



    Hope this helps shed some light! :)
     
    wmsolutions, May 16, 2009 IP
  9. Aaron Bennett

    Aaron Bennett Peon

    Messages:
    3
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #29
    Good Information,

    One thing to clarify....

    The difference between this, and auto-generated pages, is fairly simple...

    Auto-generated pages do not allow the developer to be in control of what the end result will be. Because of that, these massive "thesauruses" just plug in random information on what the software believes is appropriate, and it usually isn't...It ends up putting a garbled mess in front of the clients, so even if you do get the traffic (before Google yanks your site), your bounce-rate will be through the roof...

    Long-Story short...You actually have total control, from the copywrighting, all the way down to the tens-of-thousands (if you choose), of variations that are possible...The idea is to work with the Search Engines, rather than cheating the system, and the only way to do that is to still allow total human control over the webpage development, while incorporating software only to automate items that just-plain take time...

    Control was the key, when we implemented this, and that is why we have not seen issues with people being yanked from the system.

    That is why we are seeing good, solid leads from project implementation...

    Hope this clarifies...

    Kind Regards,
    Aaron Bennett
    Lead Strategist - MOWG System
    mowg-affiliate.com
     
    Aaron Bennett, May 18, 2009 IP
  10. Sem-Advance

    Sem-Advance Notable Member

    Messages:
    6,179
    Likes Received:
    296
    Best Answers:
    0
    Trophy Points:
    230
    #30
    mowg-affiliate .com

    Now you need to learn why site design is a very important facet in being able to sell something.....

    That site is plain triple thuggly....

    makes my sites look good and I stink at design....

    Nice to see you two tag teaming your own thread... but if your program were such a money maker... why would you need to make long drawn out convoluted posts??? when you could actually be making $10,000.00s and actually proving you can :rolleyes:
     
    Sem-Advance, May 18, 2009 IP
  11. ChaosTrivia

    ChaosTrivia Active Member

    Messages:
    2,093
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    65
    #31
    Nice self promotion.
    Google indexes 2-3 pages a day. 60,000 pages would mean at least 20,000 days.
    One has to be a patient guy........
     
    ChaosTrivia, May 18, 2009 IP
  12. copper12

    copper12 Peon

    Messages:
    1,850
    Likes Received:
    25
    Best Answers:
    0
    Trophy Points:
    0
    #32
    Show the sites you have done.
     
    copper12, May 18, 2009 IP
  13. copper12

    copper12 Peon

    Messages:
    1,850
    Likes Received:
    25
    Best Answers:
    0
    Trophy Points:
    0
    #33
    No answer, huh?

    I didn't think so.
     
    copper12, May 19, 2009 IP
  14. itouch resume services

    itouch resume services Peon

    Messages:
    32
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #34
    I'd be interested to see if google still accepts the site after a few months or whether its immediate traffic followed by....nothing? How long has the site been up for and has it passed over 3 months yet with a stable level of traffic? Great food for thought though :)
     
    itouch resume services, May 19, 2009 IP
  15. contentboss

    contentboss Peon

    Messages:
    3,241
    Likes Received:
    54
    Best Answers:
    0
    Trophy Points:
    0
    #35
    doorway pages are like... 2002 technology. And using translation out of english and back isn't going to help. In fact, it tends to trip the anti markov filters nowadays, as you should know.

    A site with 60,000 pages that google regards as 'similar' will certainly get a viewing from a human at the Plex. Who will then act appropriately, including banning the entire domain.

    And all it takes is one spam report from a competitor to trigger that same investigation and a ban.

    At $749 a pop, this could indeed become an expensive lesson for some.

    Still, good luck with it.
     
    contentboss, May 19, 2009 IP
  16. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #36
    Hey, guys -

    Sorry for the delay... been busy with programming. We almost have our excel add-in completed... this completely automates all the work involved in Chapters 10, 17 and 18. It is now possible to do a number of smaller jobs in a single day, or a couple large ones - with no real work. Just some button pressing.

    So... good points and questions!

    Let me work with some of the bigger concepts:


    1) Remember, perhaps 100 people search for what you sell or offer... nearly every SECOND of every day. The biggest reason they don't all see your site is because your site is outranked by others, for the majority of those searches. The biggest reason for that gets right down to the very, exact words on your webpages.

    For instance, if my page is about 'quick brown fox', and someone types in 'speedy brown fox'... I am related; Google gets that. But I'm probably on page 342 of the results... mainly because 341 or so guys have 'speedy brown fox' on their website. Exact wording ranks higher.

    Another reason they rank higher is because their website is 'more about' the topic than mine. That means they have more unique, Google-friendly pages dealing with the theme. If you want to hit the top of a lot of related searches, you have to have more related pages about the theme than the top 10 guys. That's not a perfect answer, but it's a big measure according to Google's algorithms.

    I hear this often:
    "Don't you have to have backlinks to get ranked for searches?"

    As far as INTERNAL backlinks... or basic page navigation... definitely. Google can't index what it can't find, you know.

    EXTERNAL - Google could find your pages through backlinks on other sites (and of course, there's the Google sitemap submission as a third way, or Google Analytics, or the Google Toolbar). But this is NOT necessary. Here's how you prove that to yourself: go do a search for something less-than-mainstream. Go for a bit long-tail search... 3 words. Maybe 4. Look for front page results from companies you've never heard of.

    Now, do a search on Google called: "link:domain.com" (not the quotes) to see how many external links there are to that site. Many Google results have no known external links according to Google! And there - that answers the question about whether external backlinks are necessary. They're not - but folks try to sell you on that so they can justify how they got to the top of searches for their own company name or product name. Usually, they are 'more about their thing' than anyone else, anyway, and would be at the top, anyway.

    Google ranks a lot of things about your site. Each is a weighted factor of a total equation. You can make up for a low weight in one factor by doubling up on another. And Google tends to lift you when you do at least one factor well, according to them.

    Another question I hear often:

    "Google only wants/expects to index 5 or 8 of your webpages daily. So getting 60,000 webpages indexed takes... how many years?" ;)


    Ah... now THAT is a big misunderstanding:

    Google visits your site as often as it's come to learn you update it (constantly, daily, weekly, monthly). A small site with few changes gets visited infrequently. When you add more pages, it takes a few days (or a month!) for them to appear on Google.

    But... what about large sites, with pages that update OFTEN...?

    Do a search for: "site:cafepress.com" (without the quotes) on Google. How many pages does Google tell you is on that website?

    Mine says 'about 20,000,000' (you know... give or take some 45,000 pages)

    Now: I'll bet that's not what you're seeing for a number... you're seeing somewhat more or less. Perhaps a million more or less.

    So how often does Google update the index on that site?

    Answer:

    REFRESH THAT PAGE EVERY 5 OR SO SECONDS... AND WATCH THE NUMBERS.

    They change in real-time, don't they? They always did!

    Google is constantly re-indexing, dropping, adding new appearing pages, etc... every SECOND... on that site. They drop/add 50,000 or more pages EVERY FEW SECONDS.

    That's how seriously large websites are handled by Google.

    Go find some more: ebay.com, overstock.com, facebook.com... anything huge.
    And watch the same thing happen.

    Kinda puts a different look on just what Google does in the way of indexing, eh?

    So, how long do you think it took Google to index that 20,000,000'ish pages from CafePress the first time around?
    80 years...? (I doubt it, lol !)
     
    wmsolutions, May 22, 2009 IP
  17. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #37
    "What makes webpages unique according to Google?"

    By definition, webpages are unique if their content is dissimilar to other pages (same site or other sites).
    You could take this to suggest that Google wants to limit how many pitches you can give... or descriptions of parts your program in different words... or ways to rephrase.

    But that's not it. Google likes that kind of diversity. If someone types 'rapid brown fox' and you have a great title and meta description about your main thing, it can catch visitors' attention after all. They might like it, because it's speaking their language.

    You also know to test - tweak variables, etc. and see the effects it has on folks. Test new headlines... new wording in the opening paragraphs... new page background colors, etc.

    So Google balances content mainly with basic rules: content on one page (the wording) must be at least 30% different than the content on any other page of your website. So... EACH of your pages must utilize at least 30% different wording than ALL OTHER pages on your website.

    So this means, for example, that if you have 150 words on a page, then you'd better make sure that 125'ish of them are variables. That means 125 columns of up to 30 synonyms each, for example. And make sure you have the grammar under control on your model page, and use the grammar for the variants that makes sense. Yes, you do this one by one. No, you can't automate it... because yes, it would come out garbled. It takes a couple hours of concentration to get this right. But this is THE hardest part of our process.

    Next, you hit a button that creates all the raw permutations. Then, you hit another button that ASKS YOU what the similarity (difference) needs to be between pages. So you enter, say, 65% for similarity (which is the same thing as 35% difference). And our software will go through ALL the permutations, and ONLY pull up a list of the pages that are at least 35% different from each other.

    You could go for 10% similarity (90% difference) if you want! That's why this project isn't about 'whether it works'... but instead, 'what ways doesn't it work and what ways does it work.'

    It WILL work on various levels, especially because you can make everything in the source code variable... different headers, menus, fonts, background colors, table widths/heights, css sheets, feeds, etc.

    Finally, you hit a third button that asks you a few simple questions to create the folder structure, and file names automatically. Again, remember: these ARE 'unique' pages. All unique to every other one that's included in the filtration (!!)

    It seems to me that Article Generators are primitive, or too user-friendly to be flexible.
    They chop out the very choices you need to be able to make for the sake of excessive simplification.
    They don't do these things (or at least, not well):

    1) Filter results such that you wind up with pages that are a user-defined % different from one another. Not just that the results are all different against a single page, but that the results are ALL unique against each other.

    2) Allow you to develop a massive folder structure - dumping tons of pages into a single folder is a no-no.

    3) They might use a massive folder structure. But how are the folder names generated? Numerically? Redundantly? If Google doesn't 'understand' the common sense approach to the folder structure, this is a bad idea. We allow you to figure out how you name the folders - usually by using variable keyword combinations that will form your title and meta desc tags. That way, the URL (avoiding keyword stuffing) folder structure and filename... and title... and meta tags... and keyword density... and alt tags, title tags, etc. are all jiving. That's a *BIG* deal.


    Then you hit a button that auto-generates the coding for the final step, which is going to hard-code all those finalized pages (and place them into the auto-generated folder structure).
    Now, on your own computer, to use as you wish... you have your few 1000 webpages (or many 1000's of webpages) to use in whatever way you want. From the time you have the variants done, the rest of this process only takes a couple hours to finish. Then do another run. Then another.

    Each can be for a client who will pay up to $2,500 for manual jobs... at least, that's what we charge per project we do for folks.

    So... the question is 'what kind of strategies are there, for all these pages?
    Do we just upload them and cross our fingers or what?'

    Answer: HECK, no... the LAST thing we would consider is simply putting up 60,000 webpages unfiltered onto a domain that runs a daily business and then instantly submit a sitemap. That's not a good idea... there are 'right' ways to do this. That's NOT one of them.

    Well, the MOWGT intro's all this stuff to you. But if you ran with what you learn in there on your own, you'll fall flat and get Google Slapped. The MOWGT does not teach strategy - it teaches the functional basics. You learn NOT to create this project on a regular business website, right? Right. You also learn NOT to submit a Google Sitemap with your earliest projects, either.
     
    wmsolutions, May 22, 2009 IP
  18. wmsolutions

    wmsolutions Active Member

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    56
    #38
    But what CAN or COULD you do...?

    That's the stuff we talk about and grow on in the MOWG Forums, for paid Members (no subscription; the whole MOWG package comes with lifetime access). There is a ton to this, and as more Members get going, there will be all kinds of new applications forming during the next year. It's going to be wild when there are 20 or 50 or more folks in there... all trained over this stuff... discussing it... helping each other with coding issues... sharing experiences, and so forth.

    1) Break the 60,000 webpages through a folder structure with 6 main folders. These will be for 6 domains. Then do a 60%-similarity/40%-differential filtering process on each 10,000 batch. Imagine THAT possibility.


    2) Filter out 60,000. Get 6,000. Upload 300. Now, every 3rd day, OVERWRITE (not add to, but single-batch-overwrite) those 300 pages with the next 300... then the next... then the next. Point: Google will start to constantly re-index your site. You're there, updating your webpages constantly. Google loves that - do it well and consistently, and your rank shoots up! Then... grow: from 300 pages to 500... do that for a couple weeks... then grow again! The point is that you don't have to manually create the updates to the pages... you have huge folders of quality, unique variants for rewriting your pages on the fly!

    3) Get daring, for a short while so you can learn what new keywords draw your biggest traffic. Aim to upload 60,000 pages. Have interlaced navigation. Let Google discover your pages in spurts. It'll start indexing... and with that, you get a rush of keyword analysis on your visitor tracking system (!!) ... after you do that for a week, and you have lots of new keyword analysis, then PRECLUDE Google from getting touchy: simply pull your pages back down before Google develops an issue with your site. Then selectively put up the pages that BEST grab the keyword combinations you found drew the most traffic!

    4) (This is not advised. It's just for general training over what Google Slap is like for folks who are afraid of the implications...)

    Be as bold as you want:
    Create a website. Upload 60,000 webpages overnight. Google might index 24,000 of them (average) for two or three months before de-listing your website.

    Well, who cares? If what you had to sell was solid, then for the price of a domain name ($10?), you got to sell like mad for 2-3 months, making $1,000's or $10,000's (not bad for a simple $10 investment into each business)!


    Let me put it like an offer - if Google came to you and said:

    "We'll let you index 24,000 webpages for 2-3 months. After that, that's it... we're going to delist you. You'll still have a fully functioning website. The email addresses you captured are still yours. The sales are yours. All the directory listings you put out there are yours, and the forum listings are yours. But... you won't get new customers from us on THAT domain name after the 2-3 months are up. Keep in mind that you could make $10,000 or more in sales during this time."

    Well - I guarantee you my answer to that is simple... heck, yes! Where's the problem?!

    My question would be: "Can I do this with another domain, too?"

    Their answer would be 'yes - as many as you want.'


    So... for those of you who are thinking about how "traumatic" it would be if Google de-listed a domain name you created yesterday for $10, 2-3 months after letting you index a ton of pages... that's about as painful as it will ever get.

    Note: if you ask, "What if all those pages point at my main business site? Won't Google de-list that site also?" The answer is NO. Google's TOS goes over this. They almost NEVER do that, because they have no idea what the relationship is between the sites. They know that program owners can't be responsible for all their affiliates' actions. And they know that everyone is getting advice from anyone out there, which they innocently employ, thinking it's a great idea.

    All Google ever does for first/second-time offending websites: it delists your site and alerts you that you tripped over an algorithm somewhere, and they suggest you 'undo your last change and resubmit your website.'

    Fine... take down that last batch of folders that probably caused the issue. Then resubmit. You're back and running on that domain within a day or two. No problems.

    Taking it further... invest $10 per week (or day!) to get new domain names. Put them on the same server. Google doesn't care nor want to investigate whether they're owned by the same person. Hosting servers have 1,000's of folks on each shared hosting server, you know.

    Then CASCADE-CREATE these massive projects. Create a whole slew of websites that simply try to make back the $10 expense of each domain name (yet they'll each average $20,000 or so before Google delists them??)

    If Google changes some algorithms, we'll change our process to suit. We're not trying to 'BEAT' Google. We're trying to work with them, to the full extent of our elbow room.

    Having a forum full of folks who make this an obsession over this next year will be WILD.

    For every 10,000 individuals who write up one webpage each... there's a guy out there generating 10,000 Google-friendly webpages at once. Get in on THAT action. You'll never rank high with manually-written webpages + no affiliates anymore. No one wins that way anymore. There's FAR more auto-generated content on the web than manually-generated content; learn how to USE that to your own advantage. And grow with that new way of internet marketing.

    What better way can there possibly be to advertise, than to saturate Google with your offers...? Find a way to do THAT. And that's what we're finding multiple ways to do and discuss and develop upon as we go.

    The MOWG is powerful, but still in its infancy. It's *wild* to imagine where we'll be in a year...

    Ok...!

    If you haven't gotten the Tutorial yet, get it... check it over. It's fully free access for 2 days. After that, it's just $8 as a reference tool. And by then, you'll know whether you want to buy it or not. No problem; no risk; no spending money on something you weren't sure about. You have at least 48 hours to review a 10-hour Tutorial completely without charge.

    I hope this helps, and have a great morning/evening/night! :)
     
    wmsolutions, May 22, 2009 IP
  19. Darkness

    Darkness Peon

    Messages:
    377
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    0
    #39
    Basically your taking your content and generating the same content using synonyms aren't you?
     
    Darkness, May 22, 2009 IP
  20. lindamood1

    lindamood1 Active Member

    Messages:
    1,705
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    78
    #40
    no its not like that more no of pages means more visits but u can plan more no of keywords for each page and that help u to increase traffic.
     
    lindamood1, May 22, 2009 IP