Hi there, could you please help me with this question I have in mind? What if I put a network of sites online with crappy articles (not dupliacte, but crappy). and then I start updating each and every site with new and re-written content? will that help? or will I get sandboxed?
I think that the bots cannot decide what is crap, only duplicate stuff. What is that old saying? ONE MANS TRASH IS ANOTHER MANS TREASURE.......LOL
i donno...what IS that old saying? you'll shoot your eye out? can't teach a old dog new tricks? i give up...
Don't do that, it is not about quantity but quality, try to build the first site, optimize it and then go on with the second site. It is essential to have a good start, when you start a website.
In my opinion it wud be much better to concentrate on one site rather than building ten sites with duplicate content. That's not gonna help the cause.