Hello, I am planning to create a social networking site that scraps different snippets from other sites/forums/blogs... dynamically. The problem is that many pages on this site (including the homepage) will have their content (+90% of the content) update and change almost every moment. So, every time you click the refresh button on my homepage most probably you'll see different dynamic content (snippets, feeds, news, entries, posts, bookmarks,...) My question is: will Google penalize/ban my website for its "different at almost every refresh" content?
I wouldn't have thought so, other social networking sites don't seem to have a problem but I would consider how you'll be linking to archived content so that Google can index everything within your site. (i.e. some kind of permalink)
im not sure you want to scrape and update data real time, this should really be a scheduled task, hourly or daily.
Scraped content is only a part of my content. Other dynamic content is user-created. The scraping will be real time from other sites I own and scheduled for sites I don't own.
google will not punish your because of refresh content on your web site. all of the search engines loves new and refesh content
Don't try to trick the search engine, search engine are very smart now aday just don't overdo the optimization and keep links lower than 100 all shall be fine.
External links on any of my site's pages will never come anywhere near 100. Not so sure about internal links but I don't think that should be a problem, should it? Also each new entry will have its own permalink and its own page, so at some point (I don't know how successful this site is gonna be) I'd expect a rate of 100's-1000's new pages added daily to the site.