This is a 3-hour seminar Google is giving at Catholic University of America in Washington D. C. on Monday, September 18 from 1:00-4:00 P. M. for $30.00: What You'll Learn Understanding how search engines work Effectively getting website pages into the indexes of search engines Getting your site appropriately listed when users type in relevant keywords Who Should Attend Web staff of all experience levels and others who want to better understand how to tweak sites to get them more optimally listed in search engines. Seminar Outline Understanding search engines and search engine results (25 minutes) Building crawlable, indexable, and well-ranked sites (45 minutes) Tools, tricks and troubleshooting (35 minutes) Break (10 minutes) Webspam --what it is, how to avoid it, how to report it, etc. (10 minutes) Live site evaluations (45 minutes) Questions & Answers (45 minutes) Level of Class All levels (beginner – experienced) Class Format The seminar will be a combination of lecture and discussion http://www.searchenginejournal.com/?p=3767
I attended this seminar given by Adam Lesnik who is in charge of search engine marketing at Google. A quick look around the room looked to be about 300 participants, 2/3 of which were government employees, a number of AOL people, and the rest of us. Adam made it clear that he was there to serve government employees. Unless I state otherwise, the following information applies to Google only: 301 redirect is the only redirect Googlebot doesn't penalize. Adam suggested making your 404 a regular site page so that the visitor will have navigation links. "No follow" is OK JavaScript is problematic for Googlebot to read, i. e. drop-down menus. Use CSS for drop downs. The bot will probably be able to read small JavaScripts. Bot has trouble with frames. iframes are OK. "Includes" are server side and no problem for the bot. "Stay away from subdomains." I'm quoting Adam. The bot has major problems with subdomains. For example, it may think the pages are duplicates because they have the same URL and it will move on and not index or penalize the site. id= dynamic pages Googlebot is clueless. The job description of the Googlebot is: crawling, indexing, ranking. Bot cannot get past logins. Adam strongly suggested not using logins because their research has shown that users will leave the site even if the login is free. But if you must use a login, then provide free summaries for the visitor to click to encourage him/her to continue with the login. Use a 301 redirect for two of the following: http://www.domain.com http://domain.com http://www.domain.com/index.html so that the bot doesn't penalize for having duplicate sites. It doesn't matter which two. This will consolidate your PR. Adam said we give PR more weight than Google does. He said PR is only one factor. No more than 100 outbound links to a page. Page should not be too long because many visitors still use dial-up and will not wait around for the page to load. Use Alt tags for visually-impaired readers and for the bot. Bot cannot read words on images. Templates that use images for text cannot be read by the bot. Bot cannot read Flash. The bot is happiest with a text site. Adam said sometimes you have to give up good looks for function. "No index" is not reliable. Test to make sure the command is working. "You'd be surprized," said Adam. He preferred robot.txt command. I think he said navigation links at the bottom of the page is helpful to the bot. Don't use "click here" because the bot interprets "click here" to mean "go to Adobe." This has to do with Adobe's historical use of "click here." Automatic link builders don't work. Google sees buying and selling and reciprocal links as an advertising business model. Links need to be quality links to count otherwise you are wasting your money. A quality link is any link that "enhances the user's experience." Carries more weight if it is in content rather than a link page, but if it is a quality link on a link page, it is still good. 1 quality link is better than hundreds of low quality links. copyscape.com shows where your copy is being used. Link building is critical. Blog and forum links don't count but by placing links on blogs and forums the readers may add your link to their site which may count if the link is relevant to your site. Google Analytics uses 1st party cookies. Google does not keep your specific stats, but does aggregate. Yahoo! is best for finding your backlinks. Google has chosen not to put resources into this function. RSS feeds are third-party links so it doesn't increase your PR. H1 H2 H3 help PR "Rewrites" are OK Content management systems are problematic because of duplication, and they are easily spammed. Age per se is not a factor in PR, but the older the site the better the odds you have better content and more quality backlinks. Don't use software packages that check position of your keywords, etc. throughout the day. This ties up Google's resources and you will be penalized. URL keywords are a factor in PR. The bot likes descriptive URLs. Adam recommended using dashes because the bot finds it hard to read combined words. He cautioned not to spam the URL. Five or fewer relevant words separated by dashes the bot likes. mortgage-porn-disease-attorney.com for a lollipop site will annoy the bot and you may be dropped from the index. Google is putting resources into controling spam on all levels. Title and Description meta tags are the most important followed by Anchor. Adam emphasized the Title tag should be a shorter version of the Description tag instead of one or two words. Use keywords and a description of the page. The bot uses both the Title and URL to return results (not necessarily the only factors). Try to make the title what your visitor would most likely use to search. With the Anchor tag, remember that the bot has a hard time with combined words. If the bot is having a hard time, it will move on. If you annoy it, it will penalize you.
hai its a good tutorial i like it i will save it in my computer for further reading also i want to ask you one thing submission to directories really matter in google?
Thanks. I updated it with minor changes since you looked at it. The topic of Directories didn't come up during the seminar.
This was a surprise to me! If subdomains are told to be avoided then it is a problem for many sites. I think Google should not penalize for having subdomains because it can isolate subdomains based on the the main domain whether it is a duplicate site or a subdomain. I am totally lost here, what to do because my site does require to have subdomains as I target different cities with each subdomain and I do not see a better way of doing this other than subdomains. I do not want to be penalized by Google for having duplicate content because each subdomain will have different content in it but main core will be the same like layout, wordings, etc. This is a big issue for me.. Can any clarify whether I should worry or not? Also I could not understand this, Please, can some one elaborate on this!!
You should not link to other sites more than 100 times on a page. Please keep in mind that these are only Google's rules. There are other search engines that may have different rules, and as you said, there may be a good reason for you not to follow Google's guidelines. The bot may not be able to tell that your subdomain is different. In your case, maybe it can. The ultimate judgment calls are yours. You have to weigh this and that and make the tradeoffs.