You can create a search engine friendly website, only some efforts is required. Creating a search engine friendly website does not necessarily mean that you will get top listing for a particular keyword or keyword phrase. It is however a necessary first step if you want to rank anywhere near the first few pages of the search engine results. A site that is not search engine ready may not even appear in the results for any query. A. Add Text to Your Images, Flash and Videos For example, you should describe each picture in the "alt" text for the image. For those who don't know what I'm saying, images are put into a web page using HTML code like the following: <img src="name-of-image.jpg" alt="Picture of a search engine ready website"> B. Create Relevant Title Tags C. Validate HTML Code D. Use Straight HTML Navigation Links on Your Website. E. Remove Hidden Text
Flash in your site can really attract visitor's but it is not spider-friendly. Robots will not index flash content site. http://www.seguesites.com/?pageID=21
You know, I could just point you to the SEO thread in my signature here, but I'm not going to. This is not used for search engine optimization, but for accessibility. The alt attribute of the IMG elment is used to identify what the image is for those who cannot see the image, such as those who are using a screen reader, a text-based browser such as Lynx, or even dialup users who have images disabled. Search engine spiders do pick it up, but its really not worth much overall from an SEO point of view (unless you're trying to rank well in an image search). Relevant to what? The TITLE element (<title></title>) is used to identify the title of the Web page, nothing more. Thankfully we can also use it to associate the page with our Web sites, which is an acceptable use. If you're going to include a keyword or two, just limit it to that - one or two keywords. But at the end of the day, remember that you're making Web sites for your visitors, not your bank account or portfolio. As much as the Web developer in me hates to say this, it's just not important to validate your code from a search engine's perspective. If you don't believe me, read this recent blog entry by Google employee Matt Cutts (you can find other blog posts of his which say the exact same thing). Don't get me wrong, you should validate your code, since it not only helps future-proof your site should the standards and specifications change, but it'll also help prevent your sites from breaking when browsers are updated (the fiasco over Mozilla Firefox 2.0.0.8 breaking source-code order layouts such as the Holy Grail of CSS aside, of course). I cannot stress those enough. The former is more of an accessibility requirement than anything, while the latter is purely within the sole domain of SEO. Oh, and KNEB, if you need to use an XML sitemap to get your Web pages crawled and indexed by the search engine spiders, your site has much bigger problems and should be rebuilt from the ground up (note: I am not talking about you or your sites personally; I am using the words "you" and "your" in the general, non-specific sense). And for those who think that a sitemap will help them get spidered or indexed faster, it won't. Search engines will do that when they feel like it, nothing more. It's just an accessibility and usability tool, nothing more.
Here some very potential techniques to create a SE friendly website: 1. Firstly used well yr targeted KW (business KW) in yr home page content & mentioned yr provided services as better way only on home page rather than another pages. 2. Used Image Alt Tags in yr site Logo for describing a image, H1 Tag for Heading, do Hyperlink with yr best KW via Anchor text on home page. 3. Create HTML sitemap for visitors & XML sitemap for SE's robots. 4. Weekly launch new articles regarding yr business or product which u provided, & even generated RSS feed to those Articles for RSS readers. If yr site will be frequently updated via any articles, any information, any news so it will be easy catching to potential Search engine & becoming very unique in SE memory. Gud luk..
As much as possible, avoid using scripts and iframes. Avoid duplicate content Check the links to know which are active/which are not to resolve dead links
Because iframes (and frames in general) cause accessibility issues since some people (nor search engine spiders) can access the content contained in them, and also because search engine spiders cannot crawl through the text that is dynamically generated by client-side scripts (such as JavaScript).
As usual Dan in on the point, this is the best post in this whole thread and not much more to add. +rep
I think you forgot to add unique contents this will help your site to be crawlable by search engine bots.
i suggest you to read http://forums.digitalpoint.com/showthread.php?t=413544 to learn more about the seo