I agree with Ankita. But off page technique completely depends on On page technique. In off page activity you can do directory submission, Forum posting, Blog commenting and bookmarking. One most thing is create account in different social sites. Its help a lot in increasing the traffic to your site.
We are good one to get the traffic. 1. Social Bookmarking 2. Directory Submission 3. RSS Feed Submission 4. Forum Posting & Commenting 5. Blog Posting and Commenting 6. Press Release Submission 7. Social Networking Submission 8. Podcast Posting 9. Atom Posting 10. Classified Ads Posting 11. Video Posting 12. Article Submission
Hi, first of all download page speed add-ons for google chrome in your pc then check your page speed of your websites. In this add-ons they suggest you what to do to decrease the page load time. Most crucial part will be in red color. So, you just need to solve that part and it'll be solve for sure. Enjoy....
Your site is really too big. Try to get it down as much as possible to optimize load times. Google doesn't like pages that load very slowely, it reduces value for the user and increases bounce rate. I can crawl your site for you if you want, to see how Google views your page and to find any HTML errors. Just let me know.
off page SEO is directory submission, social bookmarking, article submission, forum posting, classified adds.. with this we can get more back links more traffic on web site..
In off page SEO directory submission, article submission, social bookmarking, forum posting, blog commenting and blog creation all are impotent.
Take the following robots.txt file for example: User-agent: * Disallow: /cgi-bin/ The above two lines, when inserted into a robots.txt file, inform all robots (since the wildcard asterisk "*" character was used) that they are not allowed to access anything in the cgi-bin directory and its descendents. That is, they are not allowed to access cgi-bin/whatever.cgi or even a file or script in a subdirectory of cgi-bin, such as /cgi-bin/anything/whichever.cgi. If you have a particular robot in mind, such as the Google image search robot, which collects images on your site for the Google Image search engine, you may include lines like the following: User-agent: Googlebot-Image Disallow: / This means that the Google image search robot, "Googlebot-Image", should not try to access any file in the root directory "/" and all its subdirectories. This effectively means that it is banned from getting any file from your entire website. You can have multiple Disallow lines for each user agent (ie, for each spider). Here is an example of a longer robots.txt file: User-agent: * Disallow: /images/ Disallow: /cgi-bin/ User-agent: Googlebot-Image Disallow: / The first block of text disallows all spiders from the images directory and the cgi-bin directory. The second block of code disallows the Googlebot-Image spider from every directory. It is possible to exclude a spider from indexing a particular file. For example, if you don't want Google's image search robot to index a particular picture, say, mymugshot.jpg, you can add the following: User-agent: Googlebot-Image Disallow: /images/mymugshot.jpg Remember to add the trailing slash ("/") if you are indicating a directory. If you simply add User-agent: * Disallow: /privatedata