1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Blogs not indexing..Is hosting affects in indexing your blog?

Discussion in 'Google' started by zebno, Sep 24, 2009.

  1. #1
    A couple of my blogs are not getting indexed since last 3 weeks, Last month I purchased an established domain with good PR. in past the domain was used for a website related to the same niche for which i want to make a blog. I uploaded wordpress & start writing unique content on it .. the fist week it didnt show any result in google i take it normal coz i change the host that time so i thought to give it some time. Similarly to get my blogs indexed, a fello DP member suggest to added a robots file

    User-agent: *
    Disallow:
    Sitemap: YOUR SITEMAP URL

    to root directory (i did it ), submitted few of the posts to digg & facebook to get it indexed faster, submitted the site-map to webmaster tool as well.

    Google webmaster shows 14/20 of pages indexed.. but yet there results when i search for the blog in google

    In search results it show the same domain with other extension ..instead of www. myblogurl .org it shows a .com (another website).. my blog is sill not indexed in google ..not even the main page

    Do i need to change the host? what else i need to do to indexed it? i am really curious about it now. its more the 3 weeks since Google is not picking my blogs.


    Your valuable Suggestions???
     
    zebno, Sep 24, 2009 IP
  2. axlarry

    axlarry Notable Member

    Messages:
    3,961
    Likes Received:
    121
    Best Answers:
    0
    Trophy Points:
    240
    #2
    no, hosting won't be a problem. Just wait and build more links to your blog.
     
    axlarry, Sep 24, 2009 IP
  3. touches93

    touches93 Well-Known Member

    Messages:
    672
    Likes Received:
    17
    Best Answers:
    0
    Trophy Points:
    130
    #3
    It might be that you purchased a penalized or banned domain.
     
    touches93, Sep 24, 2009 IP
  4. longtime

    longtime Well-Known Member

    Messages:
    569
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    115
    #4
    I have this kind of problem too. My site is "indexed", but it doesn't show up when I search. Probably have to wait as axlarry said.
     
    longtime, Sep 24, 2009 IP
  5. bermuda

    bermuda Peon

    Messages:
    868
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Sometimes the blogs are regularly getting visited by Google and the pages indexed but you may not find them. Gradually as the sites get older more and more pages would be seen.
     
    bermuda, Sep 24, 2009 IP
  6. theapparatus

    theapparatus Peon

    Messages:
    2,925
    Likes Received:
    119
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Actually we had a case where hosting was a problem as, for some reason we never did figure out, the robots.txt file for all of the sites on a single specific server could not be read by Google and they started delisting the sites. Apache was setup correctly for *.txt files, there was nothing in the error logs, and the webserver logs actually showed that the Google spiders were reading the robots.txt files correctly. They just never went any further than that.

    It was a single CPanel server way back when that we had leased from (I believe but don't quote me) Rackshack. Finally canceled it and moved the sites to another datacenter and they indexed fine.

    The sites indexed fine with Yahoo and MSN but not with Google. It was the strangest thing....

    Anyway, the domain in question would be helpful so we can see what's occurring. We may see something that you may have missed.
     
    theapparatus, Sep 24, 2009 IP
  7. axlarry

    axlarry Notable Member

    Messages:
    3,961
    Likes Received:
    121
    Best Answers:
    0
    Trophy Points:
    240
    #7
    correct me if I'm wrong but robots.txt is more like a blocker. You only need the file if you have restricted files/folder. If google or other robots can't find it, they will simply try to crawl all your pages. You don't need the file if you don't have restricted content.

    IMO, as long as you have a good uptime with your hosting, you're fine. What I mean is that IP location etc won't be a problem.

    Best thing you can do is go to your webmaster tools and check if you need to fix any errors. Other than that, like I said just wait and build more quality links.
     
    axlarry, Sep 24, 2009 IP
    theapparatus likes this.
  8. Sxperm

    Sxperm Notable Member

    Messages:
    4,386
    Likes Received:
    142
    Best Answers:
    0
    Trophy Points:
    225
    #8
    Googlebots has its own pattern when do crawl on each site/page. You have to add content regularly and wait until the bot remember the pattern your contents added. If you add one unique content each day then they would comes and crawl and cache once a day. :) The more frequent you add the unique content, the more frequent visit from Googlebots. Please do not forget I did emphasize on "unique content" not just any scrape ones.

    Hosting could come in effect if your host is not reliable. As I told you the bots always record site pattern to set their own pattern. If your site has frequent downtime then it would be possible that bots would not come and crawl regularly as it should be.
     
    Sxperm, Sep 24, 2009 IP
  9. markhutch

    markhutch Peon

    Messages:
    357
    Likes Received:
    22
    Best Answers:
    0
    Trophy Points:
    0
    #9
    My guess is that Google detected that the content of that domain had changed and then discounted the PR because it is under new ownership. I'm no expert, but I would guess that Google does not approve of transferring PR between different owners of the same domain name - since the new ownership has not proven that they are as trust worthy as the previous one. As an example, CNN.com is currently a PR 10 rated website on Google. If CNN sold their website to someone else, my guess would be that Google would not continue to give that same domain name it's highest rating - if it was turned over to a completely different company.
     
    Last edited: Sep 24, 2009
    markhutch, Sep 24, 2009 IP
  10. aamigallery

    aamigallery Guest

    Messages:
    1,362
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #10
    If your site is hosted on same IP with another site which has been banned or penalized by google then you are bound to get penalized. ITs good to host site on different C class IP for better SEO.
     
    aamigallery, Sep 25, 2009 IP
  11. omfans

    omfans Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #11
    Sure, try to find out the sites which are hosted on same IP
     
    omfans, Sep 25, 2009 IP
  12. zebno

    zebno Well-Known Member

    Messages:
    936
    Likes Received:
    25
    Best Answers:
    0
    Trophy Points:
    138
    #12
    The domain was not penalized or banned, it was a PR 3 domain. it is still PR 3 but not indexing the new content in Google.
     
    Last edited: Sep 25, 2009
    zebno, Sep 25, 2009 IP
  13. zebno

    zebno Well-Known Member

    Messages:
    936
    Likes Received:
    25
    Best Answers:
    0
    Trophy Points:
    138
    #13
    Thanks to all, I just figure out the mistake:) By default, at wordpress admin settings, the blog visibility was set to "block search engines" i just changed it..
     
    Last edited: Sep 25, 2009
    zebno, Sep 25, 2009 IP
  14. theapparatus

    theapparatus Peon

    Messages:
    2,925
    Likes Received:
    119
    Best Answers:
    0
    Trophy Points:
    0
    #14
    Actually that's not totally correct. You're right as robots.txt will tell spiders to ignore certain directories and pages and if it's not there, it'll index the entire site. In our case, since the spiders couldn't understand that server's robots files, that's as far as it went and didn't spider any pages within those sites. And since it wasn't spidering, it started delisting.

    zebno, glad you got it figured out. One of the reasons why letting folks know what the url is is important. One of us may have seen the meta tag.
     
    theapparatus, Sep 25, 2009 IP
    zebno likes this.