Bassicly, google has indexed the URL to my server admin page. The format is https://www.example.com:2082 None of the options in my google webmaster tools will remove it, and I don't have a way to edit any of the files in that folder to show a 404, add a robots.txt, etc. Any ideas?
That is a bit of an awkward page to index, not sure why they would. In Google Webmaster Tools, you should be able to remove the URL through Tools --> Remove URLs. Enter the URL and wait, and they should handle it. Hope I helped.
This might help.. http://www.google.com/support/webmasters/bin/answer.py?answer=35301#exclude_website
You can only remove pages with the http prefix that way, not https AND for them to qualify they need to return a 404 error or be disallowed with the robots.txt file All of those require being able to add a robots.txt file, a meta tag, or a htaccess to fool a 404 error - which I can not do. robots.txt is not an option, because this is on a shared solution and I do not have FTP access to that portion of the server where the control panel lies.
Check this: http://www.google.nl/search?q=remove+https+from+google I think the best solution if you do not have root access to your website is to use "noindex" meta tag. P.s. How can you access Google webmaster tools if you don't have FTP access to the main directory of your website? How did you authorize your website without uploading a special google key html page?
Reading > You. I can't add a meta tag anywhere, because I do not have FTP access to it. Also, Google Webmaster tools requires you to verify on the HTTP portion of your site, which I do have access to. The link in question is HTTPS AND a different port. Can you read either?
Did you know that Google's Remove Content tool in Webmaster Tools is not HTTPS or WWW specific? Meaning, if you want to use the tool to remove all your HTTPS pages from the Google index, it will also remove your HTTP version. And if you want to remove the WWW version, it will also remove the non WWW version from Google? A Webmaster learned that the hard way and we see his story in a Google Groups thread. In short, he tried to remove all his HTTPS pages form the Google index to find that his whole site was removed. It is a good thing Google has a way to reinclude content within 90-days, with ease. JohnMu of Google basically apologized for the misunderstanding, he said: (http://www.seroundtable.com/archives/017596.html) Also check this: http://groups.google.com/group/Goog.../browse_thread/thread/5bff0c899c31197b/?pli=1 Hope it helps.
Matt recently suggested these two options-- one using htaccess file to password protect pages, second using url removel tool from webmaster tools. try em.
You need to get the FTP details to access your server or at lease send the files to the responsible person, you can use both robots.txt file to disallow indexing of https URL's and setting passwords to protect accessing the same URL's using .htaccess... after all you can use the meta tags: noindex, noarchive and no follow to do the same