I have someone doing this to me. All they had to do is find out my mod_rewrite structure for my main content pages and link to the dynamic pages like (domain.com/page?id=x) instead of what I have indexed and publicly viewable (domain.com/article-keywords/). The results? Well I've seen some pages go supplemental over this and have seen a pretty good drop of my static url's being included in the index with an increase in dynamic urls + supplementals. Not cool. The guy has a bunch of spam domains that just link to my pages and nobody elses'. Has anyone encoutered this cut-throat move?
If I were you, I'd write the person and request they stop doing this. Next, I'd write the domain(s) host(s) with an explanation of the problem and request they take care of the issue. Most hosts are quick to "fix" unscrupulous client behavior.
Why not change your dynamic URLs to new ones (perhaps less guessable ones), and then update you rewrite rules to match. Then make sure that all the old Dynamic URLs return 404 errors. That should eventually sort it out... Also, make sure that your .htaccess file is not readable! That's making the job too easy for him. Cryo.
Thanks Cryo. That is what I had in mind. But the biggest part of the post was about my SHOCK. I can't believe have never heard of this before. I think it is an error on the engines part that this sort of thing can hurt you.
you could do a rewrite which prevents using the original structure, if you're using WordPress, there's a plugin called permalink redirect, to make sure the page is accessed using the correct URL. As far as the shock is concerned: i can image, sucks having people do this to you. On the other hand: consider it a compliment, they apparently don't know how to beat you using "normal" methods
Yep - it's Google Bowling - too many links acquired too quickly and you incur a penalty ... although this appears to be a variation of that - interesting black hat idea/concept - totally uncool though - good luck getting it straightened out!
You can fix this simply. In your first post you mentioned this structure: (domain.com/page?id=x) Well, just change "id=" into something else, like "mspid=" or anything else. This would return an error for anyone trying to access a page via the "id" variable. Or even easier, change the name of the file that processes the request (page -> secretpage)...
Very clever nasty move on their part... what are you possibly going to do to resolve the page-jacking scheme of theirs?
I wonder if it would be useful to block that site's referrals, so that the bots would pick up 403's whenever they tried to access your site when following these links...