This is only a proposal!
More the Google asks for quality site and finds ways to stop spam link building more are the chances that competitors are temped to use span links to Google-kill other competing sites.
Links to our sites can bring positive rank or negative rank. If I build a network of sites to link mine and get high page rank to reach the first position on Google serp, it’s ok to be Google-killed. But if a competitor is building those link to make me banned, I should be able to protect myself.
There should be a method to public declare external sites which I do not trust and from which I don’t want any link juice flowing in.
Since every site has the robots.txt file I have a little proposal to use it to declare who I’m not trusting… in the save way I specify the paths I don’t want to be crawled by search engines.
The robots.txt extension can be somting like:
some rules should be proposed to decide how to match the domain, maybe even part of a path can be added. I left that to specialist who know how the bad linking technique works.
Referer is the HTTP header, mispelled as it is in that protocol.
What do you think about that proposal? Are the other way to stop bad linking and be Google-killed by competitors?