The number one organic result is a subdomain from Whitehouse.gov. This ‘petitions’ subdomain facilitates citizens to create, manage, and promote petitions to our government. If a petition receives more than 100,000 supporters then the administration has made a commitment to address the petition with a response on the matter in question.
What is immediately curious to any of us with a trained eye in search marketing is that the result from Petitions.whitehouse.gov is ranking highly despite the page being marked disallowed by the subdomain’s robots.txt file.
Why is Whitehouse.gov choosing to block search engines from indexing content of their petition pages, when these pages are created by the people and for the people to express and promote concerns to their government leaders? I cannot think of a good rationale for this. Can you?
I’ve created a petition page on petitions.whitehouse.gov to petition the Obama administration to remove the robots.txt disallow from petitions on their site. This action will promote the transparency and conduit for democracy in action that the web platform was created to serve in the first place.
Find the petition located here and pass this URL to your networks.
People may have trouble finding my new petition via search engines, so that will make it harder to achieve the 100,000 signatures to garner its due attention. Oh, the delicious irony
More details here and looking forward to all the