Submission + - Google Ignores Whitehouse.gov Attempt to Block Snowden Pardon Petition
An anonymous reader writes: I’ve been following the Edward Snowden – NSA saga the past week or so with fascination, as I suspect some of you are as well. Last night over dinner, my wife and I were pondering what might be the final outcome of this, depending what happens between Russia (or the left leaning Latin America) and the US in the coming days. I wondered – might there be any chance of an eventual pardon for Snowden from the White House on Obama’s last day in office? There must be some discussion of whether a pardon could be in the works or not, right? So I consulted the Oracle of Google, searching pardon Edward Snowden.
The number one organic result is a subdomain from Whitehouse.gov. This ‘petitions’ subdomain facilitates citizens to create, manage, and promote petitions to our government. If a petition receives more than 100,000 supporters then the administration has made a commitment to address the petition with a response on the matter in question.
What is immediately curious to any of us with a trained eye in search marketing is that the result from Petitions.whitehouse.gov is ranking highly despite the page being marked disallowed by the subdomain’s robots.txt file.
Why is Whitehouse.gov choosing to block search engines from indexing content of their petition pages, when these pages are created by the people and for the people to express and promote concerns to their government leaders? I cannot think of a good rationale for this. Can you?
I’ve created a petition page on petitions.whitehouse.gov to petition the Obama administration to remove the robots.txt disallow from petitions on their site. This action will promote the transparency and conduit for democracy in action that the web platform was created to serve in the first place.
Find the petition located here and pass this URL to your networks.
People may have trouble finding my new petition via search engines, so that will make it harder to achieve the 100,000 signatures to garner its due attention. Oh, the delicious irony
More details here and looking forward to all the /. comments.
The number one organic result is a subdomain from Whitehouse.gov. This ‘petitions’ subdomain facilitates citizens to create, manage, and promote petitions to our government. If a petition receives more than 100,000 supporters then the administration has made a commitment to address the petition with a response on the matter in question.
What is immediately curious to any of us with a trained eye in search marketing is that the result from Petitions.whitehouse.gov is ranking highly despite the page being marked disallowed by the subdomain’s robots.txt file.
Why is Whitehouse.gov choosing to block search engines from indexing content of their petition pages, when these pages are created by the people and for the people to express and promote concerns to their government leaders? I cannot think of a good rationale for this. Can you?
I’ve created a petition page on petitions.whitehouse.gov to petition the Obama administration to remove the robots.txt disallow from petitions on their site. This action will promote the transparency and conduit for democracy in action that the web platform was created to serve in the first place.
Find the petition located here and pass this URL to your networks.
People may have trouble finding my new petition via search engines, so that will make it harder to achieve the 100,000 signatures to garner its due attention. Oh, the delicious irony
More details here and looking forward to all the