Saturday 30 July 2011

Code of Ethics


As you know from the section of the course that discussed the history of search engines and optimization, there was a time when the optimization process was simple and involved little more that tweaking meta tags and repeating keywords within the content of a page. As such, there were a limited number of SEO companies.

Today the industry is highly competitive. Worldwide, there are now thousands of search engine optimization companies. There is also an ever increasing number of websites and many of them are already well optimized, making the competition much worse. In the past, when an SEO company optimized a client's site, most other sites in direct competition were poorly optimized or not optimized at all; the game was easier. But now, almost every competitor of yours is using the services of some SEO to optimize their site.

As you can guess, this makes the game really tough. Also, this leads up to half of the SEO / SEM companies into considering unethical and illegal strategies in order to get their customers to the top of search engine result pages.

Nevertheless, this rarely gives these companies any competitive advantage as most solid and respectable companies seeking SEO services now recognize that guarantees of top-10 rankings sound, at the very least, suspicious.

Let's imagine a company called FakeCompany LTD has a website that offers some products. Except product advertising, this site does not contain any valuable information for visitors so the search engines do not rank it high. So, FakeCompany hires SEO expert Mr. Doe to help them with rankings because Mr. Doe claims he can get any site into the Top-10.

What this SEO expert does is stuff the company pages with irrelevant keywords, creating a link system with thousands of hidden links and then implements some advanced spamming techniques like dynamic page generation and cloaking.

As a result, the rankings of the page are initially boosted. FakeCompany is happy and pays the fee to Mr. Doe.

Still, visitors coming to the site see even more of a mess than before the "optimization" and they are still unable to find any valuable content. Since they find the FakeCompany's site for the irrelevant keywords, that is, not for those they are primarily seeking, the conversion of visitors into customers is very low.

As a result, customers are unsatisfied and FakeCompany receives huge bills for traffic which hasn't been converted and hasn't brought any real profit.

The search engine which allowed for the spammer's site on top of its listings is also unsatisfied since it's losing its popularity among Web surfers because they are serving irrelevant pages. So it invests money into developing a more advanced spider that finally cracks Mr. Doe's tricks and the positions of FakeCompany fall down. Eventually, FakeCompany is excluded from the search engine index entirely because an unsatisfied visitor or a competitor reported to the search engine that FakeCompany is using spam methods.

Of course, FakeCompany is unsatisfied with this situation and sues Mr. Doe to get their money refunded. Mr. Doe is unwilling to cooperate or perhaps even managed to escape before all this mess started. The best case scenario is that FakeCompany gets its money back but is never able to restore its rankings and has to invest in a new website.

Nobody is satisfied in this story, however such things do happen now and again, even in today's more sophisticated SEO environment.

Unfortunately, there's no solution for such cases except abandoning spam techniques entirely and following a Code of Ethics for all Search Engine Optimizers which maintains their good reputation and withstand crowds of unethical SEO companies that wave the banner of illegal yet "effective" promotion strategies.

Adhering to a Code of Ethics (or Code of Conduct), if presented properly, may serve as an effective competitive advantage.

As we believe that such Codes should be unified across the Web, we will not invent our own. Instead, we support the one maintained by a well-known industry expert Bruce Clay and his company. This perfect collection of rules can be found at www.bruceclay.com and we provide a copy here for your reference.

Code of Ethics

Whereas all parties are working towards presenting relevant and high quality information in an easy to use format to information seekers and, whereas SEO practitioners are being contracted to assist clients in obtaining higher rankings for client pages, we (and those linking to this page) are voluntarily adhering to the below SEO Code of Ethics:

No SEO practitioner will intentionally do harm to a client. This involves the continued use of any technology or procedure (without appropriate care) that is known to result in having the client site removed from search engine indexes or directories or rendered inoperative. Questionable adherence to standards must be addressed via the Robots Exclusion Standard.

No SEO practitioner will intentionally violate any specifically published and enforced rules of search engines or directories. Should rules and guidelines change (as they often do), the SEO practitioner will promptly take action to comply with the changes as they apply to all clients. Where rules and guidelines are unclear, the SEO practitioner will seek clarification and await approval from the appropriate search engine before continuing to utilize potentially harmful technology or procedures.

No SEO practitioner will intentionally mislead, harm or offend a consumer. All individuals utilizing a search engine to visit a site will not be misled by the information presented to or by the search engine or harmed or offended upon arrival at the client site. This includes techniques like "bait and switch" where the client page does not substantially contain and is not clearly associated with the optimized phrase or may be reasonably offensive to targeted visitors.

No SEO practitioner will intentionally violate any laws.
This involves the deliberate and continued violation of copyright, trademark, servicemark or laws related to spamming as they may exist at the state, federal or international level.

No SEO practitioner will falsely represent the content of the client site. This includes the practice of presenting different versions of Web pages to different users except where that information is altered solely to meet browser specifications and needs, sensitivity to regional factors such as language or product specific needs. In general, ALL requests for a specific URL should be served identical HTML by the Web server.

No SEO practitioner will falsely represent others work as their own. This includes the taking of work from others in whole or in part and representing this work as their own. The SEO practitioner may not make verbatim copies of the work of others (instead of authoring original work) without the prior consent of the other party.

No SEO practitioner will misrepresent their own abilities, education, training, standards of performance, certifications, trade group affiliations, technical inventory or experiences to others. This includes quantifiable statements related to project timetables, performance history, company resources (staff, equipment and proprietary products) and client lists. Guarantees will be restricted to items over which the SEO practitioner has significant and reasonable control.

No SEO practitioner will participate in a conflict of interest without prior notice to all parties involved.
This includes the practice of choosing to emphasize one client over another in competing keywords because there is more personal gain for the practitioner. All clients are treated equally and all will receive equal best effort in their Search Engine Optimization.

No SEO practitioner will set unreasonable client expectations.
This includes the practice of accepting more than a reasonable number of clients competing for the same keywords and implying that all will be in the top positions in the search engines. This also includes the implication that results can be obtained in an unreasonable amount of time given the known condition of the search engines, client site and competition.

All SEO practitioners will offer their clients both internal and external dispute resolution procedures.
This includes the publishing of address and phone numbers on primary Web pages, the inclusion of third-party dispute resolution links prominently placed within the practitioner’s website and contracts that include sections discussing dispute resolution.

All SEO practitioners will protect the confidentiality and anonymity of their clients with regards to privileged information and items implying testimonial support for the SEO practitioner. All staff of SEO practitioner shall be bound to protect information that is not generally known as it may harm the client. The SEO practitioner will not include the publishing of testimonials and proprietary logos of client lists, press releases and other collateral discussing the client without explicit approvals.”

We try to keep our sites and services compliant with this code and advise that you do the same; however, if you don't wish to use this code, you are welcome to make any custom modifications or invent your own Code, as long as it remains legal, transparent and ethical.

Thursday 28 July 2011

What is Gray-Hat SEO ?


Search engine guidelines clearly define Black-Hat techniques as spamming techniques. You can recognize and avoid them in your SEO campaigns. However, there are so called Gray-Hat techniques, which are temporarily unknown or not restricted by search engines.
Gray-Hats are different because they try to do things they believe are ill-defined by Google, without first asking permission.
Let's look at SearchSecurity.com's description of this notion: "Gray-Hat describes a cracker (or, if you prefer, hacker) who exploits a security weakness in a computer system or product in order to bring the weakness to the attention of the owners. Unlike a Black-Hat, a Gray-Hat acts without malicious intent. The goal of a Gray-Hat is to improve system and network security. However, by publicizing vulnerability, the Gray-Hat may give other crackers the opportunity to exploit it. This differs from the White-Hat who alerts system owners and vendors of vulnerability without actually exploiting it in public".
Google has clearly defined Gray-Hat SEO as a risky, ill-advised method. Here is the indirect spam definition of Gray-Hat techniques from the top search engine: "It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.
Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit… If you believe that another site is abusing Google's quality guidelines, please report that site… spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts."
Now, having a sufficient number of Gray-Hat definitions, you should clearly understand the danger of any spam or spam-like technique. Gray-Hat techniques should not be used. Never deceive anyone, and avoid such methods at any cost.
Here we'd like to show some examples of Gray-Hat techniques:

Outdated Gray-Hat Techniques

Mild keyword stuffing
The keyword stuffing technique has a deceptive meaning by its origin. Search Engines recommend that site owners write qualitative and relevant contents for visitors but not the ranking mechanism of the engines. The main criterion for using keywords in your copy should be the question: will you apply this technique (add numerous, repetitive keywords) to human visitors only? Gray-Hats prefer to violate this guideline in a mild way. The number of keywords they use in the meaningful areas of the Web pages is close to the limit allowed.
Irrelevant keywords in image ALT tags
This technique means using Alt Tags stuffed by keywords unrelated to the specific image. The only purpose of this fraudulent technique is to attract more traffic to the pages. As you know, any type of keyword stuffing is offensive and violates the search engine's guidelines. They can track the keywords you have chosen and correlate them with the keyword profile of the Web page and the whole site.

Advanced Gray-Hat Techniques

Cloaking
Search engines strictly forbid cloaking for the purpose of optimization. "Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index" – state Google Webmaster guidelines.
A legitimate example of cloaking is to serve different areas of your site for the search engine to see but not the users. A ‘member's only' section can help in this case. Gray-Hat cloaking is mainly unintentional or borders on the harmful usage of different pages.
Unintentional cloaking may occur when you serve different content to dedicated audiences or some other groups. Such techniques are very risky and we recommend you contact each search engine, present your reasoning, and allow them the opportunity to approve.
Black-Hat shadow cloaking starts when site owners manipulate this method intentionally to influence a search engine's ranking algorithm.
Publishing duplicate content We have spent a lot of time teaching people how to write proper, keyword-targeted and valuable texts. Starting from the keyword research stage up to fresh content writing, these works demand special skills or additional costs if you hire a professional copy writer.
Instead of relevant, interesting, and unique contents, hackers manipulate duplicate content using the same few hundred words on every page or copying some one else's.
The Black-Hat technique copies the whole volume of the original text while Gray-Hats prefer to mix and dilute the parts.
Gray-Hats play around with margins to trick the search engines. There is no doubt that fresh, unique content is king, and duplicate content is very, very bad.
There are cases where duplicate content is not only legitimate, but is to be expected. To learn more about legitimate types of duplicate content and how to deal with multiple versions of the same content, refer to the "Duplicate Content Issues" lesson of this training course.
Content mashup Content mashup may be relative to the activity depicted above. Although we deal with the subject of content stealing here, we should mention that content is mashed in a more sophisticated fashion this time. Gray-Hat sites using the content mashup method generate non-unique content from other Web pages.
Irrelevant links As you can guess from the name of this technique, irrelevant links may not correspond to the topic of the website. Search engines regard these kinds of links as legal but won't give you much weight for them. That's why they go grey and don't hurt your reputation so much.
An example of mild spamming is asking links from every one of your clients or offering some other form of collaboration.
Off-topic link exchange If you exchange links with a site other than one that deals with your topic, you'll be bordering on the Black-Hat technique. Whether it is a Gray or Black-Hat exchange will depend on the number of off-topic links involved. Spammers know that several off-topic links may be devalued but not penalized.
Mild artificial link schemes Link schemes have already been defined in the previous Lesson devoted to the Black-Hat SEO techniques. However some kinds of artificial link schemes can be untraceable if mixed with other variety of generating backlinks. For example, links created within a narrow thematic niche may overlap or create Web rings, even without the initial purpose of manipulating search algorithms, so it may be hard for the search engines to discover the real intentions of website owners.
Remember that link schemes created for the sole purpose of manipulating SE algorithms may be considered Gray-Hat or even Black-Hat SEO. Thus, reputation-conscious website owners should think twice before getting engaged in unnatural link building - especially when websites hosted on the same server, or websites linking to bad neighborhoods take part in a link exchange game.
Paid links
Not all paid links violate search engine guidelines. "Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results." - as Google states in their Webmasters/Site owners Help (http://www.google.com/support/webmasters/bin/answer.py?answer=66736).
Link buying is actually a site owner's responsibility, which is the border technique between the law and punishment, advertising, and manipulative spamming actions. Generally, link buying is a loophole in the search engine's defense.
To follow the White-Hat SEO, purchased links should be closed from the crawlers by using a robots.txt file or adding a rel="nofollow" attribute to the <a> tag. Other methods will be punished the day the truth is known. If you ignore this guideline, you'll fall into Gray-Hat and this could play a dramatic role in your PageRank and SE rankings.
Remember to follow the white SEO line in your optimization works. Ranks should be earned honestly, and search engines will give you the opportunity to keep them with your relevant website.
Domain buying
Domain buying is another Gray-Hat technique used to get a quick ranking boost. The main idea here is simple: you buy an active domain name and get the PageRank (or "link juice") that comes with that website as a bonus to it!
Google is aware of this practice as well, and checks the domains for a long time. If your new domain name is really better for the spelling options or brand awareness, you are safe and sound.
However, the search engines can nullify the links of the domain you bought. You should know that if you act legitimately, you will be out of the SE radar. When you start illegal behavior, the search engines will regard you with suspicion.
Illegal Link Baiting
Link baiting is a marketing technique Social Media Marketing entrepreneurs encourage and use widely to promote a site, business, or brand through different social media channels. This idea has diffused and became another promotion option, which helps a business become engaged and interact with existing or potential consumers.
The concept of link baiting is creating content and tools that people want to link their websites to, or creating article content people actually want to publish.
Illegal link baiting starts when promotion becomes deceptive or irrelevant, and links or social bookmarking votes are spread via the group of gaming members using payment as well.
Gray-Hat methods include irrelevant widgets / tools of a viral nature spammers offer as links to off-topic sites. Because Gray-Hat SEO is a risky and ill-advised method, we strongly recommend you avoid any deceptive spam techniques at any cost. Always bear in mind future competitors' spam reports and the search engines' penalties for abusing their guidelines, penalties which can really hurt your rankings and the whole website.
If you still are doubtful about the color of your SEO techniques, read how search engines can adapt to the webmaster's behavior. According to SEOMoz, (http://www.seomoz.org/article/analysis-of-link-spam-alliances-paper) SEs can apply the following methods to track and prevent spam:
  1. "The use of visitor click-through rates and time spent on websites to determine quality.
  2. Advanced text analysis to determine legitimacy and quality of content.
  3. Vision-based analysis of Web page layouts to find commonalities in spam vs. non-spam pages.
  4. Editorial reviews of suspicious sites.
  5. Deflation in rankings of sites with suspected artificial link structures.
  6. Monitoring of rates of change in the speed of link acquisition and loss.
  7. Spam reports by users (and competitors)".
To sum everything up, Gray-Hat methods border Black-Hat techniques and may result in your website being banned. Moreover, these methods constantly lose their effectiveness as the search engines evolve and regularly update their indexing and ranking algorithms.

What you need to remember:

  1. Don’t use Gray-Hat techniques - they are risky and ill-advised SEO methods.
  2. If you go Gray, you will risk being reported to search engines by your competitors.

Tuesday 26 July 2011

Mission ImposSERPble: Establishing Click-through Rates


Google and its user experience is ever changing. For a company that has more than 60% of the search market, it's common to hear the question, “How many visitors can we expect, if we rank [x]?” It’s a fair question. It's just impossible to predict. Which is a fair answer. But, as my father says, “If you want fair, go to the Puyallup.” So we inevitably hear, “Well, can you take a guess? Or give us an estimate? Anything?”
To answer that question, we turned to major studies about click-through rates, incuding Optify, Enquiro, and the studies released using the leaked AOL data of 2006. But these studies are old; this study is new. Ladies, Gentlemen, and Mozbot, it is our immense pleasure to present to you…
The Slingshot SEO Google CTR Study: Mission ImposSERPble
There have been a number of changes to the Google user experience since those studies/surveys were published years ago. There's a new algorithm, a new user interface, increased mobile search, and social signals. On top of that, the blended SERP is riddled with videos, news, places, images, and even shopping results.
We made this study super transparent. You can review our step-by-step process to see how we arived at our results. This study is an ongoing project that will be compared with future SERPs and other CTR studies. Share your thoughts on the study and the research process to help us include additional factors and methods in the future.
Our client databank is made up of more than 200 major retailers and enterprise groups, and our sample set was chosen from more than thousands of keywords based on very strict criteria to ensure the accuracy and quality of the study results.
The study qualification criteria is as follows:
  • A keyword phrase must rank in a position (1 to 10)
  • The position must be stable for 30 days
Each keyword that we track at Slingshot was considered and every keyword that matched our strict criteria was included. From this method, we generated a sample set of exactly 324 keywords, with at least 30 in each of the top 10 ranking positions.
We are confident in the validity of this CTR data as a baseline model, since the data was generated using more than 170,000 actual user visits across 324 keywords over a 6-month period.
Data-Gathering Process
Authority Labs: Finding Stable Keywords
We currently use Authority Labs to track 10,646 keywords' daily positions in SERPs. From this, we were able to identify which keywords had stable positions for 30 days. For example, for the keyword “cars,” we observed a stable rank at position 2 for June 2011.
Stable 30 day ranking - ImposSERPble
Google Adwords Keyword Tool: All Months Are Not Created Equal
We found the number of [Exact] and “Phrase” local monthly searches using the Google Adwords keyword tool. It is important to note that all keywords have different monthly trends. For example, a keyword like “LCD TV” would typically spike in November, just before the holiday season. If you’re looking at searches for that keyword in May, when the search volume is not as high, your monthly search average may be overstated. So we downloaded the .csv file from Adwords, which separates the search data by month for more accuracy.
Google keyword tool csv download - ImposSERPble
By doing this, we were able to calculate our long-tail searches for that keyword. “Phrase” – [Exact] = Long-tail.
Google Analytics: Exact and Long-Tail Visits
Under Keywords in Google Analytics, we quickly specified the date of our keywords’ stable positions. In this case, “cars” was stable in June 2011. We also needed to specify “non-paid” visits, so that we were only including organic results.
Google analytics non paid - ImposSERPble
Next, we needed to limit our filter to visits from Google in the United States only. This was important since we were using Local Monthly Searches in Adwords, which is specific to U.S. searches.
Google analytics phrase and exact - ImposSERPble
After applying the filter, we were given our exact visits for the word “cars” and phrase visits, which included the word “cars” and every long-tail variation. Again, to get the number of long-tail visits, we simply used subtraction: Phrase – Exact = Long-Tail visits.
Calculations
We were then able to calculate the Exact and Long-Tail Click-through rate for our keyword.
EXACT CTR = Exact Visits from Google Analytics / [Exact] Local Monthly Searches from Adwords
LONG-TAIL CTR = (Phrase Visits – Exact Visits from Google Analytics) / (“Phrase” – [Exact] Local Monthly Searches from Adwords)
Results
What was the observed CTR curve for organic U.S. results for positions #1-10 in the SERP?
Based on our sample set of 324 keywords, we observed the following curve for Exact CTR:
Google CTR curve - ImposSERPble
Our calculations revealed an 18.2% CTR for a No. 1 rank and 10.05% for No. 2. CTR for each position below the fold (Positions 5 and beyond) is below 4%. An interesting implication of our CTR curve is that for any given SERP, the percentage of users who click on an organic result in the top 10 is 52.32%. This makes sense and seems to be typical user behavior, as many Google users will window shop the SERP results and search again before clicking on a domain.
Degrees of Difference
CTR study comparisons - ImposSERPble
The first thing we noticed from the results of our study was that our observed CTR curve was significantly lower than these two previous studies. There are several fundamental differences between the studies. One should not blindly compare the CTR curves between these studies, but note their differences.
Optify’s insightful and thorough study was conducted during the holiday season of December 2010. There are significant changes in Google’s rankings during the holiday season that many believe have a substantial impact on user behavior, as well as the inherent change in user intent.
The study published by Enquiro Search Solutions was conducted in 2007 using survey data and eye-tracking research. That study was the result of a business-to-business focused survey of 1,084 pre-researched and pre-selected participants. It was an interesting study because it looked directly at user behavior through eye-tracking and how attention drops off as users scroll down the page.
Long-Tail CTR: Volatile and Unpredictable
For each keyword, we found the percentage of click-through for all long-tail terms over the same period. For example, if “cars” ranks at position 2 for June 2011, how much traffic could that domain expect to receive from the keyword phrases “new cars,” “used cars,” or “affordable cars?” The reasoning is, if you rank second for “cars,” you are likely to drive traffic for those other keywords as well, even if those positions are unstable. We were hoping to find an elegant long-tail pattern, but we could not prove that long-tail CTR is directly dependent on the exact term’s position in the SERP. We did observe an average long-tail range of 1.17% to 5.80% for each position.
Google CTR data table - ImposSERPble
Blended SERPs: The “Universal” Effect
Starting in May 2007, news, video, local, and book search engines were blended into Google SERPs, which have since included images, videos, shopping, places, real-time, and social results. But do blended SERPs have lower CTRs? Since these blended results often push high-ranking domains towards the bottom of the page, we predicted that CTR would indeed be lower for blended SERPs. However, a counter-intuitive hypothesis would suggest that because certain SERPs have these blended results inserted by Google, they are viewed as more credible results and that CTR should be higher for those blended SERPs. We analyzed our sample set and failed to show significant differences in user behavior regarding blended versus non-blended results. The effect of blended results on user behavior remains to be seen.
Google CTR blended data table - ImposSERPble
As previously mentioned, this study will be used in comparison to future SERPs as the Slingshot SEO Research & Development team continues to track and analyze more keywords and collect additional CTR data. It is our hope that these findings will assist organic SEOs in making performance projections and consider multiple factors when selecting keywords. We look forward to additional studies, both yours and ours, on CTRs and we encourage you to share your findings. With multiple prospective and recent social releases, our research team will be dedicated to examining the effects of social platforms and Click-through rates, and how the organic CTR curve changes over time.

SERP Click Through Rate of Google Search Results – AOL-data.tgz – Want to Know How Many Clicks The #1 Google Position Gets?


Well after some gentle persuasion of MySQL the AOL-data.tgz files have surrendered some interesting, if not wholly unexpected, information about the relative strengths and click through rates of SERP positions.
The dataset contained 36,389,567 search queries with 19,434,540 clickthroughs. While we all knew the importance of the top 3 positions in the Google SERPs, this analysis further reinforces that fact:
SERP Clickthrough % of Top 10 SERP Positions
SERP Click Through Rate of Top 10 SERP Positions

Interestingly, the #1 SERP position recieves 42.3% of all clickthroughs. The #2 position only accounts for 11.92% of all clickthroughs – almost 72% less clickthroughs than the top position in the SERPs. Attaining the #1 position for your keywords/phrases results in nearly 4 times more traffic than that of your nearest rival – now that’s a serious difference in both traffic and potential revenue.
A #3 placement in the SERPs results in a 8.44% clickthrough rate, almost 30% less than the #2 and over 80% less than the top position on the first results page.
As we move down the page the rate of decline in clickthrough also falls. Notice that a #10 position in the SERPs receives slightly more clickthroughs than #9. This is most probably related to users glancing at the final listing as they scroll to the page navigation:
Clickthrough Analysis of SERP Pages 1-4
Image showing the SERP Click Through Rates of #11, #20, #21, #31, #41

Moving off the first SERP the rate of decline in clickthrough picks up considerably. The clickthrough rate for listings with #11 rank dropped to 0.66%. That’s an almost 80% decline in clickthroughs from the #10 SERP position and shows that being on the first SERP page results in far greater SE traffic than lower listings.

Google SERP Click Through Rates – The Raw Numbers

Rank#Click Throughs%Delta #n-1Delta #n1
 19,434,540100%  
18,220,27842.30%n/an/a
22,316,73811.92%-71.82%-71.82%
31,640,7518.44%-29.46%-80.04%
41,171,6426.03%-28.59%-85.75%
5943,6674.86%-19.46%-88.52%
6774,7183.99%-17.90%-90.58%
7655,9143.37%-15.34%-92.95%
8579,1962.98%-11.69%-92.95%
9549,1962.83%-5.18%-93.32%
10577.3252.97%-5.12%-92.98%
11127,6880.66%-77.88%-98.45%
12108,5550.66%-14.98%-98.68%
13101,8020.52%-6.22%-98.76%
1494,2210.48%-7.45%-98.85%
1591,0200.47%-3.40%-98.89%
1675,0060.39%-17.59%-99.09%
1770,0540.36%-6.60%-99.15%
1865,8320.34%-6.03%-99.20%
1962,1410.32%-5.61%-99.24%
2058,3820.30%-6.05%-99.29%
2155,4710.29%-4.99%-99.33%
3123,0410.12%-58.46%-99.72%
4114,0240.07%-39.13%-99.83%
Click Through Rates of Google SERPs based on AOL-data.tgz
Here’s the same table in image format:
AOL Clickthrough Data
SERP Click Through Rates of Google SERPs based on AOL-data.tgz

The volume of clickthroughs for lower SERPs is so trivial that for all but the highest volume search terms these positions will generally yield little or no benefit to site owners (obviously some niches will prove to be exceptional).
The main message from the AOL data is that page 1 SERP is where the real action lies and #1 positions reign supreme.