Friday 23 September 2011

An SEO Checklist for New Sites

Over 160,000 new top-level domains were registered yesterday. 160,000! This huge volume of new sites being birthed wasn't unique to yesterday; this happens every day (you can check out today's progress at DailyChanges.com. The sites that start out pre-optimized and that continue optimizing immediately after publishing will be at an incredible advantage over those that were made without SEO in place from the get-go. Of course, there's a lot of work to be done for a new site, and it can be hard to remember everything and prioritize work. This week, per PRO member request, Rand presents an SEO checklist that SEOs can use when optimizing new sites.Have any boxes of your own to add to the checklist? Let us know in the comments below!

Wistia
 

Video Transcription


Hi everyone. Welcome to another edition of Whiteboard Friday. This week I have a special request from one of our users to talk about an SEO checklist for new sites that aren't ranking yet. I've created a new website. I want to make sure I am doing all the right things in the right order, that I have got everything set up, and my website is not yet ranking. What are the things that I should be doing and maybe some things that I should not be doing? So, I wanted to create a brief checklist with this Whiteboard Friday, and if we find this useful, maybe we will expand it and do even more stuff with it in the future.

So, let's run through. You have a new site that you've just launched. You are setting things up for success. What do you need to worry about?

First off, accessibility. What I mean by this is users and search engines both need to be able to reach all of the pages, all the content that you've created on your website in easy ways, and you need to make sure you don't have any dumb mistakes that can harm your SEO. These are things like 404s and 500 errors and 302s instead of 301s, duplicate content, missing title tags, thin content where there is not much material on the page for the search engines to grab on to and maybe for users as well. Two tools that are great for this, first off, Google Webmaster Tools, which is completely free. You can register at Google.com/webmasters. The SEOmoz Crawl through the SEOmoz Pro Web App, also very useful when you are looking at a new site. We built a bunch of features in there that we wish Google Webmaster Tools kept track of, but they don't, and so some of those features are included in the SEOmoz Crawl, including things like 302s for example and some thin content stuff. That can be quite helpful.

Next up, keyword targeting. This makes some sense. You have to choose the right keywords to target. What I want to have is if gobbledyzook - probably an awful word for anyone to be targeting, no search volume, just bad choice in general - but we want to be looking at, do these have good search volume? Are some users actually searching for them? You might not be able to target high value terms because you are also looking for low difficulty when you are first launching a site. You don't want to necessarily shoot for the moon. Maybe you do on your home page or some branded page, some product page, but for the things that you know you want to target and you want to work on early short term, maybe some content that you've got, some feature pages for the product or service you are offering, and you think to yourself, I am not going to be able to target gobbledly, which is really tough, but maybe gobbledyzook. That will be easier. So, you can look at search volume, the relevance to the website, please by all means make sure that you have something that is relevant that is actually pulling in searches you care about, and low difficulty. If you have that taken care of, you have your keyword targeting.

Content quality and value. If you have a bunch of users coming to this page and they're thinking to themselves, this doesn't really answer my query, or yeah, maybe this answers one portion of it, but I wish there was more detail here, more video, more images, maybe a nice graphic that explains some things, a data set, some references to where they got this information. Not just a bunch of blocks of text. Maybe I am looking for something that describes a process, something that explains it fully. If you can do that, if you can build something remarkable, where all of these people change from "Huh, huh, what's this?" To, oh, you know what, instead it's "I am happy." "I also am happy." "This page makes me do happy. Yea, I am going to stick my tongue out." If you can get that level of enjoyment and satisfaction from your users with the quality of the content that you produce, you're going to do much better in the search engines. Search engines have some sophisticated algorithms that look at true quality and value. You can see Google has gotten so much better about putting really good stuff in results, even sometimes when it doesn't have a lot of links or it is not doing hardcore keyword targeting, when it is great stuff, they are doing a good job of ranking it.

Next up, design quality, user experience, and usability. This is tough. Unless you have a professional designer or you have a professional design background, you almost certainly need to hire someone or go with a very simple, basic design that is very user friendly that you know when you survey your friends, survey people in your industry, survey people in your company, survey people in your ecosystem, that they go, yeah, yeah, yeah, this looks really good. I am really happy with the design. Maybe I am only giving it a six out ten in terms of beauty, but an eight out of ten in terms of usability. I understand the content on this site. It is easy for me to find things and they flow. There is really no point in ranking unless you are nailing these two, because you are not going to get many more customers. People are just going to be frustrated by the website. There are a few tools you can use on the Web to test these out. Five Second Test, Feedback Army, Silverback App, all of these are potentially useful for checking the usability user experience of the site.

Social account setup. Because social and SEO are coming together like never before, Google is showing plus ones and things that people share by default in the search engine rankings. Bing is showing all the stuff that has been shared on Facebook, and they are putting it above the rest of the content. It really, really pays to be in social, and social signals help search engines better rank things as well as having a nice second order effect on user and usage data, on branding, on the impact of people seeing those sites through social sharing and potentially linking to them. So social account setup, at the very least, you probably want to have these four: Facebook, Twitter, LinkedIn and Google+. Google+ is only about 25 million, but it is growing very fast. LinkedIn, Twitter, Facebook are all over 150 million users right now. I think Twitter is at 200 million. Facebook is at 750 million. So at least have your pages set up for those. Make sure the account experience is the same across them, using the same photos, same branding, same description, so people get a good sense when they see you in the social world. You probably want to start setting something up to be monitoring and tracking these. You might want to sign up for something like a Bitly. I used to really recommend PostRank, but unfortunately they don't track Facebook, since Google bought them, anymore. So it is a little more frustrating. The SEOmoz Web App will start to track these for you pretty soon. Once you've got those social accounts set up, you can feel good about sharing the content that you are producing through those social accounts, finding connections, building up in that world, and spending the appropriate amount of time there depending on the value you are feeling back from that.

Next up, link building. This is where I know a lot of people get sort of off to the wrong start, and it is incredibly hard to recover. I actually just got an email in my inbox before we started doing Whiteboard Friday from someone who had started a new website and he is like, "I got these 300 links, and now I am not ranking anymore. I was doing great last week. For the first six weeks after I launched, I was ranking great." I sort of did just a quick look at the back links, and I went, "Oh, oh no." I think this person really went down the route of I am going to get a bunch of low quality, easy to acquire links, and for a new site in particular, it is so dangerous, because Google is just really on top of throwing people out of the index or penalizing them very heavily when their link profile looks really scummy. When you don't have any trustworthy quality signals to boost you up, that's when low quality links can hurt the most.

So, good things to do. Start with your business contacts and your customers. They are great places to get links from. Your customers are willing to link to you. Awesome. Get them to link to you. If the contacts that you have in the business world are willing to say, hey, my friend Rand just launched a new website, boom, that's a great way of doing it. All your email contacts, your LinkedIn contacts, the people that you know personally and professionally, if you can ask them, hey, would you support me by throwing a link to me on your About page or your blog roll or your list of customers or your list of vendors, whatever it is.

Guest posts and content. This is a great way to do good content positive content production and earning links back for that. Finding trustworthy sites that have lots of RSS subscribers and are well renowned and can give you visibility in front of your audience and give you a nice link back if you can contribute positively to those. I also like high quality resource lists. So, this would be things like the Better Business Bureau maybe, that sort of falls a little in the directory world, but something like a CrunchBase. If you are a startup in the technology world, you definitely need to have a CrunchBase listing. You might want to be on some Wikipedia lists. Granted those are no-follow, but that's still okay. That is probably a good place to get some visibility. There might be industry specific lists that are like these are heavy machine production facilities in the United States. Great, okay, I should be on that list. That's what I do. News media and blogs. Getting the press to cover you. Getting blogs in your sphere to cover you. Finding those, emailing the editors, letting them know that you are launching this new website, that's a great time to say, "Hey this business is transforming. We're launching a new site. We're changing our branding," whatever it is. That is sort of a press worthy message and you can get someone to look at you. Review sites, review blogs are great for this too. They'll sort of say, oh, you've got a new application, you've got a new mobile service, maybe we'll link to you. That could be interesting.

Relevant social industry and app account links. If I contribute something to the Google Chrome store, if I contribute something to the Apple store, if I am contributing something to a design portal or design gallery, all of those kinds of industry stuff and accounts that you can get are likely worth getting your website listed on.

Social media link acquisitions. This is obvious stuff where you spend time on Twitter, on Facebook, on LinkedIn, Google+ connecting with people and over time building those relationships that will get you the links possibly through one of these other forms or just through the friendliness of them noticing and liking, and enjoying your content. That's what content marketing is all about as well.

These are great ways to start. Very safe ways to do link building. They are not short-term wins. These, almost all of them, require at least some effort, some investment of your time and energy, some creativity, some good content, some authenticity in your marketing versus a lot of the stuff that tempts people very early on. They're like, oh, sweet, you know, I have a new website. I need to get like 500 links as soon as possible, so I am going to try things like reciprocal link pages. I am just going to put up a list of reciprocal link partners, and I am going to contact a bunch of other firms. They'll all link to me and we'll all link to each other. It will be a happy marriage of links. No, it's not. It's not a wonderland.

Low quality directories. You search for SEO friendly directory, if it shows up on that list, chances are . . . even in Google. Google is showing you a bunch of bad stuff. Someone was asking me recently on email, they said, "Hey, I really need some examples of sites that have done manipulative link building." I was like, "Oh, it's so easy. Search for SEO friendly directory and look at who has paid to be listed in those directories." They almost all have spammy manipulative link profiles, and it is funny because you go to those, and I don't know why people don't do this, but try searching for the brand names that come up in those lists. None of them rank for their own brand name. Why is that? Clearly, they are killing themselves with these terrible, terrible links. So, low quality directories, really avoid them.

Article marketing or article spinning, I talked about that a few weeks ago on Whiteboard Friday, also a practice I would strongly recommend you avoid, especially, I know it can work, I know there are people for whom it does work, but especially early on, it can just kill you. It really can get you banned or penalized out of the engines, and you just won't rank anywhere if your link profile starts out spammy. Paid links is another obvious one.

Forums, open forums, spam kind of going across the Web. Oh, here's a guest book that's open and forgot to put no-follow. I am going to leave a link there. Oh, here look, it's a forum that accepts registration, and they forgot to close their no-follow off, anyone can leave a link. Even things like do-follow blogs, do-follow blog comments, man, it's really risky because they are linking to bad places a lot of the time and it is usually manipulative people who have no intent to create something of value for the search engines. They are merely trying to manipulate their rankings. Whenever you have a tactic like that it attracts people who have nasty websites, and then Google looks at those and goes, okay, they're linking to a bunch of nasty sites. Well, I don't want to count those links, or maybe I am even going to penalize some of the people that they are linking to. That really sucks. Then link farms, which is essentially setting up all these different systems of links that point to each other across tons of domains that are completely artificial or link for no human reason, or no discernable human reason, and are merely meant to manipulate the engines.

This type of stuff is very, very dangerous when you are early on. If you have already built up a good collection of these types of links, you are much safer. You do have some risk in those first three, six, nine months after you have launched a new site around doing wrong things on the link building front and getting yourself into a situation where you are penalized. We see a ton of that through SEOmoz Q&A. I get it in email. You see it on the Web all the time. So, be cautious around that.

Saturday 6 August 2011

The Power of CSS

In this lesson we would like to give you a better idea of CSS and provide more examples of how it can be used for pure, light-coded and effective web design. First, let's define CSS. CSS stands for "cascading style sheets". The World Wide Web Consortium, also referred to as W3C, defines CSS as follows:
"Style sheets describe how documents are presented on screens, in print, or perhaps how they are pronounced [...] By attaching style sheets to structured documents on the Web (e.g. HTML), authors and readers can influence the presentation of documents without sacrificing device-independence or adding new HTML tags."
"Cascading Style Sheets (CSS) is a style sheet mechanism that has been specifically developed to meet the needs of Web designers and users."

(http://www.w3.org/Style/)

How style sheets work?

First, define a style. For instance, you want to define text as deep blue 12px size Verdana font in bold:
Color: #5500DD;
Font-size: 12px;
Font-family: Verdana;
Font-weight: bold
;

Next, we give this style a particular custom name (further referred to as "class"):
Mystyle
{
Color: #5500DD;
Font-size: 12px;
Font-family: Verdana;
Font-weight: bold;
}

…or, alternatively, we associate it with a particular HTML tag:
h1
{
Color: #5500DD;
Font-size: 12px;
Font-family: Verdana;
Font-weight: bold;
}

Styles can be declared in the HTML document itself with the help of the <style> tag anywhere in the document. However, it's preferable to keep them out of your page in a separate file with the ".css" extension ("mystyle.css") to further reduce the size of your HTML file. To be able to use the styles declared by that file in your HTML document, you must provide a link to "mystyle.css" within the HEAD area of your html document:
<link rel="stylesheet" href="mystyle.css">
With the link described above in your HEAD area, all HTML tags that have styles defined for them in the "mystyle.css" will yield these styles when shown in browser.

How can CSS help with optimization?

Imagine there's some HTML code used to print a heading on your page:
<strong><font color="#FF0000" size="24px">Main Heading of My Site</font></strong>
Now look how the same effect can be achieved using styles:
<span class="mystyle"> Main Heading of My Site </span>
The HTML code for the CSS tag is half as long as the code without the CSS. This, as you already know, is an advantage with search engine spiders as it can give more weight to your content because of the improved content-to-code ratio.
And even better:
<h1> Main Heading of My Site </h1>
(with a <link rel="stylesheet" href="mystyle.css"> in the HEAD and provided a style is declared for the H1 heading in “mystyle.css”).

CSS rollovers vs. JavaScript rollovers

Rollover menu effects are very popular. However, they commonly require JavaScript implementation. Fortunately, rollovers can be made via CSS that do not require any scripting and are fully readable by the spiders.
CSS lets you avoid using JavaScript and still emulate rollover effects with grace in a small file, but one of the greatest benefits is providing more textual content for spiders to read. Using CSS to dictate rollover effects instead of separate images will give you an effective advantage in the search engine battle, especially if the textual links are your key phrases.

CSS static-text popup effect

Let's create static-text popup purely through the power of CSS. It can be achieved like this:
<a href="http://www.mysite.com/css/">Links<span>Some text here Some text here Some text here </span></a>
As you can see the "popup" text is a span element inside the hyperlink. And one more thing to do is to give the command and prevent the text from showing up when the page loads:
div#links a span {display: none;}
Images can be the element inside the hyperlinks too. Here's one example from the source of this document:
<a href="http:// www.mysite.com /css/">Links<img src="picturename.gif">
To prevent the image from showing up when the page loads, make a command:
div#links a img {height: 0; width: 0; border-width: 0;}
From the other hand we make them visual thanks to the:
div#links a:hover img {position: absolute; top: 190px; left: 55px; height: 50px; width: 50px;}

CSS image popup effect

Any beautiful picture you want to enlarge for the Web page visitors can be processed via CSS popup effect like this:
<style type="text/css">
.thumbnail{
position: relative;
z-index: 0;
}
.thumbnail:hover{
background-color: transparent;
z-index: 50;
}
.thumbnail span{ /*CSS for enlarged image*/
position: absolute;
background-color: lightyellow;
padding: 5px;
left: -1000px;
border: 1px dashed gray;
visibility: hidden;
color: black;
text-decoration: none;
}
.thumbnail span img{ /*CSS for enlarged image*/
border-width: 0;
padding: 2px;
}
.thumbnail:hover span{ /*CSS for enlarged image on hover*/
visibility: visible;
top: 0;
left: 60px; /*position where enlarged image should offset horizontally */
}
</style>
The effect will come in forth when the user moves its mouse over the specified image. You can include this "image popup" code on the HTML page.

CSS button effect instead of JavaScript

3-d button is another CSS creative design issue we'd like to recommend. This captivating effect gives the possibility to hit the button when mouse over it. See the picture below.
Example link
Make it work by help of mouse. To gain the effect you want feel free to use the code below:
a {
display: block;
border: 1px solid;
border-color: #aaa #000 #000 #aaa;
width: 8em;
background: #fc0;
}


a:hover
{
position: relative;
top: 1px;
left: 1px;
border-color: #000 #aaa #aaa #000;
}

Where CSS should be used with caution?

When you code your page using CSS you should always keep in mind that search engines don’t like to index hidden context. Thus, you should avoid using "display:none", "visibility:hidden" or similar definitions to hide sections stuffed with keywords. Such actions are definitely spamming and are not recommended.
Another technique you should avoid is producing the same color text and background with the help of CSS. Such technique is used by the spammers to hide keywords from the human visitors and can hurt your pages.
One more crucial thing – use HTML tags for their intended purpose. E.g., you shouldn’t use CSS to change the function of the tag if you don’t use it. There are classes instead.
The usage of CSS reduces to a minimum the size of the HTML code and provides the best opportunity to effectively use keywords in important HTML heading tags (h1, h2, etc.). But  remember – the time when CSS was invisible to search engines has gone by. Today’s engines can easily read cascading style sheets and thus they are aware of what they once were not!

What you should remember from this lesson:

  1. CSS is a perfect means for increasing your content-to-code ratio, lowering HTML file size, and compromising between clean visual design and clean code for spiders.
  2. You should avoid using CSS to visually hide sections stuffed with words or produce the same color text and background.

Saturday 30 July 2011

Code of Ethics


As you know from the section of the course that discussed the history of search engines and optimization, there was a time when the optimization process was simple and involved little more that tweaking meta tags and repeating keywords within the content of a page. As such, there were a limited number of SEO companies.

Today the industry is highly competitive. Worldwide, there are now thousands of search engine optimization companies. There is also an ever increasing number of websites and many of them are already well optimized, making the competition much worse. In the past, when an SEO company optimized a client's site, most other sites in direct competition were poorly optimized or not optimized at all; the game was easier. But now, almost every competitor of yours is using the services of some SEO to optimize their site.

As you can guess, this makes the game really tough. Also, this leads up to half of the SEO / SEM companies into considering unethical and illegal strategies in order to get their customers to the top of search engine result pages.

Nevertheless, this rarely gives these companies any competitive advantage as most solid and respectable companies seeking SEO services now recognize that guarantees of top-10 rankings sound, at the very least, suspicious.

Let's imagine a company called FakeCompany LTD has a website that offers some products. Except product advertising, this site does not contain any valuable information for visitors so the search engines do not rank it high. So, FakeCompany hires SEO expert Mr. Doe to help them with rankings because Mr. Doe claims he can get any site into the Top-10.

What this SEO expert does is stuff the company pages with irrelevant keywords, creating a link system with thousands of hidden links and then implements some advanced spamming techniques like dynamic page generation and cloaking.

As a result, the rankings of the page are initially boosted. FakeCompany is happy and pays the fee to Mr. Doe.

Still, visitors coming to the site see even more of a mess than before the "optimization" and they are still unable to find any valuable content. Since they find the FakeCompany's site for the irrelevant keywords, that is, not for those they are primarily seeking, the conversion of visitors into customers is very low.

As a result, customers are unsatisfied and FakeCompany receives huge bills for traffic which hasn't been converted and hasn't brought any real profit.

The search engine which allowed for the spammer's site on top of its listings is also unsatisfied since it's losing its popularity among Web surfers because they are serving irrelevant pages. So it invests money into developing a more advanced spider that finally cracks Mr. Doe's tricks and the positions of FakeCompany fall down. Eventually, FakeCompany is excluded from the search engine index entirely because an unsatisfied visitor or a competitor reported to the search engine that FakeCompany is using spam methods.

Of course, FakeCompany is unsatisfied with this situation and sues Mr. Doe to get their money refunded. Mr. Doe is unwilling to cooperate or perhaps even managed to escape before all this mess started. The best case scenario is that FakeCompany gets its money back but is never able to restore its rankings and has to invest in a new website.

Nobody is satisfied in this story, however such things do happen now and again, even in today's more sophisticated SEO environment.

Unfortunately, there's no solution for such cases except abandoning spam techniques entirely and following a Code of Ethics for all Search Engine Optimizers which maintains their good reputation and withstand crowds of unethical SEO companies that wave the banner of illegal yet "effective" promotion strategies.

Adhering to a Code of Ethics (or Code of Conduct), if presented properly, may serve as an effective competitive advantage.

As we believe that such Codes should be unified across the Web, we will not invent our own. Instead, we support the one maintained by a well-known industry expert Bruce Clay and his company. This perfect collection of rules can be found at www.bruceclay.com and we provide a copy here for your reference.

Code of Ethics

Whereas all parties are working towards presenting relevant and high quality information in an easy to use format to information seekers and, whereas SEO practitioners are being contracted to assist clients in obtaining higher rankings for client pages, we (and those linking to this page) are voluntarily adhering to the below SEO Code of Ethics:

No SEO practitioner will intentionally do harm to a client. This involves the continued use of any technology or procedure (without appropriate care) that is known to result in having the client site removed from search engine indexes or directories or rendered inoperative. Questionable adherence to standards must be addressed via the Robots Exclusion Standard.

No SEO practitioner will intentionally violate any specifically published and enforced rules of search engines or directories. Should rules and guidelines change (as they often do), the SEO practitioner will promptly take action to comply with the changes as they apply to all clients. Where rules and guidelines are unclear, the SEO practitioner will seek clarification and await approval from the appropriate search engine before continuing to utilize potentially harmful technology or procedures.

No SEO practitioner will intentionally mislead, harm or offend a consumer. All individuals utilizing a search engine to visit a site will not be misled by the information presented to or by the search engine or harmed or offended upon arrival at the client site. This includes techniques like "bait and switch" where the client page does not substantially contain and is not clearly associated with the optimized phrase or may be reasonably offensive to targeted visitors.

No SEO practitioner will intentionally violate any laws.
This involves the deliberate and continued violation of copyright, trademark, servicemark or laws related to spamming as they may exist at the state, federal or international level.

No SEO practitioner will falsely represent the content of the client site. This includes the practice of presenting different versions of Web pages to different users except where that information is altered solely to meet browser specifications and needs, sensitivity to regional factors such as language or product specific needs. In general, ALL requests for a specific URL should be served identical HTML by the Web server.

No SEO practitioner will falsely represent others work as their own. This includes the taking of work from others in whole or in part and representing this work as their own. The SEO practitioner may not make verbatim copies of the work of others (instead of authoring original work) without the prior consent of the other party.

No SEO practitioner will misrepresent their own abilities, education, training, standards of performance, certifications, trade group affiliations, technical inventory or experiences to others. This includes quantifiable statements related to project timetables, performance history, company resources (staff, equipment and proprietary products) and client lists. Guarantees will be restricted to items over which the SEO practitioner has significant and reasonable control.

No SEO practitioner will participate in a conflict of interest without prior notice to all parties involved.
This includes the practice of choosing to emphasize one client over another in competing keywords because there is more personal gain for the practitioner. All clients are treated equally and all will receive equal best effort in their Search Engine Optimization.

No SEO practitioner will set unreasonable client expectations.
This includes the practice of accepting more than a reasonable number of clients competing for the same keywords and implying that all will be in the top positions in the search engines. This also includes the implication that results can be obtained in an unreasonable amount of time given the known condition of the search engines, client site and competition.

All SEO practitioners will offer their clients both internal and external dispute resolution procedures.
This includes the publishing of address and phone numbers on primary Web pages, the inclusion of third-party dispute resolution links prominently placed within the practitioner’s website and contracts that include sections discussing dispute resolution.

All SEO practitioners will protect the confidentiality and anonymity of their clients with regards to privileged information and items implying testimonial support for the SEO practitioner. All staff of SEO practitioner shall be bound to protect information that is not generally known as it may harm the client. The SEO practitioner will not include the publishing of testimonials and proprietary logos of client lists, press releases and other collateral discussing the client without explicit approvals.”

We try to keep our sites and services compliant with this code and advise that you do the same; however, if you don't wish to use this code, you are welcome to make any custom modifications or invent your own Code, as long as it remains legal, transparent and ethical.

Thursday 28 July 2011

What is Gray-Hat SEO ?


Search engine guidelines clearly define Black-Hat techniques as spamming techniques. You can recognize and avoid them in your SEO campaigns. However, there are so called Gray-Hat techniques, which are temporarily unknown or not restricted by search engines.
Gray-Hats are different because they try to do things they believe are ill-defined by Google, without first asking permission.
Let's look at SearchSecurity.com's description of this notion: "Gray-Hat describes a cracker (or, if you prefer, hacker) who exploits a security weakness in a computer system or product in order to bring the weakness to the attention of the owners. Unlike a Black-Hat, a Gray-Hat acts without malicious intent. The goal of a Gray-Hat is to improve system and network security. However, by publicizing vulnerability, the Gray-Hat may give other crackers the opportunity to exploit it. This differs from the White-Hat who alerts system owners and vendors of vulnerability without actually exploiting it in public".
Google has clearly defined Gray-Hat SEO as a risky, ill-advised method. Here is the indirect spam definition of Gray-Hat techniques from the top search engine: "It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.
Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit… If you believe that another site is abusing Google's quality guidelines, please report that site… spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts."
Now, having a sufficient number of Gray-Hat definitions, you should clearly understand the danger of any spam or spam-like technique. Gray-Hat techniques should not be used. Never deceive anyone, and avoid such methods at any cost.
Here we'd like to show some examples of Gray-Hat techniques:

Outdated Gray-Hat Techniques

Mild keyword stuffing
The keyword stuffing technique has a deceptive meaning by its origin. Search Engines recommend that site owners write qualitative and relevant contents for visitors but not the ranking mechanism of the engines. The main criterion for using keywords in your copy should be the question: will you apply this technique (add numerous, repetitive keywords) to human visitors only? Gray-Hats prefer to violate this guideline in a mild way. The number of keywords they use in the meaningful areas of the Web pages is close to the limit allowed.
Irrelevant keywords in image ALT tags
This technique means using Alt Tags stuffed by keywords unrelated to the specific image. The only purpose of this fraudulent technique is to attract more traffic to the pages. As you know, any type of keyword stuffing is offensive and violates the search engine's guidelines. They can track the keywords you have chosen and correlate them with the keyword profile of the Web page and the whole site.

Advanced Gray-Hat Techniques

Cloaking
Search engines strictly forbid cloaking for the purpose of optimization. "Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index" – state Google Webmaster guidelines.
A legitimate example of cloaking is to serve different areas of your site for the search engine to see but not the users. A ‘member's only' section can help in this case. Gray-Hat cloaking is mainly unintentional or borders on the harmful usage of different pages.
Unintentional cloaking may occur when you serve different content to dedicated audiences or some other groups. Such techniques are very risky and we recommend you contact each search engine, present your reasoning, and allow them the opportunity to approve.
Black-Hat shadow cloaking starts when site owners manipulate this method intentionally to influence a search engine's ranking algorithm.
Publishing duplicate content We have spent a lot of time teaching people how to write proper, keyword-targeted and valuable texts. Starting from the keyword research stage up to fresh content writing, these works demand special skills or additional costs if you hire a professional copy writer.
Instead of relevant, interesting, and unique contents, hackers manipulate duplicate content using the same few hundred words on every page or copying some one else's.
The Black-Hat technique copies the whole volume of the original text while Gray-Hats prefer to mix and dilute the parts.
Gray-Hats play around with margins to trick the search engines. There is no doubt that fresh, unique content is king, and duplicate content is very, very bad.
There are cases where duplicate content is not only legitimate, but is to be expected. To learn more about legitimate types of duplicate content and how to deal with multiple versions of the same content, refer to the "Duplicate Content Issues" lesson of this training course.
Content mashup Content mashup may be relative to the activity depicted above. Although we deal with the subject of content stealing here, we should mention that content is mashed in a more sophisticated fashion this time. Gray-Hat sites using the content mashup method generate non-unique content from other Web pages.
Irrelevant links As you can guess from the name of this technique, irrelevant links may not correspond to the topic of the website. Search engines regard these kinds of links as legal but won't give you much weight for them. That's why they go grey and don't hurt your reputation so much.
An example of mild spamming is asking links from every one of your clients or offering some other form of collaboration.
Off-topic link exchange If you exchange links with a site other than one that deals with your topic, you'll be bordering on the Black-Hat technique. Whether it is a Gray or Black-Hat exchange will depend on the number of off-topic links involved. Spammers know that several off-topic links may be devalued but not penalized.
Mild artificial link schemes Link schemes have already been defined in the previous Lesson devoted to the Black-Hat SEO techniques. However some kinds of artificial link schemes can be untraceable if mixed with other variety of generating backlinks. For example, links created within a narrow thematic niche may overlap or create Web rings, even without the initial purpose of manipulating search algorithms, so it may be hard for the search engines to discover the real intentions of website owners.
Remember that link schemes created for the sole purpose of manipulating SE algorithms may be considered Gray-Hat or even Black-Hat SEO. Thus, reputation-conscious website owners should think twice before getting engaged in unnatural link building - especially when websites hosted on the same server, or websites linking to bad neighborhoods take part in a link exchange game.
Paid links
Not all paid links violate search engine guidelines. "Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results." - as Google states in their Webmasters/Site owners Help (http://www.google.com/support/webmasters/bin/answer.py?answer=66736).
Link buying is actually a site owner's responsibility, which is the border technique between the law and punishment, advertising, and manipulative spamming actions. Generally, link buying is a loophole in the search engine's defense.
To follow the White-Hat SEO, purchased links should be closed from the crawlers by using a robots.txt file or adding a rel="nofollow" attribute to the <a> tag. Other methods will be punished the day the truth is known. If you ignore this guideline, you'll fall into Gray-Hat and this could play a dramatic role in your PageRank and SE rankings.
Remember to follow the white SEO line in your optimization works. Ranks should be earned honestly, and search engines will give you the opportunity to keep them with your relevant website.
Domain buying
Domain buying is another Gray-Hat technique used to get a quick ranking boost. The main idea here is simple: you buy an active domain name and get the PageRank (or "link juice") that comes with that website as a bonus to it!
Google is aware of this practice as well, and checks the domains for a long time. If your new domain name is really better for the spelling options or brand awareness, you are safe and sound.
However, the search engines can nullify the links of the domain you bought. You should know that if you act legitimately, you will be out of the SE radar. When you start illegal behavior, the search engines will regard you with suspicion.
Illegal Link Baiting
Link baiting is a marketing technique Social Media Marketing entrepreneurs encourage and use widely to promote a site, business, or brand through different social media channels. This idea has diffused and became another promotion option, which helps a business become engaged and interact with existing or potential consumers.
The concept of link baiting is creating content and tools that people want to link their websites to, or creating article content people actually want to publish.
Illegal link baiting starts when promotion becomes deceptive or irrelevant, and links or social bookmarking votes are spread via the group of gaming members using payment as well.
Gray-Hat methods include irrelevant widgets / tools of a viral nature spammers offer as links to off-topic sites. Because Gray-Hat SEO is a risky and ill-advised method, we strongly recommend you avoid any deceptive spam techniques at any cost. Always bear in mind future competitors' spam reports and the search engines' penalties for abusing their guidelines, penalties which can really hurt your rankings and the whole website.
If you still are doubtful about the color of your SEO techniques, read how search engines can adapt to the webmaster's behavior. According to SEOMoz, (http://www.seomoz.org/article/analysis-of-link-spam-alliances-paper) SEs can apply the following methods to track and prevent spam:
  1. "The use of visitor click-through rates and time spent on websites to determine quality.
  2. Advanced text analysis to determine legitimacy and quality of content.
  3. Vision-based analysis of Web page layouts to find commonalities in spam vs. non-spam pages.
  4. Editorial reviews of suspicious sites.
  5. Deflation in rankings of sites with suspected artificial link structures.
  6. Monitoring of rates of change in the speed of link acquisition and loss.
  7. Spam reports by users (and competitors)".
To sum everything up, Gray-Hat methods border Black-Hat techniques and may result in your website being banned. Moreover, these methods constantly lose their effectiveness as the search engines evolve and regularly update their indexing and ranking algorithms.

What you need to remember:

  1. Don’t use Gray-Hat techniques - they are risky and ill-advised SEO methods.
  2. If you go Gray, you will risk being reported to search engines by your competitors.

Tuesday 26 July 2011

Mission ImposSERPble: Establishing Click-through Rates


Google and its user experience is ever changing. For a company that has more than 60% of the search market, it's common to hear the question, “How many visitors can we expect, if we rank [x]?” It’s a fair question. It's just impossible to predict. Which is a fair answer. But, as my father says, “If you want fair, go to the Puyallup.” So we inevitably hear, “Well, can you take a guess? Or give us an estimate? Anything?”
To answer that question, we turned to major studies about click-through rates, incuding Optify, Enquiro, and the studies released using the leaked AOL data of 2006. But these studies are old; this study is new. Ladies, Gentlemen, and Mozbot, it is our immense pleasure to present to you…
The Slingshot SEO Google CTR Study: Mission ImposSERPble
There have been a number of changes to the Google user experience since those studies/surveys were published years ago. There's a new algorithm, a new user interface, increased mobile search, and social signals. On top of that, the blended SERP is riddled with videos, news, places, images, and even shopping results.
We made this study super transparent. You can review our step-by-step process to see how we arived at our results. This study is an ongoing project that will be compared with future SERPs and other CTR studies. Share your thoughts on the study and the research process to help us include additional factors and methods in the future.
Our client databank is made up of more than 200 major retailers and enterprise groups, and our sample set was chosen from more than thousands of keywords based on very strict criteria to ensure the accuracy and quality of the study results.
The study qualification criteria is as follows:
  • A keyword phrase must rank in a position (1 to 10)
  • The position must be stable for 30 days
Each keyword that we track at Slingshot was considered and every keyword that matched our strict criteria was included. From this method, we generated a sample set of exactly 324 keywords, with at least 30 in each of the top 10 ranking positions.
We are confident in the validity of this CTR data as a baseline model, since the data was generated using more than 170,000 actual user visits across 324 keywords over a 6-month period.
Data-Gathering Process
Authority Labs: Finding Stable Keywords
We currently use Authority Labs to track 10,646 keywords' daily positions in SERPs. From this, we were able to identify which keywords had stable positions for 30 days. For example, for the keyword “cars,” we observed a stable rank at position 2 for June 2011.
Stable 30 day ranking - ImposSERPble
Google Adwords Keyword Tool: All Months Are Not Created Equal
We found the number of [Exact] and “Phrase” local monthly searches using the Google Adwords keyword tool. It is important to note that all keywords have different monthly trends. For example, a keyword like “LCD TV” would typically spike in November, just before the holiday season. If you’re looking at searches for that keyword in May, when the search volume is not as high, your monthly search average may be overstated. So we downloaded the .csv file from Adwords, which separates the search data by month for more accuracy.
Google keyword tool csv download - ImposSERPble
By doing this, we were able to calculate our long-tail searches for that keyword. “Phrase” – [Exact] = Long-tail.
Google Analytics: Exact and Long-Tail Visits
Under Keywords in Google Analytics, we quickly specified the date of our keywords’ stable positions. In this case, “cars” was stable in June 2011. We also needed to specify “non-paid” visits, so that we were only including organic results.
Google analytics non paid - ImposSERPble
Next, we needed to limit our filter to visits from Google in the United States only. This was important since we were using Local Monthly Searches in Adwords, which is specific to U.S. searches.
Google analytics phrase and exact - ImposSERPble
After applying the filter, we were given our exact visits for the word “cars” and phrase visits, which included the word “cars” and every long-tail variation. Again, to get the number of long-tail visits, we simply used subtraction: Phrase – Exact = Long-Tail visits.
Calculations
We were then able to calculate the Exact and Long-Tail Click-through rate for our keyword.
EXACT CTR = Exact Visits from Google Analytics / [Exact] Local Monthly Searches from Adwords
LONG-TAIL CTR = (Phrase Visits – Exact Visits from Google Analytics) / (“Phrase” – [Exact] Local Monthly Searches from Adwords)
Results
What was the observed CTR curve for organic U.S. results for positions #1-10 in the SERP?
Based on our sample set of 324 keywords, we observed the following curve for Exact CTR:
Google CTR curve - ImposSERPble
Our calculations revealed an 18.2% CTR for a No. 1 rank and 10.05% for No. 2. CTR for each position below the fold (Positions 5 and beyond) is below 4%. An interesting implication of our CTR curve is that for any given SERP, the percentage of users who click on an organic result in the top 10 is 52.32%. This makes sense and seems to be typical user behavior, as many Google users will window shop the SERP results and search again before clicking on a domain.
Degrees of Difference
CTR study comparisons - ImposSERPble
The first thing we noticed from the results of our study was that our observed CTR curve was significantly lower than these two previous studies. There are several fundamental differences between the studies. One should not blindly compare the CTR curves between these studies, but note their differences.
Optify’s insightful and thorough study was conducted during the holiday season of December 2010. There are significant changes in Google’s rankings during the holiday season that many believe have a substantial impact on user behavior, as well as the inherent change in user intent.
The study published by Enquiro Search Solutions was conducted in 2007 using survey data and eye-tracking research. That study was the result of a business-to-business focused survey of 1,084 pre-researched and pre-selected participants. It was an interesting study because it looked directly at user behavior through eye-tracking and how attention drops off as users scroll down the page.
Long-Tail CTR: Volatile and Unpredictable
For each keyword, we found the percentage of click-through for all long-tail terms over the same period. For example, if “cars” ranks at position 2 for June 2011, how much traffic could that domain expect to receive from the keyword phrases “new cars,” “used cars,” or “affordable cars?” The reasoning is, if you rank second for “cars,” you are likely to drive traffic for those other keywords as well, even if those positions are unstable. We were hoping to find an elegant long-tail pattern, but we could not prove that long-tail CTR is directly dependent on the exact term’s position in the SERP. We did observe an average long-tail range of 1.17% to 5.80% for each position.
Google CTR data table - ImposSERPble
Blended SERPs: The “Universal” Effect
Starting in May 2007, news, video, local, and book search engines were blended into Google SERPs, which have since included images, videos, shopping, places, real-time, and social results. But do blended SERPs have lower CTRs? Since these blended results often push high-ranking domains towards the bottom of the page, we predicted that CTR would indeed be lower for blended SERPs. However, a counter-intuitive hypothesis would suggest that because certain SERPs have these blended results inserted by Google, they are viewed as more credible results and that CTR should be higher for those blended SERPs. We analyzed our sample set and failed to show significant differences in user behavior regarding blended versus non-blended results. The effect of blended results on user behavior remains to be seen.
Google CTR blended data table - ImposSERPble
As previously mentioned, this study will be used in comparison to future SERPs as the Slingshot SEO Research & Development team continues to track and analyze more keywords and collect additional CTR data. It is our hope that these findings will assist organic SEOs in making performance projections and consider multiple factors when selecting keywords. We look forward to additional studies, both yours and ours, on CTRs and we encourage you to share your findings. With multiple prospective and recent social releases, our research team will be dedicated to examining the effects of social platforms and Click-through rates, and how the organic CTR curve changes over time.

SERP Click Through Rate of Google Search Results – AOL-data.tgz – Want to Know How Many Clicks The #1 Google Position Gets?


Well after some gentle persuasion of MySQL the AOL-data.tgz files have surrendered some interesting, if not wholly unexpected, information about the relative strengths and click through rates of SERP positions.
The dataset contained 36,389,567 search queries with 19,434,540 clickthroughs. While we all knew the importance of the top 3 positions in the Google SERPs, this analysis further reinforces that fact:
SERP Clickthrough % of Top 10 SERP Positions
SERP Click Through Rate of Top 10 SERP Positions

Interestingly, the #1 SERP position recieves 42.3% of all clickthroughs. The #2 position only accounts for 11.92% of all clickthroughs – almost 72% less clickthroughs than the top position in the SERPs. Attaining the #1 position for your keywords/phrases results in nearly 4 times more traffic than that of your nearest rival – now that’s a serious difference in both traffic and potential revenue.
A #3 placement in the SERPs results in a 8.44% clickthrough rate, almost 30% less than the #2 and over 80% less than the top position on the first results page.
As we move down the page the rate of decline in clickthrough also falls. Notice that a #10 position in the SERPs receives slightly more clickthroughs than #9. This is most probably related to users glancing at the final listing as they scroll to the page navigation:
Clickthrough Analysis of SERP Pages 1-4
Image showing the SERP Click Through Rates of #11, #20, #21, #31, #41

Moving off the first SERP the rate of decline in clickthrough picks up considerably. The clickthrough rate for listings with #11 rank dropped to 0.66%. That’s an almost 80% decline in clickthroughs from the #10 SERP position and shows that being on the first SERP page results in far greater SE traffic than lower listings.

Google SERP Click Through Rates – The Raw Numbers

Rank#Click Throughs%Delta #n-1Delta #n1
 19,434,540100%  
18,220,27842.30%n/an/a
22,316,73811.92%-71.82%-71.82%
31,640,7518.44%-29.46%-80.04%
41,171,6426.03%-28.59%-85.75%
5943,6674.86%-19.46%-88.52%
6774,7183.99%-17.90%-90.58%
7655,9143.37%-15.34%-92.95%
8579,1962.98%-11.69%-92.95%
9549,1962.83%-5.18%-93.32%
10577.3252.97%-5.12%-92.98%
11127,6880.66%-77.88%-98.45%
12108,5550.66%-14.98%-98.68%
13101,8020.52%-6.22%-98.76%
1494,2210.48%-7.45%-98.85%
1591,0200.47%-3.40%-98.89%
1675,0060.39%-17.59%-99.09%
1770,0540.36%-6.60%-99.15%
1865,8320.34%-6.03%-99.20%
1962,1410.32%-5.61%-99.24%
2058,3820.30%-6.05%-99.29%
2155,4710.29%-4.99%-99.33%
3123,0410.12%-58.46%-99.72%
4114,0240.07%-39.13%-99.83%
Click Through Rates of Google SERPs based on AOL-data.tgz
Here’s the same table in image format:
AOL Clickthrough Data
SERP Click Through Rates of Google SERPs based on AOL-data.tgz

The volume of clickthroughs for lower SERPs is so trivial that for all but the highest volume search terms these positions will generally yield little or no benefit to site owners (obviously some niches will prove to be exceptional).
The main message from the AOL data is that page 1 SERP is where the real action lies and #1 positions reign supreme.