Showing posts with label seo. Show all posts
Showing posts with label seo. Show all posts

Friday 3 February 2012

Is PageRank Important?

Just Enough Knowledge to be Dangerous

One of the bigger problems with learning in the field of SEO is that there are a lot of people who have a nugget of information. And they spread it far and wide without the proper context needed to evaluate the potential risks and rewards of any given strategy. So new SEOs end up thinking topic x is the most important, then topic y, then topic z. And then someone debunks one of those. Many false facts are taken as truths when the people with a nugget of information (that they found from some source) spread it as fact.

Accurate Answers Need Context

As the structure of the web changes and search engine relevancy algorithms change then so must the field of SEO. This means that the right answer to questions can change frequently, and information from many years ago may not be correct. Does PageRank matter? When I first got in SEO it was crucially important, but over the years other pieces of the relevancy algorithms (like domain age, domain name, domain trust, domain extension, link anchor text, searcher location, search query chains, word relationships, search personalization, other user data, result re-ranking based on local inter-connectivity, input from 10,000+ remote quality raters, and even a wide array of penalties & filters) have been layered over the top of the core relevancy algorithm.

If that sounds like a lot, it is because it is!

Yes, PageRank is important to driving indexing, but for rankings it is nowhere near as important as it once was. SEO has become a much more refined art. In an October 2009 interview, Google's Amit Singal stated:

No one should feel, if I dismantle the current search system, someone will get upset. That’s the wrong environment. When I came, I dismantled [Google cofounders] Larry and Sergey’s whole ranking system. That was the whole idea. I just said, That’s how I think it should be done, and Sergey said, Great!

Great SEO Service is Interactive

Search keeps innovating - as it must. Each layer of innovation creates new challenges and new opportunities.

Not only does SEO strategy change over time, but it also varies from site to site. A large corporate site has a different set of strengths and weaknesses than a small local business website. The best SEO advice must incorporate all of the following
  • where you are
  • where you want to be
  • the resources you have to bridge the gap between the above 2 (domain names, brand, social relations, public relations, capital, etc.)
  • what the competition is doing
  • your strengths and weaknesses relative to your market

That is why having an interactive SEO Community is so important. It allows us to look for competitive strengths and weaknesses, and offer useful tips that fit your market, your website, and your business.

Even Search Engineers Don't Know All the Search Algorithms

The algorithms are so complex that sometimes even leading search engineers working for Google are uncertain of what is going on. Search engineers can't know every bit of code because Google has made over 450 algorithm changes in a single year.

When I first wrote about a new algorithmic anomaly that I (and others) saw, I got flamed with some pretty nasty words on public SEO sites...a few of which are highlighted below:

SEO Company.

The above people were:

  • confident
  • rude
  • wrong

And that is part of the reason I stopped sharing as much research publicly. Sharing publicly meant...

  • spending long hours of research and writing (for free)
  • creating more competition for myself (from the people who listen to my tips and advice)
  • watching my brand get dragged through the mud by people who didn't have the experience or capacity needed to understand and evaluate what I was writing about (but who had enough time to hang out in a free forum and blast me).

Whereas if we share that sort of information in our exclusive member forums we...

  • help our customers
  • get to share information and learn from each other's experiences
  • don't get blasted by the trolls hanging out on the public forums

Google's Matt Cutts Confirmed I Was Right

In early 2008 Google's Matt Cutts (one of the top 4 search engineers working at Google) wrote about the above issue that he did not know existed (even AFTER he was alerted to it).

Matt Cutts on Position 6 Issue.

But take notice that Matt would not confirm the issue until he claimed it had been corrected. So if you wanted to research that issue to better learn the relevancy algorithms it was already gone.

SEO professionals either captured the opportunity early or missed it. And, if they waited for the official word from Google, they missed it.

Algorithmic anomalies & new penalties are often written off by the industry & then months or years later the truth is revealed.

Back to PageRank

So PageRank...is it important? Yes, primarily for

  • determining the original source of content when duplicates of a page exist
  • selecting the initial set of results (before re-ranking them based on other factors)
  • establishing the crawl priority and crawl depth of a site

But when determining which site ranks better than the next, link diversity is typically far more important than raw PageRank. And even though PageRank is supposed to be query independent, Google warps their view of the web graph where necessary to improve relevancy, like when localizing search results:

Q: Anything you’ve focused on more recently than freshness?

A: Localization. We were not local enough in multiple countries, especially in countries where there are multiple languages or in countries whose language is the same as the majority country.

So in Austria, where they speak German, they were getting many more German results because the German Web is bigger, the German linkage is bigger. Or in the U.K., they were getting American results, or in India or New Zealand. So we built a team around it and we have made great strides in localization. And we have had a lot of success internationally.

The above quote shows how they look at far more than PageRank and links when trying to determine relevancy.

3 Common SEO Approaches

There are 3 basic ways to approach search engine optimization

  • a mechanical strategy, where you try to outsmart the search engines and stay ahead of their relevancy algorithms
  • a marketing-based approach, where you try to meet ranking criteria by creating the types of content that other people value and making sure you push market it aggressively
  • a hybrid approach, where you take the easy mechanical wins and study general algorithmic shifts...but are primarily driving your decisions based on fundamental marketing principals

Comparing the 3 Strategies

For most people the first approach is simply too complex, risky, and uncertain to be worth the effort. Not only do the sites get burned to the ground, but it happens over and over again, so it is quite hard to build momentum and a real business that keeps growing. In fact, most of the top "black hat" SEOs have "white hat" sites that help provide stable income in case anything happens to their riskier sites. Some people are great at coming up with clever hacks, but most people would be better off focusing on building their business using more traditional means.

If search engineers have access to the source code and still don't know everything then how can people outside the company know everything? They can't. Which is why we take a hybrid approach to SEO.

The approach we teach is the hybrid approach - a marketing-based strategy with some easy mechanical wins mixed in. Our customers take some of these easy wins to help differentiate their strategy from uninformed competitors, and then use marketing principals to build off of early success.

The Paradox of SEO

In using a marketing based approach you build up many signals of trust and many rankings as a side effect of doing traditional marketing. If people are talking about you and like your products then you are probably going to get some free high-quality links. And this leads us to the paradox of SEO: "the less reliant your site is on Google the more Google will want to rely on your site."

If you want good information to find out what is working and what is not, you can use our site searchto find answers to most common SEO questions, and know you are getting answers from a trust-worthy source. The information we give away is of higher value than what most people sell.

Thursday 19 January 2012

10 Elements of a Perfectly Optimized Page

One area that search engines have made a number of significant advancements in recent years is in how they evaluate content on a website. So what does a perfectly optimized page look like in 2012? Let’s look at 10 elements.

Bearnaise Sauce Optimized Page

1. Title tags are still important, but it’s not a good idea to over-optimize them.

2. Descriptions still don’t appear to add much ranking value, but can help encourage clicks.

3. Header tags still need to be relevant.

4. URL still ideally mentions the keywords.

5. Content is now about semantically relevant supporting keywords, not multiple mentions of the keywords. The example chosen is a recipe, because in order to make béarnaise sauce there are specific ingredients that are 100 percent relevant to the eventual outcome. One way of checking what keywords Google might consider as relevant is to do a ‘~keyword’ (or tilde) search. Other ways, let’s be honest, involve nothing more than common sense and knowing your subject.

6. Video and other ‘rich’ content can be useful on a page to increase engagement levels, reduce bounce rates and also to appear alongside results as illustrated.

apple-ipad-review-serp

7. Internal links need to follow the "reasonable surfer" patent. It makes sense in the "perfectly optimized page" example above to link to peppercorn sauce as an alternative to béarnaise.

8. Facebook/Twitter/other login comments are a way of sharing the content on other platforms. The direct SEO benefit may be debatable, but it never hurts to get your content in front of a large amount of people. With Google Search Plus Your World, it could be that adding a Google+ login is more important than anything else.

9. User reviews add regular content to the page, which can also be coded to include microformatting instructions and add extra elements to your listings in search engine result pages (SERPs).

10. Newsfeeds only share content that already exists elsewhere, but they contribute to an overall impression of the page changing on a regular basis.

It’s worth noting that the “perfectly optimized page” above won’t be perfect for all verticals, or for all brands – not everyone has the ability to add customer reviews to their product pages (e.g., insurance comparison sites).

Although there's no one-size-fits-all solution, hopefully the above list will give you some guidance on how to perfect your on-page SEO.

Tuesday 10 January 2012

How Google+ Uses SEO to Steal Search from Facebook and Twitter


  Google's Superior SEO Strategy

Notice anything odd about your Google+ profile? Does it rank incredibly well in Google’s search results for your own name?
Colleagues note that their G+ profile now outranks other online identities that they’ve worked for years on. My own Google+ profile, just 5 months old, ranks #2 for my name. It now ranks higher than both my Twitter and Facebook profiles, even though I use those services far more often.
Profiles aren’t the only thing ranking. Individual Google+ posts frequently appear in search results as well.
Google+ Domination
Ranking for people’s names is one of the Holy Grails of search, like Amazon ranking for every book in print. With 7 billion people in the world, ranking on the first page for even a small portion of these is lucrative territory.
As search and social focus more on the individual, the war over names has begun.
How has Google won so much real estate on their own search pages in such a short period of time? Do they cheat? No, not really - more on this later. Google wins by employing really smart Search Engine Optimization techniques – the same SEO practices available to any online business.
For Facebook especially, this is a sensitive issue. Facebook actively prevents Google from crawling most of its content, allowing big G to access “Fan” pages, but limiting information from regular profiles. Now that Google+ has entered the social game, this policy puts Facebook results at risk of dropping in rankings and losing search real estate.
I often work with websites and startups wanting to build SEO features into their platform. If I were to build a social media service for SEO domination from scratch, I would build it exactly like Google+.
Here's the takeaway: Use SEO to your competitive advantage, no matter your niche.

1. Incentivize Inbound Links

Not long ago, Google started displaying author photos in its search results. In order to display a photo, Google asks authors to add links from their webpages to their Google+ profile. This creates potentially millions of high quality links from the world’s most influential online publishers, all pointing to multiple Google+ profiles.
Google+ Linking
Twitter and Facebook both benefit from similar links, but never before has a social media service offered such an incentive.
Google's SEO Tactic: Require Authors to Link to their Google+ Profile

2. Internal Linking

One thing noted about Google+ when it was released was just how easy it was to be in lots of circles, or add lots of people to your own. People who struggled on Twitter for years to build up 1000 followers, suddenly found themselves in 2000 or 3000 Google+ circles, seemingly overnight.
Circle Count
Google’s strategy to connect everyone on the planet also makes for good internal linking. Following more than 1000 people may not create a practical social experience, but it creates a great SEO opportunity. The more your content is shared in other people’s streams and profiles, then the more your content is crawled, indexed, and deemed important by search engines.
Google's SEO Tactic: Encourage Large Circles Counts

3. Lots of Indexable Content

My public Google+ profile contains a wealth of information, all visible to search engines, including:
  • Biographical Information
  • Full Text of Public Posts
  • Photos
  • Links to people who have added me to their circles
  • Everything I have ever +1’d
Compare that to my Twitter account – limited to 160 characters of biographical information, or my Facebook profile, which reads like an auto-generated pamphlet.
Consider how a search engine sees these pages. Take a look at the source code of any Google+ profile or use a tool SEO-browser (a search robot simulator) to see how many words appear on each profile.
  • Facebook – 275 Words

  • Twitter – 491 Words

  • Google+ – 2621 Words

Google structures content to provide a wealth of information for search engines, to index and serve in search results.
Google's SEO Tactic: Search Engine Friendly Profiles

4. On-Page Optimization

Google+ makes it easy to share posts from others – a feature much like retweeting on Twitter or reblogging on Tumblr. These Google+ posts frequently show up in search results as their own entries.
As the title tag is one of the most important aspects of on-page optimization, Google wisely choose longer, more descriptive title tags. Compare these to the shorter title tags offered by Facebook and Twitter, which often run no longer than three unique words.
Here’s the title tag to 3 different posts, all by Rand Fishkin. Each of these posts is indexed by Google.
  • Facebook – Yesterday, I…
  • Twitter – Twitter / @randfish: Running test of Google+’s …
  • Google+ – Rand Fishkin – Google+ – Shocking how many of the folks featured in this post form…
Which do you think ranks better for a query with “Rand Fishkin” in the search?
Rand Fishkin
Google's SEO Tactic: Descriptive Title Tags

5. User Generated Content

Every post I’ve ever written on Google+ has been public. As a result, every post has been crawled and indexed by Google search. The privacy settings on the profiles are simple, intuitive and encourage openness.
The big green button screams, “Pick me! Pick me!”
Share Google+
Most Twitter posts are public by default, although unless a tweet becomes famous the 140 character limit prevents most tweets from reaching the definition of “rich” content. Facebook, in contrast, only shares posts from fan pages with Google, and not posts from regular profiles.
Google's SEO Tactic: Encourage Public Sharing

6. Show Google+ Author Profiles in Search Results

The first 5 items on this list represent SEO tactics that anyone can use, but in a way #6 belongs to Google alone. By linking to Google+ profiles in search results, they create an advantage that no other social media service can duplicate.
Is Google “cheating” by favoring it’s own property? Some say yes, but on the other hand, is there a more relevant result? To me, it makes more sense to connect my author profile with the website that actually hosts the content, such as my profile on SEOmoz.
Rich Google Snippets
This demonstrates the power of rich snippets. Since Google introduced author photos in search results, webmasters have scrambled to get their mug included – the idea being that rich snippets of all kinds increase click-through rates. The question is, are we increasing the CTR of our own website, or Google+?
Google's SEO Tactic: Creative Rich Snippets

What Can You Do?

Except for #6 above, most of these techniques are available to any online business. Google has found a way to create large amounts of search engine friendly content, and do it at scale.
The lack of diversity this creates in Google's search results is troubling to some. Google risks turning into McGoogle, where every result and every page looks the same. With any luck, more companies will adopt strong SEO strategies to raise themselves in search.
Now that the adoption of Google+ has hit 62 million users and growing, expect to see far more Google+ in your search results soon.

Wednesday 21 December 2011

Biggest Search Events of 2011 & Predictions for 2012

Everybody's been talking about search in 2011, but what were the events that helped to shape the search landscape of the year?

We ran a poll on SEOptimise in order to find out. While the biggest search impact of 2011 might not come as much of a surprise, some other events were notable by their absence.

Out of eight possibilities, one ranked as the clear leader, with twice the votes of its nearest rival at the last count. So, without further ado, let’s look back at the most notable search events of 2011.

The Google Panda Update

Google's Panda algorithm change was all about improving the quality of search results.

This has caused lots of problems for SEOs and webmasters, with many sites suffering from huge drops in rankings and subsequent traffic as a result. There’s also been no real quick fix to this and for some sites it’s been such a long way back that they’ve had to change their whole business model in order to react!

SSL Search

Secure Sockets Layer (SSL) search allows Google users to encrypt their search queries. Google made this the "default experience" for signed-in users on Google.com in October and, as a consequence, stopped passing query data to analytics software including Google Analytics.

Users began to see "(not provided)" appearing in their Google Analytics data, indicating that the search had been encrypted and the keyword data was therefore not available.

Google have stated that overall this will be a single-digit percentage of keywords that is classed as “(not provided)” – however, from an SEO agency perspective, if you’ve set client targets for increases in non-branded search and are no longer accurately able to measure a full picture of where visits are coming from, they will be some difficulties here. As witnesses by the reaction to this change from the search industry!

Social Signals & Integration

With Twitter and Facebook now well established, LinkedIn covering the business angle, and Google+ still emerging on to the social stage, social signals and integration are impacting our search experience.

Both Facebook and Twitter are now widely integrated into websites, giving companies a 'face' and an easy way to deal with customer feedback, both positive and negative.

LinkedIn perhaps has less of an impact on websites' search rankings, although its highly search-visible profiles offer an easy way for professionals to appear in queries relating to their own name or work experience.

But it's Google+ that holds the potential to change search drastically, providing it can gain enough traction to build a dedicated and regular user base.

The +1 button is already appearing on blogs and websites across the web, and on browser toolbars too, putting search rankings directly in the hands of Google's users for the first time.

Siri

Siri is unarguably impressive. Responding to natural, conversational questions with relevant search results, the voice-activated search function on Apple's iPhone 4S ignited a media furore when it launched.

Yahoo Site Explorer

Yahoo retired its Site Explorer service in November as part of its partnership with Bing, advising its users to head over to Bing Webmaster Tools instead. Site Explorer actually predated Google Webmaster Tools by about a year, and had become a point of reference for many web marketers.

Yahoo Site Explorer had allowed a glimpse into the performance of competitors' sites, and left a genuine gulf among free online services in those terms.

Google Freshness Algorithm Update

Google's Freshness update affected over a third of search results - roughly 35 percent - and is part of the real-time search trend.

It ensures that search queries relating to time-sensitive events, such as the Olympics, are more likely to yield results about recent or upcoming events than about those held a long time ago.

Between 6 and 10 percent of Google search users were expected to notice a change, with other types of content like news and reviews similarly impacted.

Microsoft-Yahoo Search Alliance

The Microsoft-Yahoo Search Alliance gave Microsoft direct access to some of Yahoo's search technologies as part of a 10-year licensing deal. Ostensibly, the alliance was part of an aim to deliver faster, more relevant results to users of both Yahoo search and Bing, with collaboration in other areas like paid search, too.

However, Google remains dominant, and the combo a distant second. And it seems, unlike with Google, web marketers were able to handle the transition smoothly enough that it had no negative effect on their search performance.

Predictions for 2012

So what might we see in the year ahead? Briefly, here’s what I expect:

Plenty More Privacy/Analytics Headaches

The rollout of SSL search from Google has only just started, with an increase in the number of queries affected widely anticipated. However, if the ICO don’t back down on the cookie directive law, this could only just be the start!

If you can only track users who opt in to allowing cookies this will have an extremely significant impact on how we measure website performance via analytics. So this is definitely the big one to look out for in 2012.

Shifting Facebook Demographics

I expect that this will be the year that teenagers leave Facebook in droves. The kind of growth this platform has seen can’t continue – and young people will be the first out of the door. Not only do they currently have to see their parents’ status updates, their parents can see theirs. No teen wants that.

Marketers are going to have to make a real effort to remain on top of this changing market and make sure they know where the teenagers go.

Unification of SEO and PR… With HR

SEO and PR have gradually become more integrated. Expect this trend to continue in 2012. What could be even more interesting will be larger companies using their employees to aid their marketing.

From Twitter, to Facebook, to YouTube – businesses will increasingly ask their employees to get involved in their online promotion. This could blur the boundaries between professional and social profiles, so firms will need to set out ground rules before using their workforce this way.

Tablets Taking Over

For so long the focus has been on mobile, but companies can’t risk missing the latest boat. Tablets are rapidly becoming the norm; eMarketer is predicting there will be 90 million tablet users by 2014.

This could help unify TV and online marketing. Research agency Sparkler found that 51 percent of all tablet use occurs while the owner is watching TV. It’s a downtime device and so in 2012 businesses need to ensure their marketing strategies take advantage of this.

Thursday 8 December 2011

Opinion: 3 Onsite SEO Myths and Facts – What Really Counts?

Before starting this article I would like to note that I am specifically talking about Google. The information below might not apply to other search engines.

Everybody who is into SEO knows that it is more than just link building and offsite techniques. Sure, links matter the most, but how about your website itself? Onsite optimization might not be the most important part of SEO according to some people but that just depends on the point of view. To me, there have always been some onsite factors that play a significant role in the eyes of Google.

Of course, the most important thing when optimizing a website according to link builders is getting links from website with a high TrustRank and the most important thing according to web designers is the proper coding. Since me and the people I work with focus on SEO as a complete process, we concentrate on everything important.

However, there are some things that just don’t matter as much as others especially when doing onsite SEO. Google changes its algorithm almost every day so a lot of the old onsite techniques that once worked are now useless thanks to the so many spammers exploiting them. So what exactly are the onsite factors that can affect your rankings?

WEBSITE TITLE

The title of your website is one of the most important things when it comes to onsite SEO and better rankings. Here is what people believe to be true and what the truth really is:

Myth

A common mistake that people make when doing onsite optimization is stuff keywords in the title of their website thinking that would help them rank better. Keyword stuffing was a technique that was kind of effective a long time ago until Google found out that the spammers are using it to their advantage. So Google decided to change their algorithm, making a website’s ranking depend more on links and less on onsite factors.

Fact

The title of your website matters a lot and if you don’t want to get a penalty, you need to keep it simply and user-friendly as well as Google-friendly. This is the place where you get to describe your whole website/business in about 10-15 words. I am not saying you should not put keywords in there. Quite the opposite – put your most important keywords in the title but make sure you put them in a way that is not spammy looking instead of just stuffing them and separating them with commas.

Tips

When writing your website title, be creative and don’t repeat anything more than once. For example, if you are selling silver and gold jewelry, writing “Silver jewelry, gold jewelry…” in your title is not a good idea. Instead use the phrase “Silver and gold jewelry”. You should know that Google doesn’t care about the order of the words and you will get credit for each permutation.

URL STRUCTURE

The most obvious thing is the domain name. If your domain name is an exact match for your most competitive keyword – you’re on. However, the rest of the URL structure is also a very important onsite factor and many people are still doing it all wrong.

Myth

Again, a very common myth is that stuffing as many keywords as possible in the name of each page will work.

Fact

A website with a better URL structure has an advantage over a website with dynamic URLs. Although dynamic URLs can be read by Google, they simply don’t have the same effect as properly structured ones.

Tips

When taking care of your URL structure, the most important thing is to name each page of your website with the most relevant keyword. Creating many pages with different names that are also your keywords will pay off better than having dynamic URLs.

AGE

The age of a website is another factor that plays a big role when it comes to its rankings but not in a way that some people think.

Myth

A lot of people believe that a website will get better rankings with time on its own. So their strategy is to just sit there and wait because they believe that a website that is 3 years old should automatically rank better than a website that is 4 months old no matter what. They believe that if no offsite optimization has been done to the old website it will still have better ranking than a new website with a lot of backlinks for example.

Fact

The age of a website does matter to Google. However your website will not rank high just because it’s old. The only thing that is affected by the site age is the amount of TrustRank it gets from the links pointing to it. The first two months, you will most likely not rank at all in Google because you will be in the “sandbox” where all new websites go. Then you will start receiving a bigger percentage of TrustRank as your website gets older. 4 years after the creation of your website, you will start receiving 100% of the TrustRank that the links pointing to your website pass.

Tips

Just because your website will be in the sandbox for the first two months, doesn’t mean you should sit and wait for the time to pass and then start building links. Instead, use the time to get some links and enjoy the credit Google will give you for them when the trial period is over.

Conclusion

These are 3 of the most important onsite SEO factors you should focus on, but I want to touch on two other factors people still think matter, the XML sitemap and the coding. Just to be clear – this article is about which onsite factors help you get better rankings and not about what makes Google’s job easier. Of course the XML sitemap is a really great thing and it sure helps Google crawl your website faster and therefore index it faster. However your sitemap has nothing to do with your rankings at all nor does the coding of your website.

Concentrate on what is really important and don’t worry about things web designers and other charlatans tell you in order to get more money from you.

Thursday 28 July 2011

What is Gray-Hat SEO ?


Search engine guidelines clearly define Black-Hat techniques as spamming techniques. You can recognize and avoid them in your SEO campaigns. However, there are so called Gray-Hat techniques, which are temporarily unknown or not restricted by search engines.
Gray-Hats are different because they try to do things they believe are ill-defined by Google, without first asking permission.
Let's look at SearchSecurity.com's description of this notion: "Gray-Hat describes a cracker (or, if you prefer, hacker) who exploits a security weakness in a computer system or product in order to bring the weakness to the attention of the owners. Unlike a Black-Hat, a Gray-Hat acts without malicious intent. The goal of a Gray-Hat is to improve system and network security. However, by publicizing vulnerability, the Gray-Hat may give other crackers the opportunity to exploit it. This differs from the White-Hat who alerts system owners and vendors of vulnerability without actually exploiting it in public".
Google has clearly defined Gray-Hat SEO as a risky, ill-advised method. Here is the indirect spam definition of Gray-Hat techniques from the top search engine: "It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it.
Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit… If you believe that another site is abusing Google's quality guidelines, please report that site… spam reports we receive are used to create scalable algorithms that recognize and block future spam attempts."
Now, having a sufficient number of Gray-Hat definitions, you should clearly understand the danger of any spam or spam-like technique. Gray-Hat techniques should not be used. Never deceive anyone, and avoid such methods at any cost.
Here we'd like to show some examples of Gray-Hat techniques:

Outdated Gray-Hat Techniques

Mild keyword stuffing
The keyword stuffing technique has a deceptive meaning by its origin. Search Engines recommend that site owners write qualitative and relevant contents for visitors but not the ranking mechanism of the engines. The main criterion for using keywords in your copy should be the question: will you apply this technique (add numerous, repetitive keywords) to human visitors only? Gray-Hats prefer to violate this guideline in a mild way. The number of keywords they use in the meaningful areas of the Web pages is close to the limit allowed.
Irrelevant keywords in image ALT tags
This technique means using Alt Tags stuffed by keywords unrelated to the specific image. The only purpose of this fraudulent technique is to attract more traffic to the pages. As you know, any type of keyword stuffing is offensive and violates the search engine's guidelines. They can track the keywords you have chosen and correlate them with the keyword profile of the Web page and the whole site.

Advanced Gray-Hat Techniques

Cloaking
Search engines strictly forbid cloaking for the purpose of optimization. "Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index" – state Google Webmaster guidelines.
A legitimate example of cloaking is to serve different areas of your site for the search engine to see but not the users. A ‘member's only' section can help in this case. Gray-Hat cloaking is mainly unintentional or borders on the harmful usage of different pages.
Unintentional cloaking may occur when you serve different content to dedicated audiences or some other groups. Such techniques are very risky and we recommend you contact each search engine, present your reasoning, and allow them the opportunity to approve.
Black-Hat shadow cloaking starts when site owners manipulate this method intentionally to influence a search engine's ranking algorithm.
Publishing duplicate content We have spent a lot of time teaching people how to write proper, keyword-targeted and valuable texts. Starting from the keyword research stage up to fresh content writing, these works demand special skills or additional costs if you hire a professional copy writer.
Instead of relevant, interesting, and unique contents, hackers manipulate duplicate content using the same few hundred words on every page or copying some one else's.
The Black-Hat technique copies the whole volume of the original text while Gray-Hats prefer to mix and dilute the parts.
Gray-Hats play around with margins to trick the search engines. There is no doubt that fresh, unique content is king, and duplicate content is very, very bad.
There are cases where duplicate content is not only legitimate, but is to be expected. To learn more about legitimate types of duplicate content and how to deal with multiple versions of the same content, refer to the "Duplicate Content Issues" lesson of this training course.
Content mashup Content mashup may be relative to the activity depicted above. Although we deal with the subject of content stealing here, we should mention that content is mashed in a more sophisticated fashion this time. Gray-Hat sites using the content mashup method generate non-unique content from other Web pages.
Irrelevant links As you can guess from the name of this technique, irrelevant links may not correspond to the topic of the website. Search engines regard these kinds of links as legal but won't give you much weight for them. That's why they go grey and don't hurt your reputation so much.
An example of mild spamming is asking links from every one of your clients or offering some other form of collaboration.
Off-topic link exchange If you exchange links with a site other than one that deals with your topic, you'll be bordering on the Black-Hat technique. Whether it is a Gray or Black-Hat exchange will depend on the number of off-topic links involved. Spammers know that several off-topic links may be devalued but not penalized.
Mild artificial link schemes Link schemes have already been defined in the previous Lesson devoted to the Black-Hat SEO techniques. However some kinds of artificial link schemes can be untraceable if mixed with other variety of generating backlinks. For example, links created within a narrow thematic niche may overlap or create Web rings, even without the initial purpose of manipulating search algorithms, so it may be hard for the search engines to discover the real intentions of website owners.
Remember that link schemes created for the sole purpose of manipulating SE algorithms may be considered Gray-Hat or even Black-Hat SEO. Thus, reputation-conscious website owners should think twice before getting engaged in unnatural link building - especially when websites hosted on the same server, or websites linking to bad neighborhoods take part in a link exchange game.
Paid links
Not all paid links violate search engine guidelines. "Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results." - as Google states in their Webmasters/Site owners Help (http://www.google.com/support/webmasters/bin/answer.py?answer=66736).
Link buying is actually a site owner's responsibility, which is the border technique between the law and punishment, advertising, and manipulative spamming actions. Generally, link buying is a loophole in the search engine's defense.
To follow the White-Hat SEO, purchased links should be closed from the crawlers by using a robots.txt file or adding a rel="nofollow" attribute to the <a> tag. Other methods will be punished the day the truth is known. If you ignore this guideline, you'll fall into Gray-Hat and this could play a dramatic role in your PageRank and SE rankings.
Remember to follow the white SEO line in your optimization works. Ranks should be earned honestly, and search engines will give you the opportunity to keep them with your relevant website.
Domain buying
Domain buying is another Gray-Hat technique used to get a quick ranking boost. The main idea here is simple: you buy an active domain name and get the PageRank (or "link juice") that comes with that website as a bonus to it!
Google is aware of this practice as well, and checks the domains for a long time. If your new domain name is really better for the spelling options or brand awareness, you are safe and sound.
However, the search engines can nullify the links of the domain you bought. You should know that if you act legitimately, you will be out of the SE radar. When you start illegal behavior, the search engines will regard you with suspicion.
Illegal Link Baiting
Link baiting is a marketing technique Social Media Marketing entrepreneurs encourage and use widely to promote a site, business, or brand through different social media channels. This idea has diffused and became another promotion option, which helps a business become engaged and interact with existing or potential consumers.
The concept of link baiting is creating content and tools that people want to link their websites to, or creating article content people actually want to publish.
Illegal link baiting starts when promotion becomes deceptive or irrelevant, and links or social bookmarking votes are spread via the group of gaming members using payment as well.
Gray-Hat methods include irrelevant widgets / tools of a viral nature spammers offer as links to off-topic sites. Because Gray-Hat SEO is a risky and ill-advised method, we strongly recommend you avoid any deceptive spam techniques at any cost. Always bear in mind future competitors' spam reports and the search engines' penalties for abusing their guidelines, penalties which can really hurt your rankings and the whole website.
If you still are doubtful about the color of your SEO techniques, read how search engines can adapt to the webmaster's behavior. According to SEOMoz, (http://www.seomoz.org/article/analysis-of-link-spam-alliances-paper) SEs can apply the following methods to track and prevent spam:
  1. "The use of visitor click-through rates and time spent on websites to determine quality.
  2. Advanced text analysis to determine legitimacy and quality of content.
  3. Vision-based analysis of Web page layouts to find commonalities in spam vs. non-spam pages.
  4. Editorial reviews of suspicious sites.
  5. Deflation in rankings of sites with suspected artificial link structures.
  6. Monitoring of rates of change in the speed of link acquisition and loss.
  7. Spam reports by users (and competitors)".
To sum everything up, Gray-Hat methods border Black-Hat techniques and may result in your website being banned. Moreover, these methods constantly lose their effectiveness as the search engines evolve and regularly update their indexing and ranking algorithms.

What you need to remember:

  1. Don’t use Gray-Hat techniques - they are risky and ill-advised SEO methods.
  2. If you go Gray, you will risk being reported to search engines by your competitors.

Saturday 7 May 2011

HTML Elements (Page Areas) That Matter

Since spiders see your page as HTML code instead of what is directly visible through a browser, optimizers must gain a solid understanding of the structure of a typical HTML document.
This lesson will guide you through some HTML basics and then show which elements are critical for optimization and why.
First, we recommend that your HTML documents comply with the XHTML standard. XHTML is the strictest standard of HTML (hypertext markup language). By following this standard you ensure that your pages are easily readable for search engine spiders. You can learn more about XHTML at the official resource of the World Wide Web consortium:
http://www.w3.org/
Every HTML document that complies with the standards has two principal sections: the HEAD area and the BODY area. To illustrate this we can open the source code of any HTML page found on the Web. Open it in your browser, right-click on the page and select "view page source" or "view source".

The HEAD section is everything you see between the <head> and </head> tags. The content of this section is invisible when viewing the page in a browser. As you can see, one of the elements it includes is the title tag (between the <title> and </title> markup). This is what is shown in the caption bar of the internet browser when this page is displayed in a browser. The title will also represent your page in the search engine results. As such, the title tag is a very important element.
The head section also includes various META tags. In the w3.org example we see the META keywords tag and the META description tag:
<meta name="keywords" content="…">
<meta name="description" content="…">
After the <head> tag is closed, the <body> tag opens. Everything that's within the body tag (i.e. between the <body> and </body> markup) is visible on the page when viewed in the browser.
In the body text of the w3c.org example, we see the <h1> and <h2> tags. These are called HTML headings and range from the 1st (h1) to the 6th (h6) level; initially, they were meant to mark logical styles for different levels of heading importance: "h1" being the most important heading and "h6" being the least important. Usually browsers display the tags from the largest to smallest. The <h1> tag displayed with the largest font size, and on down respectively until <h6> which displays the smallest font size. The search engines treat the heading tags the same way.
The links tag is another important body element, and is delimited by <a> and </a> markup.
The image tag <img> is responsible for displaying an appropriate image whenever a browser sees it in the source code.
Schematically, an HTML document in an optimizer's eyes (as well as in the search engine's eyes) looks like this:
<head>
<title>My title goes here</title>
<meta name="keywords" content="keyword 1, keyword 2, keyword 3">
<meta name="description" content="site description">
</head>
<body>
<h1>This is the first level heading which is important to search engines</h1>
<h2>this is a kind of subheading which is also important</h2>
This is a simple text in the body of the page. This content must include a minimum of 100 words, with keyword density around 3% to 7%, maximum keyword prominence towards the beginning, middle and end of the page, and maximum keyword proximity.
<b>This text will show in bold</b>
<a href="http://www.somesite.com" title="some widget site">Link to some widget site</a>
<img src="http://mysite.com/image.jpg" alt="and this is my image" />
</body>

Let's go through all the HTML elements and get some in-depth insight into how we can optimize each of them.

The title tag

The Title tag of your Web page is probably the most important HTML tag. Not only will search engines consider it when estimating your pages' relevancy towards certain keywords, but also when your title tag finally shows up in the SERP (search engine result pages). A lot depends on how attractive the title is to Web surfers and whether they are compelled to click on your link.
All search engines consider the keywords in this tag and generally give those keywords a great deal of importance in their ranking system. It is as important as your visible text copy and the links pointing to your page.
Always use your primary keywords in the title tag at least once. Try to place them at the start of the tag, i.e. make their prominence 100%. The minimum keyword prominence for the title is 60%.
Don't use several title tags on one page. Make sure the title is the first tag in the head section and that there are no tags before it. Avoid listing the same keyword multiple times in the tag, some engines may penalize for this. Avoid using the same title throughout your site. Try using a unique title tag for each Web page and use key phrases that are thematically relevant to that page. You can use variant forms of a keyword when possible or applicable.
For instance, if you use "Designer" in your Title tag, a search on "design" will give you a match on most engines. However, words like "companies" will not always yield a match on "company" since "company" is not an exact substring of "companies".
Longer titles are generally better than shorter ones. However the recommended word count for a title is only 5 to 9 words, and character length up to 80 symbols. Make your title interesting and appealing to searchers to convince them that they should click on it.
Moreover, you can put your company’s name in the title tag, even place it at the very begining of the tag. If your company is a well-known brand it’s essential for you to do it, if not – then it’s an excellent opportunity to promote it. What is more important is that you shouldn’t stop with just your company name but definitely add one or two descriptive phrases to the tag. Those who already know your company will query for it specifically in the engines and those who don’t – will find you while seeking the products or services you sell based on the descriptive phrases.
One more point to remember is that you should be very specific if you are working in a certain area. Your keywords should reflect the geographical region where you are primarily seeking clients. For example, if a customer looks for some goods (let’s say slippers) first they will begin with typing simply “slippers” and after the engine returns an enormous list from all over the world, the customer will narrow the query by adding some geographical names (e.g. Utah slippers). That’s your chance to be in the Top 10 of the new results for that area. 
While creating the title you can use the following approaches:
<Title>My Company Inc. Utah Slippers</Title>
<Title>My Company Inc. – Utah Footwear</Title>
<Title>My Company Inc. – Utah Slippers – Footwear in Utah</Title>
In the last example the geographical name is used twice in different variations and it is crucial not to put the same words right next to each other as that might be considered as spam by SEs. Don’t use ALL CAPS,  SEs are not case sensitive now and it won’t help, instead it will look rather crude. Initial capitals are well suited for the title tag.
The title should reveal the main idea of the visible text and thus reflect your business in the best possible way.
Here are 10 tips for title tags given by John Alexander, a prominent search engine expert:
  1. "When working with your keyword phrase, get it positioned up front so that as you build a sentence it still reads well.
  2. Try working with your one important keyword phrase up front and another secondary phrase to the rear of the title.
  3. Try writing your title to make a thought provoking statement.
  4. Try writing your title so that it asks a "thought provoking" question.
  5. Try writing a title so that it reveals a hidden truth or misconception.
  6. While in creative mode, keep your mind on what it is that your target audience really wants.
  7. Build honest titles that are related to your site content.
  8. Do NOT resort to keyword stuffing or stacking the title full of multiple phrases that do not convey an intelligent message.
  9. Do not include a lot of capitals or special characters in your title.
  10. Do not get hung up on title length. The easiest rule is to simply keep your title under 59 characters (for Lycos sake) and honestly, you can build really excellent titles in this space."

The META tags

There are two META tags that still appear to be of use by the search spiders: META keywords and META description. These tags are very unlikely to impact rankings; they can only play a weighty role in the site's click-through rate from the SERPs so it's worth optimizing them for your keywords as well.
If you use any other META tags, place them after these two.
The META Keywords
Syntax:
<meta name="keywords" content="keyword 1, keyword 2, keyword 3, …" />
Its initial purpose was to give search engine robots an idea of what the page is about to help with rankings. Unfortunately, as soon as this became evident, so many spammers started abusing it that spiders now have discounted the importance of this tag by at least half of its original ranking value. Most experts say this tag has no weight from the SEO perspective and does not influence your rankings.
If you still want to exploit this tag, use your main keyword phrase, a secondary keyword phrase, and a few synonyms of your keyword phrase in your keyword META tag. Make sure to focus the words in your keyword tag on that one page only, not on every single keyword that could possibly be associated with your entire website. Focus your tags on that page only.
Remember if you use many of the same words in your different keyword phrases, it could look as if you're spamming the engine, so be careful.
The META Description
Syntax:
<meta name="description" content="a short description of your site" />
The contents of the META description tag is what most search engines and directories will show under your title in the search result list. If you have not provided any META description tag to your Web page, the search engines try to make one for you, often using the first few words of your Web page or a text selection where the keyword phrases searched by the users appear. If the Search Engine makes up a description by picking up text from your page, the generated description may not do you Web page justice.
The Meta description tag needs to be kept brief yet informative. A description of about 25-30 words should be fine. Keywords and key phrases should be included in the Meta description tag, though care should be taken not to repeat them too often. Like the title tag, the META description tag should be customized for each page depending on the content theme and target keywords of this page.
Remember that even though Google doesn't consider the META description tag when determining relevancy, it often uses the contents of this tag in the snippet description of your page in the search results. So, make your description captivating and designed to attract traffic. The Meta description tag should be appealing to users, tempting them to click on the link to your site and visit your Web estate. Using the Meta description tag can help you increase the click-through rate of your page, which in its turn increases the traffic you can get from any ranking position.
Below is a nice example of an informative description optimized for "weather forecast software":
"The only weather forecast software that brings long-range weather forecasts, daily horoscopes, biorhythm calculator, Web cams, and weather maps to your desktop."

The body text

The main textual content that is visible to your visitors is placed within the body tag. It still matters for some search engines when it comes to your page analysis and ranking.
Remember the importance of keyword prominence and place your keyword phrase early in the body text of the page. This may also become a means to communicate your message to prospects; some search engines retrieve the first few lines of your Web page and use them as the description of your site in the search results. So, put a number of important keywords in the first few lines in the visible part of your body text. Try to tailor the text in the beginning so that it can be used as a description of your site.
Spread your keyword phrases throughout the body of the page in natural sounding paragraphs; try to keep separate words of your key phrases close together for proximity sake. Put a secondary key phrase in the middle and at the end of your body text. Have some of your keywords in bold (for this purpose, it's better to use the "<b>" tag instead of styles of logical <strong> formatting. However, you can still apply the necessary styles to this text by the following trick: <b style="font-weight:bold">).
Remember your content minimum for a page is 125 words but it's better to reach far beyond this limit.

HTML headings h1 – h6

The headings themselves are a good means of visually organizing your content. Besides that, search engines consider the headings and sub-headings of your page, (especially those in bold), to be important. Take advantage of this by using H1, H2 and H3 tags instead of graphical headings, especially towards the top of your page.
Use heading tags to break up your page copy and make it easier to read and absorb for your visitors. Include your most important search keywords and phrases within the heading text. It follows that using your target keywords and phrases within these headings means that search engines will give them more relevancy weight. Thus, you should always try to use your target keywords within the headings and sub-headings to break up the text on your page.
Page Heading incorporating most important keyword phrase 
Sub-Heading 1 incorporating most important keyword phrase
Paragraph of text incorporating other target keyword phrases 
Sub-Heading 2 incorporating next most important keyword phrase
Paragraph of text incorporating other target keyword phrases
And so on…

The problem about headings is that each type of browser has its own way of displaying them and thus may not match your design ideas. You may apply the following workaround with the help of the style attribute:
<h1 style="font-size:10px;color:#00FFFF;font-weight:bold">This is the formatted heading</h1>
Or with the class attribute, provided the class is defined somewhere in a style sheet.
The search engine will see a first level heading here, but the browser will show human visitors the text formatted as you need instead of standard level one heading.
What should be avoided is trying to repeat the first level tag more than one time. In other words, you shouldn't have more than one <h1> tag on your page to indicate that your main topic is streamlined around a single definite concept.
As for the tags of other levels, it is up to you to use multiple <h2>, <h3> tags etc. on a page in order to structure information in a proper way. Just do your best not to overuse them, and keep to the quality content guidelines.

Link text

Keyword usage is important in the visible text (also called anchor text) of links pointing outside your domain (also called outbound links) as well as links to the internal pages. When you give your users a reference to other documents relative to your theme, the words you use to refer to those documents are considered descriptive for your page's profile.
A usual link would look like this:
Click here for <a href="http://www.somesite.com/keyword-phrase.html" title="this text will appear when user mouse-overs the link">Visible link text</a>
Note: When you link to your own pages, rename these pages so that the URLs contain keywords separated with a hyphen – instead of running the keywords together. By breaking the words up in some way, you let the engines see them as individual words in a phrase. If the words are not broken up, the spiders will see the words as a single term.
Don't flood your links with keywords; usually it's enough to have up to three links per page containing your targeted terms, desirably the first three links.

ALT attributes of images

Alt tags consist of alternative wording for images that is displayed in browsers that can't display images, did not download the image for some reason, or is spoken by talking browsers for the blind. Search Engines use the text in the alt tags to substitute the anchor link text if the image is a hyperlink. This makes these attributes ideal for optimization.
Example:
<img src="images/logo.gif" alt="Graphic of a weather forecast software" width="415" height ="100" / >
As a rule, if you insert your keyword phrase in your ALT text (as long as you are also describing the graphic), you'll have a boost in relevancy with many of the engines. Google often picks up the first ALT text on the page and uses it as the description in the search results, so pay special attention to the ALT text in your first graphic.
To avoid spamming, never remove the actual graphic description from the ALT attributes when you're populating them with your key phrase, and do not plant your key phrase into more than the first three ALT image attributes on the page and then perhaps the last one as well.

What you should remember for this lesson:

  1. The areas of a standard HTML document which matter most for the search engine spiders are: the TITLE tag, the HTML headings, the link text and the ALT attributes of images.
  2. With most of these, observe four parameters: prominence, density, proximity and frequency when populating them with your keywords. This is most important with the body text. To improve keyword significance in the body text, use your keywords in bold once or twice.
  3. While working with your keywords, keep away from any kind of keyword stuffing. After finishing, ensure that the copy still reads naturally.