Friday 16 December 2011

On-Page Optimization Not Dead: Long-Tail Keywords Increase Rankings, Conversions [STUDY]

On-page optimization for long-tail keywords can result in ranking more than a page higher in search results, compared to half a page optimizing with head terms, according to a study by New York-based SEO & SaaS company Conductor. They also found that long-tail keywords converted 2.5 times better than head terms.

You may remember the uproar from last fall, when SEOMoz purported there was a higher correlation between LDA scores and high rankings than any other factor. Some took this to mean that on-page optimization didn't matter. It’s a topic that still pops up now and again; my on-page optimization isn’t working, I don’t know if it’s worth it... on-page optimization must be dead.

Not so, says Conductor. In their research study The Long Tail of Search, Conductor examined the effects of on-page auditing and optimization for long-tail keywords, versus optimizing for head terms or failing to optimize on-page at all. Not surprisingly, they saw a downward movement of more than two positions for keywords with no on-page optimization.

“Even in 2011 – often at executive prodding – many marketers are still singularly focused on the most searched terms in their industry that are also the most competitive and difficult to move up the search rankings," Conductor CEO Seth Besmertnik told Search Engine Watch. "As our study shows, we think there is huge opportunity in the long tail of search for the savvy search marketer to move up the search rankings more rapidly and convert at a rate that is 2.5 times greater than for head terms.”

Conductor’s research involved thousands of keywords studied over a period of nine months, using the data collection and on-page recommendations of their SEO platform, Searchlight. They first segmented keywords into three groups:

Keywords with shrinking on-page issues (being resolved by SEO).

Keywords with growing on-page issues (not being resolved by SEO).

Keywords with no on-page issues.


On-Page Optimization Crucial for SEO

average-keyword-rank-movement-for-keywords-conductor

Optimizing on-page elements for the keywords marketers want to rank for is critical, according to Conductor's research. On-page optimization for keywords identified by Searchlight as having on-page issues consistently resulted in rankings increases, by an average of 11.24 positions.

Websites with identified issues but no on-page optimization saw a two position drop. Keywords with no identified issues saw a less than one position increase.

Long-Tail Keywords Show Greatest Rankings Increases

average-keyword-rank-movement-head-vs-long-tail-keywords

Recognizing that there are two ways marketers commonly use the word “long-tail,” they looked at query volumes and the number of keywords in a phrase as separate issues and tested twice.

First, they excluded medium-volume keywords for the purpose of this study, focusing on those with either high (head) or low (long-tail) query volumes. In this breakdown, they found that long-tail terms were “significantly” more impacted by on-page optimization, with an 11 position increase, compared to six positions for higher volume, head keywords.

average-keyword-rank-movement-for-head-vs-long-tail

For the second part, they separated keywords according to the number of words in the term; head keywords were one to two word queries, while long-tail terms had three or more words. Again, they found that on-page optimization increased long-term rankings more, but by a smaller margin. With this segmentation, long-tail terms rose an average of six positions and head terms an average of four.

Long-Tail Terms Convert 2.5 Times More

conversion-rates-head-vs-long-tail-terms-conductor

The final part of their study looked at conversion rates, examining more than 7 million visits to three major retailers. Long-tail terms – those with three words or more – converted two and a half times more than head terms. Conductor said this is a great opportunity for marketers who may be disproportionately focusing on higher volume, one- to two-word search terms.

On-page optimization is one of many strategies SEOs and marketers can use to increase rankings and conversions. It’s also just good practice to make sure your page addresses the issues that brought visitors to the site in the first place.

How to use Twitter as a lead generation tool

Twitter is among the top three social networking sites today and offers a micro-blogging service that lets its users send, receive and share messages via tweets. Tweets are text messages confined to 140 characters and can include links. Twitter users currently Tweet over 110M Tweets/day, at the time of writing this article. With traffic like that, who can ignore the gold mine that is Twitter?

Twitter – no shortage of business opportunities

It is only natural for internet marketers to leverage Twitter as a lead generation tool. The beauty of Twitter is that it allows easy conversation and for someone who is looking for business opportunities. Obviously, you would not try and sell directly to anyone with the prefix “@”. Keeping your eyes open and being alert to what is going on certainly helps. For example, someone might be looking for a specific service, and if you have the solution, you could probably suggest it to them and end up making a profit.

The short cut to generating leads on Twitter

However, it is hardly practical to browse Twitter all day trying to find leads. Here is where we make use of third party tools to shorten the distance between you and the prospects. Let us take a quick look at what you can leverage:

Twitter advanced search

If you use Twitter, you are probably aware of the basic search function. This lets you save a tremendous amount of time while you run searches using your name, or your business’s name, your brands, your competition and other tags. This will help keep track of your online business reputation as you look at the response to your tweets.
However, the advanced search function is what you should really take advantage of because it lets you run a search for your keywords by geographical location. So, if you live in Jacksonville, KY, and run an advanced search for who is in Jacksonville, KY, you can see who is saying what in that area. You are looking at real-time data you can filter and develop into business opportunities.
When you look at the advanced search form, you will realize that it is quite simple to run a search around a prospective lead that you might have identified. You can also use Twitter’s “search operators” and use the examples to come up with your own searches. Let us say you want to find out who wants your services in Lawrenceville, GA, you would create a search based on those terms or search operators.

Being responsive is the key

Twitter is full of people who post questions, opinions, requirements and just about anything and it is up to the marketer to utilize these tweets. If someone is looking for something, they would be pleased to have someone respond to their need, to identify, filter and respond to people asking questions or complaining. This becomes even more valuable if it is someone from their area. As a marketer, you can plan and set up a series of tweets that could very well convert into hot leads on a daily basis.
Remember, though, that you must take care to see that people don’t consider you a spammer. Instead, identify your niche and target people in that segment by triggering an interesting conversation and being helpful, eventually generating leads.

Not just leads, but a whole network

When you work on generating business leads from Twitter, bear in mind that you can also build a community around you that could yield business partners for different aspects of your business that you could work with.

RSS feeds can be a boon

Every search that you run will bring up RSS feeds you can subscribe to. You can decide which ones are worthwhile from the comfort of your RSS reader. The RSS function keeps your searches well-organized, helping you respond to the ones you find lucrative and follow-up.

You can also use the RSS function to republish the results from your search. If you happen to be planning an event, you can create a buzz around it by doing this. You can create a #hashtag that is specific to your event, run a search on that tag and get the RSS feed. You can then use Feedburner’s BuzzBoost feature under the Publicize tab of your Feedburner feed to publicise the dynamic feed on your site. It is easy.

If you would rather do this right from your browser, use Tweetdeck to see the real time search results on your desktop.

This is only the tip of the iceberg when it comes to leveraging Twitter for lead generation. Do you have your own tips to share? Do post them in the comments section.

Thursday 15 December 2011

Conversion Tracking vs. Google Analytics Goals

Conversion tracking is a must-have. If you can, you should.

Simply knowing how many sales you generated for how much spend isn't enough. You need to be able to know which keywords (and better, which search queries) generated the enquiries, sales, leads and phone calls you're interested in.

This means you need a tool that integrates with AdWords, either by letting you get the data out, or by putting its own data in.

Exporting to a Tool

Getting the data out means tagging your landing pages with query strings. This will mean that a solution on your site can read that query string and know whatever information you've given it.

The big advantage of using these kinds of solutions is the range of them available. They can include attribution modeling and user journey paths and all kinds of exciting information. You can integrate them into CRM systems and log the keyword and user all the way through their lifecycle as a customer. Most of these systems will be able to pull keyword data from query strings you tag, and search query data from referral information, then pair these together in the customer's record if they convert.

The downside is that acting on that information in AdWords is a few steps removed. You need to use multiple different systems to see the data and take actions, such as increasing or decreasing bids based on conversion rate. Convenience and cost are the main disadvantages of these systems.

There are two options that use the opposite approach: put conversion data into AdWords to analyze there. AdWords' own conversion tracking system, and Google Analytics will both do this, and both are free. Let's take a look at how they're different and why you might choose each option.

Importing Into AdWords

Importing the data straight into AdWords has one major advantage over using separate tools: the data you need to optimize from is right next to the areas you need to change in order to optimize. You can view conversion volumes and cost per conversion next to each keyword or ad, and you can make changes appropriately directly in the interface. If you have enough conversion volume you can also take advantage of Google's "conversion optimizer".

AdWords Conversion Tracking

AdWords' system involves a snippet of JavaScript that you place on the confirmation page after the user has taken the action you want to track. The users are completely anonymized: you can see the search query used, the ad seen, and the keyword that triggered the ad; but none of this can be logged to the individual conversion. With a little customization of the code you can include a value to any conversion if appropriate.

adwords-conversion-code

There is typically a 30-day window on these conversions. Any visitor who has clicked on one of your Google ads in the 30 days before they converted will be tracked as a converted click in AdWords. The conversion will be displayed at the date and time of the original click. So if visitor A searches and clicks on Monday, then comes back to purchase on Friday, the conversion will register on Friday, showing up in Monday's stats.

AdWords tracks conversions on a last-click wins basis (amongst AdWords clicks only, no other traffic sources are included). So you will find that if a searchers clicks ads from several different keywords before they convert, the credit will only go to the final keyword searched on. This will often lead AdWords to assign a higher weighting to your brand terms than you might expect.

AdWords contains a "Search Funnels" feature to let you see all the AdWords touchpoints before a user converted. You'll be able to see the average time lag of a user from click to conversion, but more helpfully you'll also be able to see the keywords that "assisted" conversions (e.g., were not the last click but were still part of the user's path). These keywords would have received no credit in the main AdWords interface.

adwords-search-funnels

Importing Goals From GA

If you have Google Analytics set up with goals recording conversions (or e-commerce tracking), then you can import these into AdWords directly.

Goals imported from GA give you some additional features compared to AdWords tracking, but come with some differences that you'll need to know about.

First, Google Analytics is a last-click-wins system across all traffic sources, apart from direct traffic. That means that if a user goes through an AdWords ad but then comes back another way, GA won't log that as an AdWords conversion. You will expect GA goals to under-track a little compared to AdWords tracking for this reason.

Second, GA records conversions to the date and time of the visit that converted, as opposed to AdWords' method of recording them to the click that generated the conversion. This can be much more different than you think. Even in FMCG it's not unexpected to see click to conversion lags of a few days or weeks. In retail you should expect fewer than 70 percent of people to purchase on the same day as the click, so that's quite a big difference in daily conversions between AdWords (today's conversions being tracked back to whenever they clicked) and GA (today's conversions being recorded today).

Goals imported from Google Analytics can take up to 48 hours to appear in AdWords, so you may not be able to immediately see the effect of your optimizations without going into GA directly.

Why Use One Over the Other?

These differences in behavior may indicate to you which type of conversion tracking is more suitable for you depending on your overall preferences.

  • If you want any traffic going through AdWords to be classed as an AdWords conversion, AdWords conversion tracking would be the preference.
  • If you want only last-click AdWords conversions to be tracked, import from Google Analytics.
  • If you want full attribution modeling, then do your tracking in a third party tool (I'm classing multi-channel funnels in GA as a third party tool here, since the data from there can't be imported directly into AdWords, but must be analyzed separately).

Google Analytics holds one trump card still: engagement goals. GA will let you set a threshold for certain metrics (e.g., pages viewed or time on site) and set any visit that goes over that threshold as a goal. For certain content sites these are really worthwhile. Non-bounce visits could be a great signal to optimize for if you run a site that has no other real tracking options.

non-bounce-visit-goal

You can see that each of these methods offers different benefits and drawbacks, and in some cases different biases in the data you'll see that you need to account for. Those biases can be pretty large, so don't be surprised if different conversion tracking sources don't match, and make sure you understand why each might be tracking something in a slightly different way.

Just adding conversion tracking isn't enough, you need to have thought about which to implement and what that means for you.

Tuesday 13 December 2011

Listing Dynamic Web Pages in Search Engines


Problems With Dynamic Languages

Most search engines do not like to follow java script or spider deeply into dynamic driven websites. Since scripts can create millions and millions of pages on the fly search engines need to ensure some level of quality before they will want to list all the pages.

Session ID's & Cookies

Search engines do not like reading session IDs or cookies. When a search engine is given a session ID they will usually leave the site. If spiders indexed websites that offered the spider a session ID it would frequently cause the spiders to overload the server. Session ID's would also make the site seem many times larger than it is.

General Dynamic Page Listing Tips

Search engines are getting better at listing dynamic web pages. There are a few basic thumb rules to help search engines index your dynamic website.
  • Build a linking campaign. As you get more inbound links search engine spiders will have more reason to trust the quality of your content and they will spider deeper through your site.
  • Use short query strings. Most search engines will do well to list dynamic pages if each query string is kept less than 10 digits.
  • Minimize the number of variables. When possible you want to use three or less different parameters, the fewer the better. If you use long parameter strings and a half dozen parameters it is a fair bet that the site will not get indexed.
  • If you still have problems you can use a CGI script to take the query string out of your dynamic content or try one of the other dynamic page workarounds listed below.
  • Other dynamic workarounds: There are multiple ways to list dynamic web pages in search engines. Common techniques are:
    • Hard coded text links
    • Bare bone pages
    • Coldfusion site maps
    • Apache Mod Rewrite
    • IIS Rewrite Software
    • Paid Inclusion

Hard Coded Text Link

To list dynamic web pages in search engines you can capture the entire url in a link like:
<a href="http://www.search-marketing.info/catalog.html?item=widget&color=green&model=6">Green widgets style 6</a>
If you use static links like listed above to reference dynamic pages, search engines will usually be able to index them. Many sites use a site map which captures the most important interior pages.
If you have enough link popularity and link into a couple of your main category pages using static links then search engines will usually be able to index the rest of the site.

Bare Bones Pages

You also can make a bare bones static page for each one you want listed in search engines.
<html>
<head>
<title>Green Widgets style 6</title>
</head>
<body>
<!--#exec cgi="myscript.pl?greenwidget-6"-->
</body>
</html>

Coldfusion Site Map

For ColdFusion you can code a site map page using a code similar to the following.
<cfquery name="getPages" datasource="myDatasource">
SELECT *
FROM pages
WHERE status = 1
ORDER BY sortorder
</cfquery>
<cfoutput query="getPages">
<a href="index.cfm?pageID=#getPages.pageID#">#getPages.pageName#</a>
</cfoutput>

Apache Mod Rewrite

For Apache servers there is a way to make dynamic pages seem like static pages called Mod Rewrite. The documentation on MOD Rewrite is located on the Apache website. Apache MOD Rewrite.

IIS Rewrite Software

IIS servers do not have the built in rewrite features like Apache Mod Rewrite. You still can rewrite your URL's on IIS servers using custom built software programs.

Trusted Feed Paid Inclusion

You can pay to use trusted feeds to upload dynamic content to Yahoo! to acquire traffic at a cost per click bases through the Overture Site Match program. I would consider this a last option for many sites since many business models can not support the incremental cost per click charges.

Thursday 8 December 2011

Opinion: 3 Onsite SEO Myths and Facts – What Really Counts?

Before starting this article I would like to note that I am specifically talking about Google. The information below might not apply to other search engines.

Everybody who is into SEO knows that it is more than just link building and offsite techniques. Sure, links matter the most, but how about your website itself? Onsite optimization might not be the most important part of SEO according to some people but that just depends on the point of view. To me, there have always been some onsite factors that play a significant role in the eyes of Google.

Of course, the most important thing when optimizing a website according to link builders is getting links from website with a high TrustRank and the most important thing according to web designers is the proper coding. Since me and the people I work with focus on SEO as a complete process, we concentrate on everything important.

However, there are some things that just don’t matter as much as others especially when doing onsite SEO. Google changes its algorithm almost every day so a lot of the old onsite techniques that once worked are now useless thanks to the so many spammers exploiting them. So what exactly are the onsite factors that can affect your rankings?

WEBSITE TITLE

The title of your website is one of the most important things when it comes to onsite SEO and better rankings. Here is what people believe to be true and what the truth really is:

Myth

A common mistake that people make when doing onsite optimization is stuff keywords in the title of their website thinking that would help them rank better. Keyword stuffing was a technique that was kind of effective a long time ago until Google found out that the spammers are using it to their advantage. So Google decided to change their algorithm, making a website’s ranking depend more on links and less on onsite factors.

Fact

The title of your website matters a lot and if you don’t want to get a penalty, you need to keep it simply and user-friendly as well as Google-friendly. This is the place where you get to describe your whole website/business in about 10-15 words. I am not saying you should not put keywords in there. Quite the opposite – put your most important keywords in the title but make sure you put them in a way that is not spammy looking instead of just stuffing them and separating them with commas.

Tips

When writing your website title, be creative and don’t repeat anything more than once. For example, if you are selling silver and gold jewelry, writing “Silver jewelry, gold jewelry…” in your title is not a good idea. Instead use the phrase “Silver and gold jewelry”. You should know that Google doesn’t care about the order of the words and you will get credit for each permutation.

URL STRUCTURE

The most obvious thing is the domain name. If your domain name is an exact match for your most competitive keyword – you’re on. However, the rest of the URL structure is also a very important onsite factor and many people are still doing it all wrong.

Myth

Again, a very common myth is that stuffing as many keywords as possible in the name of each page will work.

Fact

A website with a better URL structure has an advantage over a website with dynamic URLs. Although dynamic URLs can be read by Google, they simply don’t have the same effect as properly structured ones.

Tips

When taking care of your URL structure, the most important thing is to name each page of your website with the most relevant keyword. Creating many pages with different names that are also your keywords will pay off better than having dynamic URLs.

AGE

The age of a website is another factor that plays a big role when it comes to its rankings but not in a way that some people think.

Myth

A lot of people believe that a website will get better rankings with time on its own. So their strategy is to just sit there and wait because they believe that a website that is 3 years old should automatically rank better than a website that is 4 months old no matter what. They believe that if no offsite optimization has been done to the old website it will still have better ranking than a new website with a lot of backlinks for example.

Fact

The age of a website does matter to Google. However your website will not rank high just because it’s old. The only thing that is affected by the site age is the amount of TrustRank it gets from the links pointing to it. The first two months, you will most likely not rank at all in Google because you will be in the “sandbox” where all new websites go. Then you will start receiving a bigger percentage of TrustRank as your website gets older. 4 years after the creation of your website, you will start receiving 100% of the TrustRank that the links pointing to your website pass.

Tips

Just because your website will be in the sandbox for the first two months, doesn’t mean you should sit and wait for the time to pass and then start building links. Instead, use the time to get some links and enjoy the credit Google will give you for them when the trial period is over.

Conclusion

These are 3 of the most important onsite SEO factors you should focus on, but I want to touch on two other factors people still think matter, the XML sitemap and the coding. Just to be clear – this article is about which onsite factors help you get better rankings and not about what makes Google’s job easier. Of course the XML sitemap is a really great thing and it sure helps Google crawl your website faster and therefore index it faster. However your sitemap has nothing to do with your rankings at all nor does the coding of your website.

Concentrate on what is really important and don’t worry about things web designers and other charlatans tell you in order to get more money from you.

Monday 5 December 2011

Google Algorithm Updates: The Latest Things To Consider

Google algorithm "transparency" continues

Google has been making a big deal about wanting to be more transparent about its search algorithm lately (without revealing the secret sauce too much of course). And so far, I have to say they're making good on that promise fairly well.

Is Google being transparent enough for your liking?

We've seen plenty of algorithmic announcements made from the company over the course of the year. In November, they discussed ten recent changes they had made. Here's a recap of those:

  • Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
  • Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
  • Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page's title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page's content.
  • Length-based auto complete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
  • Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
  • Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
  • Fresher, more recent results: As we announced just over a week ago, we've made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
  • Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
  • Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
  • Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.

Friday 2 December 2011

Debunking Common SEO Misconceptions and Myths

Search Engine Optimization, can provide the best ROI for your internet marketing budget. Because it is widely misunderstood and just as commonly misused and abused, there are plenty of misconceptions about SEO that can derail your marketing efforts.

SEO misconceptions

Here are some commonly held beliefs that need some clarification:

“I don’t need on site SEO, just 100 more links”

There is just too much misinformation about SEO out there for you to assume your on site optimization has been done right. Many times we will have a new client tell us it is all taken care of, then we discover their pages are keyword stuffed, title tags are too long or duplicated, no meta descriptions exist, all the internal links say “click here”, content is duplicated elsewhere in the site or stolen from another site, or pages are cloaked or hidden in an attempt to trick the search engines.

Another variation of this is the idea that some webmasters have that on site optimization is not necessary and all you need is lots and lots of backlinks. You have figured out how many links your competitor has, and it is only a matter of getting more than them, right? Nope – if it was that easy, anybody with some time could make a totally crappy site rank #1 for everything and nobody would ever use search engines.

While inbound links to your site are one of the most important things that will help your website rank better, quality is becoming increasingly important to Google, links are becoming less important and social signals (Likes, +1′s, Tweets, etc) are becoming more important. Make sure you are creating good quality content, your site is well optimized, and make it easy for readers to like your site.

“My web designer already optimized my website”

Web designers and developers often don’t know or don’t care what is important to search engines. If the site looks good and all the buttons work, then it is all good as far as they are concerned. There are many web designers who know SEO, but they are kind of rare. Unfortunately, most web design classes teach the software and visual design skills, but don’t bother with optimization best practices. I recently had a web design instructor tell me that he doesn’t teach SEO because it “has nothing to do with web development and it doesn’t work”! Um, yeah. Right. Think about that when you hire a web designer who just finished school.

Here are some things that are so basic to search engine optimization that you just can’t call a site optimized if they have not been dealt with properly. These things really should not even be considered as separate from web design since they should be part of any web developer’s best practices when building a website.