Friday 27 May 2011

Introduction to PPC and Basic Concepts

Next, let’s take a look at PPC advertising which nowadays is the fastest growing marketing tool. The successful growth in this area is the result of the beneficial characteristics of PPC such as affordability and effectiveness. Actually, any site owner starting a SEO/SEM campaign will receive targeted traffic when using proper keywords directly relevant to site’s content. It is advisable to use PPC advertising in the periods before good organic rankings come about. Another added benefit of PPC is that it protects you from long organic re-indexation periods and seasonality by providing a steady stream of traffic.
As the definition states, pay per click (PPC) is an advertising technique used on websites, advertising networks and search engines.
Advertisers bid on "keywords" that they believe their target market (people they think would be interested in their offer) would type in the search bar when they are looking for their type of product or service. For example, if an advertiser sells tennis balls, they would bid on the keyword "tennis balls", hoping a user would type those words in the search bar, see their ad, click on it and buy. Keywords should be directly relevant to site content as it is easy to lose money with improper choices.
These ads are called "sponsored links" or "sponsored ads" and appear next to and sometimes above natural or organic results on the page. The advertiser pays only when the user clicks on the ad.
Abuse of the pay per click model can result in “click fraud”.  Click fraud occurs in pay per click online advertising when a person, automated script, or computer program imitates a legitimate user of a Web browser clicking on an ad, for the purpose of generating an improper charge per click. Click fraud is the subject of some controversy and increasing litigation due a situation where advertising networks are also a key beneficiary of the fraud whether intentional or not.
Use of a computer to commit this type of internet fraud is a felony in many jurisdictions, for example it is covered by Penal code 502 in California, USA, and the Computer Misuse Act 1990 in the United Kingdom. There have been arrests relating to click fraud with regard to malicious clicking in order to deplete a competitor's advertising budget.
While many companies exist in this space, Google AdWords and Yahoo! Search Marketing are the largest network operators as of 2008. MSN offers their own PPC services with their MSN adCenter. There are more PPC search engines available such as Ask.com, Kanoodle, LookSmart, Miva, etc.
Depending on the search engine, minimum prices per click start at $0.01 (up to $0.50). Very popular search terms can cost much more on popular engines.
Before starting PPC campaigns, make sure the following is done:
  • Analyze your business niche and learn who your competitors are (the Web CEO Keyword Suggestion tool can definitely be of great help to achieve this);
  • Pinpoint the search terms that you will use in your ad (keywords you are going to target and the negative keywords that you should omit);
  • Learn how to write good ad copy which will have a high clickthrough rate;
  • Work hard on the landing page you will link your ad to - make sure  visitors do not navigate away from the sales area of your site;
  • Think ahead about tracking solution that will help track and analyze your PPC campaigns, prevent click fraud, etc.
Generally, to get started you will need about $50 for the first month of Google AdWords advertising, and $50 – for Yahoo! Search Marketing. Our next lesson depicts these top PPC providers and the advertising process itself.
Another PPC advertising provider you might be interested in is LookSmart with its Silver, Gold and Platinum levels of service (http://adcenter.looksmart.com). LookSmart has a built-in Conversion Tracking system that will help you measure your cost-per acquisition (CPA), the total amount you spend for each conversion. LookSmart results feed CNET's Search.com, Ask, ABCsearch, Bravenet Media, Copernic , Wikia, Enhance, Kontera and Search.com.
Experienced Search Engine marketers are playing with bids and keywords by starting from small sums and gradually growing successful keywords. This technique is depicted in more detail in our next lessons. In general dollars spent on targeted keywords for your site should have a chance for success. Any good marketing Internet based business relies on fast feedback loops to help it improve. At the beginning you want to overspend on your ads so that you can see which ones are effective. After a while regroup your ads by cost and then slash your bid price down. Ineffective ad groups or keywords will automatically shut off and just your effective ads will keep displaying.
To sum up pay-per-click is a better advertising medium for small SEM businesses because with proper keyword selection it can be affordable for any site owner. That’s why if you are starting a new site marketing strategy the first advertising option to consider should be PPC search engine promotion. After this, the best course of action would be to assess the mix of organic and paid search engine traffic as a whole, and to implement a wide-ranging search engine marketing strategy.

What You Should Remember:

  1. If you use PPC your site gets instant and measurable targeted traffic;
  2. PPC helps to get prime exposure on Google, Yahoo! and other major search engines;
  3. You only pay for the clicks you receive;
  4. Most importantly - there's no limit to the traffic.

Major PPC Providers (Google AdWords, Yahoo! Search Marketing, Microsoft adCenter)

As we look at the pay-per-click advertising medium we should acknowledge that there are many different PPC search engines where one can spend money. Keep in mind that when using larger pay per click search engines (that will charge you more for your ad campaigns), the chances are also higher that you will be getting good traffic and that your business model will be scalable. Smaller engines provide slower feedback loops and some may not even provide quality traffic.
The major players are Google AdWords, Yahoo! Search Marketing (formerly Overture), Microsoft adCenter, others include Ask, Kanoodle, LookSmart, etc.
Let's examine the top players of the pay-per-click advertising medium.
Google AdWords
Yahoo! Search Marketing
Microsoft adCenter
PPC Programs Comparison

Google AdWords - http://adwords.google.com/

PPC Advertising is an essential element of a pay-for-performance strategy. And Google AdWords is definitely your first and arguably most important stop.
AdWords is Google's flagship advertising product, and it's main source of revenue. AdWords offers pay-per-click (PPC) advertising, and site-targeted advertising for both text and banner ads. The AdWords program includes local, national, and international distribution. Google's text advertisements are short, consisting of one title line and two content text lines.
Let's look at Google's results for "Search engine marketing" to see it in use. (You might see somewhat different results in your browser due to Google's geotargeting).

Google's results

Among the top paid results we can see are SEM businesses – this confirms our assertion that PPC advertising is better suited for the needs of an SEM business, because of exposure to a highly targeted audience. Costs per click for keywords like "search engine marketing" or "search engine marketing firm" are about $10 / click. Therefore, if your services cost on average $2000, you should have a 0.5% conversion to break even.
Obviously, costs-per-click varies with different keywords and sectors, but $50/mo for each and careful keyword selection will be enough to give you a good start.
One vital factor to remember about Google AdWords ad is your advertisement display URL. It will be seen on your ad and needs proper attention while creating the advertisement for listing. For more advice on creating your ad, please refer to Lesson Ad Writing Tips. Where do you want your visitors to go after clicking the ad? Don't waste an opportunity by sending your customers to unfocused pages. More tips and guidelines on ad writing and a detailed explanation of how to direct visitors will be available in our next lessons.
Getting Feedback from Your Account:
When opening up a Google AdWords account many people are hesitant to spend much money. In spending too small amount the business is essentially ignoring the feedback loop Google has set up.
Is it better to lose 100 dollars today, than to lose it over three months to finally come to the conclusion that you need to change? We think you would rather want to know what you need to change now. Not all industries will make money from AdWords. It is not a business model that will work perfectly across the board.
Lower-placing ads have a lower bid price. That can help save money if the top placed ads are overpriced. Another great benefit of having a lower-placed ad is that a user that clicks on one is more prequalified to make a purchase. By scrolling through a bunch of ads they have displayed a greater intent to make a purchase.
These two mathematical facts often mean that listing at 5 or 6 often provides a better ROI than listing at the top. You will need to determine the profit elasticity for your market to see what ad locations will return the highest overall profits.
Lowering Click Price after Bidding:
After you get 10 or 20 clicks and have a decent click-through rate you may want to slash your bids in half or by 2/3. Often it is best to start off with your ad around the #1 or #2 positions to collect feedback and then let it fall back after you drop ad price.
Expanding Breadth:
It is advisable to limit your ad budget by help of the availability of well targeted ads. If your ad spend is limited by a budget and you are showing high for many of the search terms lower your max bid to lower the position down to 5, 6, or 7.
Thus you can show up on more search results and people who are looking at the lower ad positions are more prequalified to buy.
Google's AdWords is a great opportunity for your business to get customers who are searching for what you sell. During the search using one of your keywords your ad will appear directly with the relevant results on SERP. Moreover only a highly targeted audience interested in your services will see your listings.
Demographic Targeting
The Demographic targeting feature introduced by Google AdWords program is a way to find and run your ads on sites with the right audience for your add campaigns.
The right audience or demographic group is an audience that shares a particular trait or characteristic such as age, gender, income, etc. Some site-publishers (e. g. social networks) ask their users to identify themselves by age and gender.
Armed with this kind of information, Google can display your AdWords ads to the demographic groups that you prefer, or prevent your ads from displaying to groups you don't want to reach.
The system will analyze the preferences you choose and create a list of available Google Network sites that are popular with that audience.
Learn more about AdWords demographic targeting.

Yahoo! Search Marketinghttp://searchmarketing.yahoo.com

If you are serious about pay-per-click advertising, Yahoo! Search Marketing is a reliable top player for your advertising campaign.
Sponsor ads
Lately Yahoo! had an upgrade of their Paid Search Advertising Platform. The Update dubbed "Panama" is replacing the system that Yahoo! acquired from Overture (formerly Goto.com). The new Yahoo Search Marketing Interface provides you with the new Sponsored Search.
Sponsored Search is a form of search engine marketing with new, improved features: budget control for the marketing expenses, targeting ads to the specific audience and/or customers, easy management of tools and features. Content Match – new feature can extend your business's reach and increase web site traffic by featuring your ads on publisher sites and in newsletters and emails.
In the base of the new Sponsored Search is the New Ranking Model. With the new ranking model, an ad's rank in search results is determined by bid amount and ad quality. Thus, ads with higher quality can deliver a lower cost per click and/or may receive better placement on the results page relative to lower quality ads.
Ad quality is determined by quality index calculated each time your ad is shown. The ad's click-through rate relative to its position and to other ads displayed at the same time will affect the quality index. Remember you may be rewarded with a better rank or lower cost on your ads making your ads more appealing.
Yahoo! Sponsored Search lets you create ads that appear in search results on the Web's most popular destination and other sites in the Yahoo! distribution network. Their Start-Up helps to get your campaign online with five easy steps.
Generally all the pay per click solutions will require you to take the steps like following to list your ad.
5 steps for listing via the Yahoo! Search Marketing service:
Step 1. Select keywords directly relevant to your site content. The editorial staff of Yahoo! Search marketing will check your site content for  correspondence with your keywords.
Step 2. Write a search listing which consists of a title, description about your site and what you have to offer and URL.
Step 3. Determine bid amounts for your search listing.
Step 4. Your listing is distributed across the web.
Step 5. Finally, your listing appears in the results of search engines.
If you use Yahoo! Search Marketing as any other pay per click provider you get only the targeted prospects and pay only when prospects click on your listing. Fortunately, it is a powerful way to find your site for customers who are searching for what you sell.

Microsoft adCenter - https://adcenter.microsoft.com/Default.aspx

Microsoft offers their own pay-per-click ad-bidding system called Microsoft adCenter that pairs search results with sponsored text messages from advertisers.
They help to start your online advertising campaign and target ads to the times, places, and customers you want the most. This program also has built-in tools to manage your advertising process for better results.
Keyword generation tool will automatically generate keywords based on the word or website address you choose.
This add-in has been developed for Excel 2007. Among its basic functions are building or expanding your keyword list, monitoring performance and researching trends. Keyword Forecasting option gives historical and future forecasted impression counts for the specific keywords, monthly reports and shows daily traffic.
Moreover, you can optimize ad campaign performance by preparing timely reports with help of another special built-in tool. Thus you can create the advertisement for listing, define keyword match options, set your own price per click and manage the campaign to see when your clicks lead to sales.
All you need is a credit card and $5 for a one-time sign-up fee to create your Microsoft adCenter account. To start registering go here https://adcenter.microsoft.com/Default.aspx.

Here you can see the top PPC programs comparison:

Google Adwords Yahoo! Sponsored Search Microsoft adCenter
Product Features:
Geo-targeting + + +
Demographic targeting Limited to the U. S. area ads only. + +
Dynamic keyword insertion + + +
Keyword matching options (allow you to control how precisely you want a user's query to match your specified keyword) + + (bidding types)
Reports + + +
Ranking (important factors in how an ad is ranked) A combination of several relevance factors including CPC and clickthrough rate Bid amount and ad quality (ad quality is determined by quality index calculated each time your ad is shown.) Bid, click-through rate, and relevance.
Minimum deposit to start $5 $5 $5
Tools
Keyword analysis Enhanced Keyword Tool The ROI calculator – CPC, CPM (cost per thousand)

Note: Keyword research feature for keyword ideas
Keyword Generation Tool
(a plug-in for MS Excel 2007)
 Traffic estimation Traffic Estimator Tool
Get quick traffic estimates for new keywords without adding them to an account or using the AdWords sign-up wizard. You can see clicks per day, average CPC, cost per day and average estimated position.
- -
Ad management - - AdCenter Desktop application to manage your ads.
More Features
Ad exposure Advertising network of sites and products for increased ad exposure. Content Match extends your business reach via publisher sites, newsletters and emails. Content Ads displays content-targeted ads on Web pages.
Ad testing Test multiple versions of your ads to determine which message works best for your customers.
The Sponsored Search system automatically displays the ads receiving the most clicks.
-
Competitive pricing AdWords Discounter automatically monitors your competition and lowers your CPC to one cent above theirs. - -
Campaign budgeting Daily Daily Daily or monthly
Ad scheduling Run your ads on the days and hours you want. Set specific start and end dates for your campaigns and choose time-of-day / time-of-week for each campaign. Set a schedule to run your ads on the days and hours you want.
Learning center + + +

What You Should Remember:

  1. Using the PPC advertisement method prepare your ads properly.  The most important point here is to select keywords directly relevant to your site content.
  2. Lower-placed ads have a lower bid price. That can help save money if the top placed ads are overpriced. If you are running your ads correctly then the availability of well targeted ads should be what is limiting your spending.

Thursday 26 May 2011

How Search Engines Rank Pages

Every smart Search Engine Optimizer starts his or her career by looking at Web pages with the eye of a search engine spider. Once the optimizer is able to do that, the path is half way complete to full mastery.

The first thing to remember is that the search engines rank "pages", not "sites". What this means is that you will not achieve a high ranking for your site by attempting to optimize your main page for ten different keyword phrases. However, different pages of your site WILL appear up the list for different key phrases if you optimize each page for just one of them. If you can't use your keyword in the domain name, no problem – use it in the URL of some page within your site, e.g. in the file name of the page. This page will rise in relevance for the given keyword. All search engines show you URLs of specific PAGES when you search – not just the root domain names.

Second, understand that the search engines do not see the graphics and JavaScript dynamics your page uses to captivate visitors. You can use a graphic image of written text that says you sell beautiful Christmas gifts. But it does not tell the search engine that your website is related to Christmas Gifts – unless you use an ALT attribute where you write about it.

Here's an example to illustrate.

What the visitor sees:

Beautiful Christmas Gifts!!!

What the search engine will read in this place:

<img src="http://training.webceo.com/images/assets/Stg2_St2_L5/0004.png" width="250" height="100" class="image" />

As you see there's nothing in the code which could tell the search robots that the content relates to "Christmas", "Gifts", or "Beautiful". The situation will change if we rewrite the code like this:

<img src="http://training.webceo.com/images/assets/Stg2_St2_L5/0004.png" width="250" height="100" class="image" alt="Beautiful Christmas Gifts!!!" />

As you can see we've added the ALT attribute with the value that corresponds to what the image tells your visitors. Initially, the "alt" attribute was meant to provide alternative text for an image that for some reason could not be shown by the visitor's browser. Nowadays it has acquired one more function – to bring the same message to the search engines that the image itself brings to human Web surfers.

The same concerns the usage of JavaScript. Look at these two examples:
  1. Visit our page about discounted Samsung Monitors!
  2. <script language="JavaScript" type="text/javascript"><!--document.write("Visit our page about " + goods[Math.round(0.5 +(3.99999 * Math.random()))-1]); --> </script>
The first example is what visitors see, the second is the source code script that produces the output. Assume the search engine spider is intelligent enough to read the script (however, actually not all the spiders do); is there anything in the code that can tell it about the Samsung Monitor? Hardly.

As a rule, search engine spiders have a limit on loading page content. For instance, the Googlebot will not read more than 100 KB of your page, even though it is instructed to look whether there are keywords at the end of your page. So if you use keywords somewhere beyond this limit, this is invisible to spiders. Therefore, you may want to acquire the good habit of not overloading the HEAD section of your page with scripts and styles. Better link them from outside files, because otherwise they just push away your important textual content.

There are many more examples of relevancy indicators a spider considers when visiting your page, such as the proximity of important words to the beginning of the page. Here, as well, the spider does not necessarily see the same things a human visitor would see. For instance, a left-hand menu pane on your Web page. People visiting your site will generally not first pay attention to this, focusing instead on the main section. The spider, however, will read your menu before passing to the main content – simply because it is closer to the beginning of the code.

Remember: during the first visit, the spider does not yet know which words your page relates to! Keep in mind this simple truth. By reading your HTML code, the spider (which is just a computer program) must be able guess the exact words that make up the theme of your site.

Then, the spider will compress your page and create the index associated with it. To keep things simple, you can think of this index as an enumeration of all words found on your page, with several important parameters associated with each word: their proximity, frequency, etc.

Certainly, no one really knows what the real indices look like, but the principals are as they have been outlined here. The words that are high in the list according to the main criteria will be considered your keywords by the spider. In reality, the parameters are quite numerous and include off-page factors as well, because the spider is able to detect the words every other page out there uses when linking to your page, and thus calculate your relevance to those terms also.

When a Web surfer queries the search engine, it pulls out all pages in its database that contain the user's query. And here the ranking begins: each page has a number of "on-page" indicators associated with it, as well as certain page-independent indicators (like PageRank). A combination of these indicators determines how well the page ranks.

It's important to keep this in mind: after you have made your page attractive for visitors, ask yourself whether you have also made it readable for the search engine spiders. In the lessons that follow, we will provide for you detailed insight into the optimization procedure; however, try to keep in mind the basics you've learned here, no matter how advanced you become.

Here's what you should remember from this lesson:

  1. Search engines rank pages, not sites.
  2. When a spider first visits a page on your website, it does not yet know the keywords for which your page is relevant; it does not know anything except your URL. Try to optimize your code to make it readable not only to visitors but also to spiders.

Saturday 21 May 2011

Submitting to Directories

There are hundreds of directories on the Web that cover every possible market, offering many valuable opportunities to get your site listed in crawler-based engines, expose your site to your audience and increase the absolute value of your pages (also known as Google PageRank). The first (and, if you succeed, maybe the only) directories to get listed in are Yahoo and DMOZ.

There are other directories such as JoeAnt which can be quite useful also; however, many of them are just not worth the trouble. There's a good technique to determine if a directory can help you on your way to top rankings and traffic. When you are considering placement in a directory, check it's "robots.txt" file (which we've covered in the previous section about optimizing site structure) and look if it allows the major search engines to crawl it. If they don't allow crawlers to go through their directory, it is useless for you to list there.

As you remember, the robots.txt file is always in the root directory so just type the full URL of the site and add "/robots.txt" on the end to see the file.

When submitting to the directories, remember that they are search engines powered by humans. All listings within the directories are compiled by human editors. You already know how important a listing in a directory can be; please find the time to make some of the recommended preparations for directory submission. This includes writing a 25-word or less description of your entire website, which includes the use of the primary keyphrase you've optimized your home page for. Review and make sure that the description you write doesn't misuse marketing language or hype.

If you are going to submit your site to several directories or even to a dozen it is recommended to have this information prepared and saved. This will help you to speed the process of submission. This data should be ready  somewhere to copy from and then paste:
  • Email address
  • Website URL
  • A title for the website
  • A description for the website
When it comes to the title, use your company name or the official name of your website as there are strict criteria. For example Yahoo! will allow  only these names.

As for a description Jill Whalen suggests the following: “The website description posted with your URL is a big factor in how your site will rank once it's listed in the directory. It is very important to do this right the first time. If you put too much promotional jargon in your description or make it too long, for example, the editors are sure to change it. When they do, you can bet your keywords won't appear in the final listing.

If you've created a good meta description tag for your site, start with that. Copy and paste it into the submission form, then start deleting extraneous words. Move words around until you have the shortest yet most descriptive sentence possible. If you do this correctly, chances are the editors won't change it.

Be sure the words you're using in your description appear on the pages of your website. If they don't, and the site appears to be about subjects other than what you described in your form, your description might be edited.”

For deeper insight visit - http://www.webproguide.com/articles/How-to-Submit-Your-Site-to-Directories-such-as-Yahoo-DMOZ-and-Zeal/index.php?phrase_id=2796

Submitting to Yahoo

When listed in the Yahoo! Directory it is easier to get indexed and rank higher in crawler-based results of Yahoo! as well as Google and Live Search.

Yahoo! Directory offers two types of submission: "Standard" which is free, and "Yahoo Directory Submit", which involves a submission fee and annual fee of $299 ($600 if your site is adult-related). This guarantees your site will be reviewed by an editor within 7 days. It does not, however, guarantee inclusion and the fee is non-refundable.

To choose the submission option go to http://dir.yahoo.com, select the appropriate category and then click "Suggest a Site" link in the top right section.

Anyone can use Standard submission to submit for free to a non-commercial category. You'll know the category is non-commercial because if you try to submit to a non-commercial category, the Standard submission option will be offered in addition to the Yahoo Express paid option, discussed further below.

If you choose the free submit, there's no guarantee that your submission will be reviewed quickly or reviewed at all.

You can have a commercial site and still try to submit for free to a non-commercial category; however use caution when submitting. Let's say you sell weather forecast software. If you submit your site as such, chances are good it will not be accepted; but, if you highlight a page that tells interesting facts about weather and weather forecasting, this information can be considered a good reason to list your site in a non-commercial category.

If accepted into a commercial category for money, you'll be reevaluated after a year and charged the submission fee again if you want to stay in Yahoo's commercial area. You should review the traffic you've received from Yahoo over the past year to decide if it is worth paying the fee again. If not,  decline to be listed again and you will not be charged. Most often, you will decide to drop your listing after a year, for the category itself does not bring much traffic. Remember that the directory listing is initially important for us as a doorway to search engines listings. Once we've done that, we may safely let the directory listing drop, most often without a significant impact on the search traffic the site receives. The crawler-based engines will keep revisiting and listing your site on their regular basis.

Before submission be aware of the Terms and Conditions. Here are the most important:
  • I have verified that my site does not already appear in the Yahoo! Directory and I understand that this is not the place to request a change for an existing site.
  • My site supports multiple browsers and capabilities. (For example, java only sites will not be listed).
  • I understand that if my site is added, it will be treated as any other site in Yahoo! and will receive no special consideration.
The full list of requirements can be found here https://ecom.yahoo.com/dir/reference/instructions

It is crucial to choose the category properly. It is not recommended to choose the category you would prefer your site to be seen in, but the one it really matches to. If you are not certain what category to choose try this: using your most appropriate keyword for your site, make a query and observe the result page with categories.

The next step is to choose a subcategory because if you submit to a top-level category while disregarding an appropriate subcategory your submission becomes questionable. Additionally, don't forget about geographic regions because if your site is local by nature it should be taken to consideration during the submission process.

Remember the more carefully you prepare, research keywords, debug pages, write valuable content and compile your description, the more chances there are to be included, this is regardless of whether it is through Yahoo! Directory Submit or the Free Submission service.

Submitting to DMOZ

DMOZ / ODP is a catalog of the World Wide Web compiled by volunteers. DMOZ used to be a starting page for the Google's crawler, however nowadays it remains to be a good source of reliable inbound links for the sites.

Submission to DMOZ is free but on the other hand there's no guaranteed turnaround time for acceptance.

To suggest your site to DMOZ, go to http://www.dmoz.org and locate the category you want to be listed in. Then use the "Suggest URL" link that is visible on the top of the category page. Fill out the form, and the submission process is complete.

If accepted, you should see your site appear within approximately three to six weeks. If this doesn't happen, don't try to resubmit. Instead, try to get your site listed in several regional or thematic categories.

As with Yahoo, it's highly recommended that you take the time to learn more about the Open Directory before submitting, in order to maximize the amount of traffic you may receive.

 

What you should remember from this lesson:

  1. Submitting to categories provides a power bonus to your rankings and search engine visibility, but to be successful, it requires thorough preparation.
  2. If your submission isn't listed after three to six weeks, the correct technique on Yahoo! is to resubmit. With DMOZ, on the contrary, just try submitting to another category or categories.

Verifying Submission Success

When manually submitting to a search engine, it's clear whether your submission has been accepted: most commonly, you will be shown a message confirming that your page has been queued for crawling or an error message explaining why it hasn't.

Submission with auto-submission software to the crawler based engines, sometimes, as with Web CEO, will provide an opportunity to see the submission result in a report (something like 'OK' or 'Failed') and the real response pages returned by the search engine as well.

Directories like Yahoo!, when using paid submission, send a message from the editor with an explanation of why your submission has been accepted or rejected.

Dynamic Search Marketing, however, requires staying updated not only on whether your submission has been queued – it is also extremely important to find out at once when your site has been crawled and indexed and thus found its worthy place in the search engine index. Also, if the site does not appear in the index after a period of time, something must be done. In any case, you need to know when your site has been indexed.

Here are some techniques for verifying whether your submission has been successful.

First use the site:URL syntax to check if your site has been indexed by the given SE. Mind however that some SEs do not support this syntax, so you will have to use some other method for indexation check.
Another checking method is to include a special unique word or combination of words into the page you are going to submit. The idea is that your page will be matched against this word by the search engine when queried for this term.

For this purpose we may use a randomly generated alphanumeric string like "249ej38eh234ieb32i40ly5u05" or a real word combination which is unlikely to be found on any other page on the web, i.e. "International red widgets online open the ranking contest". Include it somewhere on your page so that it can be read by a search engine spider. Don't worry; as soon as you determine you're in, you can immediately remove this from your page.

When you are included in the index, your page will be shown in the result list for this query and – as it's a unique search term not used by anyone else – your page will appear on the top of the list.

It's simply about regular checking with the search engine by querying it for this term and looking for your page in the first results. If your page isn't there, it means you aren't included in the index yet.

You can make your life easier by using a combination of Web CEO Ranking Checker and Scheduler to automatically and periodically perform this check for you. Once it detects you are found among the results, it means your page has been included into the index and you may celebrate your first SEO victory.

What you should remember from this lesson:

  1. Submission verification is important because it alerts you to when the first step is complete and allows one to move to the next step in the process.
  2. For verification, include a string in your page that can be uniquely matched against your page by the search engine. Then regularly check the search results for this term.

What to Do if Your Site Has Been Penalized

We are pleased to congratulate you with a great job done – studying the White, Black, and Gray-Hat SEO. This final lesson is devoted to search engine penalty signs and the principles that differentiate them.

As evidence shows, websites can suffer from penalties for numerous reasons. Search engines hate spammers violating or manipulating their rules. They are continuously working to strengthen the algorithms and sophistically sieve out the spamming sites/pages.

The initiative taken against spammers has resulted in scalable and intelligent methodologies for fighting manipulation. Today, search engines do their best to control and remove spam with hundreds of the world's best online engineers engaged in dealing with spammers.

We can repeat the widely known methods Google, a leading spam fighter, applies. Their names are still buzzwords: Google Sandbox, supplemental index, penalties for SE rules violations, regular updates called "Google dances"… They check many factors and monitor each domain or website within their reach.

Wide spread mistakes occur when your links directory expands and starts looking like a link farm (even if this was not your intention). That's why we have warned you to be very careful with link exchange programs, and who you elect to link with. Too many sites have been banned because of the enormous number of links they displayed or for having paid or manipulative links.

Moreover, sometimes it's hard to see whether your site/page actually has a penalty or some things have changed, either in the search engines' algorithms or on your site that negatively impacted the rankings or inclusion.

This loophole is still critical for the Google and Yahoo! search engines, though Bing/Live Search/MSN has already added a special checking option for the penalties applied.

Checking if your site has been banned
To check and recover from the penalty box, Bing/Live Search/MSN offers its Webmaster Center tools (http://www.bing.com/webmaster/ ). As usual, you first have to add a website and troubleshoot the crawling and indexing processes.
Google and Yahoo! users are still deprived of this option. Yahoo! offers a special contact form for you to send feedback regarding the status of your site in their Search Index. If your site disappeared from the Search Results, you can use this form - Yahoo Search Support.
Very often, what is initially perceived as a mistaken spam penalty with these engines is, in fact, related to accessibility issues.
That's why a two-step checking process is available for your attention.
Step 1
  • Poor website availability
    Look deep into your website monitoring reports (e.g. Web CEO's Monitoring tool) and check your website availability over time. If your site was not uptime for a period, the chances are the crawler failed to access your site several times and decided those pages did not exist any more.
  • Changes to the site content
    Consider whether you have made any changes to the site content recently – maybe you have applied keyword stuffing (such changes could have triggered spam filters).
  • Participation in affiliate schemes
    If you are an affiliate or manage your affiliate program – make sure your site's content is unique and no one scrapes content from you. If you have stronger affiliates who use the same content but rank better, your site could be penalized for duplicate content.
  • Changes in the site structure
    Have you changed your site structure; perhaps added more outgoing links, changed the internal link structure, removed or added a new website section, removed pages, played around with the redirects, etc? Those changes might not be SE-friendly.
  • The quality and quantity of your backlinks
    Carefully check the value and number of your backlinks changed over time. E.g. look to see if most of your backlinks are off-topic, or some linking sites were banned or devalued because of Gray or Black-Hat SEO, or if there was a one-night boost in your link popularity which may be a sign of aggressive link-building techniques.
  • Temporary changes in the SE database or ranking algorithms
    First, check sites that share similar backlink profiles, and whether they've also lost rankings. Then wait for a couple of hours and come back to the SE to check your rankings. Maybe you were just expecting another Google update or so called "dance." When the engines update ranking, algorithms links change the value; site importance changes, and you may suffer from ranking movements. (Read the full explanation of the Google dance notion in our "Crawler-Based Search Engines" lesson.) Refer to the SE news to check if there have been any changes to the SE’s ranking algos.
If any of these suggestions are true for your site – go to Step 2 to proceed with your investigation.

Penalties are very disappointing to get. All you'll see is a sudden site drop; furthermore, the search engine won’t send you an email to tell you to stop doing anything wrong. As a rule, search engines just listen to a competitor who ratted on you. They take the competitor’s word for it, without getting any type of defense from you.

Step 2

This step will cover the following factors:

Check the site’s presence in the search engine index (use the site:url.com query) and see if it ranks for the domain name. Then search your website for the unique brand name and five or six relative unique terms from the title tag of your pages. A positive answer for these actions shows your presence in the search engine’s index and possible loss of ranking.

On the other hand, negative answers in the second step check means a penalty has probably been applied.

Note: site URL presence (only home page) means your site was banned, while a partial presence in the index states the penalty. Once you've ruled these out, clear your pages of spam and remove all the potentially harmful links.

Cleaning penalized Web pages

Our new task is to check or recall what you did wrong while optimizing the pages. Search Engines may help you with this job. Top search engines have created tools to make the site owner’s life easy. You'll be informed about wrong steps you took on your website.
First, register your site with the engine's Webmaster Tools service and brush it up properly. Remove and fix everything you can with the help of Webmaster Tools warnings and alerts.
Top Search Engines Webmaster Tools services are:
Asking for Reconsideration
Now we are going to show our ready for reconsideration pages. Feel free to use the next links to re-include the clean pages:
It is recommended that you include a clear explanation of what you have done to fix all previous conflicts. Write the reasons for your fault. Perhaps your site was hacked or you used a problematic technique unintentionally, and so on.
To summarize: all Search Engines have all driven down the value of search spam and made so-called "White-Hat" tactics far more attractive. The re-inclusion/re-consideration process can take months. Moreover, get ready for the rejection as well.

What you should remember:

  1. Website accessibility difficulties are often perceived as a penalty for spam. Check your pages for errors that can prevent the crawling process.
  2. If your site was banned or penalized, clean the pages of spam by using SE special tools and then use a standard re-consideration or re-inclusion request.

Google PageRank, Local Rank and Hilltop Algorithms

When estimating websites, crawler-based search engines usually consider many factors they can find on your pages and about your pages. Most important for Google are PageRank and links. Let's look closer at the algorithms applied by Google for ranking Web pages.

Google PageRank

Google PageRank (further referred to as PR) is a system for ranking Web pages used by the Google search engine. It was developed by Google founders Larry Page and Sergey Brin while they were students at Stanford University. PageRank ("PageRank" written together is a trademark that belongs to Google) is the heart of Google's algorithm and makes it the most complex of all the search engines.

PageRank uses the Internet's link structure as an indication of each Web page's relevancy value. Sites considered high quality by Google receive a higher Page Rank and – as a consequence – a higher ranking in Google results (the interdependence between PageRank and site rankings in the search results is discussed later in this lesson). Further, since Google is currently the world's most popular search engine, the ranking a site receives in its search results has a significant impact on the volume of visitor traffic for that site.

You can view an approximation of the PageRank value currently assigned to each of your pages by Google if you download and install Google's toolbar for Microsoft Internet Explorer (alternatives also exist for other popular browsers). The Google toolbar will display the PageRank based on a 0 to 10 scale, however a page's true PageRank has many contributing factors and is known only to Google.

For each of your pages PageRank may be different, and the PageRanks of all the pages of your site participate in the calculation of PageRank for your domain.

For each of your pages, the PR value is almost completely dependent upon links pointing to your site, reduced, to some degree, by the total number of links to other sites on the linking page. Thus, a link to your site will have the highest amount of impact on your PR if the page linking to yours has a high PR itself and the total number of links on that page is low, ideally, just the one link to your site.

The actual formula (well, an approximate one, according to Google's official papers) for PR is as follows:

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

where pages T1...Tn all point to page A. The parameter d is a damping factor which can be set between 0 and 1. Google usually sets d to 0.85. C(T) is defined as the number of links going out of page T.

Thus, a site with a high PR but a large number of outbound links can nullify its own impact on your PR. To increase your PageRank, get as many links to your site from pages with a high PR and a low number of total links. Alternatively, obtain as many links pointing to your site as you can, no matter what their PageRank is, as long as they are ranked. It depends on each specific case which variant will get the best out of the PR formula.

Those of you interested in the mathematical aspect will see that the formula is cyclic: the PR of each page depends on the PR of the pages pointing to it. But we won't know what PR those pages have until the pages pointing to them have their PR calculated and so on. Google resolves this by implementing an iterative algorithm which starts without knowing the real PR for each page and assuming it to be 1. Then the algorithm runs as many times as needed and on each run it gets closer to the estimate of the final value.

Each time the calculation runs, the value of PageRank for each page participating in the calculation changes. When these changes become insignificant or stop after a certain number of iterations, the algorithm assumes it now has the final Page Rank values for each page.

Real Page Ranks range from 0.15 (for pages that have no inbound links at all) up to a very large number. The actual value changes every time Google does re-indexing and adds new pages to its database. Most experts agree on the point that the interdependence of toolbar PR and real PR are based on the logarithmic scale. Here's what it means if we assume that the base for the algorithm is, for instance, 10:

Toolbar PageRank
(log base 10)
Real PageRank
0
0 .15 – 10
1
100 – 1,000
2
1,000 – 10 , 000
3
10,000 – 100,000
4
100,000 – 1,000,000
5
1,000,000 – 10,000,000
6
10,000,000 – 100,000,000
7
100,000,000 – 1,000,000,000
8
1,000,000,000 – 10,000,000,000
9
10,000,000,000 – 100,000,000,000
10
100,000,000,000 – 1,000,000,000,000
Although there is no evidence that the logarithm is based on 10, the main point is that it becomes harder and harder to move up the toolbar, because the gaps to overcome become larger and larger with each step. This means that for new websites, "toolbar" PR values between 1 and 3 may be relatively easy to acquire, but getting to 4 requires considerably more effort and then pushing up to 5 is even harder still.

As you may have figured out from the formula above, every page has at least a PR of 0.15 even if it doesn’t have any inbound links pointing to it. But this may only be in theory – there are rumors that Google applies a post-spidering phase whereby any pages that have no incoming links at all are completely deleted from the index.

Local Rank

Local Rank is an algorithm similar to PR which is written by Krishna Bharat of the HillTop project. Google applied for a patent in 2001 and received it in early 2003. To sum it up, this algorithm re-ranks the results returned for a certain user's query by looking at the inter-connectivity between the results. This means that after a search is done, the PR algorithm is run among the result pages only, and the pages that have the most links from other pages in that set will rank highest.

Essentially, it's a way of making sure that links are relevant and ranking sites accordingly. Please note that this algorithm does not count links from your own site – or, to be more exact, links from the same IP address.

Assuming that it is used by Google, make sure that you first get links pointing to you from other pages that rank well (or rank at all) for the keyword that you are targeting. Directories such as Yahoo! and DMOZ would be a good place to start – they tend to rank well for a wide range of keywords. Also, keep in mind that this is about pages, not sites. The links need to be from the pages that rank well – not other pages on sites that rank well.

Hilltop

Hilltop is a patented algorithm that was created in 1999 by Krishna Bharath and George A. Mihaila of the University of Toronto. The algorithm is used to find topic relevant documents to the particular keyword topic. Hilltop operates on a special index of " expert documents".

Basically, it looks at the relationship between the "Expert" and "Authority" pages. An "Expert" is a page that links to lots of other relevant documents. An "Authority" is a page that has links pointing to it from the "Expert" pages. Here they mean pages about a specific topic and having links to many non-affiliated pages on that topic. Pages are defined as non-affiliated if they are authored by authors from non-affiliated organizations. So, if your website has backlinks from many of the best expert pages it will be an "Authority".

In theory, Google finds "Expert" pages and then the pages that they link to would rank well. Pages on sites like Yahoo!, DMOZ, college sites and library sites can be considered experts.

Google acquired the algorithm in February 2003.

Site Structure and PageRank

PageRank can be transmitted from page to page via links across different pages of your site as well as across all the sites in the Web. Knowing this, it’s possible to organize your link system in such a way that your content-rich pages receive and retain the highest PageRank.

The pages of your site receive PageRank from outside through inbound links. If you've got many inbound links to different pages of your site, it means PageRank enters your site at many points.

Such "PageRank entry points" can pass PageRank further on to other pages of your site.

The idea that you should keep in mind is that the amount of PageRank that a page of your site is able to give to another page depends on how many links the first (linking) page itself contains. This page only has a certain amount of Page Rank, which is going to be distributed over several other pages that this page links to.

The best way to obtain a good PR on all of your pages is to have a well thought-out linking structure for your site.

What this means is that every page on your site should have multiple links from your other pages coming into it. Since PR is passed on from page to page - the higher the PR that a page has, the more it has to pass on. Pages with a low number of links on them will pass relatively more PR per link. However, on your own site, you want all of your pages to benefit - usually. Also, PR is passed back and forth between all of your pages - this means that your home page gets an additional boost because, generally, every page on your site links to your home page.

Let's look at the prototypes of site linking schemes that may be beneficial in terms of PR distribution.

1. Simple hierarchy.

Simple hierarchy

The boxes denote separate pages and the figures in them denote the PR value calculated with the help of a simple algorithm that takes into consideration only these pages. With a site structure like this, it's pretty easy to get a high PR for your home page; however this is an ideal situation which is difficult to recreate in real life: you will want to get more cross-linking then just links from all your pages to the home page.

2. Linking to external pages that return backlinks

Linking to external pages that return backlinks

This just means creating a link directory page on your site and benefit a bit from link exchange with the external pages. Link exchanges are dealt with in the next lesson.

3. Site with inbound and outbound links

Site with inbound and outbound links

This is very similar to the first scheme, however, here there is an external site (Site A) passing its PR to your home page which then distributes it to child pages. You can see that both a homepage's PR and that of the child pages have significantly increased. It doesn't matter how many pages you have in your site, your average PR will always be 1.0 at best. But a hierarchical layout can strongly concentrate votes and, therefore the PR, into the home page.

So here are some main conclusions you should keep in mind when optimizing the link structure of your site for better PR distribution.
  • If a particular page is very important – use a hierarchical structure with the important page at the "top".
  • When a group of pages may contain outward links – increase the number of internal links to retain as much PR as possible.
  • When a group of pages do not contain outward links – the number of internal links in the site has no effect on the site's average PR. You might as well use a link structure that gives the user the best navigational experience.

How your PageRank influences your rankings

While the exact algorithm of each search engine is a closely guarded secret, search engine analysts believe that search engine results (ranking) are some form of a multiplier factor of Page relevance (which is determined from your multiple of "on-page" and "off-page" factors) and PageRank. Simply put, the formula would look something like –

Ranking = [Page Relevance] * [PageRank]

The PR logic makes sense since the algorithm seems invulnerable to spammers. The search results of Google search have demonstrated high relevance and this is one of the main reasons for their resounding success. Most other major search engines have adopted this logic in their own algorithms in some form or other, varying the importance they assign to this value in ranking sites in their search engine result pages.

What you should remember from this lesson:

  1. PageRank was developed by Google to estimate the absolute (keyword-independent) importance of every page in its index. When Google pulls out the results in response to a Web surfer's query, it does something similar to multiplying the relevance of each page by the PR value. So, PageRank is really worth fighting for.
  2. PageRank depends on how many pages out there link to yours (the more, the better) and how many other links these pages contain (the less, the better).
  3. You may try to optimize the link structure of your site for better PageRank distribution. Most simply, you should create a site map, get many cross-links between your pages and organize a hierarchy link structure with the most important pages on the top.