Tuesday 13 December 2011

Listing Dynamic Web Pages in Search Engines


Problems With Dynamic Languages

Most search engines do not like to follow java script or spider deeply into dynamic driven websites. Since scripts can create millions and millions of pages on the fly search engines need to ensure some level of quality before they will want to list all the pages.

Session ID's & Cookies

Search engines do not like reading session IDs or cookies. When a search engine is given a session ID they will usually leave the site. If spiders indexed websites that offered the spider a session ID it would frequently cause the spiders to overload the server. Session ID's would also make the site seem many times larger than it is.

General Dynamic Page Listing Tips

Search engines are getting better at listing dynamic web pages. There are a few basic thumb rules to help search engines index your dynamic website.
  • Build a linking campaign. As you get more inbound links search engine spiders will have more reason to trust the quality of your content and they will spider deeper through your site.
  • Use short query strings. Most search engines will do well to list dynamic pages if each query string is kept less than 10 digits.
  • Minimize the number of variables. When possible you want to use three or less different parameters, the fewer the better. If you use long parameter strings and a half dozen parameters it is a fair bet that the site will not get indexed.
  • If you still have problems you can use a CGI script to take the query string out of your dynamic content or try one of the other dynamic page workarounds listed below.
  • Other dynamic workarounds: There are multiple ways to list dynamic web pages in search engines. Common techniques are:
    • Hard coded text links
    • Bare bone pages
    • Coldfusion site maps
    • Apache Mod Rewrite
    • IIS Rewrite Software
    • Paid Inclusion

Hard Coded Text Link

To list dynamic web pages in search engines you can capture the entire url in a link like:
<a href="http://www.search-marketing.info/catalog.html?item=widget&color=green&model=6">Green widgets style 6</a>
If you use static links like listed above to reference dynamic pages, search engines will usually be able to index them. Many sites use a site map which captures the most important interior pages.
If you have enough link popularity and link into a couple of your main category pages using static links then search engines will usually be able to index the rest of the site.

Bare Bones Pages

You also can make a bare bones static page for each one you want listed in search engines.
<html>
<head>
<title>Green Widgets style 6</title>
</head>
<body>
<!--#exec cgi="myscript.pl?greenwidget-6"-->
</body>
</html>

Coldfusion Site Map

For ColdFusion you can code a site map page using a code similar to the following.
<cfquery name="getPages" datasource="myDatasource">
SELECT *
FROM pages
WHERE status = 1
ORDER BY sortorder
</cfquery>
<cfoutput query="getPages">
<a href="index.cfm?pageID=#getPages.pageID#">#getPages.pageName#</a>
</cfoutput>

Apache Mod Rewrite

For Apache servers there is a way to make dynamic pages seem like static pages called Mod Rewrite. The documentation on MOD Rewrite is located on the Apache website. Apache MOD Rewrite.

IIS Rewrite Software

IIS servers do not have the built in rewrite features like Apache Mod Rewrite. You still can rewrite your URL's on IIS servers using custom built software programs.

Trusted Feed Paid Inclusion

You can pay to use trusted feeds to upload dynamic content to Yahoo! to acquire traffic at a cost per click bases through the Overture Site Match program. I would consider this a last option for many sites since many business models can not support the incremental cost per click charges.

Thursday 8 December 2011

Opinion: 3 Onsite SEO Myths and Facts – What Really Counts?

Before starting this article I would like to note that I am specifically talking about Google. The information below might not apply to other search engines.

Everybody who is into SEO knows that it is more than just link building and offsite techniques. Sure, links matter the most, but how about your website itself? Onsite optimization might not be the most important part of SEO according to some people but that just depends on the point of view. To me, there have always been some onsite factors that play a significant role in the eyes of Google.

Of course, the most important thing when optimizing a website according to link builders is getting links from website with a high TrustRank and the most important thing according to web designers is the proper coding. Since me and the people I work with focus on SEO as a complete process, we concentrate on everything important.

However, there are some things that just don’t matter as much as others especially when doing onsite SEO. Google changes its algorithm almost every day so a lot of the old onsite techniques that once worked are now useless thanks to the so many spammers exploiting them. So what exactly are the onsite factors that can affect your rankings?

WEBSITE TITLE

The title of your website is one of the most important things when it comes to onsite SEO and better rankings. Here is what people believe to be true and what the truth really is:

Myth

A common mistake that people make when doing onsite optimization is stuff keywords in the title of their website thinking that would help them rank better. Keyword stuffing was a technique that was kind of effective a long time ago until Google found out that the spammers are using it to their advantage. So Google decided to change their algorithm, making a website’s ranking depend more on links and less on onsite factors.

Fact

The title of your website matters a lot and if you don’t want to get a penalty, you need to keep it simply and user-friendly as well as Google-friendly. This is the place where you get to describe your whole website/business in about 10-15 words. I am not saying you should not put keywords in there. Quite the opposite – put your most important keywords in the title but make sure you put them in a way that is not spammy looking instead of just stuffing them and separating them with commas.

Tips

When writing your website title, be creative and don’t repeat anything more than once. For example, if you are selling silver and gold jewelry, writing “Silver jewelry, gold jewelry…” in your title is not a good idea. Instead use the phrase “Silver and gold jewelry”. You should know that Google doesn’t care about the order of the words and you will get credit for each permutation.

URL STRUCTURE

The most obvious thing is the domain name. If your domain name is an exact match for your most competitive keyword – you’re on. However, the rest of the URL structure is also a very important onsite factor and many people are still doing it all wrong.

Myth

Again, a very common myth is that stuffing as many keywords as possible in the name of each page will work.

Fact

A website with a better URL structure has an advantage over a website with dynamic URLs. Although dynamic URLs can be read by Google, they simply don’t have the same effect as properly structured ones.

Tips

When taking care of your URL structure, the most important thing is to name each page of your website with the most relevant keyword. Creating many pages with different names that are also your keywords will pay off better than having dynamic URLs.

AGE

The age of a website is another factor that plays a big role when it comes to its rankings but not in a way that some people think.

Myth

A lot of people believe that a website will get better rankings with time on its own. So their strategy is to just sit there and wait because they believe that a website that is 3 years old should automatically rank better than a website that is 4 months old no matter what. They believe that if no offsite optimization has been done to the old website it will still have better ranking than a new website with a lot of backlinks for example.

Fact

The age of a website does matter to Google. However your website will not rank high just because it’s old. The only thing that is affected by the site age is the amount of TrustRank it gets from the links pointing to it. The first two months, you will most likely not rank at all in Google because you will be in the “sandbox” where all new websites go. Then you will start receiving a bigger percentage of TrustRank as your website gets older. 4 years after the creation of your website, you will start receiving 100% of the TrustRank that the links pointing to your website pass.

Tips

Just because your website will be in the sandbox for the first two months, doesn’t mean you should sit and wait for the time to pass and then start building links. Instead, use the time to get some links and enjoy the credit Google will give you for them when the trial period is over.

Conclusion

These are 3 of the most important onsite SEO factors you should focus on, but I want to touch on two other factors people still think matter, the XML sitemap and the coding. Just to be clear – this article is about which onsite factors help you get better rankings and not about what makes Google’s job easier. Of course the XML sitemap is a really great thing and it sure helps Google crawl your website faster and therefore index it faster. However your sitemap has nothing to do with your rankings at all nor does the coding of your website.

Concentrate on what is really important and don’t worry about things web designers and other charlatans tell you in order to get more money from you.

Monday 5 December 2011

Google Algorithm Updates: The Latest Things To Consider

Google algorithm "transparency" continues

Google has been making a big deal about wanting to be more transparent about its search algorithm lately (without revealing the secret sauce too much of course). And so far, I have to say they're making good on that promise fairly well.

Is Google being transparent enough for your liking?

We've seen plenty of algorithmic announcements made from the company over the course of the year. In November, they discussed ten recent changes they had made. Here's a recap of those:

  • Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
  • Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
  • Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page's title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page's content.
  • Length-based auto complete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
  • Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
  • Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
  • Fresher, more recent results: As we announced just over a week ago, we've made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
  • Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
  • Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
  • Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.

Friday 2 December 2011

Debunking Common SEO Misconceptions and Myths

Search Engine Optimization, can provide the best ROI for your internet marketing budget. Because it is widely misunderstood and just as commonly misused and abused, there are plenty of misconceptions about SEO that can derail your marketing efforts.

SEO misconceptions

Here are some commonly held beliefs that need some clarification:

“I don’t need on site SEO, just 100 more links”

There is just too much misinformation about SEO out there for you to assume your on site optimization has been done right. Many times we will have a new client tell us it is all taken care of, then we discover their pages are keyword stuffed, title tags are too long or duplicated, no meta descriptions exist, all the internal links say “click here”, content is duplicated elsewhere in the site or stolen from another site, or pages are cloaked or hidden in an attempt to trick the search engines.

Another variation of this is the idea that some webmasters have that on site optimization is not necessary and all you need is lots and lots of backlinks. You have figured out how many links your competitor has, and it is only a matter of getting more than them, right? Nope – if it was that easy, anybody with some time could make a totally crappy site rank #1 for everything and nobody would ever use search engines.

While inbound links to your site are one of the most important things that will help your website rank better, quality is becoming increasingly important to Google, links are becoming less important and social signals (Likes, +1′s, Tweets, etc) are becoming more important. Make sure you are creating good quality content, your site is well optimized, and make it easy for readers to like your site.

“My web designer already optimized my website”

Web designers and developers often don’t know or don’t care what is important to search engines. If the site looks good and all the buttons work, then it is all good as far as they are concerned. There are many web designers who know SEO, but they are kind of rare. Unfortunately, most web design classes teach the software and visual design skills, but don’t bother with optimization best practices. I recently had a web design instructor tell me that he doesn’t teach SEO because it “has nothing to do with web development and it doesn’t work”! Um, yeah. Right. Think about that when you hire a web designer who just finished school.

Here are some things that are so basic to search engine optimization that you just can’t call a site optimized if they have not been dealt with properly. These things really should not even be considered as separate from web design since they should be part of any web developer’s best practices when building a website.

Wednesday 30 November 2011

Link Wheel – Powerful & Effective Concept for Natural Link Building

With the Internet era touching new heights, drawing quality traffic towards a blog or website has become absolutely necessary to increase its net worth. One of the best ways to soar high in search results is to build natural links that Google will simply love. Link wheel is an outstanding link building strategy that imitates the natural Internet pattern making use of the power of multiplication to get links from your related niche. Link wheels have the ability to boost up your link count significantly over a period of time.

Link wheel creation involves interlinking different websites to the central focal point that is your parent website. The entire link wheel process builds links that go from your parent website to various sites having web 2.0 properties. By submitting unique and appealing content in these web 2.0 property websites, you can get at least two backlinks to your parent website. Google highly values such links as the content on these web 2.0 sites is qualitative and is related to your specific niche. This makes your website a dominant feature, which you can use to boost up your search engine rankings, and thus make your website reach far and wide.

Key Features of Link Wheel Creation Process

  • Our link building strategy makes use of only high quality, relevant websites.
  • We submit content to websites and web 2.0 platforms with page rank 4 to 9.
  • Content is submitted only after client’s approval.
  • The client can choose the target URL to either promote his home page and/or any of the inner pages.
  • We offer cost-effective link building packages that are unmatched in the industry.

By combining the effective link building technique of link wheel in our other link building packages, we can produce phenomenal results for you.

10 Recent Google Algorithm Changes To Be Aware Of

Google makes around 500 changes to their algorithm every year to make the search easy and to provide updated results to the users. Anytime there is a change in the Google Algorithm, it becomes news. There have been some recent changes to Google, but there is no need to guess these changes as Google has released the details of the changes and how these changes impact the search results and rankings of the websites in SERP.

Here is the brief of the recent Google Algorithm changes:

1) For queries in language for which limited content is available, Google will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. For webmasters, it will be beneficial as they can now search new markets, which couldn’t be done previously due to language boundaries.

2) The snippets on the search result page now show more page content and less header/menu content. It points out that Google is starting to put more attention to the text in “Actual Page Content” than headers and menu content. For webmasters, they need to ensure that they are presenting right content on the right page.

3) Google found that boilerplate links with duplicated anchor text are not as relevant, so these will be given less emphasis. For webmasters, they need to understand that site wide linking (in headers, footers and blog rolls) will not lead to any better ranking.

4) An Autocomplete prediction in Russian has been improved. Now, Google will not do predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already practiced in English. This is good news for Russian users but not useful for English users.

5) Users searching for software applications will see more rich snippets including cost and user reviews, within their search results. For webmasters, it would be useful to add descriptive rich snippets for software applications in order to get a higher ranking.

6) Google retired a signal in Image Search related to images that had references from multiple documents on the web. This tweak is aimed to improve image search function. But maybe Google is trying to decrease the link juice from sites like Flicker, Dailymotion, etc.

7) Google made a significant change on how to rank the fresh content. This change impacted around 6-10% of the search results and better determines the appropriate level of freshness for a given query. This change would highly benefit the news websites. For webmasters, they need to update the content on regular basis in order to rank high in Google.

8) Google will tend to rank official websites even higher on Google’s result page. And this change is intended to provide the users with more relevant and authoritative results. Basically, this is good news for long-established brands. The official sites will get better rankings and the industry giants get a stronger hold on search results too!

9) Date-restricted queries have been improved to ensure that users get the most relevant results for that specific date range. It means if your company’s news is in Google News then expect it to be given more prominence now during the time the news is still relevant.

10) Autocomplete predictions for Hebrew, Russian and Arabic have been improved. There is nothing specific about it; just the user experience for non-English speaking users will get better.

For website owners and webmasters, they need to work accordingly to get their websites rank high in search engines. Focus more on the content part of the website as it is given more importance and try to use the variations of anchor texts for linking.