Robots Exclusion Standard

Posted: December 14, 2011 in SEO

When primitive robots were first created, some of them would crash servers. A robots exclusion standard was crafted to allow you to tell any robot (or all of them) that you do not want some of your pages indexed or that you do not want your links followed. You can do this via a meta tag on the page copy
<meta name=”robots” content=”noindex,nofollow”>
or create a robots.txt file which tells the robots where NOT to go.  The official robots exclusion protocol document is located here:
http://www.robotstxt.org/wc/exclusion.html. The robot.txt file goes in the root level of your domain using robots.txt as the file name.

This allows all robots to index everything
User-agent: *
Disallow:

This  disallows all robots to your site
User-agent: *
Disallow: /

You also can disallow a folder or a single file in the robots txt file. This disallows a
folder:
User-agent: *
Disallow: /projects/

This disallows a file
User-agent: *
Disallow: /cheese/please.html

One problem many dynamic sites have is sending search engines multiple URLs with nearly identical content. If you have products in different sizes and colors or other small differences, it is likely you could generate lots of duplicate content which will prevent search engines from wanting to fully index your sites. If you place your variables at the start of your URLs then you can easily block all of the sorting options using only a few disallow lines.

For example:
User-agent: *
Disallow: /cart.php?size
Disallow: /cart.php?color
Would block search engines from indexing any URLs that started with cart.php?size or cart.php?color. Notice how there was no trailing slash at the end of the above disallow lines. That means the engines will not index anything that starts

with that in the URL. If there was a trailing slash search engines would only block a specific folder.  If the sort options were at the end of the URL, you would either need to create an exceptionally long robots.txt file or place the robots noindex meta tags inside the sort pages. You also can specify any specific user agent, such as Googlebot, instead of using the asterisks wild card. Many bad bots will ignore your robots txt files and / or harvest the blocked information, so you do not want to use robots.txt to block individuals from finding confidential information.
Googlebot also supports wildcards in the robots.txt. For example, the following:
User-agent: Googlebot
Disallow: /*sort=
would stop Googlebot from reading any URL that included the string “sort=” no matter where that string occurs in the URL.

Keyword Value Pyramid

Posted: December 14, 2011 in SEO

One of the most fatal flaws of many S.E.O. campaigns is that people think they need to rank well for one term or a few generic terms.  Generic terms may occasionally convert, but most strong converting search terms are specific.

If you read S.E.O. forums you often hear many posts about a San Diego real estate agent no longer ranking for a generic term such as ‘real estate’. Since the term is too generic for most of his target market (and his service would not be fitting for most people searching for that term), it makes sense that the search engines would not want to show his site in those search results. As search continues to evolve, it will get better at filtering out untargeted or inappropriate sites.  Targeting generic terms outside of your area it means that you need to use aggressive techniques to try to rank. Problems with being too aggressive:
•  Targeting exceptionally generic terms may not add much value since the leads are not strongly qualified. Paying extra to rank for more generic terms may not be a cost that is justified unless you can resell those leads at a profit.
•  Being exceptionally aggressive raises your risk profile and makes your site more likely to fluctuate in rankings when new search algorithms are rolled out.

Keyword Value Pyramid
As you can see from the image, the more we target on our specific market the greater value we can extract from our sites. I am not suggesting always trying to use free online Texas holdem software download over and over again, but by scattering those various words throughout your copy you may be able to rank for many related phrases.

Keywords

Posted: December 14, 2011 in SEO

What are Keywords?

Keywords are phrases which you want your website to be found under in search engines.. Keywords are typically two to five word phrases you expect people to search for to find your website. Often corporate climates force people to refer to things using special phrases. Keywords are not about what you call your stuff.
Keywords are what Joe average surfer (or your prospective site visitors) may type in a search box.

Focusing a Keyword:
When people tell you to target the word “free” they are out of their minds. The single word is too general and has too much competition. I just did a search on Yahoo! for “free” and it returned 749,000,000 results. That is over 10% of the web trying to use free as a sales pitch.  Single word keywords are usually not well targeted and hard to obtain. Longer keywords are easier to rank well for and typically have better conversion rates.

I am not saying that free should not be on your page, it is on most of mine. I am saying that keywords should define the product or idea. Free alone just does not get this done

Keyword Phrases:
If free isn’t a keyword, then what is? Keywords are typically two to five word phrases you expect people to search for to find your website. What would you expect people to type in the browser to find your site? If you were looking for your product, what would you type? What type of problems does your product or service solve? Those answers are likely good keyword phrases.

Keyword Length:
A longer search phrase is also associated with better targeting and increased consumer desire. Some people say shorter keyword searchers are shoppers and longer keyword searchers are buyers. As you add various copy to pages you are more likely to appear in search results similar to your keywords which do not exactly match your more general keywords. Most good keyword phrases are generally 2 to 5 words.

 

Here are some Google AdWords optimization tips that will increase your click through ratio and lower your cost per click.

  • Divide your keywords into campaigns. Each campaign could refer to a specific product category or promotion. This will let you set different options for each campaign and obtain more accurate return on investment (ROI) data
  • Only target customers who speak your language. For example, if you website is only in English, then only target users who understand English
  • Only target countries where your customers are. If you find that the majority of your customers are from the U.S. and Canada, then only select those two countries for your ad to appear in.
  • Enter as many targeted keyword phrases as possible. Don’t make assumptions on whether a keyword phrase will attract clicks
  • Include misspelled keyword phrases. I find they produce the highest click through ratios as they typically have little competition
  • Breakdown your keyword phrases into groups of similar keywords that can be targeted using the same titles. This will produce the higher click through rates and lower cost per click
  • Use “Exact” and “Phrase” match to improve your click through ratio and lower you cost per click. For example:

[search engine optimization] – Exact match
”search engine optimization” – Phrase match
search engine optimization – Broad match
-search engine optimization – Nagitive match

  • Include negative keywords to avoid click throughs from people you don’t want to attract. For example, all software promoters should include keywords such as free, hack, crack, warez, etc., which are entered by people looking for free or hacked versions of the software
  • Minimize the use of “stop words” such as “in, on, at, from, and, etc.” in your ads. For more information on stop words
  • Set your initial cost per click bid as high as you can afford. This will help to get as high a click through rate as possible. After you have achieved a reasonable click through ratio, you may lower your bid. The high click through rate will help you to achieve a higher position for your bid
  • Set a high enough bid to at least get your ad into the top 8 positions, or the first page. If possible try to bid for the top 2/3 positions to get your ads “above the fold” – visible content that does not require the user to scroll down the page. This will help you to maximum your click through ratio
  • For extremely popular and competitive keywords that cost far too much to the top positions, don’t be afraid to settle for low cost bids that give you rankings in the 40-50 range. They can and do attract click throughs
  • Enter a reasonable daily campaign budget. Make sure it is sufficient; otherwise your ads may not appear every time your keywords are searched
  • Uncheck the “content sites in Google’s network” checkbox, if you do not want your ads to appear in content-targeted pages. People have reported that content-targeted ads generally attract a lower click through and ROI ratio. Please note that by choosing not to display your ads in the content sites in Google’s ad network, your ads will not appear in the Google Directory and Google Groups, aside from other highly trafficked sites
  • Always include the main keyword phrase you’re targeting in the ad title. If the keyword phrase doesn’t fit onto one line, let it continue on the second line. When a user searches for the keyword phrase you’re targeting, it will be highlighted in bold and will help to draw attention to your ad
  • Always test 2 or more ads simultaneously. Google will automatically display more of the better performing ad. After a while, replace the lesser performing ad with a fresh one. Continue this process in your never-ending quest to improve your click through ratio and lower your cost per click
  • Test ad groups by pausing them to see what effects it has on your conversion ratio and return on investment. Add the ads back into the system by selecting the “Resume” option
  • Test ads by changing the title, description, and displayed URL. Also try changing just one word in an ad. You could be surprised by the results. I have had ads perform up to 30% better just by changing one word in the ad
  • Don’t be afraid to pause and delete ad groups and campaigns that are not producing a return on investment
  • I have seen Google AdWords ads appear in the 2 premium positions that are usually reserved for large advertisers. These positions typically produce higher click through ratios. So it can pay to bid for the top rankings
  • Optimize your return on investment by watching your average cost-per-click, not your maximum cost-per-click. Quite often you can drive more click throughs by setting a much higher maximum cost per click, yet keeping your average cost per click (CPC) low

For more Google AdWords optimization tips from Google, visit

http://adwords.google.com/support/aw/bin/static.py?hl=en&page=tips.html

Major search engines give a great deal of relevance to keywords in the TITLE tag. As such, you should always include the Title tag along with your most important keywords on every page.

In general, I have found that short keyword rich title descriptions work best, as this maximizes the keyword density.

Here are some examples of good title tags:

<TITLE>WebPosition Gold Review</TITLE>

<TITLE>Canon Digital Camera</TITLE>

<TITLE>Cheap Flights</TITLE>

And examples of poor title tags:

<TITLE>WebPosition Gold – Search Engine Optimization Software</TITLE>

<TITLE>Up To 50% Off Digital Cameras</TITLE>

<TITLE>Cheap flights, tickets, air travel and much more!</TITLE>

You can determine your page’s keyword density using the following procedure:

  • View a webpage in your web browser.
  • Right click on your mouse and select “Select All” to highlight the text. Right click again and select “Copy” to copy the text to your clipboard.
  • Open your text or Word editor. Copy the text to the document.
  • Select the “Word Count” option (Most text editors have it). In Microsoft Word, the option is under “Tools > Word Count…”
  • Run a find and replace procedure by putting your keyword phrase in both the find and replace area. The select “Replace All.” In Microsoft Word, the option is under “Edit > Replace…”

The program will search for your keyword phrase entered in the “Find” input box and replace it with the keyword phrase in the “Replace” input box, which in our case will be the same. It will tell you how many times the keyword has been replaced.

  • Divide the keyword replaced count by the total number of words on your page to determine your page’s keyword density. For example, if your keyword replaced count is 3 and there are 100 words on the page, your keyword density ratio is 3 percent.

Keyword density is the ratio of targeted keywords contained within the total number of indexable words within a page.

For example, if a page has 100 words in total and of those 100 words 3 words are your targeted keywords, then the keyword ratio is 3% (3 divided by 100).

In general, I suggest using a keyword density ratio in the range of 1-3%. For specific search engines, it depends on which search engine you are targeting, as different search engines have different preferences regarding keyword density.

Keyword frequency refers to the number of times a keyword, or keyword phrase, appears within a page. The theory is that the more times a keyword or keyword phrase appears within a page, the more relevant the page is for that keyword. But do not overdo it, by plastering a webpage with the targeted keyword. Search engines frown upon keyword spamming.

Search engine optimization (SEO) should play a major part in the planning of any website. The sooner you incorporate SEO into your website, the better. All too often, people only think about SEO after they have built and launched their site. By then they would have lost a lot of SEO time and may have made decisions that are detrimental to their SEO efforts.

So what part does SEO play in the planning stage of a website?

Website Domain Name

Search engines do take into account keywords in the domain name. So I highly recommend registering a domain name containing your most important keywords. However it is also important that the domain name is memorable and brandable. Please avoid domain names like, “AAA1-Cheap-Domain-Names-Registrations.com.”

Website Structure

A website should be structured for optimum usability and linkability.

The usability of a website is essential to the success of a website. It doesn’t matter how great your content is, if your visitors have problems or get frustrated trying to find the information they’re searching for. Typically, the usability of a website is managed by the web designer.

Linkability refers to the internal linking structure of a website. It is vitality important that webpages are linked in such a way that it maximizes the Google PageRank of each webpage, because the rankings of each page depend on it.

This is why I recommend employing search engine optimization strategies as soon as you start planning your website.

Website Navigation

Site navigation is one area that few web designers worry about, with regards to search engine optimization.

For example, some search engines don’t crawl deeper than the top two or three levels of a website. So unless you use SEO techniques to help the search engine spider go deeper than the top levels of your site, your lower level pages may never get indexed.

For exp:

http://dir.yahoo.com/

Search engine spiders that don’t index more than the top 3 levels will not index pages starting with level 4 downwards.

The best way to getting all your pages indexed is to lead search engine spiders to all the major areas of your site. To do this, use a site map. Visit the “What Are Site Maps?” section for more information.

Website Categories

To attract the most number of targeted visitors, you must offer products, services and content that people want or need. That’s obvious right?

So you must choose the right topics to target when creating your site. Split your site up into the wrong categories, or not knowing what categories to target, could cost you a lot of potential visitors and customers.

For example, let’s say your website sells baby products. Did you know that more people search for baby names than baby products?

With this knowledge you should create a resource to attract these people. After all, if someone is interested in baby names, there’s a good chance they will also be interested in baby products, right?

Good topics include major product categories, as well as major brand names. Some people may search for a digital camera by entering “digital camera” into a search engine. Whereas others may know of the brand they want and enter the brand name, such as “Canon digital camera.” You should create categories that cater to both these types of search engine users.

Webpage Content

The content of a page is the most important aspect of search engine optimization. It is the keywords contained within the page content that makes or breaks a page’s chance of top search engine rankings. Let’s take the two extremes.

  • • A webpage has no text whatsoever, as in the case of some Flash pages. Search engines will not find any keywords to index.
  • • A webpage contained lots of text, with dozens, even hundreds, of different keywords on different topics. Search engines would find it difficult to categorize the page, as there are simply too many competing keywords.

As you can see, not only is the content of a page is important, but the amount of content as well.

The inherent problem with web designers is that they generally don’t understand, or even care about, the importance of creating a search engine optimized website. After all, they’re web designers, not search engine optimizers.

So it’s up to you to make sure your website designers produce a site that offers a balance of aesthetically pleasing design and search engine marketability.

Links (Also known as Hyperlinks)

After the contents of a page, links is the most important aspect of search engine optimization. Some would argue that links is more important. Maybe, but we could argue that issue all day.

Search engines, such as Google, base their ranking system on the link structure of websites. In general, the more links pointing to a site, the higher it should appear in the search engines. But this isn’t always the case.

The link text is just as, if not more, important as the link itself. But web designers would rather use aesthetically pleasing graphic buttons, than plain, simple text links. The problem is that search engines cannot associate keywords with such links, as it doesn’t have any link text to