Saturday, November 28, 2009

SEO for Landing Page

  1. Inquiry form short.
  2. Contact details phone numbers.
  3. Brand Logo of services / product / Clients.
  4. Show testimonials,benefits /results.
  5. Special offers/discount.
  6. Use small sentence. (Good grammar).
  7. Related services / products & Information.
  8. free news letter, Information, book, software.
  9. Live chat
  10. Additional qualities of your product , services, team members.
  11. Bookmark this site.
  12. Why our company /services/product.

Friday, November 20, 2009

SEO SEM Check List - Avanindra

1.Valid XHTML

validate document through the W3C Markup Validation Service - http://validator.w3.org/

2.Title Tag

Every page needs a distinct title tag that defines what that page is about, it should also contain the keywords for that page in a locigal coherant manner, not just randomly placed in there. Build relevancy, matching title and H1 tags are a very wise idea.

This tag belongs in the section of your page and should appear before any other tags in the head section, it is THE MOST IMPORTANT thing you'll put on the page so make it count and make it relevant!
* Example:

3.Description

Under the title tag each page should have a clear and relevant description summarising what that page is about.
* Example:


4.Keywords
Every page needs the keywords tag that if you didn't guess it contains the keywords you're page is relevant to.

* Example:


5.Alternate media / Mobile version
Take a look around, people are surfing everywhere and making your site accessible to them gives them access to your content wherever they are without the need to be tied to a desk.
* Example:


Staggering number of people browsing the web on mobile devices, iPhones, Blackberry, Palm, Windows Mobile and a whole plethora of smartphones. If you're got nice clean markup and not bloated your site with flash and javascript the chances are it may possibly work well on a mobile device.

You can then tell the search engines spiders about your mobile site using the link rel tag and setting the media as handheld.


Mowser rocks, if your site shows Google AdSense it will convert your normal AdSense ad units into 'AdSense for Mobile Content' ad units. Way cool.

A more advanced way to deliver a mobile site would be to detect any mobile phones that visit your site so you can serve them a custom experience suited for their handset
* My mobile device detection function written in PHP - http://detectmobilebrowsers.mobi/
* Check how well your mobile site might perform - http://ready.mobi/


6. Navigational structure
A site that is easy to navigate from a humand point of view is easy to navigate from a spider / search engine point of view. Think of your site as a virtual town, each page is a street and in order for people to find that street you need to ensure you have easy to undertstand signs (links) pointing out where they are.



7.Popularity filtering / Folksonomy

Show links to your most popular content on every page or make them a big part of navigation to place emphasis on the pages that are being viewed the most, do this with sales, aggregate your internal search queries.


* See the Wikipedia article on Folksonomy for a better understanding of the concept: http://en.wikipedia.org/wiki/Folksonomy



8.Alt tag on images

To add relevancy to your images you should use the alt tag to specify some short descriptive, keyword rich text. This will show in the place of the image whilst the image is loading and will be shown instead of the image when the visitor can't view images either because they've got a visual imparement and are using a screen reading tool or they disabled images in their browser settings.

Also, using relevant file names in your images will help build relevancy, image1.jpg doesn't say anything of any value whereas seo-checklist.jpg speaks volumes!

9.Title on hyperlinks

Adding a title element to a hyperlink will show whatever you put inside that element in a little bubble when the user places their mouse over the link. This is another way of reinforcing the relevancy we're trying to build.

* Example: SEO Checklist



10.Keywords in page / file names

If you have a page all about SEO tips but you called the file page_6.htm there is no relevancy, that page would hold better value if it were titled seo-tips.htm



Try to make your file names and URLs easy to understand and read from a human point of view. /seo-tips.html is easy for the mind to understand /page_6.php?paramter1=2&paramater2=1 is just confusing for humans.

Firefox has the most awesome search facility in the address bar, making URLs and file names easy to understand and keyword rich makes it easier for users to find you again.



11.Keywords in domain
Try to include your keywords in your domain name putting the most important keywords first if you can.

A site about SEO Tips that is hosted at www.pettravelinsurance.net is giving off very mixed signals indeed. It would be much better off hosted at www.seo-tips.com or www.seotips.com - try to include your keywords in your domain name putting the most important keywords first if you can.


12.Keywords in links both internally and externally
Make sure all your links work and the keywords used in the URL, title and anchor text all tie up with the content of the page.
Remember, if you're linking to internal pages try to use the same text in the link anchor as what appears on the page in the title tag.


A quality link on your site will contain all the elements we've just covered, here's an example of a bangingly good hyperlink
* Example: SEO Checklist

As far as a search engine goes it isn't going to have any issues at all working out what that page is going to be about.

It contains keywords in the URL, keywords in the title and keywords in the anchor text. How could even an idiot fail to understand what the page on the end of that link is about!?!? Just make sure all your links work and the keywords used in the URL, title and anchor text all tie up with the content of the page.

That link above points to a page with a title of 'SEO Checklist' and a H1 tag saying 'SEO Checklist' - everything ties up perfect.


13.robots.txt

robots.txt is a small text file that tells seach engine spiders which parts of your site you want them to view and which parts of your site you don't want them to view. Create a plain text file called robots.txt and upload it to your main www / htdocs / public_html directory

This is the contents of the robots.txt file I uploaded to http://seochecklist.mobi/robots.txt

User-Agent: *

Allow: /
* robotstxt.org has this page http://www.robotstxt.org/robotstxt.html which again is a perfect example of keywords in a domain and in the filename and explains robots.txt in a nutshell



14.XML sitemap

XML sitemaps help search engines discover content they might not have otherwise found during their regular spiderings, publishing a site map with a comprehensive list of all the pages on your site is a way to directly tell the engines what you have and where it is!

Google has a good article on XML sitemaps and afterall it's traffic from Google you're most wanting so you know you're gonna click it!


15.CONTENT IS KING!

Always write your content for humans not spiders.




16.RSS feeds

RSS stands for really simply syndication and is a pretty simple way to syndicate your content so it's easy to publish on other people's sites. Each site that uses your RSS feed earns you near real-time backlinks, feeds can be harvested by Google, Outlook, a number of sites and applications, allowing users to connect to you through updates without them having to check your site every day. I monitor a number of feeds in my Outlook, it lets me scan the headline and content to see if I want to click through to read the page behind it all.

RSS is a valuable tool for both getting links on other sites and getting your content into the inbox of your visitors.

For stats on how many times your feed is being accessed there's a free service available over at http://feedburner.com/ Take a wild and random guess who owns it now? I'll give you a clue, it begins with a G and ends in oogle. Begs the question: do they rank the sites behind the most popular feeds higher?


17.Social bookmarking tools

Social bookmarking tools will easily let your visitors bookmarks and share your site, this not only makes it easier for them to come back again but it adds to the number of inbound links you have pointing to your site and should increase your traffic.


Adding a free tool like this makes it easy for your visitors to bookmark your site on popular sites like Twitter, Facebook and many many more. Whilst social networking can in this case be great and a traffic bringer it's something personally I've sacked in favour of actually phoning and seeing people instead, I've reclaimed many hours and liberated myself from being sucked into what I feel is total time killing junk.


This service also works on mobile devices allows visitors to your mobile site to share and bookmark you on the mobile versions of the most popular social networking site.


18.Publish fresh content often (spider you more often )

Google and the search engines love fresh content, if you publish new content often they'll be excited by it, they'll spider you more often and be prone to send you more visitors. There's nothing exciting about a site that's stagnating and was last updated in March 2001. Fresh content to spiders is like the smell of freshly baked bread to us humans, they love it, it gives them an appetite for your site and will want to feast on your new content.

Using dynamic methods like aggregating the most popular content and showing it is a way of emulating freshness in that the engines will see your site update often with relevent keywords. Whilst nothing compares with freshly squeezed content dynamic changes on the site can be viewed as a good thing.



19.Tweet when publishing content

When you publish new content push a Tweet to Twitter, freshly published content bring in instant visitors thanks to it being Tweeted.

* Twitter API - http://apiwiki.twitter.com/


Personally I can no longer stand Twitter, I agreed with David Cameron when he said 'Too many tweets make a twat' on Absolute Radio. However from an SEO persepective it has benefits, millions and millions of users, trending topics, hashtags and a growing number of sites and applications on mobile that allow you to search twitter means that pushing your links to it are invaluable.


20.Blog when publishing content

Most if not all blogging or web publishing platforms include an option to ping a number of sites to tell them that you've published fresh content, you can integrate your own ping service when publishing content though it would be simpler to run a relevant blog about your site and publish a post when you've just released fresh content.
* See this post for a list of popular ping services: http://seoblackhat.com/2005/09/05/rss-ping-list/




21.Place your keywords in many tags

H1, H2, H3, LI, P, B - I don't mean stuff the keywords in every tag you can, I mean use your keyword wisely in a number of tags, H1 is the most important tag after the title tag, having your keywords in a number of tags, in bold and in links just reinforces your relevancy.


22.Don't build your entire site in flash unless you don't want traffic!

Spiders find it hard to understand what flash is saying, yes it might look damn pretty and swirly and do funky things when you move your mouse around but it it can't be read by search engine spiders, those on screen readers or mobile devices there's no much point in it.

Keep flash to a minimum, use it for eye candy and funk but don't build a whole site with the stuff, spiders can't read it, there's no keeping of state between pages and nagivating flash can be a nightmare sometimes.



23.Follow Webmaster Guidelines
* Google Webmaster Guidelines
* Bing / MSN Guidelines for successful indexing


24.Don't link to junk

Only ever link to external websites that are QUALITY, many SEO folks would say never link to anything that's less than PageRank 3 but really if it's well written relevant content with relevant anchor text on your page you're onto a winner.

Simply never link to crap, the links away from your site as as valuable as the internal ones, you don't want to send traffic to a junky site as the visitor you're sending there is going to think less of you for doing so, they'll class you as being less authorative that they did before they clicked the duff link and search engines will view links to junk in the same way.

Only link to relevant, high quality pages and sites. Always use anchor text in those links that bring benefit to your page and has relevancy to the page you're linking to.



25.Publish content that is worth linking to

You need to be publishing content that is rich enough in quality that other site owners will want to link to it. If you can do that then the hard long slog of a job that is link building will be much easier. If you publish content that is worth linking to and make it easy for folks to share the content and link to the page through the social network bookmarking services like addthis.com you're even more likely to win.



26.Avoid duplicating pages / duplicate content sucks!

Duplicate content sucks, all the search engines hate it. Near the very top of this list it mentions having UNIQUE title tags, UNIQUE is the important bit here. Make sure all your pages are different, it shows a well organised site and face it as a human if you clicked a link on a page and it took you to an identicle page you'd be a little confuzzled, same applies to the bots. Don't confuzzle your users and don't confuzzle the bots. (Confuse and puzzle if you're on the other side of the irony filter that is the atlantic)



27.Host all images and media files under your URL

Don't leach of other folks bandwitdh, host all the images on your site on your site. This gives you control over the file names so you can optimise them to suit your needs, it also makes you look more authorative than a site that leaches it's content from another source.


28.Use Google alerts to monitor your keywords and contribute where you can

Google Alerts is an invaluable free service that lets you keep an eye on what's coming up on the Google radar for the given search term. This then lets you check out that page and if the content is relevant and there is the opportunity to publish a comment you should do your best to add something valuable to earn a back link.

Note the word contribute, don't spam, no seriously don't spam, it's a bad habit and yeilds no rewards, contributions that are valid and relevant bring something to table and as such the author often gets a credit via a link. Some would protest that blogs use nofollow on external links but the real thing about getting traffic is you want humans and real users, funny eh, real users and humans read blogs. Contribute, get your link there regardlesss of nofollow and it will bring you traffic.Contextual contributions are fine, link spam should be a shootable offence.

Another reason to not spam is Akismet, a spam filter for the WordPress bloggin system, if your URL is added to that you're comments on any WordPress blog will go to trash and never be published, consider it bad karma for link spamming.
* Create a Google Alert - http://www.google.com/alerts



29.Use Google Analytics to monitor your traffic, conversions and keywords

Because if you don't how do you measure your success? Making money is one thing, knowing what your money terms are and where the money comes from it another.


30.Use Google WebMaster Tools

It will tell you which pages have duplicate titles, which URLs are broken and much more


31.Use Google AdSense

It brings in bots and forces Google to browse your content in order for them to show relevant ads, whilst they deny making preference to sites that show AdSense getting Google to come by and check out your site so they can show relevant content is not only priceless in my mind but a revenue generator too that you could reinvest in marketing your site.


32.Don't try anything daft or dodgy

Google is smarter than you are, they'll bust your ass if you try to blag them. They know, they can bin you from results so it's better to play it safe, be honest and be in it for the long game than try to sneakily get to the top of the results to be blown out the water a later time. I refer to the point above about webmaster guidelines, it's better to follow them than break them.

Monday, November 9, 2009

Basic SEO activity

Top Search Engines
• AOL Search, Ask Jeeves, Bing (MSN), Google, Yahoo! Search


Pay Per Click Search Engines
• Google AdWords, MSN adCenter, Yahoo! Search Marketing

Meta Search Engines
• Dogpile, Excite, Mamma, MetaCrawler, WebCrawler





"Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images."

Text Links
* What are text links?
* What do text links have to do with search engines?
* How do text links improve search engine rankings?
* How do I get people to link to my web site?
* What is link traffic?
Here are some examples of the underlying HTML code for a text link:
homepage
Google
text links help search engines understand the topic of a link destination. Search engines spiders doesn't recognize text contained in images. Although there are ways to help spiders understand the content of image links (as explored in the Image Links guide), text links are the preferred method.
What is link traffic?
Link traffic is traffic generated by links to your website. So getting people to link to your website isn't just about improving search engine rankings. It also generates traffic.

Anchor Text (Variations, misspellings)
* What is anchor text?
* Should I vary the anchor text?
* Should I include misspellings in my anchor text?
* Britney Spears Spelling Correction
Anchor text is the visible, clickable descriptive text of a hyperlink, located in between the HTML anchor and tags.
For example:
anchor text
Should I vary the anchor text?
Yes, you should vary anchor text. This is because as search engine optimizers have learnt that anchor text helps improve search engine rankings they inevitably built incredible number of webpages that link to their websites using identical anchor text.
It may seem odd but it's actually a good idea to include misspellings in your anchor text, especially if the anchor text isn't easy to spell correctly.

Image Links (image maps, alt text)

* What are image links?
* What is "Alt Text"?
* What are image maps?
* Should I stay away from using image maps?
What are image links?
An image link is a graphic image that typically links to another online document, such as a webpage, PDF file, video, Excel spreadsheet, etc. Like text links, image links can also point to another location on the same page.
What is "Alt Text"?
Alt text is "alternative text" that is placed in the code for an image. It is displayed when the mouse cursor hovers over the image (fig. 1). The text is also visible if an image is unable to load (fig.2). Alt text is especially useful to people using screen readers.

What are image maps?
An image map is a graphic image containing one or more invisible clickable regions, called hotspots, allowing several hyperlinks from a single image.
For example, an image could be a map of the World with each continent split into different clickable hotspots. When a visitor clicks on one of the hotspots, they are taken to the respective online document.
Should I stay away from using image maps?
Many search engines will not crawl links inside an image map because of the potential for spam abuse. But that doesn't mean you should stay away from using image maps. To ensure search engines crawl the links contained with an image map, you should also duplicate the links as conventional text or image links somewhere else on the page and/or a sitemap.


PageRank (increase PageRank, attribute)
* What is PageRank?
* Are all pages in a website given the same PageRank?
* How do I check my website's PageRank?
* How do I increase a webpage's PageRank?
* Can I buy PageRank?
* What type of links increase PageRank?
* What type of links don't increase PageRank?
* What is the attribute?
* What is PageRank leakage?
* Should I use the to preserve PageRank?
* Are there links that can actually decrease PageRank?

What type of links increase PageRank?
PageRank is passed onto a webpage if the page with the outbound link uses HTML code that has a direct link to the link destination.
For example:
Outbound Link
What type of links don't increase PageRank?
PageRank is not passed on if the link doesn't point directly to the link destination. Examples of links that don't point directly to the link destination include most affiliate links, traffic tracking links, links that point to a webpage on the same website as the outbound link which then redirects the user to the link destination (this is often used to hide the real URL of the website it is linking to), and so on.
Other links that don't help increase PageRank are links from webpages that either has a PageRank rating of zero (for example, new webpages), or doesn't have a PageRank rating because Google's spider cannot crawl and index it. These include dynamically generated pages, such as search engine results pages (including paid inclusion pages), online databases, and password protected pages.
What is the attribute?
Adding the attribute to a hyperlink instructs search engine spiders not to crawl the link. The attribute was introduced in 2005 to help combat blog comment spam. The organizations that support the technique include, Blogger, blojsom, Blosxom, Buzznet, Flickr, Google, MSN (MSN Search, MSN Spaces), Scripting News, Six Apart (TypePad, MovableType, LiveJournal), WordPress, Yahoo! Search.
For example the following link:
This is a great site
Becomes:
I link to this site, but I'm not endorsing it
Be wary of websites trying to deprive you of PageRank by including the attribute in their link to your site.

Are there links that can actually decrease PageRank?
Yes! Google recommend that you avoid linking to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.



Link Popularity (authority pages, link quality, quantity, relevancy, reputation)
* What is link popularity?
* What is link quality?
* What is link relevancy?
* What is link reputation?
* How do I check my website's link popularity?
* Why do Google, Yahoo and MSN backlink stats vary so much?
* How do I increase link popularity?
* Should I get deep links to subpages?
* What are authority pages?
* What type of links don't increase link popularity?
Link popularity is a measure of the number of links pointing to a webpage, without any regard for the quality of the links.
What is link quality?
Link quality refers to the quality of a link which takes into account a number of factors, including the importance (i.e. PageRank) of the linking webpage, the relevance of the anchor text, and whether the linking webpage is of the same topic as the link destination.
What is link relevancy?
Link relevancy refers to getting relevant sites to link to yours. Here's what the Google Webmaster Guidelines recommends, and I quote:
When your site is ready:
• Have other relevant sites link to yours
Sounds obvious, right? Top ranked sites have hundreds, if not thousands, of links to their site. How many does yours have?
What is link reputation?
Link reputation refers to the quality or "importance" of the webpage providing the link. In general, PageRank could be used as a measurement of the reputation of a webpage.
How do I check my website's link popularity?
You can actually check the link popularity of every webpage in your site. The simplest method is to go to a search engine and enter "link:" followed by the URL you wish to check.
For example:
link:http://www.yoursite.com
link:http://www.yoursite.com/subdirectory/
link:http://www.yoursite.com/subpage.html


How do I increase link popularity?

You can increase link popularity by getting more websites to link to yours. Easier said than done, right? Well, try these link building strategies. Click on a topic for more information:
• Sitemaps - XML, Google, generators, Yahoo!
• Linkbait - create content and web tools to attract links
• Link Seeding - plant links in forums, blogs, web directories
• Sponsored Links - buy advertising on other websites
• Link Exchange - exchange links other websites
• Affiliate Links - set up a direct linking affiliate program
• Link Brokers - buy text links on other websites
Should I get deep links to subpages?
Yes. Every link to a webpage, be it the homepage or a subpage, is a good link. They all help increase PageRank, and drive traffic. Sometimes, it's more relevant for another website to link to a subpage so deep linking to subpages should always be welcomed.
What are authority pages?
Authority page are webpages that are considered an "authority" on a particular topic. In theory, links from authority pages are considered more valuable than links from other pages.
What type of links don't increase link popularity?
Similar to PageRank, if a link doesn't point directly to the link destination, search engines don't crawl and index the link and therefore don't count the link popularity. These include dynamically generated pages, such as search engine results, some web directory links, most affiliate links, traffic tracking links, links that point to a webpage on the same domain which then redirects the user to the link destination (to hide the real URL of the link), online databases, and password protected pages.


Sitemaps (XML, Google, generators, Yahoo!)
* What are sitemaps?
* What are HTML sitemaps?
* What are XML sitemaps?
* What is Google Sitemaps?
* Which sitemap formats does Google support?
* What is a sitemap generator?
* Can you recommend a sitemap generator?
* How do I validate my Google sitemap?
* How do I submit a sitemap to Google Sitemaps?
* What are text sitemaps?
* How do I submit a sitemap to Yahoo?
* Which sitemap formats does Yahoo! support?
* continue top of next column >>>

Sitemaps
* ...Sitemaps continued
* How do I submit a sitemap to MSN?
* Do sitemaps guarantee my webpages will be included in a search engine?
* Do sitemaps improve search engine rankings?
* Does the position of a URL in a Sitemap influence its use?
* Do I need to remove session IDs from URLs?
* Should I include the frameset URLs or the URLs of the frame contents?
What are sitemaps?
DEFINITION
Sitemaps and Site Maps are used interchangeably.
Sitemaps are used by webmasters to inform search engines about pages on their sites that are available for crawling. Search engine spiders usually discover webpages from links within a site and from other sites, but they sometimes don't index all pages, especially if a site has a large number of pages. Sitemaps supplement this data.
What are HTML sitemaps?
NOTICE
This page primarily discusses XML and text sitemaps that are supported by the major search engines; Google, Yahoo! and MSN.
A HTML sitemap is a collection of hyperlinks, sometimes with descriptions, on one page or spread across several pages. Sitemaps help visitors to find what they're looking quickly, while at the same time helping search engine spiders crawl and index all the pages within the site.
What are XML sitemaps?
In its simplest form, a XML sitemap (see Google Sitemap Protocol) is a XML file that lists URLs for a website. It may also include additional metadata about each URL (date of last update, how often the pages are updated, and how important it is relative to other URLs in the site) to better help search engines more intelligently crawl and index the website.


What is Google Sitemaps?
Google Sitemaps is a suite of webmaster tools offered by Google to provide webmasters a free and easy way to make their site more Google-friendly. Google Sitemaps shows you how Google crawls and indexes your site, and specific problems it might be having accessing it. You can also discover which search queries drive traffic to your site, and which version of Google is driving that traffic.
DEFINITION
RSS and Atom are XML formats for sharing content on the Web.
Which sitemap formats does Google support?
Google supports XML, RSS (Real Simple Syndication) 2.0, Atom 0.3 feeds, OAI-PMH (Open Archives Initiative Protocol for Metadata Harvesting), and a text file containing a list of URLs.
What is a sitemap generator?
You could create a sitemap by hand by following the sitemap protocol, but it would be a really tedious task, and you will probably miss a few links. A better method is to use one of the many sitemap generators available. Google offer a free sitemap generator but it is probably too complicated for most people to install. The installation instructions certainly put me off giving it a try. Luckily there are plenty of downloadable and online sitemap generator tools offered by third-parties.
Can you recommend a sitemap generator?
I've only tried a few free, downloadable sitemap generators, but here are my opinions on those I've tested:
• G-Mapper is a Google and Yahoo sitemap generator for static website's of all sizes. The main problem with G-Mapper is that it doesn't give you the option of adding filters. You have to manually delete URLs after they've been crawled which is a real pain.
• Gsitemap is another sitemap generator tool. The main problem with Gsitemap is that it only generates Google sitemaps. It doesn't generate sitemaps for Yahoo!.
• GSiteCrawler is the sitemap generator I ended up using mainly because it offered all of the basic features, including Google sitemap and Yahoo! URL text file generator. But what sold me were the extra features that made it easy to use. For example, the "Import Robots.txt" feature meant I could quickly filter out all unwanted directories and pages with one click.

The other nice feature is that you can edit the metadata of one or more (individual or group) URLs at the same time, which is something not available in some of the other sitemap generators I tested. GSiteCrawler is free, although it does display a nag screen asking for a donation every so often.
The above applications are all for Windows computers only. Mac users may like to try Map-IT SiteMapper ($19.95 and also available for Windows) and RAGE Google Sitemap Automator ($29.95).
I found the above sitemap generators in Google's Sitemaps Third Party Programs & Websites guide. For a full list of the third-party sitemaps programs & websites, visit the Google page.
How do I validate my Google sitemap?
You can validate your Google XML sitemap with the Google Sitemap Validator by Validome. I'm glad I did as I discovered that the links to the Sitemaps.org XML schema in the sitemaps generated by GSiteCrawler were returning 404 not found errors. As a result I switched to the XML headers in Google's Using the Sitemap Protocol guide.
How do I submit a sitemap to Google Sitemaps?
Log in, or sign up, to Google Webmaster Tools with your Google Account, and add your sitemap URLs.
What are text sitemaps?
A text sitemap is simply a text file with a list of URLs, one on each line, and is supported by Yahoo! Site Explorer. Unlike XML sitemaps, it does not allow for metadata about each URL.


How do I submit a sitemap to Yahoo?
Submit your sitemap to Yahoo! Site Explorer.
Which sitemap formats does Yahoo! support?
Yahoo! supports RSS 0.9, RSS 1.0, RSS 2.0, Atom 0.3, and a text file containing a list of URLs. The filename of the URL list file must be "urllist.txt". The filename for a compressed file must be "urllist.txt.gz". Site Explorer recognizes files with a .gz extension as compressed files and will decompress them before parsing.
Source: What kinds of feeds do you support?
How do I submit a sitemap to MSN?
Good question! No one seems to know. The best bet is try submitting your sitemap URL to the Windows Live Search URL Submission form.
Do sitemaps guarantee my webpages will be included in a search engine?
No. Sitemaps do not guarantee that webpages will be included in search engines, but they should help search engine spiders do a better job of finding and indexing every page within a site.
Do sitemaps improve search engine rankings?
No. Sitemaps will not influence the way your pages are ranked by a search engine. However, it does help to get them crawled and indexed, which gives your pages a better chance of being ranked.
Does the position of a URL in a Sitemap influence its use?
No.
Do I need to remove session IDs from URLs?
Yes. Session IDs in URLs may result in incomplete and redundant crawling of your site.
Should I include the frameset URLs or the URLs of the frame contents?
Include both URLs in your sitemap.
This document was written with the help of Sitemaps.org, the official sitemaps website sponsored by Google, Yahoo! and Microsoft. Visit the site for more technical information on sitemaps.


Linkbait
* What is linkbait?
* Is linkbait just about attracting links?
Linkbait is the internet marketing equivalent of fishing bait. Linkbait is simply content or web tools created specifically to entice people to link to from their website or blog, or tell others about it.
All content and web tools are linkable content you might say. Yes, to a certain extent that is true. But linkbait content is different. Linkbait content has to be better or different from the usual content to attract more than the usual share of links.
Examples of good Linkbait content include:
• Viral content, such as the hilarious Evolution of Dance video that's been viewed 36,758,020 (Dec. 13, 2006) times on the YouTube website.
• Controversial content
• Useful web tools, such as Alexaholic which allows you to compare the Alexa traffic rankings of up to 5 domains simultaneously. According to Yahoo!, Alexaholic has a whopping 17,493 links pointing to it (not including links from the Alexaholic domain itself), and the domain was set up as recently as February 2006.
• "How To" articles
• Top 10, 30, 50 lists
• 101 tips
• Inside knowledge
• "Cool" stuff

Is linkbait just about attracting links?
Yes and no. Links usually equals traffic, so attracting links is the main reason for creating linkbait content. But many web surfers don't actually have a website to post links to. As such, they might bookmark or tell others about it via an online bookmarking service, such as Digg or Del.icio.us.


Link Seeding (link spam)
* What is link seeding?
* Where can I seed links?
* What is link spam?
What is link seeding?
Link seeding refers to the process of proactively surfing the Web planting links in websites that allow direct linkage back to your site, in the hope of attract PageRank, as well as traffic from those sites
Where can I seed links?
You can plant free links in web directory listings, forum posts, forum signatures, blog posts, blog comment author links, discussion boards, product testimonials, guestbooks, and wikis.
You can also submit your links to search engines purely, but they will only attract traffic. The links won't provide any PageRank because search engine result pages are dynamically generated, and Google doesn't allocate PageRank to dynamically generated webpages.
DEFINITION
Link Spam is also known as Blog Spam and Comment Spam.
What is link spam?
Link spam is a much publicized form of spam that mainly targets blog commenting systems, but also forums, guestbooks, wikis (often called Wikispam) - in fact any website that allows people to freely post links.


Link Exchange
* What is link exchange?
* Should I exchange links with other websites?
* What type of websites should I exchange links with?
* What type of websites should I avoid exchanging links with?
* How do I find websites to exchange links with?
* Can my competitor harm my Google rankings by pointing thousands of links to my site, from a penalized or banned site?
* How do I know whether link exchange partners are still linking to my website?
* What are one way links?

Link exchange is the process of exchanging links with other websites.

Should I exchange links with other websites?
That depends on a number of factors. Before you decide on starting a reciprocal link exchange campaign, you must ask yourself these questions:
• Can I find more efficient ways to attract links to your website?
• Do I have the time and resources to commit to a link exchange campaign?
• Do I want to commit to a short-term or long-term link exchange campaign?
• How many links am I looking to exchange?
• How am I going to manage all of those link exchange partners?
What type of websites should I exchange links with?
The best websites to get links from are those with a high PageRank, and/or are of a similar topic to yours, and with few outbound links on the page. However, this is more difficult than it may sound. High PageRank sites typically don't exchange links - they don't need to.
That means websites similar to yours becomes the next most desirable sites to swap links with. Again this causes problems. It doesn't make sense to exchange links with your competition. After all, you don't want to send your visitors to their website, nor do you want to help improve their PageRank, and consequently search engine rankings!
You can still exchange links with sites that are similar to yours, but make sure they're not in direct competition with you. For example, if your website sells oil paintings, don't swap links with other sites that also sell oil paintings. Instead find art museums, water painting sites, art colleges, home decoration sites, etc., to exchange links with.
What type of websites should I avoid exchanging links with?
The main category of websites you should avoid swapping links with are what are commonly referred to in the SEO community as "bad neighborhoods" - sites which have been penalized or banned by the search engines.
There is no official source of "bad neighborhood" sites. But they do often fall into a number of categories, including:
• a PageRank rating of 2 or less -- Use the Google Toolbar to check a site's PageRank rating.
• the website is not indexed by the search engines -- To check if site has pages indexed in a particular search engine, simply go to the search engine and enter the following query into the search box:
site:www.potentiallinkexchangepartner.com
If no pages are found, then that could be an indication that the site has been penalized or banned by the search engine.
• the domain is 6 months or older -- Conduct a Whois search to find out the age of a domain. Relatively new sites can be forgiven for not having a PageRank rating or indexed by the search engines. But if a domain is 6 months or older, and it has a low PageRank and isn't indexed in the search engines, then it "could" mean that the site has been penalized or banned for some reason.

Having said all that, it is difficult to conclusively say whether a site has been penalized or banned by the search engines. The pointers above are simply a guide to whether you should exchange links with a site. The bottom line is that if a site looks genuine and you are happy to exchange links with it, then by all means go ahead. Don't take the advice given here as the deciding factor.
How do I find websites to exchange links with?
There are five main methods to finding reciprocal link partners:
1. Proactively surf the Web searching for websites to exchange links with. This is not the most practical or efficient method, so isn't really recommended. But if you really want to try this method, the easiest way to find link exchange partners is to approach all the sites that link to the competition. And you can find out who they are simply by going to a search engine such as Yahoo! Search and Microsoft's Bing Search and entering the following query into the search box:
link:www.yourcompetitorswebsite.com
The search engines will return a list of websites that link to your competitor's site. All you have to do is contact each site and either ask them to link to your site or, if they require a link back, a link exchange. In case you're wondering, you could also try the search in Google. The reason I didn't mention it above is because Google's filtering algorithm seems quite strict and pretty much filters out the majority of links.
2. Install a link exchange script on your site and wait for potential link exchange partners to submit their link exchange requests. This is the poor, lazy person's method of exchanging links. It's definitely worth trying if you can't afford the time or to outsource the link exchange campaign.
3. Use a link exchange software program to help you find and manage link exchange partners. This is probably the most popular method of finding link exchange partners as the software does much of the grunt work for you. All you have to do is instruct the software which types of sites you're looking for, decide which of the sites discovered by the software to email a link exchange request to, and add their link if they accept your offer. My favorite link exchange software program is Arelis, which has now been incorporated into the Internet Business Promoter suite of SEO tools.
4. Use an online link exchange database to find and manage reciprocal link partners. This is similar to the software method, except the number of link exchange partners is limited to sites who have signed up to the same service. Generally I would not recommend link exchange databases.

Also be careful to avoid link exchange services which create link pages that automatically include all the other sites within a category or links to sites unrelated to yours. The chances are you will end up with link pages with the same links as all the other sites using the service. This could get the page removed or even banned by the search engines for link spam.
5. Outsource the link exchange campaign to a professional link exchange company. This is the rich, lazy person's way to link exchange heaven. Simply find a company you like and purchase the link package that suits your budget. Make sure you tell them exactly the types of websites you want to exchange links with. Also set a deadline.

Can my competitor harm my Google rankings by pointing thousands of links to my site, from a penalized or banned site?
According to Google:
"There's almost nothing a competitor can do to harm your ranking or have your site removed from our index."
Note that Google doesn't rule out the possibility that a competitor could harm your site. As far as I know another website cannot harm your Google rankings simply by pointing links to your site. Imagine the havoc it would cause it was possible for someone to get a site penalized or banned in Google simply by pointing hundreds or thousands of links to it.


Sponsored Links (PageRank)
* What are Sponsored Links?
* Do Sponsored Links increase PageRank?
* Which websites offer Sponsored Links?

What are Sponsored Links?

Sponsored links are any links where the sponsor pays for placement. These advertisements are typically displayed above, and to the right of, search engine results or content. But they sometimes also appear below search results.
Google AdWords ads are labeled as "Sponsored Links", as shown here:

Yahoo Search Marketing ads use the "Sponsor Results" label:

While MSN Live Search ads use the "Sponsored Sites" heading:

Do Sponsored Links increase PageRank?
Sponsored Links such as Google AdWords, Yahoo Search Marketing and MSN ads use long and sophisticated tracking links which don't get crawled by Google. As such no PageRank is passed along the link.
However, many websites sell sponsored text links which use short links that link directly to the sponsor's website. These types of links are usually crawled by Google and therefore do attract PageRank.
Which websites offer Sponsored Links?
Google AdWords, Yahoo! Search Marketing, and MSN AdCenter all offer Sponsored Links. Link brokers, such as Text Links Ads, also offer Sponsored Links, but of a different sort. Link brokers sell simple text links that link directly to your site, and therefore attracts PageRank. The linked anchor text is usually the keyword phrase you're trying to get top rankings for. However there is no accompanying description next to the link so they don't do as good a job of persuading people to click on your link.


Affiliate Links
* What are affiliate links?
* Do affiliate links increase PageRank?
What are affiliate links?
An affiliate link is the hyperlink code given to members of affiliate programs. The affiliate link is displayed in webpages, emails, text ads, etc., and used to track referrals.
Do affiliate links increase PageRank?
Generally no, because the majority of affiliate links point to an affiliate program script, which in turn redirects the user to the final landing page. PageRank is only passed on in direct links.
Here is an example affiliate link. If you click on it, you'll notice how it redirects you to a different website.
http://linkconnector.com/traffic_affiliate.php?lc=002143001313002778&atid=tlgaflk
In recent years some affiliate programs and affiliate networks have introduced a new type of link that links directly from affiliate websites to the merchant's site.
These "bare" links meant that affiliates would link directly to the merchant's website, thereby passing on any PageRank benefits to the merchant's landing page.
This is great for affiliate program owners, but gives affiliates nothing in return. Generally I would recommend sticking with standard affiliate links, if available.


Link Brokers (RON, ROS, Text Link Ads)

* What are link brokers?
* Should I buy text links via link brokers?
* Is it better to buy text links that are displayed on one page (usually the homepage) or across an entire site?
* Is Text Link Ads a reputable link broker?
Link brokers are websites that promote the buying and selling of text links. Technically speaking, text links are leased for a monthly fee, not bought. Link brokers make money by taking a cut of the leasing fee.
Should I buy text links via link brokers?
That depends. Link brokers offer a number of benefits, including:
• a large selection of websites
• websites sortable by various variables
• relevant site information available at a glance
• only having to deal with the link broker, as opposed to several website owners
• fixed prices - no negotiating skills required
Of course, there are downsides too, including:
• some useful site information, such as traffic and pageviews, may not available at a glance
• higher prices than you might pay if you purchased a link directly from a website
• you can't negotiate prices - they are fixed
• search engines "may" penalize sites that purchase links from sites within the network
You have to weigh up the pros and cons and decide for yourself whether you feel link brokers are a good place to buy text links. If you have limited time, or don't wish to go through all the trouble of finding sites and negotiating lower fees, then I think purchasing links via link brokers is a good idea.
The one real downside could be that search engines "may" put a penalty on sites, or outbound links from sites, within the network, thereby reducing the link popularity or PageRank benefit. Obviously, you still get the benefit of any traffic generated from the links.
DEFINITION
Run of Site (ROS) and Run of Network (RON) refers to advertisements appearing on all pages of a website, and network, respectively.
Is it better to buy text links that are displayed on one page (usually the homepage) or across an entire site?
That's a good question! Generally, the more backlinks the better so it makes sense to buy links that would be displayed on every page of a site, especially if it only costs a bit more than a single homepage link. However, this could be a waste of time, never mind money. Why? Let me explain.
Google interprets a link from page A to page B as a vote for page B, by page A, as stated in the PageRank guide. But this is an unnatural occurrence. No one would naturally "vote" for a page hundreds of times. So if Google finds hundreds (even thousands) of links from one source, it "could", and probably does, consider those links as an attempt to inflate the site's link popularity numbers, and as a result remove the PageRank benefits of all the links from the source.
If you check the link popularity of some websites that have obviously purchased a text link on a website as it appears on hundreds, even thousands, of pages, you'll probably notice that Google seems to index only a small percentage of all the links. Yahoo! Search and Microsoft's Bing Search on the other hand seem to index a far greater number of such links, as illustrated in this backlink screenshot.
As such, you may be better off simply buying one link, typically on the homepage, rather than run of site. I certainly think so.
Is Text Link Ads a reputable link broker?
Yes. Text Link Ads are one of the most reputable text link brokers around, with clients ranging from Fortune 500 firms to individual website owners. They offer thousands of text links for sale from hundreds of websites, starting from just $2 per month. Text Link Ads is a member of the Better Business Bureau and also a circle member of SEMPO (Search Engine Marketing Professional Organization). I have heard nothing but good comments about Text Link Ads.



Natural Linking (artificial linking, link triangles, www or non-www)
* What is natural linking?
* What is artificial linking?
* What are link triangles?
* Is there any difference between linking virtually hosted domains and domains on dedicated IP addresses?
* Should I link to the www or non-www version of my domain?

result of website content creators trying to add value to their website.
Natural links:
• are surrounded by relevant text
• include a mixture of anchor text
• include a combination of homepage and deep links
• are usually built up slowly over time
• are from a wide variety of sources
• have a high ratio of links from subpages
• are not reciprocal

What is artificial linking?
Conversely, "artificial linking" are irrelevant or disguised links that add value to the link destination site, instead of the sites providing the links. These include paid links, link exchanges, "powered by" and "hosted by" links.
Artificial links:
• are surrounded by little or no relevant text
• are surrounded by navigation links
• typically have a high ratio of identical anchor text
• often are not in context with the webpage content
• typically point to the homepage
• spring up over a short period of time
• often come from a large number of webpages in a small number of sites
• have a high ratio of links from the homepage of websites
• are often reciprocal
• are at the bottom of the page

What are link triangles?
A link triangle is a concept devised to evade the detection of reciprocal linking patterns by search engines. With a link triangle, page A links to page B, page B links to page C, and page C links to page A, as illustrated below:

It is certainly an interesting concept, and one that may well fool the search engines if implemented properly with natural linking strategies.
Is there any difference between linking virtually hosted domains and domains on dedicated IP addresses?
None whatsoever according to Matt Cutts, head of the Google's Webspam team. Matt says, quote, "Links to virtually hosted domains are treated the same as links to domains on dedicated IP addresses."
Source: Myth busting: virtual hosts vs. dedicated IP addresses

DEFINITION
URL Canonicalization is the process of picking the best URL when there are several choices.

Should I link to the www or non-www version of my domain?
I would recommend picking the www version and always use that in internal and inbound links.
Do not think that www and non-www version of a domain is the same. For example, most people would probably consider these URLs to be the same:
• www.yoursite.com
• www.yoursite.com/index.html
• yoursite.com
• yoursite.com/index.html
Technically they are all different. I have often encountered the dreaded "cannot find server" error message when entering the non-www version of a domain name into a web browser, only for the site to appear when I try the www version.
Make sure your webmaster sets up your site so that it appears when someone visits either the www or non-www version of the domain.
I often come across internal homepage links pointing to various URLs, including:
• index.html
• yoursite.com
• yoursite.com/index.html
• www.yoursite.com/index.html
Stick to one URL, such as www.yoursite.com, instead of all those variations above.
Source: SEO advice: url canonicalization


Broken Links (link rot, broken link testing)
* What are broken links?
* What is linkrot?
* How often should I check for broken links?
* What are the most popular broken links testing software?

Broken links are text hyperlinks that typically links to another online document, such as a webpage, PDF file, video, Excel spreadsheet, etc. which when clicked doesn't lead to the correct destination.
Broken links can include clickable image links and links to images.

DEFINITION
Link Rot and Linkrot are used interchangeably

What is Link Rot?
Link Rot is the process by which links on a website become irrelevant as webpages and files being linked to disappear, change content, or redirect to a new location.
How often should I check for broken links?
Probably every day in an ideal world. But that would take up too much time. I would suggest you check for broken links at least once a month
What are the most popular broken links testing software?
My favorite tool for finding broken links is Xenu's Link Sleuth, which runs on Windows PC and is 100% free. Link verification is carried out on normal links, images, frames, plug-ins, backgrounds, local image maps, style sheets, scripts and Java applets. It displays a continuously updated list of URLs which you can sort by different criteria.
Additional features include:
• Can re-check broken links (useful for temporary network errors)
• Supports SSL websites - https://
• Detects and reports redirected URLs
• Creates site maps
• Reports can be emailed
• Partial testing of FTP and gopher sites
If you would like to recommend similar broken link checkers, which is also free, especially one for Mac users, please post it below.

Thursday, October 8, 2009

SEO SEM Checklist

On-Page SEO Checklist
Always start with keyword selection, research and testing
Title tags
Meta tags
ALT tags
H1 tags
URL structure
Internal Linking
Content
Keyword density
Site maps
Usability
Track target keywords
Expect results in 12 months

Do not make common on-page SEO mistakes:
Duplicate content
URL variants on the same pages
Off-site images and content on-site
Duplicate title tags

Do not use on-page SEO spamming tactics such as:
Hidden text
Hidden links
Keyword repetition
Doorway pages
Mirror pages
Cloaking


Off-Page SEO Checklist
Always start with keyword research, testing and selection
Keywords in links
Links from high ranking publisher sites
One-way inbound links (not link exchange)
Different keywords in your link-ads from the same site
Gradual link building technology (no growth spikes)
Relevant keywords near your inbound link
Deep linking (from multiple pages to multiple pages)
Target a large list of keywords (5-500+)
Link from sites with a variety of LinkRanks
Track active link-ad keywords
Discontinue campaigns if ranking does not improve
Expect results in 30 days (MSN) 1-9 months (Google, Yahoo)



Do not make common off-page SEO mistakes:
Duplicate keywords in link-ads
Site-wide links causing link growth spikes
Using on-page SEOs to do specialist off-page SEO
Placing random links without keywords near your link-ad


Do not use off-page SEO spamming tactics such as:
Link farms (sites with 100+ outbound links per page)
Using irrelevant keywords in your link-ads
Garbage links
Link churning
Hidden inbound links

SEO SEM Glossary

advertising network: A service where ads are bought centrally through one company, and displayed on multiple Web sites that contract with that company for a share of revenue generated by ads served on their site.

algorithm: The technology that a search engine uses to deliver results to a query. Search engines utilize several algorithms in tandem to deliver a page of search results or keyword-targeted search ads.

anchor text: The clickable text part of a hyperlink. The text usually gives visitors or search engines important information on what the page being linked to is about.

click through rate (CTR): The rate (expressed in a percentage) at which users click on an ad. This is calculated by dividing the total number of clicks by the total number of ad impressions. CTR is an important metric for Internet marketers to measure the performance of an ad campaign.

content network: A group of Web sites that agree to show ads on their site, served by an ad network, in exchange for a share of the revenue generated by those ads. For example: Google AdSense or the Yahoo Publisher Network.

contextual advertising: Advertising that is targeted to a Web page based on the page's content, keywords, or category. Ads in most content networks are targeted contextually.

cost per action (CPA): A form of advertising where payment is dependent upon an action that a user performs as a result of the ad. The action could be making a purchase, signing up for a newsletter, or asking for a follow-up call. An advertiser pays a set fee to the publisher based on the number of visitors who take action. Many affiliate programs use the CPA model.

cost per click (CPC): Also called Pay per Click (PPC). A performance-based advertising model where the advertiser pays a set fee for every click on an ad. The majority of text ads sold by search engines are billed under the CPC model.

cost per thousand (CPM): An ad model that charges advertisers every time an ad is displayed to a user, whether the user clicks on the ad or not. The fee is based on every 1,000 ad impressions (M is the Roman numeral for 1,000). Most display ads, such as banner ads, are sold by CPM.

geo-targeting: Delivery of ads specific to the geographic location of the searcher. Geo-targeting allows the advertiser to specify where ads will or won't be shown based on the searcher's location, enabling more localized and personalized results.

Googlebot: Google uses several user-agents to crawl and index content in the Google.com search engine. Googlebot describes all Google spiders. All Google bots begin with "Googlebot"; for example, Googlebot-Mobile: crawls pages for Google’s mobile index; Googlebot-Image: crawls pages for Google’s image index.

inbound link: An inbound link is an hyperlink to a particular Web page from an outside site, bringing traffic to that Web page. Inbound links are an important element that most search engine algorithms use to measure the popularity of a Web page.

invisible web: A term that refers to the vast amount of information on the web that isn't indexed by search engines. Coined in 1994 by Dr. Jill Ellsworth.

keyword: A word or phrase entered into a search engine in an effort to get the search engine to return matching and relevant results. Many Web sites offer advertising targeted by keywords, so an ad will only show when a specific keyword is entered.

link bait: Editorial content, often sensational in nature, posted on a Web page and submitted to social media sites in hopes of building inbound links from other sites. Or, as Matt Cutts of Google says, "something interesting enough to catch people's attention."

link building: The process of getting quality Web sites to link to your Web site, in order to improve search engine rankings. Link building techniques can include buying links, reciprocal linking, or entering barter arrangements.

meta tags: Information placed in the HTML header of a Web page, providing information that is not visible to browsers, but can be used in varying degrees by search engines to index a page. Common meta tags used in search engine marketing are title, description, and keyword tags.

pay per click (PPC): See Cost per Click (CPC).

quality score: A score assigned by search engines that is calculated by measuring an ad's clickthrough rate, analyzing the relevance of the landing page, and considering other factors used to determine the quality of a site and reward those of higher quality with top placement and lower bid requirements. Some factors that make up a quality score are historical keyword performance, the quality of an ad's landing page, and other undisclosed attributes. All of the major search engines now use some form of quality score in their search ad algorithm.

return on investment (ROI): The amount of money an advertiser earns from their ads compared to the amount of money the advertiser spends on their ads.

search advertising: Also called Paid Search. An advertiser bids for the chance to have their ad display when a user searches for a given keyword. These are usually text ads, which are displayed above or to the right of the algorithmic (organic) search results. Most search ads are sold by the PPC model, where the advertiser pays only when the user clicks on the ad or text link.

search engine marketing (SEM): The process of building and marketing a site with the goal of improving its position in search engine results. SEM includes both search engine optimization (SEO) and search advertising, or paid search.

search engine optimization (SEO): The process of making a site and its content highly relevant for both search engines and searchers. SEO includes technical tasks to make it easier for search engines to find and index a site for the appropriate keywords, as well as marketing-focused tasks to make a site more appealing to users. Successful search marketing helps a site gain top positioning for relevant words and phrases.

search engine results pages (SERPs): The page searchers see after they've entered their query into the search box. This page lists several Web pages related to the searcher's query, sorted by relevance. Increasingly, search engines are returning blended search results, which include images, videos, and results from specialty databases on their SERPs.

social media: A category of sites that is based on user participation and user-generated content. They include social networking sites like LinkedIn or Facebook, social bookmarking sites like Del.icio.us, social news sites like Digg or Reddit, and other sites that are centered on user interaction.

spider: A search engine spider is a program that crawls the Web, visiting Web pages to collect information to add to or update a search engine's index. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot."

title tag: An HTML meta tag with text describing a specific Web page. The title tag should contain strategic keywords for the page, since many search engines pay special attention to the title text when indexing pages. The title tag should also make sense to humans, since it is usually the text link to the page displayed in search engine results.

universal search: Also known as blended, or federated search results, universal search pulls data from multiple databases to display on the same page. Results can include images, videos, and results from specialty databases like maps and local information, product information, or news stories.

Web 2.0: A term that refers to a supposed second generation of Internet-based services. These usually include tools that let people collaborate and share information online, such as social networking sites, wikis, communication tools, and folksonomies.


Anchor Text

Anchor text refers to the visible text for a hyperlink. For example:

< a href="http://www.seo-help.com/" >This is the anchor text< /a >


ATW

Abbreviation for AllTheWeb, a search engine powered by FAST.


Back Link

Any link on another page that points to the subject page. Also called inbound links or IBLs.


Bot

Abbreviation for robot (also called a spider). It refers to software programs that scan the web. Bots vary in purpose from indexing web pages for search engines to harvesting e-mail addresses for spammers.


Cloaking

Cloaking describes the technique of serving a different page to a search engine spider than what a human visitor sees. This technique is abused by spammers for keyword stuffing. Cloaking is a violation of the Terms Of Service of most search engines and could be grounds for banning.


Conversion

Conversion refers to site traffic that follows through on the goal of the site (such as buying a product on-line, filling out a contact form, registering for a newsletter, etc.). Webmasters measure conversion to judge the effectiveness (and ROI) of PPC and other advertising campaigns. Effective conversion tracking requires the use of some scripting/cookies to track visitors actions within a website. Log file analysis is not sufficient for this purpose.


CPC

Abbreviation for Cost Per Click. It is the base unit of cost for a PPC campaign.


CTA

Abbreviation for Content Targeted Ad(vertising). It refers to the placement of relevant PPC ads on content pages for non-search engine websites.


CTR

Abbreviation for Click Through Rate. It is a ratio of clicks per impressions in a PPC campaign.


Doorway Page

Also called a gateway page. A doorway page exists solely for the purpose of driving traffic to another page. They are usually designed and optimized to target one specific keyphrase. Doorway pages rarely are written for human visitors. They are written for search engines to achieve high rankings and hopefully drive traffic to the main site. Using doorway pages is a violation of the Terms Of Service of most search engines and could be grounds for banning.


FFA

Abbreviation for Free For All. FFA sites post large lists of unrelated links to anyone and everyone. FFA sites and the links they provide are basically useless. Humans do not use them and search engines minimize their importance in ranking formulas.


Gateway Page

Also called a doorway page. A gateway page exists solely for the purpose of driving traffic to another page. They are usually designed and optimized to target one specific keyphrase. Gateway pages rarely are written for human visitors. They are written for search engines to achieve high rankings and hopefully drive traffic to the main site. Using gateway pages is a violation of the Terms Of Service of most search engines and could be grounds for banning.


Google Dance

Up to June, 2003, Google has updated the index for their search engine on a roughly monthly basis. While the update is in progress, search results for each of Google's nine datacenters are different. The positions of a site appears to "dance" as it fluctuates minute to minute. "Google dance" is an unofficial term coined to refer to the period when Google is performing the update to its index. Google may be changing their index calculation method to allow for a continuous update (which will effectively end the roughly monthly dances).


IBL

Abbreviation for In Bound Link. Any link on another page that points to the subject page. Also called a back link.


Ink

Abbreviation for Inktomi, the back-end search engine acquired by Yahoo. The Inktomi search engine is being phased out as Yahoo built a new search engine incorporating Inktomi's technology with elements of Yahoo's other search acquisitions.


Keyword/Keyphrase

Keywords are words which are used in search engine queries. Keyphrases are multi-word phrases used in search engine queries. SEO is the process of optimizing web pages for keywords and keyphrases so that they rank highly in the results returned for search queries.


Keyword Stuffing

Keyword stuffing refers to the practice of adding superfluous keywords to a web page. The words are added for the 'benefit' of search engines and not human visitors. The words may or may not be visible to human visitors. While not necessarily a violation of search engine Terms of Service, at least when the words are visible to humans, it detracts from the impact of a page (it looks like spam). It is also possible that search engines may discount the importance of large blocks of text that do not conform to grammatical structures (ie. lists of disconnected keywords). There is no valid reason for engaging in this practice.


Link Farm

A link farm is a group of separate, highly interlinked websites for the purposes of inflating link popularity (or PR). Engaging in a link farm is a violation of the Terms Of Service of most search engines and could be grounds for banning.


Mirror

In SEO parlance, a mirror is a near identical duplicate website (or page). Mirrors are commonly used in an effort to target different keywords/keyphrases. Using mirrors is a violation of the Terms Of Service of most search engines and could be grounds for banning.


PFI

Abbreviation for Pay For Inclusion. Many search engines offer a PFI program to assure frequent spidering / indexing of a site (or page). PFI does not guarantee that a site will be ranked highly (or at all) for a given search term. It just offers webmasters the opportunity to quickly incorporate changes to a site into a search engine's index. This can be useful for experimenting with tweaking a site and judging the resultant effects on the rankings.


Portal

Designation for websites that are either authoritative hubs for a given subject or popular content driven sites (like Yahoo) that people use as their homepage. Most portals offer significant content and offer advertising opportunities for relevant sites.


PPC

Abbreviation for Pay Per Click. An advertising model where advertisers pay only for the traffic generated by their ads.


PR

Abbreviation for PageRank - Google's trademark for their proprietary measure of link popularity for web pages. Google offers a PR viewer on their Toolbar.


Robots.txt

Robots.txt is a file which well behaved spiders read to determine which parts of a website they may visit.


Scumware

Scumware is a generic/catch-all label that applies to software that:

* Installs itself secretly, dishonestly or without consent
* Does not allow for easy uninstallation / removal
* Monitors or tracks users actions without the users awareness or consent (aka spyware)
* Alters the behavior/default options of other programs without the users consent or awareness (aka thiefware)


SEM

Abbreviation for Search Engine Marketing. SEM encompasses SEO and search engine paid advertising options (banners, PPC, etc.)


SEO

Abbreviation for Search Engine Optimization. SEO covers the process of

* making web pages spider friendly (so search engines can read them)
* making web pages relevant to desired keyphrases


SERP

Abbreviation for Search Engine Results Page/Positioning. This refers to the organic (excluding paid listings) search results for a given query.


Spam

In the SEO vernacular, this refers to manipulation techniques that violate search engines Terms of Service and are designed to achieve higher rankings for a web page. Obviously, spam could be grounds for banning. Alan Perkins has published an excellent white paper on Search Engine Spam that is highly recommended. Here are some definitions of spam from the search engines themselves:

* Google
* Yahoo
* MSN


Spamdexing

Spamdexing was describes the efforts to spam a search engine's index. Spamdexing is a violation of the Terms Of Service of most search engines and could be grounds for banning.


Spider

Also called a bot (or robot). Spiders are software programs that scan the web. They vary in purpose from indexing web pages for search engines to harvesting e-mail addresses for spammers.




A refers to either a continuous loop where spiders are requesting pages and the server is requesting data to render the page or an intentional scheme designed to identify (and "ban") spiders that do not respect robots.txt.


Splash Page

Splash pages are introduction pages to a web site that are heavy on graphics (or flash video) with no textual content. They are designed to either impress a visitor or complement some corporate branding.


Stop Word

Stop words are words that are ignored by search engines when indexing web pages and processing search queries. Common words such as the.


www2/www3/www-xx

Google dance watchers use these terms as short-hand to refer to Google's different datacenters. You can add .google.com to the end of them to visit the data center that corresponds to the term.

Incremental CPC
"incremental clicks" is (change in cost/change in clicks = incremental CPC).

Monday, September 21, 2009

8 Most Popular and Useful Google Commands and Operators

1. Site: The Google site command will show you all the URL’s within the domain you specify. This allows you to see all of the pages that Google has indexed within your site. Not only will this command show you the exact URL’s that are indexed but it will also give you the total number of pages that are indexed by Google. Just type in the Google search, site:yourdomain.com to get this information.

2. Related: this is probably one of my favorite Google commands for link building purposes. If you type in related:yourdomain.com, the search results will show you a wide range of related websites that contain the same keywords as your site. This is helpful when it’s time to find relevant websites for link building.

3. cache: The cache:yourdomain.com will show you all of the URL’s from your site that Google currently has in it’s cache from the last time it crawled and indexed your site.

4. Allintitle: The allintitle command, allintitle:wood working will show you all of the URL’s that contain both of those keywords in the title of a web page.

5. AllinURL: The command, allinURL:wood working is similar to the allintitle command except that it looks for and will show you results of that keyword in a domain only.

6. Link: The link command is one of the more popular and useful commands. If you search for link:petfood.com, Google will show you a good majority of the webpages that have links pointing to the specified domain.

7. Info: The info command will show you information about a particular domain. The information such as Google cache, webpages that are similar, webpages that link to your domain, webpages that you link to and webpages that actually contain your domain name listed. So this is almost a one-stop-shop for a few of the most useful Google commands.

8. Define: Another cool command for link building – if you’re looking for related websites for a particular keyword, you can type in quotation marks around your keyword like this: “Iams cat food”, and a list of websites will show that contain that exact keyword.

SEO Terminology Glossary

50 Search Engine Optimization Definitions Every Advertiser Should Know: A SEO Terminology Glossary


Let’s face it, when it comes to SEO there are many terms that sound like they come from a foreign world. Whether you are new to search engine optimization (SEO) or looking to hire an agency, it is important to understand the industry lingo. SEO-speak can get pretty technical, I often get lost myself! But learning these 50 core SEO terms will get you a long way to understanding how the process works.

* 301 Redirect – A means of forwarding content that has been moved permanently from an old URL to a new one automatically. It will prevent a webpage from losing traffic or PageRank that is associated with an old URL. The 301 redirect will automatically forward or redirect that traffic to the updated URL.
* 404 Error – The 404 or Not Found error message is an HTTP response code that says the client was able to communicate with the server, but the server could not find what was requested, or it was configured not to fulfill the request. A 404 error page occurs when the URL of a page is moved and not redirected or when the URL is removed altogether.
* Alexa Rank – A free web toolbar service that provides web masters with information about a sites traffic. Alexa collects internet usage data from toolbar users, which contributed to the Alexa Rank.
* Anchor Text – A visible link active within text, commonly referred to as a hyperlink. It usually gives users a relevant, contextual description of the content of the links location. The words used in anchor text play a role in keyword rankings.
* Algorithm – The mathematical criteria search engines use to determine a web pages relevancy for certain search words or phrases.
* Backlinks – Links that are coming into a website from another website or web page. The number of backlinks on a webpage is an indication to search engines the popularity or importance of that website or page. Also known as incoming links, inbound links, in links, and inward links.
* Black Hat SEO – The search engine optimization practice of using unethical or unfair techniques (i.e. keyword stuffing, link spamming) to improve website rankings. Sites that practice black hat methods run the risk of being penalized or banned by Google.
* Bot – Nickname for the Googlebot, the search software used by Google to crawl, index and serve web content.
* Canonical Tag – A code snippet used to tell search engines what URL is the original or “canonical” version of your webpage. Search engines use this as a guide for what URL to crawl and index, especially when duplicate pages exist.
* Cache – The archived copy of a webpage that was indexed by a search engine. The cached version of a webpage is a copy of the page that is saved on search engine servers.
* Competing Pages – The number of pages that rank for a given keyword in search engine results pages (SERPs). Found in the upper right-hand corner of the Google SERPs (i.e. Results 1 – 10 of about 1,450,000 for example keyword).
* Crawl – An automated, computerized algorithm hosted by search engines that browses the web. The programs create a copy of each webpage for future indexing by the search engines. Also known as web crawl.
* Internal Links (Cross-Links) – A link or hyperlink from one webpage to another within the same website or domain. Usually internal links work in a hierarchal structure, with the most internal links to the most important pages.
* Duplicate Content – Refers to substantive blocks of content within or across domains that are identical or exact duplicates. In many cases this is not malicious or deceptive.
* External Links – A link or hyperlink from one webpage to another on a different website or domain. External links point to your site but do not reside on your domain.
* Google Webmaster Tools (GWT) – A free service provided by Google for webmasters. It allows webmasters to understand the sites indexing and visibility on the web, as well as identify any search engine crawl errors.
* Header Tags – HTML code used to format text, define a page’s organizational structure, and simplify navigation— code appears as an

through

. For SEO, the header tags are a source for keyword optimization, which can help a page rank better on some search engines.
* HTML Sitemap – A text version of your sitemap that is a bulleted outline of your sites navigation, with anchor text linking to each age. Provided on a website to aid in user navigation. It is also used to guide search engine bots through a website.
* Alt Text Attribute – The image alternate (ALT) attribute is considered a text equivalent for a non-text element. It is recognized by search engines as an alternative text description to an image, since crawlers cannot otherwise ‘read’ image content.
* Indexing – The process of search engines collecting, parsing, and storing webpage content for fast, relevant and accurate retrieval.
* Keyword Optimization – The process of incorporating target keywords or phrases into metadata, images and on-page content to improve search engine rankings for that keyword of phrase.
* Keyword Stuffing – Considered a black hat method for SEO, keyword stuffing is a technique used to increase a web page’s rankings by hiding text so it will not be seen by a visitor, but it will be crawled and indexed by search engines. Practices include hiding text behind and image, small text, and making the background the same color as the copy. Websites that participate in keyword stuffing will be removed from the Google index.
* Linkbait – Any website activity that encourages or attracts links from other websites. It can include contests, newsworthy information, giveaways, tools, and other compelling content. It is an important practice in SEO because the backlinks to a website factor into search engine rankings,
* Link Building – The practice of increasing the number of relevant in-links on a website or specific webpage with the goal of increasing search engine rankings and website traffic.
* Metadata – Located in the header section of a web pages source code, metadata was is intended to provide information about a website’s content. Includes robots, title, description and keywords.
* Meta Description – A part of the metadata, it is located within the code header and appears as the webpage description in organic search engine results. The description is used to describe the webpage content to search engine crawlers. The meta description can be up to 160 characters.
* Meta Keywords – A part of the metadata, it is used to list the keywords that are relevant to the content on your page. The meta keywords tag used to have significance to search engine rankings.
* Meta Robots – A part of the meta data, the robots are used to control search engine crawling activity at the page level. For site-level search engine controlling, use the robots.txt file.
* Meta Title – A part of the metadata, it is located within the code header and appears at the top of a search window. It is also the hyperlinked title in organic search results. The meta title is an important element in declaring the content on a page to crawlers and potential visitors. Each meta title is between 50-60 characters and should be unique to the content on the page.
* NoFollow – A meta tag that tells search engines that a page can be crawled and indexed, but the search engines will not follow any outgoing links, and no PageRank flows from that page.
* NoIndex – A meta tag that allows search engines to crawl a page and give it PageRank, however the search engines do not to index the page, and it will not show up in the search results.
* NOODP – A meta tag, , used to opt a page out of listing in the DMOZ open directory project.
* NOYDIR – A meta tag, , specifically used to prevent the search engines from displaying a page title and meta description in the Yahoo! Directory.
* Organic (Natural) Search Results – Non-paid search engine results located below all sponsored (or pay-per-click) advertising on a search engine results page.
* Page Rank (PR) – According to Google, “PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important”.
* Page Rank Sculpting – The practice of guiding search engines to pages of your site that have more value more over others, by adding in nofollow tags, consolidated links, and other elements on pages that are of lesser value. Due to changes to the Google algorithm in 2009, it is believed that PageRank sculpting with the use of nofollow tags no longer works to help flow more PageRank to the more valuable pages of your site.
* Pay-Per-Click (PPC) – A method of advertising where the advertiser pays for each click he receives on the search engines.
* Rankability Score – A statistical guide used for keyword selection, calculated by taking the search volume divided by the number of competing pages. The rankability score is a guide to help identify keywords that statistically show greater potential for achieving rankings quickly
* Ranking – According to an algorithm, a search engine will determine which page content retrieved for a search is the most relevant. It will list (rank) each web page in order of relevancy.
* Reciprocal Link – Mutual external link between two different websites. This is a large practice of link building in SEO, but it is also controversial. A reciprocal link can help a website’s Page Rank if the linked content is relevant. However, excessive reciprocal linking and link farms can hurt a website.
* Robots.txt – A text file that instructs the search engine bots which directory paths and pages should and shouldn’t be crawled. Any pages blocked by search engines in the robots.txt file will not pass PageRank.
* Search Engine Marketing (SEM) – An umbrella term that includes any form of internet marketing that promotes a websites visibility in search engine results pages (SERPs)— including, pay-per-click, search engine optimization, display advertising, and contextual advertising.
* SEO (Search Engine Optimization) – A form of internet marketing that can effectively increase organic search engine placement, site traffic and qualified leads to a website.
* Search Engine Results Page (SERP) – The results a user sees in the search engines after typing in a search query. The results would consist of a series of Organic ads as well as Paid or sponsored search ads.
* Search Query – A basic search query is a keyword or a keyword phrase a user enters when searching on any search engine.
* Social Media – Many definitions exist, but according the Wikipedia, “social media is media designed to be disseminated through social interaction, created using highly accessible and scalable publishing techniques. Social media supports the human need for social interaction, using Internet- and web-based technologies to transform broadcast media monologues (one-to-many) into social media dialogues (many-to-many).”
* Spiders – Similar to the Googlebot, a spider automatically crawls content on the web and feeds the pages to search engines.
* White Hat SEO – The ethical and fair practice of search engine optimization on a website.
* XML Sitemap – An XML sitemap lists URLs for a site so that search engines can more intelligently crawl a website. Sitemaps include instructions for the search engines telling them the relative importance of the page (homepage being the highest), the estimated frequency of updates (daily, weekly, monthly, etc.) and most importantly the exact URL structure for your website’s pages.
* Yahoo! Site Explorer – A free tool that allows web masters to explore web pages indexed by Yahoo! Search, including, the most popular pages from any site, sitemaps, and webpage links.

Thursday, September 3, 2009

SEO Copywriting Example 350 Words

SEO Copywriting Example 350 Words

SEO Copywriting usually optimizes other on-page elements for the targeted search terms. These include the Title, Description and Keywords tags, headings and alt text.

The idea behind SEO Copywriting is that search engines want genuine content pages and not additional pages (often called "doorway pages") that are created for the sole purpose of achieving high rankings. Therefore, the engines cannot possibly view SEO copywritten pages as undesirable, and the rankings they achieve tend to be as stable as those that are achieved by other search engine optimization techniques.

Practitioners of the search engine copywriting method recommend around 250 viewable words per page, with one, or at most two, targeted search terms strategically placed within the text and other on-page elements.


SEO Copywriting strengths
One thing that can be said about search engine optimization copywriting is that works for suitable websites and for suitable search terms (see below). SEO Copywriting can achieve rankings that tend to do well across the search engines, although no page can do equally well in all engines.

It is sometimes said by practioners of search engine optimization copywriting, that the method tends to maintain its rankings as the engines tweak and change their algorithms, whereas other methods produce less stable rankings. This can't be true. If 2 pages are in the top 10 search results; one getting there by the SEO copywriting method and the other by different search engine optimization techniques, they are both there because they match the engine's criteria (algorithm) quite well. When the criteria is changed, the match that each of them had is necessarily changed. The matches could become closer to, or further from, the engine's criteria. Whether each page goes up or down in the results depends on what changes have been made to the engine's criteria. It is a matter of chance, and not a matter of whether SEO copywriting was used or not.


SEO Copywriting weaknesses
# Competitive search terms
The technique only works for search terms that are not particularly competitive. Competitive search terms are those where many people are trying very hard to gain the top rankings for their sites. Casino, sex, insurance, health and hotels sites are among the most competitive, and there are many other topics where people fight for rankings. For medium to highly competitive search terms, other, more vigorous, methods are needed.

# Suitable sites
Not all websites are suitable for SEO Copywriting. Many simply don't have sufficient text on their pages, and adding text would spoil the design or nature of the sites. Also, some sites that do have sufficient text sometimes don't want to be forced into changing what is written on the pages, just for the sake of the search engines.

# Cost, and the limitation of targeted search terms
SEO Copywriting is a time-consuming process, and professional SEO copywriters are not cheap, therefore the cost of each page is significant. Since each page can target only one or two search terms, it would usually require a good number of pages to be made-over in order to target all the required search terms.

# Tied to a copywriter
What happens when a website owner finds it necessary to alter the text on a page that has been worked on by a professional SEO copywriter? It can't be done without either ruining the costly SEO work and, with it, the page's rankings, or re-hiring a professional copywriter to redo the work once the changes have been made.

# Slipping in the rankings
If a page is successfully optimized by SEO Copywriting, and is ranked in the top 10 search results for its targeted search term, then the optimization was worth the cost. But what happens when someone else decides to optimize a page from a different website for the same search term? If their optimization technique is successful, and the page gets into the top 10, the #10 page will slip to #11 - and off the first page of results. Then suppose another website does the same thing...and another...and another. Sooner or later, the successful page will slip from the first page of search results. As soon as people decide to optimize their pages for the chosen search terms, existing top 10 pages are on the way down. Then what?

If the sliding pages were professionally SEO copywritten, there is nothing else that the technique can do for them, or if it can, the whole costly copywriting process must be redone. Adding one or two instances of the target search terms isn't merely a case of typing them in somewhere, because the final text still needs to read well for the site visitors. Again, the website owner is tied to a copywriter.


Summary
SEO Copywriting is good when:-

* there are not many search terms to target
* the search terms are on the low to middle end of competitiveness
* money isn't a problem, or if it is your own website
* you don't mind the text on your pages being frozen (if money does matter)

Otherwise, 'search engine friendly' techniques should always be done as a first measure, and real search engine optimization should be done for the search terms for which 'search engine friendly' techniques are unsuccessful.


Example website
http://www.athena-solutions.com/services-readiness-assess.shtml
Athena IT Solutions - SEO copywriting helped this Boston-area consulting firm achieve high rankings for several popular key phrases. The firm offers assessments, and keyword research showed that people search using the phrases data warehouse assessment, data warehouse project assessment, business intelligence assessment, and business intelligence project assessment. Judicious use of the title tag and sub-navigation throughout the site helped the page achieve a first page rank on Google and Yahoo for all four phrases.

Example:
http://www.elderlawmaine.com/ckmain.htm
http://www.athena-solutions.com/services-readiness-assess.shtml
http://www.ecopywriters.com/resources/seo-copywriting-example-350-words.html