Thursday, November 30, 2006

Google's Sandbox is a Myth

By Anthony Gregory
The "Google Sandbox" argument has been around for a very long time; it is much loved and loathed by many people trying to do Search Engine Optimisation. The truth of the matter is Google's Sandbox does not exist. People doing SEO who refer back to the "sandbox" as a reason for websites not performing as they should are either naive or are dishonest.

I have recently tested the alleged "sandbox" theory on a number of sites and at every hurdle the sandbox theory failed.

In one particular test I registered a car insurance domain and slapped up a bunch of content. Now, anyone who has been in SEO circles for any amount of time will tell you that car insurance is a highly competitive industry and Google would have most certainly "sandboxed" the site. However, four months later, the site is on the first page of Google's search results for a highly competitive search term.

The site is also moving very quickly on an ultra competitive search term that should appear on the first page within a month or two.

Search Engine Optimisers who believe in the sandbox theory all too often blame their own poor performance for why the website is not ranking highly on Google. Getting a site's on-page optimisation right as well as building a sufficient quantity of relevant backlinks will get any site to rank well.

Google's search algorithm can detect if a site has been selling links, this obviously will nullify the effect of the link.

Therefore, if you buy a whole lot of links, there is a chance that the majority of the links will not be passing page rank to your site and the links will probably be wasted. Google's also algorithm detects a number of other techniques that search engine spammers use to manipulate the results.

If you want to create a website that performs well in Google's search engine results, create a site with lots of quality content and get relevant sites with a high page rank to back link to your site. It's that easy.

Wednesday, November 29, 2006

SEO Tools - Keyword Density


The keyword density tool is useful for helping webmasters and SEOs achieve their optimum keyword density for a set of key terms.

Keyword density is important because search engines use this information to categorize a site's theme, and to determine which terms the site is relevant to. The perfect keyword density will help achieve higher search engine positions. Keyword density needs to be balanced correctly (too low and you will not get the optimum benefit, too high and your page might get flagged for "keyword spamming").

This tool will analyze your chosen URL and return a table of keyword density values for one-, two-, or three-word key terms. In an attempt to mimic the function of search engine spiders, it will filter out common stop words (since these will probably be ignored by search engines). It will avoid filtering out stop words in the middle of a term, however (for example: "designing with CSS" would go through, even though "with" is a stop word).
 

Monday, November 27, 2006

The top Ten SEO Factors

by Mark Imre
These are what I believe to be the top 10 most important things (not necessarily in order) that you need, in order to get your website found in the search engines.

There are many other factors as well, but if you follow these guidelines, you'll stand a much better chance, and you'll be off to a good start

1

Title Meta Tag
The title tag is what displays as the headline in the SERPs (Search Engine Results Pages). It's also what displays in the top blue band of Internet Explorer when your site is displayed.

Your title tag of your website should be easy to read and designed to bring in traffic. By that, I mean that your main keyword phrase should be used toward the beginning of the tag. True there are websites being found now that do not use the phrase in the title, but the vast majority still do as of this writing.

Don't make the mistake of putting your company name first, unless you are already a household name, like Nascar or HBO. People are likely searching for what you have to offer, not your name.

Your title tag should be written with a capital letter starting the tag, and followed by all lowercase letters, unless you're using proper nouns. Some people prefer to capitalize every word, too.

2. Description Meta Tag
The description tag is the paragraph that people will see when your page comes up in the search results.

Your description tag should be captivating and designed to attract business. It should be easy to read, and compel the reader to act right now and follow your link. Without a description tag, search engines will frequently display the first text on your page. Is yours appropriate as a description of the page?

A proper description tag is what people will see below your title. You should make proper use of punctuation, and with readability, use your subject and geographical references.

3. Keywords Meta Tag
The importance of Meta keyword tags fluctuates from month to month among different search engines. There is a debate in the SEO community as to whether or not they help at all on certain search engines. In fact, in the summer of 2004 it appeared as if they were losing importance altogether.

However, you'll NEVER be penalized on any search engines for using relevant targeted keywords in moderation, and they can only help you with most, especially Yahoo

Avoid stuffing your keyword metatags with too many keywords

Just use relevant tags that apply directly to the content of that particular page, and don't overdo it.

4. Alt Tags
The small yellow box that comes up when your mouse cursor is placed over an image is called the ALT tag. Every relevant image should have an alt tag with your key words or phrases mentioned in the tag.

A proper ALT tag goes after the file name, and before the Align indicator. * - The ALT tag is no longer being considered for ranking purposes by some search engines. That said, it still cannot HURT you, and will still help you with some engines. My recommendation is to continue to use them, but be sure to avoid keyword stuffing. Besides, who nows when the pendulum will swing back the other way?

5. Header Tags
The text of each page is given more weight by the search engines if you make use of header tags and then use descriptive body text below those headers. Bullet points work well too. It is not enough to merely BOLD or enlarge your text headlines.

6 Link Text
Search engine spiders cannot follow image links

In addition to having image links or buttons on your web pages, you should have text links at the bottom or elsewhere. The text that the user sees when looking at the link is called the link text. A link that displays products does not carry as much weight to the search engines as a link called oregon widgets. Link text is very important, and is actually one of the most frequently overlooked aspects of web design that I've seen.

7. Site Map
Using a site map not only makes it easy for your users to see the entire structure of your website, but it also makes it easier for the search engines to spider your site. When the search engine spiders come to visit, they will follow all of the text links from your main index page. If one of those links is to a site map, then the spiders will go right to the sitemap, and consequently visit every page you have text linked to from that site map. On the site map page, try to have a sentence or two describing each page, and not just a page of links.

8. Relevant Inbound Links
By relevant, I mean similar industry or subject related sites. Right now, no single strategy can get your site ranked higher faster than being linked to by dozens of other relevant websites. It used to be that the quantity of incoming links mattered most, but today, it's much better to have three highly relevant links to you from other popular related websites than 30 links from unrelated low ranked sites. If there are other businesses in your industry that you can trade links with, it will help your site enormously. Link to others, and have them link to you. It's proven, and it works. To see who's linking to you, in Google type the following...
links: yourdomain.com

.

9. Your Content
Not to be forgotten of course, is the actual content of your webpage. It must be relevant helpful information that people want to read. These days, each webpage should be laser focused on one specific product or subject, in order to rank highly for that search phrase. The days of writing one webpage to appeal to dozens of search terms are long gone. Ideally, each page should have between 400 to 650 words on it. Too few, and the search engines won't consider it to be relevant enough. Too many words and the search engine spiders may have a hard time determining the actual subject or focus of the page.

Use your keywords or phrases often, and use them at the beginning of your paragraphs wherever possible. Don't overuse them and make the page sound phony, but don't write a page about a certain subject, and not mention that subject repeatedly either. Reading it out loud to yourself is a great way to judge how natural your text sounds.

Concentrate on writing quality pages that actually appeal to the human reader. Write pages that provide the reader with exactly what they are looking for; that is, information about the exact search phrase they've entered.

10. Avoid Cheating
With all of these tidbits of information, it's tempting to think that you can stuff 100 keywords into your title, or create a page with the phrase oregon widget company being used 100 times in headers, text links, ALT tags, bullet points etc. but that cannot help you. In fact, it can penalize you, and get your website banned from certain search engines.

Google Ranking Factors - SEO Checklist

There are "over 100 SEO factors" that Google uses to rank pages in the Google search results (SERPs). What are the search engine optimization rules?
Here is the speculation - educated guesses by SEO webmasters on top webmaster forums. Various confirmed and suspected Google Search Engine Optimization (SEO) Rules.

Sunday, November 26, 2006

Yahoo SEO Techniques

Different Search Engine Means Different Algorithm

All search engines have their own algorithms to determine the value and, therefore, positioning of websites. While the majority of SEO work tends to concentrate on Google because of the sheer weight of searches they receive it would be foolish to discount or ignore the other major search engines. Yahoo is considered one of the big three along with Google and MSN and by concentrating a little more time and effort on Yahoo optimization it is quite possible to gain a good amount of traffic. With ultra competitive keywords it may actually provide an easier way to generate search traffic than gearing all your efforts solely towards Google.

The Most Important Yahoo Optimization Factor

The first, and most pertinent point is that Yahoo judges content to be the most important factor in their algorithms. They do still consider inbound links and other factors but they are attuned to the way of the content site and they love sites that provide keyword-optimized content in large mass. While that may make it sound easier than concentrating on generating a huge base of inbound links as you would for Google, Yahoo optimization presents its own challenges and its own unique quirks that you should consider.

Looking At Keyword Density

Because of the relevance that Yahoo places on the content within your site, the keyword once again becomes a vitally important aspect of your research. While Google have been striving to promote sites that use organic content and webmasters and SEOs have been optimizing with around 2% to 3% keyword density Yahoo prefers a much greater density level. The danger, of course, is that giving Yahoo what they want may cause Google to deem your content as being keyword stuffed but there is another difference between the two algorithms that can help to counteract this problem.

Using Stems, Inflexions, And Variants Of Keywords

Yahoo is very heavily language based. This means that it is, strictly speaking, more aware of the nuances of the written language. It will include synonyms and inflexions of a keyword when considering your keyword density; something that Google does not consider to the same extent. This means that it is possible to optimize for both without diminishing your ranking with one another.

How To Optimize For Yahoo Without Getting Penalized By Google

Google likes a density of around 2% and Yahoo likes a density as high as 7% or even 8%. This means that you can effectively use 4 variations of a single keyword or phrase and a density of 2% for each. This offers further advantages. With Google you are now gearing your content towards four different keywords and offering the level they want, and you are still providing Yahoo with the much higher density rate that they require. Because you can include plurals and further stems of keywords this means you can write in a much more natural tone.

Using The Near Forgotten Meta Tags

One area that a lot of SEO professionals and webmasters alike now tend to overlook is the Meta tag. However, Yahoo appears to still give consideration to the keyword and description tags in particular. This is quite rare in the case of most search engines and Google certainly do not look for keywords in your Meta tags. Do not attempt to dupe Yahoo, though, and only include keywords that genuinely appear on your page and are relevant to your topic.

Regular, Fresh Content Is King

You've probably heard the saying that "content is king" and this is even truer when considering Yahoo optimization. The more content you provide the better. This may mean making regular additions to your site but it will generate the kind of results you are looking for. Blogs are also a very good way to continue adding relevant content to your site that Yahoo will smile down on.

The Lazy Yahoo Bot

Compared to other search engine spiders the Yahoo bot is a comparatively lazy animal. It doesn't crawl as often as other bots and it certainly doesn't crawl as deep into your site to find all of your pages and index them. This means you should pay extra attention to creating a legible sitemap and keeping it updated as regularly as possible. Yahoo has a sitemap submission feature that is similar to Google's and using this is heavily recommended to try and ensure that Yahoo stays on top of the infrastructure of your site and ranks you accordingly.

Inbound Links And Controlling Them Yourself

Inbound links are still important to Yahoo, but again a lot of emphasis is placed on content. Textual relevance seems to be one of the most important factors so having control over your inbound links and being able to determine the pages where they appear and the anchor text of each is important. Perhaps the best way to generate inbound links for Yahoo optimization is to use the article directories to your benefit.

Yahoo Optimization Conclusion

Google may be the search engine that everyone talks about and optimizes for but ignoring Yahoo would be foolish. This is especially true because while the Yahoo algorithm is quite different to the Google one and other algorithms, it is still quite easy to optimize for both. The most important factors to remember are to use relevant Meta tags for every single page of your site, include as much content and update your site with new content as often as possible, and update a sitemap both on your website and with the Yahoo sitemaps function.

About the author: Matt Jackson ,WebWiseWords specializes in SEO writing geared toward any search engine or any SEO guidelines. They can create website content pages, articles, blog entries, and more that utilize search engine optimization rules to help promote your site with all of the major search engines, not just Google.


Saturday, November 25, 2006

SEO Resourses



Articles

About Link building

From Matt Cutts

Sets of articles

Cre8asite Forum threads

Blogs

Feel free to pick blogs from this hefty list of SEM blogs and another blog list from Rand of SEOmoz.

How to Get Free Traffic for Your Web Site


by Kevin Sinclair


Getting traffic to your web site is the most important task for a web master. Though paying for traffic is one option, it might not be a profitable source, especially if the web site does not have a direct source of revenue via sale of products. In such cases getting free traffic to the web plays an important role. There are many different ways in which you can generate free traffic. Some of the methods that you can use are listed below:

1. Writing articles and submitting them to article directories. Almost all article directories provide a form field to include you copyright notice and the resource box. You should use the resource box not just to talk about your self but also to entice readers looking for more information. The resource box for your articles should be used to offer a freebie e-book or a report or even an attention-grabbing headline to attract the readers to click the link and visit your web site.

You can also end the article with a question and link to your web site to find the answer. Since web masters looking for free content on their sites also use the articles you publish in article directories, it creates a viral affect with your site getting traffic from varied sources. You also gain credibility as an expert with articles. Your article displays your knowledge and shows readers how much you know, about a specific topic.

2. Creating and distributing a free report on a specific subject also helps to get free traffic without putting in much effort. You should include a link to your site or link to a squeeze page to capture readers name and email. Having a link to a squeeze page automatically builds your own list of readers who can be contacted later with affiliate programs in the related subject. You should send your paid traffic to download this free report and allow them to redistribute the report. Giving the report for free attracts readers, and by allowing them to distribute freely ensures that your web site gets free traffic, long after the report was distributed.

3. You should regularly post on forums related to your site. This is perhaps the easiest and quickest way to get free traffic. You should use these forums to make useful and interesting posts, and include a signature file that allows readers to click through to your site. If you are not an expert in any niche and are unable to have a discussion around a topic, you can always post questions. Asking for help will get your site noticed and get you some good incoming links.

4. Another option to explore for free traffic is to find link partners and exchange links. This will not only get you direct click-through traffic from other sites, but it will also get you search engine traffic once those sites are indexed and your site is given credit for the inbound links.

5. If you maintain a blog on your site, submitting your blog feed URL to various RSS sites also helps getting exposure to your site very soon. You should consider submitting feeds or sites to blog directories with a high PR to make sure you are getting quality incoming links. You should also not ignore submitting to sites with low PR in the later stages. If these web sites "make it big" you do not want to be left out. You should not submit to millions of web sites or directories using some software. Trying to submit your site to millions of other web sites may just work against your site.

6. You should look at submitting your web site to the social book marking sites like Squidoo, del.icio.us etc. This helps in getting good exposure to your web site.

Whatever the way you choose to get the visitors to your site, you need to be careful to do it naturally and never stop your hard work on your web site. After all patience really pays!!!

Thursday, November 23, 2006

Frequent errors in the optimization for search engines

by Dragos Diaconescu

The most usual errors in web optimization refer to the following aspects
Ignore of 'Title' tags
It is one of the most frequent errors committed, with a direct negative effect upon the positioning of your site in a search engine. Most of search engines consider the 'title' tag very important because the information entered here is the one that is presented. Therefore try to enter in the title the keywords which you consider to have the biggest relevance for your site.
Irrelevant keywords
A certain category of webmasters use irrelevant keywords which are meant to increase the traffic on site. If you want to develop an on-line business, the key words that you use must be representative for the domain of of your activity.
Spamming
This is a method deprived of ethics through which the same keywords are repeatedly used in the "title" tag, the meta tags and in the content. This method is used in order to improve the position of a site in an artificial way, but the search engines detect these techniques and diminish the ranking of those sites. Therefore, do not use too many keywords.
Text and invisible links
This is a method through which it is tried the misleading of the search engines. Web pages are loaded by links and keywords which are hidden to the user but visible for search engines. The problem is that these have now the capacity to detect this technique and its utilization leads without any exception to the exclusion of your site.
Links to and from useless sites
It is known the fact that a high number of links that lead to your site increase its importance for search engines. Though, not every link that leads to your site helps you. Try always to obtain links only from sites that have a good ranking.

Errors as a part of the HTML code
If your site presents errors in the html code it will not be well ranked by the search engines even if it has an optimized content .The loading speed will be small and there will be incompatibilities with certain internet browsers. The best way to avoid this situation is the html code validation before subscribing the site in search engines.

The utilization of un-canonical URLs

During the optimization process make sure that every URL of the page that follows is canonic. This will have a positive influence upon the visibility of your site in a search engine.

The excessive graphic in a page

The way through which search engines are reading your site resembles with that of a text browser. A huge amount of graphic does not lead to a good site positioning. It will negatively influence the download speed of the site and will not attract the visitors. Try to use content with many key words instead of a great number of images. Also, use ALT text in order to describe the used images.

Wednesday, November 22, 2006

Duplicate Content Penalty - How to Lose Google Ranking Fast!

by Joe Duchesne
 

Duplicate content penalty. Ever heard of it? This penalty is applied by Google and possibly other search engines when content found on your website is largely the same as what is found elsewhere on your site or on other websites across the internet.

Search engine spam has been common ever since search engines were first invented. Search engine spam describes the practice of making changes to your website that gets you listed high in search engines at the expense of readability by humans. Years ago, you could get ranked high on a search term simply by repeating it as many times as possible in a document. The primitive search engines of the past ranked the importance of a keyword simply by counting the number of times a term appeared on a page. Today's search engines are much more complex.

Google has been waging war against all kinds of search engine spam and especially against duplicate content in all forms. There are two main types of duplicate content that Google is concerned about.

The first is a website that simply lists the very same page hundreds or thousands of times with simply a few words changed. This is usually done to attain high ranking on a wide range of keywords. It is most often used to get ranked high on a whole bunch of keywords unrelated to your website but can sometimes be done by a site that is on topic but simply offering duplicate content.

The second type of duplicate content that Google is concerned about revolves around affiliate programs. It has been common practice for high traffic websites to establish an affiliate program. Affiliate programs themselves don't worry Google. What it doesn't like though, is for an affiliate program to take a template and then offer it to its base of affiliates to use. Some of the higher traffic websites end up with thousands upon thousands of duplicate websites all promoting the very same things and, according to Google, not offering any real value to the internet community. A website offering this type of cookie cutter website can easily find themselves de-listed by Google as happened to Template Monster a while back.

The third type of duplicate content is simply not included in the Google index. This is content that is found elsewhere on the internet at large. Google and the other major search engines are interested in gathering and cataloging as much quality, unique content as possible for human consumption. To this end, they look to minimize the amount of duplicate content they allow in their index. This is why creating a new website and simply filling it with third part content will rarely if ever result in high rankings in the Google index.

The solution? Don't rely on duplicate content as your main method of driving traffic to your site. Should you avoid all duplicate content? Of course not. What kind of duplicate content is acceptable? Answering this question is easily another article in itself.


Search Engine Ranking Positions


Search engine marketing requires
elements of your web site to be optimized so your site gains popularity with the search
engines. There are two ways to do this: pay-per-click or naturally.



Pay-per-click search engine marketing requires you to bid and pay for specific search
engine rankings. This can be advantageous in that you can quickly achieve a high response
rate. This is helpful in testing ad copy and response rates. However, it isdisadvantageous because you have to pay for every single click. On the other hand, natural
search engine rankings take longer to achieve but are less costly, they can never bee 100% free,
you always have to have savings up to build your backlinks and market your site.



How do you do this? How do you know what the search engines are looking for?
Some top predictors include content in relation to keywords, meta keyword and meta
description tags, your site's title and its link popularity. Although meta tags are not
used by most search engines anymore, it does not hurt to include them in your header. You need to know that
achieving high search engine rankings for more than a few words on any given web page is very
difficult to do. This is why some sites grow so big as different pages are created for each
keyword. Of course, the site then has to optimize these pages for the specific keyword that
they are trying to achieve search engine rankings for.



Other than that, focus on one main keyword, if you say, for example own a site about "baseball cards",
and have subpages that includes "80's baseball cards" and "Babe Ruth baseball cards". It is good to target a few extra keywords for your
subpages instead of targeting just "baseball cards". Those keywords are less competitive and will help you gain better rankings.




Optimizing a web page also requires that the writing on the web page contain between a 3%
and 5% keyword density. This is the percentage of times you insert your keyword into your
writing when compared to the other words that are used. So, if you are creating a site about
"volleyball" you will need to make sure that the term "volleyball" is included in your page's
title, keywords and description. Then for every 100 words you write, the keyword"volleyball"
will need to be used at least three times. Also make sure you use header tags around your targeted keywords.
Take the extra step to include an image alt code to your images, these keywords are usually indexed
by search engines.



One very important thing to do when you are building a strong natural search engine
ranking is to work on increasing your link popularity (the number of web sites that link
to your site). This can be done by adding your site to directories, you can even pay someone to do this for you.
Trading links with other sites with the same or related content as yours. Update your articles and submit them to various article directories that allow you to place your link in your profile.

The Most Common SEO Mistakes Big Brands Commit

Posted by randfish

In the world of big brands and sites, SEO practices become part of a laundry list of developer tasks, often far beneath the threshold of serious attention. While folks in our industry have learned the detriment of ignoring the search engines' guidelines for crawling & ranking, in the world of the Fortune 1000's, there are hundreds of disbelievers. Thus, it's a great time to re-visit some common SEO mistakes.

  1. Un-Spiderable Navigation
    From Flash links to Javascript calls to drop-downs and search box interfaces, there's dozens of sites that fall victim to a lack of crawling due to their spider un-friendliness.
  2. Disregard for Relevant Keywords
    Out of the Fortune 500, I'd estimate that only a scant few dozen are actually implementing proper keyword research and targeting - the rest leave it to a "creative ad writer" to determine page content and title tags.
  3. Flash & Image-Based Content
    In addition to navigation, the content that's most critical to search engines is frustratingly hidden in files that spiders can't see. Despite the promises from years ago that engines would eventually be able to spider Flash content (or read text in images), it seems we're still many years away.
  4. URL Cannonicalization Problems
    With "print friendly" versions, different navigation paths in the URLs leading to the same page and plain-old duplication for the heck of it, "canonical content" is going underappreciated.
  5. Content Distribution & Partnerships
    Along with cannonical issues on their own sites, many large owners of web content license it out to dozens (sometimes hundreds) of sources. The only thing more damaging than having six versions of content on your site is having six versions of it on six other big, powerful sites.
  6. Cookie or Session-Variable Requirements
    Big sites that don't build content access systems for spiders are asking for trouble - if even a spider has to have a cookie drop to pass through, someone else will be getting your traffic.
  7. Controlled-Access to Content
    The NY Times, Economist and Salon.com don't see nearly the link popularity growth of their more generous competitors. Even when you let the spiders through, requiring membership of paid-access means that far fewer visitors will bother to link.
  8. Multiple Site Creation
    Rather than launch projects behind their root domain, big companies seem to take pride in releasing 6 new websites every time their ad agency changes the campaign slogan. Somebody's never heard of the sandbox...

One of the most fascinating people I met at the recent Pubcon was a "search proxy architect" whose job is to work with big brands' unfriendly sites and create alternate pages for search engines to crawl and index. Basically, it's an advanced form of cloaking that the engines tolerate, largely because they'd rather be able to spider the content from these sites than to have it removed from the index. I had no idea the extent to which this practice is used, but apparently, this "ethical cloaking" is much, much more common than you might think. Sadly, I can't post the examples I know of, but if you've got some, feel free to share in the comments.

So, next time someone asks you about whether cloaking is white-hat or black-hat you can tell them... "it depends."

Cliches in the SEO World

Posted by randfish

Alright, this one's just for fun. We were batting around some cliches in the industry (at conferences, on blog posts, in forums) and thought that a few were worth sharing.

  • Black Hats are British
    More specifically, it seems they all have some relationship to Leeds (born there, live there, from "around there"). Maybe there's something in the water.
  • "Content is King"
    Ugghh... I imagine that most people who've spent 10 minutes in the SEO field are probably sick of hearing this, even if it's true.
  • "I'm 100% White Hat"
    Sure, I buy links, spam the ocassional blog, send out cheesy link requests and hide white on white text at the bottom of the page, but other than that...
  • "I'm a Nefarious Black Hat"
    My sites make $10 a month on AdSense (in total), and I'm still trying to get my first site banned, but hey - it sure sounds sexier than "white hat."
  • Matt Cutts is a Government Spy
    No. Just no. He's an employee of Google - that's it.
  • Matt Cutts Once Said
    Not always valid in my book. If you don't test it yourself or have data to back up your hypothesis, the "Matt Cutts said" or "GoogleGuy said" argument does not serve as the final answer. Same goes for anyone else - just because Dave Naylor or Greg Boser or that insufferable, yellow-shoed Rand guy said or wrote something doesn't make it gospel.
  • MSN/Yahoo!/Google/Ask Sucks
    They usually "suck" because they're not sending you traffic and then, when you finally get things right on your site, they suddenly "rule."
  • My Site's Been Banned/Penalized
    9/10 times, your site has not been banned or penalized, you're just not getting the links you need to compete and the engines are getting smarter about which ones they count.
  • I Hate Web 2.0
    You probably hate it because it's poorly defined and frustratingly over-applied, not because you actually don't like UGC, RSS feeds, AJAX or shiny, clean web design.
  • Google Reps and PageRank
    In public, Googlers seem to have an obsession with using PageRank when they really mean "advanced link quality/quantity score that has very little relationship to the original algo." You'll hear them say that results go into the supplementary index because of low PageRank (of course, there's plenty of high PR pages in there that just happen to be duplicate/low quality content) or that PageRank is how they determine which source is canonical between different versions of content (I just don't believe they would be naive enough to avoid using other metrics, too).
  • I've Been Thinking of Going White Hat
    One of your big sites got banned and suddenly you're thinking that hard work and unique content and hours of labor are more attractive than 3 hours at the computer every night on your $1K-per-day churn and burns... Why am I skeptical?

From: SEOmoz Daily SEO Blog


Tuesday, November 21, 2006

Creating sitemaps-the proper path for search engine optimization


by Razvan

Summary:

In this article you will learn the basics for sitemaps, what they are, what are they used for, about creating sitemaps, the basics on Google sitemaps, html sitemaps and txt sitemaps. You will learn the main functions that a sitemap can perform and its most popular uses.

Body :

In order to have a successful website, it is very important to possess all the different and important factors for web promotion, and a sitemap is one of them. If you decide to take your website to the next level it is imperative to include a sitemap in it.

By now you may be wondering, what is a sitemap? The answer is quite simple, a sitemap is an aid which is used for accomplishing an easier navigation of a given site. Inside it you will find the whole structure of a website and also the most important links and sections that it has, basically it will ensure that your visitors will know exactly where to go when looking for certain information within your site, otherwise they might get bored and try with the next similar website. But the main reason why a sitemap is so important for your website is because it will help you raise your ranking within search engines, and that means, more visitors, more exposure and overall, a higher Internet success.

There are different types of sitemaps: there is the Google sitemap, the simple .html sitemap, the .txt sitemaps, and others which are mainly used by search engines to locate your site and to help you visitors during their surfing.

First of all let us discuss the Google sitemap, if you have a Google sitemap your site will be benefited in many ways, first of all, the speed in which your website and its sections are added to the Google index will significantly increase. Also, Google will be quickly aware of any changes or updates that you do on your site. There is also the very convenient fact that your website will be listed in Google and many people will find it without any problem, but more importantly, with a Google sitemap you will be capable of providing your visitors with fresh new content and all the information they require within your site in a matter of minutes.

In order to create a Google sitemap you will need to code it in a programming language known as Extensible Markup Language, otherwise referred as XML; the process is tedious and it can be quite time-consuming, especially if your website contains hundreds of sections. The fact that sitemaps are so difficult to code is the reason why many people have opted for another option, using a sitemap generator. A sitemap generator is a simple program that will let you create your own personalized sitemap in a matter of minutes.

If you are ready to create a sitemap, possibly the easiest option for you is to start with a simple .html sitemap, these popular sitemaps which have been used for years will help your visitors find their way around your site. To create a simple .html sitemap is easy, just do a new page and put in it links to all the other pages of your site. Having a simple .html sitemap will not only help your visitors, but it is also extremely helpful for search engines' spiders when trying to locate specific content within your site.

If you want to create a simple .txt sitemap just type a list of the different URLs available in your website, save them in a text file and you will be then capable of submitting the list to the search engines.

Loading...