Wednesday, April 05, 2006

SEO Experts India

Why SEO (Search Engine Optimization):

A movie a few years back made the concept of building something and having people show up famous. And, as Hollywood would have it, it worked. But this is the real world. Or at least the real internet world, and things don't happen exactly that way.

To build a web site and have people come to it, you have to complete certain steps. Let's take a look at what is required to build it and have them come. Can they find it once you build it? Do they want it (or what it offers) when they find it? Can they get it once they find it?
First, and the most important, is having a site that people can find. What is required to have a web site that people can find? It’s a combination of search engine rankings, advertising, and linking.

Current statistics show that between 70 and 85% of an average site's traffic comes from search engines. So having excellent search engine rankings is all-important in having people come to your web site. This is attained through search engine optimization and web site design that takes into consideration factors such as keyword density, meta tags, and search engine friendly design. When your site is designed and optimized for great and relevant search engine ranking, you are well on your way to receiving your fare share of that 70%+ traffic to your web site.
The second important factor to attaining good search engine rankings is your overall advertising efforts. You have to advertise where your niche market is "hanging out" on the internet. And you have to advertise with copy and banners high powered enough to get your potential visitors to come visit you. Advertising is a method of driving traffic to your web site that you cannot afford to overlook.

Finally, links play a vital role in getting traffic to your web site. Links on other web sites where your niche market visits drives traffic to your web site while also raising your search engine rankings through a factor known as "link popularity". Links are an inexpensive form of advertising and driving traffic that help you get more people to come to your web site.
The next factor to getting people to your web site is actually offering what they are looking for or what they need to solve their problems or cure their pain. You have to know who your niche market is. You have to make sure that you create the perception of need within that niche market. And then you have to make sure that what you are offering is meeting that need.
Your niche market may grow and change over time. But you have to know whom it is to keep up with them and reach them through the factors mentioned above. When you know whom these potential customers are, you can focus on reaching them with your marketing message and getting them to your web site.

Once you capture this niche market, you have to speak to a specific need they have that your product or service will fill. Or you have to create a need within their minds that your product or service will fill. Nobody buys anything that they don't believe that they need.
So make sure that your product or service meets those needs and service it accordingly.
The final item to deal with in this article is making your site shopable. No, this isn't a word you will find in Webster's Dictionary. But it should be.

Many internet newbies get to a web site and fall out of the ordering process because it is too difficult for them. You have to provide an ease of ordering and quality level of service that makes shopping easier than going to a brick and mortar store or another web site. Without doing so, you can get all the traffic in the world out of your potential niche market and still never sell anything. If you don't know whether or not your shopping cart system is easy to use, try walking through placing an order yourself and see what happens. Get your friends to try it as well.

If even one person feels that the ordering process is too difficult, then you need to look into streamlining the process and/or getting an easier to use ordering system.
If you analyze each of these steps individually, then you will build a site to which people will come and buy.

Ten SEO (Search Engine Optimization) Tips :

SEO (Search Engine Optimization) STEP 1 :

Try to trick the search engines with techniques such as invisible text. This will only make your journey a difficult one. Try to work together with thy search engine brother, and HELP them to find your web site's useful content.

SEO (Search Engine Optimization) STEP 2 :

Fill thy "meta keywords" with useless words such as "business" if your site is about "computers". Be kind to your search engine neighbor and try not to confuse him. These days, it's virtually impossible to be placed for "business" if your site is about "computers"! Narrow your focus, and you'll start to notice that it was MUCH more difficult to be a "jack of all trades", yet master of none!! ;)

SEO (Search Engine Optimization) STEP 3 :

Submit thy site more than ONCE a month! Search engines are very strict about SPAM. If you've been submitting your site every day, chances are that you have been banned by the time you read this article! If you are currently using a search engine submission/optimization provider, like Guaranteed Top-20.com, be sure not to sign up for any other service or submit the site yourself. Doing this will most likely ruin your chances for placement.

SEO (Search Engine Optimization) STEP 4 :

Repeat keywords right next to each other! Make sure to include each keyword only ONE TIME! Too many people make the mistake of repeating keywords over, and over again.
For Example: "computers, business, computers, components, computers"
This is a HUGE mistake, and will only make sure that you end up at the BOTTOM of the list!

SEO (Search Engine Optimization) STEP 5 :

Use the same keyword list, and "title", for every page! Each one of your pages can get placed for a different keyword. Don't "sell your pages short". Besides, one page on your site might be seen as more relevant for a particular keyword when compared to another.

SEO (Search Engine Optimization) STEP 6 :

Submit dynamic pages such as ASP, or Cold Fusion, because most search engine spiders can only read basic HTML pages. If your site was developed in a language other than HTML, make sure that there's AT LEAST SOME static content for the spider to read.

SEO (Search Engine Optimization) STEP 7 :

Write relevant, useful content that pertains to your web site and it's keywords. Whatever you do, make sure that writing educational, informational, and useful content is your FIRST priority. Search engines cannot read your pretty web design, graphics, or FLASH. The only thing that is of use to them is your content, and the code that verifies what each page is about ("meta keywords", {"description", "title", "alt tags", and the list goes on..).
Be Specific!

Tell your users what each page is about, and include the keywords a few times within the text. Do not repeat your keywords excessively- but make sure to include them!

SEO (Search Engine Optimization) STEP 8 :

Make all pages accessible to the search engine spiders within 1 or 2 levels of the home page. Even better, if you make a web page with a list of every page on your site (kind of like a "site map"), then the search engine spiders have a way to find all of your content!!

SEO (Search Engine Optimization) STEP 9 :

Attempt to get reciprocal links between your site and other sites in the same industry. This should be easy, as it will benefit both your site AND theirs. Link popularity is one way in which some search engines judge what your site is really about. Web site's with more "link popularity" can place higher on some major search engines.

SEO (Search Engine Optimization) STEP 10 :

Be patient! You will not achieve top placement overnight. You might not even get placed at ALL for months! The major search engines can get backlogged or full, and sometimes will not even accept your submission. One thing that I've learned is that successful search engine optimization is a time consuming and laborious task. You must be patient and will learn the most about it mostly through trial and error.

13 comments:

Unknown said...

Have you noticed anything different with Google lately? The
Webmaster community certainly has, and if recent talk on several
search engine optimization (SEO) forums is an indicator,
Webmasters are very frustrated. For approximately two years
Google has introduced a series of algorithm and filter changes
that have led to unpredictable search engine results, and many
clean (non-spam) websites have been dropped from the rankings.
Google updates used to be monthly, and then quarterly. Now with
so many servers, there seems to be several different search
engine results rolling through the servers at any time during a
quarter. Part of this is the recent Big Daddy update, which is a
Google infrastructure update as much as an algorithm update. We
believe Big Daddy is using a 64 bit architecture. Pages seem to
go from a first page ranking to a spot on the 100th page, or
worse yet to the Supplemental index. Google algorithm changes
started in November 2003 with the Florida update, which now
ranks as a legendary event in the Webmaster community. Then came
updates named Austin, Brandy, Bourbon, and Jagger. Now we are
dealing with the BigDaddy!

The algorithm problems seem to fall into 4 categories. There are
canonical issues, duplicate content issues, the Sandbox, and
supplemental page issues.

1. Canonical Issues: These occur when a search engine
treats www.yourdomain.com, yourdomain.com, and yourdomain.com/index.html
all as different websites. When Google does this, it then flags
the different copies as duplicate content and penalizes them.
Also, if the site not penalized is http://yourdomain.com, but
all of the websites link to your website using www.yourdomain.com,
then the version left in the index will have no ranking. These
are basic issues that other major search engines, such as Yahoo
and MSN, have no problem dealing with. Google is possibly the
greatest search engine in the world (ranking themselves as a 10
on a scale of 1 to 10). They provide tremendous results for a
wide range of topics, and yet they cannot get some basic indexing
issues resolved.

2. The Sandbox: This has become one of the legends of
the search engine world. It appears that websites, or links to them,
are "sandboxed" for a period before they are given full rank in the
index, kind of like a maturing time. Some even think it is only
applied to a set of competitive keywords, because they were the
ones being manipulated the most. The Sandbox existence is
debated, and Google has never officially confirmed it. The
hypothesis behind the Sandbox is that Google knows that someone
cannot create a 100,000 page website overnight, so they have
implemented a type of time penalty for new links and sites
before fully making the index.

3. Duplicate Content Issues: These have become a major
issue on the Internet. Because web pages drive search engine rankings,
black hat SEOs (search engine optimizers) started duplicating
entire sites' content under their own domain name, thereby
instantly producing a ton of web pages (an example of this would
be to download an Encyclopedia onto your website). As a result
of this abuse, Google aggressively attacked duplicate content
abusers with their algorithm updates. But in the process they
knocked out many legitimate sites as collateral damage. One
example occurs when someone scrapes your website. Google sees
both sites and may determine the legitimate one to be the
duplicate. About the only thing a Webmaster can do is track down
these sites as they are scraped, and submit a spam report to
Google. Another big issue with duplicate content is that there
are a lot of legitimate uses of duplicate content. News feeds
are the most obvious example. A news story is covered by many
websites because it is content the viewers want. Any filter will
inevitably catch some legitimate uses.

4. Supplemental Page Issues: Webmasters fondly refer to
this as Supplemental Hell. This issue has been reported on places like
WebmasterWorld for over a year, but a major shake up around
February 23rd has led to a huge outcry from the Webmaster
community. This recent shakeup was part of the ongoing BigDaddy
rollout that should finish this month. This issue is still
unclear, but here is what we know. Google has 2 indexes: the
Main index that you get when you search, and the Supplemental
index that contains pages that are old, no longer active, have
received errors, etc. The Supplemental index is a type of
graveyard where web pages go when they are no longer deemed
active. No one disputes the need for a Supplemental index. The
problem, though, is that active, recent, and clean pages have
been showing up in the Supplemental index. Like a dungeon, once
they go in, they rarely come out. This issue has been reported
with a low noise level for over a year, but the recent February
upset has led to a lot of discussion around it. There is not a
lot we know about this issue, and no one can seem to find a
common cause leading to it.

Google updates were once fairly predictable, with monthly
updates that Webmasters anticipated with both joy and angst.
Google followed a well published algorithm that gave each
website a Page Rank, which is a number given to each webpage
based on the number and rank of other web pages pointing to it.
When someone searches on a term, all of the web pages deemed
relevant are then ordered by their Page Rank.

Google uses a number of factors such as keyword density, page
titles, meta tags, and header tags to determine which pages are
relevant. This original algorithm favored incoming links and the
anchor text of them. The more links you got with an anchor text,
the better you ranked for that keyword. As Google gained the
bulk of internet searches in the early part of the decade,
ranking well in their engine became highly coveted. Add to this
the release of Google's Adsense program, and it became very
lucrative. If a website could rank high for a popular keyword,
they could run Google ads under Adsense and split the revenue
with Google!

This combination led to an avalanche of SEO'ing like the
Webmaster world had never seen. The whole nature of links between
websites changed. Websites used to link to one another because
it was good information for their visitors. But now that link to
another website could reduce your search engine rankings, and if
it is a link to a competitor, it might boost his. In Google's
algorithm, links coming into your website boost the site's Page
Rank (PR), while links from your web pages to other sites reduce
your PR. People started creating link farms, doing reciprocal
link partnerships, and buying/selling links. Webmasters started
linking to each other for mutual ranking help or money, instead
of quality content for their visitors. This also led to the
wholesale scraping of websites. Black hat SEO's will take the
whole content of a website, put Google's ad on it, get a few
high powered incoming links, and the next thing you know they
are ranking high in Google and generating revenue from Google's
Adsense without providing any unique website content.

Worse yet, as Google tries to go after this duplicate content,
they sometimes get the real company instead of the scraper. This
is all part of the cat and mouse game that has become the Google
algorithm. Once Google realized the manipulation that was
happening, they decided to aggressively alter their algorithms
to prevent it. After all, their goal is to find the most
relevant results for their searchers. At the same time, they
also faced huge growth with the internet explosion. This has led
to a period of unstable updates, causing many top ranking
websites to disappear while many spam and scraped websites
remain. In spite of Google's efforts, every change seems to
catch more quality websites. Many spam sites and websites that
violate Google's guidelines are caught, but there is an endless
tide of more spam websites taking their place.

Some people might believe that this is not a problem. Google is
there to provide the best relevant listings for what people are
searching on, and for the most part the end user has not noticed
an issue with Google's listings. If they only drop thousands of
listings out of millions, then the results are still very good.
These problems may not be affecting Google's bottom line now,
but having a search engine that cannot be evolved without
producing unintended results will hurt them over time in several
ways.

First, as the competition from MSN and Yahoo grows, having
the best results will no longer be a given, and these drops in
quality listings will hurt them. Next, to stay competitive
Google will need to continue to change their algorithms. This
will be harder if they cannot make changes without producing
unintended results. Finally, having the Webmaster community lose
faith in them will make them vulnerable to competition.
Webmasters provide Google with two things. They are the word of
mouth experts. Also, they run the websites that use Google's
Adsense program. Unlike other monopolies, it is easy to switch
search engines. People might also criticize Webmasters for
relying on a business model that requires free search engine
traffic. Fluctuations in ranking are part of the internet
business, and most Webmasters realize this. Webmasters are
simply asking Google to fix bugs that cause unintended issues
with their sites.

Most Webmasters may blame ranking losses on Google and their
bugs. But the truth is that many Webmasters do violate some of
the guidelines that Google lays out. Most consider it harmless
to bend the rules a little, and assume this is not the reason
their websites have issues. In some cases, though, Google is
right and has just tweaked its algorithm in the right direction.
Here is an example: Google seems to be watching the
incoming links to your site to make sure they don't have the same anchor
text (this is the text used in the link on the website linking
to you). If too many links use the same anchor text, Google
discounts these links. This was originally done by some people
to inflate their rankings. Other people did it because one
anchor text usually makes sense. This is not really a black hat
SEO trick, and it is not called out in Google's guidelines, but
it has caused some websites to lose rank.

Webmasters realize that Google needs to fight spam and black
hat SEO manipulation. And to their credit, there is a Google
Engineer named Matt Cutts who has a Blog site and participates
in SEO forums to assist Webmasters. But given the revenue impact
that Google rankings have on companies, Webmasters would like to
see even more communication around the known issues, and help
with identifying future algorithm issues. No one expects Google
to reveal their algorithm or what changes they are making. Rumor
on the forum boards speculates that Google is currently looking
at items like the age of the domain name, websites on the same
IP, and frequency of fresh content. It would be nice from a
Webmaster standpoint to be able to report potential bugs to
Google, and get a response. It is in Google's best interest to
have a bug free algorithm. This will in turn provide the best
search engine results for everyone.

Unknown said...

As search gets smarter, tricks get cheaper and we get nearer to coming
full circle to an original goal of Internet search: that content is,
indeed, king. It cuts across the grain of some notions we've held in
the industry for some time, that there are shortcuts aplenty in the
hunt for better search rankings. Search experts at ICMediaDirect.com
dissuade clients from thinking that they can or should fool Google and
Yahoo. Instead, we show them solid methodology that works. Above all
else, your website needs superior content that anchors your search
engine optimization strategy.

Algorithmic innovation, much of it by Google's engineers, has
thankfully rendered search engine spamming a fruitless endeavor. This
has made the search engine a far greater tool for us to use and, in
turn, has sparked the online advertising industry.

Nevertheless, Google and other search engines closely guard their
Search secrets. Their constant tweaking compels smart webmasters to not
rely on keyword strategy alone. Your site must be prepared to weather
the ensuing fallout from any change in web-crawler browsing instigated
by their engineers in those far-off, unseen laboratories. Any quest for
keyword perfection is folly by nature, since that success is fleeting.

Search engines take link relevancy into account when ranking webpages.
As a search engine optimizer it's important for you to keep your "white
hat" on. This means, for the uninitiated, that instead of spamdexing or
deceiving the search engine through trickery, you employ ethical means
to achieve ranking. Crime doesn't pay in the search world because
deceptive sites get shown the door and no site can afford to pay that
price.

Google recently caught BMW's websites gaming the system and the
company's sites were essentially blacklisted from the results listings.
Did BMW not have enough confidence in the quality of their cars and
motorcycles or was their online optimizer getting too cute? No matter,
it's BMW's problem; one nobody needs to have.

Your white hat optimization entails shoring up whatever online
relationships you can with other sites for link exchange, promoting
your site with new ones for more link exchange, and joining web
directories. This is not an instantaneous process, like most SEO work,
but it is indispensable for your long-term online planning.

After dashing any notion of SEO trickery from our repertoire we focus
on content. Content requires actual work, anathema, perhaps, to those
seeking quick rewards with no effort, but this work pays off. You're
charged with producing fresh, relevant content that - gulp - someone
might even want to read!

I've yet to hear of a web crawling spider that purchases anything
online. Until this starts happening (hey, we rule out nothing at
ICMediaDirect.com), you should start writing (or rewriting) your
website's content with an ideal reader in mind: your customer. For
instance, if you're marketing rock-climbing gear, sell that helmet,
weave the thrill of the sport into your language and not a language of
sugar and carrots you believe will attract Google or Yahoo. Be genuine.

State the aim of your site's business by using clear and direct
language early in your text, or "up there" on the page. Indirectness or
misleading intros may not only divert the attention of people and cause
them to leave, but the same for web crawlers, too. State your business
early. Remember: simple beats complex in the SEO arena.

The goal is to win people over. It bears repeating that loading text
with a barrage of related keywords that doesn't keep with a natural
flow of language is not going to work, on any level. Did you know that
search engines can pick out the poor structural balance in language?
They actually pick out overly optimized sites. Those flagged sites can
go fish because their ranking is then shot. The lesson here is to work
with the system.

Now your content is written. After having a second (and third) person
read it over to find any inaccuracies or grammar mistakes, you
supplement its structure. It's time to optimize the text with keywords.
But not too much. While it's okay for a reader to notice repeats of a
word throughout the written content, it's too much when it distracts
from the message. Use keywords at the beginning, end, and in header
text. Review your text and see where you might re-work some more
keywords. Be creative and thorough in these efforts and do not to
exceed a 20:1 prose-to-keyword ratio within the copy.

Search algorithms scan text, not pictures. Therefore, it's important to
front-load relevant text onto your homepage, even to a point where you
might think there's too much text. Don't worry about that. Remember
that first time visitors to your webpage who come via search engine are
arriving because they're looking for something specific. Your homepage
isn't a billboard attracting the attention of motorists. Your objective
is to "close" on those already interested. If they're not ready to read
a couple of paragraphs, they're not ready to buy.

This isn't to say that images are unimportant. They can actually be a
tremendous search resource. You should slip in keywords when filling
out your image's alternate text description. Make them fit the keywords
of your site and try to have the description of the pictures match the
keywords, as well. This way "Image Search" functions of search engines
will provide any number of interested queries. This is an area that is
a) not nearly as optimized as text and b) increasing in aggregate
search numbers. A well optimized image selection for your site could
pay off serious dividends, while random descriptions for your imagery
will fritter this chance away.

Spellcheck. Use it, that's what it's there for. There is no excuse for
a poorly written website that smacks of amateurism. Sloppy efforts tend
to spook page viewers from doing business. If you're not up to writing
articles for whatever reason, you can hire a professional copywriter
for the job. ICMediaDirect.com has them, if need be, one shouldn't be
hard to find.

Any expense taken in regards to your website is a mere pittance when
compared to the disparity in value between a well-written site and a
poorly written one. Website quality is too important for a business to
trifle with. A good web copywriter is well-versed in keyword
utilization and can make your specific directives look natural and
effortless.

Search engines are getting smarter almost daily and continue to level
the playing fields in the process. Subsequently, Search Engine
Optimization has turned away from gamesmanship and towards crafting
quality websites. Those who put in extra effort will be the ones to
reap the long-term SEO benefits and, ultimately, more business

Unknown said...

7 Best Search Engine Optimisation Resources


1. Search Engine Watch (http://searchenginewatch.com/)

This should be of no surprise to anyone who spends any time in the SEO community. Search engine watch, and maybe more importantly, the Search Engine Watch blog is invariably on every professional SEO's daily reading list. Danny Sullivan, the founder of Search Engine Watch, is often referred to as the person who formed the industry. Much of the SEO information that gets repeated from one SEO blog to another starts at SEW.

2. Matt Cutts Blog (http://www.mattcutts.com/blog/)

Matt's relative fame is rooted in the fact that he is one of the first 100 employees at Google, an engineer in the search spam department, and has become a conduit of information between Google and the SEO community. Matt Cutts' blog often contains information about major updates (most of the Big Daddy update information came directly from Matt Cutts), examples of what not to do with your website, advice on how to rank well with Google, and also the normal day to day life of being a Googler.

Between Matt's blog, the numerous interviews he does with bloggers, his freely giving of his time at search conferences, and the fact that he works for Google, Matt is easily the most identifiable, and authoritative, figure in the SEO industry.

3. Mike Grehan's Blog (http://www.mikegrehan.com/)

As you find more and more places with information on SEO, you will find that a lot of the information is duplicated or repackaged in some way, but it is usually the same information. Mike Grehan's blog tends to take a fresh look at SEO from a different angle.

To find out more about Mike, you can view his profile at Clickz where he is a writer. Mike has written some extremely good articles, including Goodby, SEO Push. Hello, Pull SEO, A Grand Plan for SEO, and SEO Jargon, Real Beef or Just Baloney?. His blog is not limited to just SEO, but like many blogs also acts as a bit of an online diary (he does A LOT of traveling to conferences), but when he does post on SEO we have always found his approach to SEO to be a very professional, precise, and backed with the confidence of real experience. Since he serves as an SEO consultant to some very large corporations, we should expect nothing less.

3. Webmaster World and Some Other Forums (http://www.webmasterworld.com)

Forums are tricky – they are a great place to keep a pulse on the SEO industry, a great place to learn, and a great place to network with some very good SEOs. But they are also breeding grounds for bad information. Bad information spreads faster than quality information in SEO, and trying to distinguish on your own what is good and what is bad could lead to disaster. But forums offer what a blog cannot – an interactive community.

Knowing that all information in forums is not the gospel truth, there are a few forums that stand head and shoulders above the rest. Probably the best known would be the Webmaster World forums. Webmaster World is a very well known community with some very prominent participants.

Other forums which rise to the top time and time again would be SEOChat, V7N forums, and the Digital Point forums. Search Engine Roundtable does a good job of keeping a pulse on the search engine marketing forums.

4. Search Engine Guide (http://www.searchengineguide.com)

If you want to get a good mix of links from a variety of resources offering some of the latest information on SEO news without being totally overwhelmed (or simply don't have the time for a forum), then Search Engine Guide would be a good place to start. Search Engine Guide daily offers a nice mix of links and news from various forums, blogs, and SEO news websites. The site has grown steadily, and Robert Clough has done a good job of helping the site grow over the years.

5. A Beginner's Guide to SEO - SEOMoz (http://www.seomoz.org/beginners.php)

The title of this is fairly self-explanatory. If you are feeling overwhelmed by all the search blogs which often-times look at more advanced concepts and are just looking for a good basic guide to SEO, SEOMoz has put together a fantastic guide.

The guide is fairly comprehensive, nicely organized, and available in a variety of formats (very useful if you do not like reading online). Best of all, it is 100% free. There are a lot of books available for purchase, but the basics of SEO are all fairly well-covered in this online guide.

6. SEO Chat - SEO Tools (http://www.seochat.com/seo-tools/)

It is my opinion that all SEO Tools need to be taken with some caution. SEO tools, like forums, can be misleading. Most SEO tools will evaluate an aspect of your website according to what the tool's creator believes to be important aspects of SEO. The problem with this is that no one, other than the engineers who put the search engines together, knows how the search engines work. SEO tools can be useful, but should be used knowing that no single tool will get you a top ranking.

That being said, SEO Chat has a nice section of tools which are freely available. These are pretty much the standard set of SEO tools that you can probably find at a variety of websites. Some of the more interesting and useful tools are the multiple datacenter checks, the URL Rewriting tools, and the Spider Simulator.

7. SEO Book's Tools (http://tools.seobook.com/)

While we are on the subject of SEO Tools, if you are looking for a set of rather non-traditional SEO tools, Aaron Wall over at SEOBook.com has put together a very nice set of tools. These tools are not just your regular “check where you are in the rankings” tools – these dig a bit deeper.

One of the very nice features of SEOBook's tools is that most of them are open source code, which means you could put the code on your website and run the programs from there. Aaron is also the owner of Threadwatch.org which can offer some good information on SEO related issues.

SEO for the Practical Website Owner

Practically speaking, the average website owner is not going to spend a day pouring over the latest patent filings by Google. It simply is not reasonable for a website owner to concern themselves to this level of detail with the details of SEO.

The truth of the matter, however, is that most website owners can see moderate SEO success by simply keeping a pulse on the SEO industry, picking up the information that is freely flowing through the many, many resources for SEO, and applying them to their website. The science of SEO, trying to figure out to the finest detail of how search engines work, is complicated, but search engines have a very simple goal: to present searches with relevant, up to date, quality results.

While a website owner may not want to spend hours every day reading patents or testing various SEO theories on test accounts, knowing the basics of SEO, and keeping up with the trends of the industry by paying attention to some of its finest resources, can be all that a website owner needs.

Unknown said...

Benefits of using a subdomain

What is a subdomain?
A subdomain is just as it sounds, a sub or second-level of a domain. A standard domain looks as follows: www.mydomain.com. A sub domain would be http://prefix.mydomain.com. Subdomains do not take on the "WWW".

All subs start with http://subdomain.maindomain.com

That's simple enough, I think. We have all seen a subdomain before. On to the next question.

Is it hard for a subdomain to rank well?

Subdomains rank just as well as regular domains. Having a subdomain doesn't decrease your chances of ranking on search engines. You'll just have to treat each subdomain as a new, individual site. This means work on building PR and links. As with any domain, you don't get a free ride.

What are the benefits to using a subdomain?

Let's say you had a site that had many categories in it. If you were to submit to directories you could submit each subdomain as it's own individual category and not get penalized. The links from the subdomains to the main domain would be looked upon as one way links from another domain.

What are the drawbacks to using subdomains?

Each subdomain would be looked at by search engines as a new site, which means you will need to do everything you would normally do for a new site.In short, creating back-links, battling the sandbox, and all other problems associated with domain development are still present. If this sounds like too much trouble, you may want to consider creating subfolders within the domain.This would look like http://www.mydomain.com/folder

If a sub,domain becomes banned will it effect the main domain?

The answer is yes. It may get the main domain banned if they are linked to each other. There has been proof of main domains being banned, yet i have heard of many instances where they have not.The main domain may not get banned under certain circumstances. The question shouldn't be whether the main domain can get banned because of a subdomain, but instead, why would your subdomain get banned in the first place.If you want to experiment with untested or controversial methods, do so on a new domain.

So should I use a subdomain?

If you can afford the time an money required to develop out each subdomain and if your main domain would be too convoluted by having too much on it, I'd say yes. If not, save yourself the heartache and extra work. Just make your site easy to navigate, use folders and you should do just fine.

Unknown said...

The Rules For MSN

Assuming that you are following the right rules, the results you can achieve on MSN can be fast and solid. MSN does not apply the same types of aging delays that the other two engines do and thus, when you change your content the change in results can be realized as quickly as they reindex your site and as quickly as your incoming links get picked. This differs greatly from Google and Yahoo! in that those two engines age both domains and links requiring a longer period of time before the full effects of your efforts are realized.

As an additional note on MSN, users of MSN are 48% more likely to purchase a product or service online than the average Internet user according to a comScore Media report.

So what are the rules for MSN that can help us get top rankings? As with all the major engines, there are two fundamental areas that need to be addressed to attain top rankings. The first is the onsite factors, the second is the offsite. Because they are fundamentally different we will address them separately.

Onsite SEO Factors

The problem with writing an article about the onsite factors is that by the time many of you read this some of the weight these factors hold and the optimal levels noted may well be out-of-date. Thus, rather than listing overly-specific-and-sure-to-change factors we will focus on how to know what the factors are, how to get a handle on what you need to adjust and by how much, and how to predict what will be coming down the road. And so we'll begin:

How To Know What The Factors Are:

Unfortunately there's no one over at MSN Search calling us up weekly to let us know what the specifics of their algorithm are, we have to figure it out for ourselves with research, reading and playing with test sites. From all of this there is only one conclusion that an SEO can make: the details matter. When we're discussing onsite factors this includes:

the content of the page including keyword density
the internal linking structure of the site (how the pages of your site are linked together)
the number of pages in your site and the relevancy of those pages to your main topic and phrases
the use of titles, heading tags and special formats
There are a number of lower weight factors however the ones noted above, if addressed correctly, will have very significant results on your rankings if the offsite factors noted below are also addressed.

Page Content:

The content of your page must be perfect. What I mean by this is that the content must appeal to both the search engines and the algorithms. In order to write properly for the visitors you must be able to write clearly and in language that is both appealing and understandable to your target market. While there is much debate about whether the keyword density of your page is important I am certainly one who believes that it is. It only makes sense that a part of the algorithm takes into account the use of the keywords on your page. Unfortunately the optimal keyword density changes slightly with each algorithm update and also by site type and field. For this reason it would be virtually impossible for me to give you a density that will work today and forevermore. For this reason you will need a keyword density analysis tool which you will want to run on your own site as well as the sites in the top 10 to assess what the optimal density is at this time. You may notice a variation in the densities of the top 10. This is due to the other factors including offsite which can give extra weight to even a poorly optimized site. I recommend getting your site to a keyword density close to the higher-end of the top 10 but not excessive. Traditionally this percentage will fall somewhere near 3.5 to 4% for MSN.

Internal Linking Structure:

The way your pages link together tells the search engines what the page is about and also allows them to easily (or not-so-easily) work their way to your internal pages. If your site has an image or script-based navigation it is important to also use text links either in your content, in a footer, or both. The text links are easy to follow for a spider and perhaps more importantly, the text links allow you the opportunity to tell the spiders what a specific page is about though the anchor text and, in the case of footers, allows you to add in more instances of the targeted phrases outside of your general content area.

The Number Of Pages & Their Relevancy:

MSN wants to please their visitors. For this reason they want to insure that highest likelihood that a searcher will find what they need once they get to your site. For this reason a larger site with unified content will rank higher that a smaller site or a site with varying content topics. (note: this assumes that all else is equal in regards to the other ranking factors)

When you are optimizing your site for MSN be sure to take some time to built quality content. Do a search on your major competitors to see how large their sites are, over time you will want to build yours to the same range through general content creation or the addition of a blog or forum to your site.

Titles, Heading Tags & Special Formats:

Titles are the single most important piece of code our your entire web page for two reasons. The first is that it holds a very high level of weight in the algorithm. the second reason is that it is your window to the world. When someone runs a search the results will generally show your page title in the search results. This means that a human visitor has to be drawn to click on your title or rankings your site is a futile effort (this isn't about bragging rights, it's about return on investment).

Heading tags are used to specify significant portions of content. The most commonly used is the H1 tag though there are obviously others (or they wouldn't bother numbering them would they). The H1 tag is given a significant amount of weight in the algorithm provided that it is not abused though overuse (it should only be used once per page). Try to keep your headings short-and-sweet. They're there to tell your visitor what the page is about, not your whole site.

Special formats are, for the purpose of this article, and text formatting that distinguishes a set of characters or words apart from the others. This includes such things as, anchor text, bold, italic, different font colors, etc. When you set content apart using special formats MSN will read this as a part of your content that you want to draw attention to and which you obviously want your visitors to see. This will increase the weight of that content. Now don't go making all your keyword bold or the such, simply make sure to use special formats properly. Inline text links (links in the body content of your page) is a great way to increase the weight of specific text while actually helping your visitor by providing easy paths to pages they may be interested in.

Offsite SEO Factors

With MSN, the offsite factors are much simpler to deal with than either Google or Yahoo! MSN will give you full credit for a link the day they pick it up so link building, while time consuming, is reworded much quicker on MSN. When dealing with MSN and offsite SEO there are two main factors we must consider when finding links:

Relevancy. The site must be relevant to yours to hold any real weight.
Quality is better than quantity. Because PageRank is Google-specific we can't use it as the grading tool for MSN however upon visiting a website it's generally fairly clear whether we're visiting a good site or not. Spending extra time to find quality is well rewarded. Also, finding one-way links as opposed to reciprocal links is becoming increasingly important and I'd recommend utilizing both in your link building strategies.
You will have to begin your offsite optimization by running link checks on your competitors to see what you're up against. This is also a good place to start for potential link partners though those of you using a tool such as Total Optimizer Pro or PR Prowler will find it far faster and more effective to use these tools.

Conclusion

This entire article may seem fairly simplistic and there's a reason for that, what we've noted above is a list of the more important areas however to save you frustration and me from receiving hundreds of emails a few months from now noting that the keyword densities don't work, etc. I've tried to keep it general. Below you'll find a list of recommended resources. These are tools and SEO resources to help keep you updated and on top of the rankings.

Next week we will be covering Yahoo!

Resources

Total Optimizer Pro - A keyword density and backlink analysis tool. This tool breaks down a variety of onsite and offsite factors giving you a full snapshot of how the top 10 got their positions.
Microsoft Press Room - Read the latest press releases from Microsoft. This may not give you the algorithm but it will tell you the direction they're going. Understand this and you'll be better equipped to deal with changes down the road.
SearchEngineWatch's MSN Forums - Read the latest news, feedback and discussions on the SearchEngineWatch forums. A great way to keep updated but beware, not everyone in there is a qualified opinion.

===========================================================================================================

Search Engine Optimization for Google

The Factors
To optimize and rank highly on Google, as with any of the major engines, specific areas need to be addressed. On Google the most important of these factors are:

Backlinks
Age
Content
How it fares in the results
Backlinks
More than on either Yahoo! or MSN backlinks are key to attaining top rankings on Google. More importantly, Google's methods for calculating the weight of backlinks is very different than either of the other two engines. Once upon a time backlink acquisition was mainly a numbers game. If you had more links you had higher rankings, it was basically as simple as that. Today however Google has an algorithm inside their algorithm for determining which links are more valuable than others. This algorithm has a number of factors itself, however there are some that are more important than others. They key factors that determine the value of a link in regards to its contributions to the ranking of your site are:

The age of the links - Like domains, links gain weight with age. The longer your links have been on a web page the higher their value. Basically this means that your link building efforts today aren't going to pay off for a number of months. The weight seems to age gradually. In a month your link will hold partial weight, in two months it'll hold a bit more and so on. Links hold the majority of their weight after about 5 to 6 months.

The location of the link - The physical location of your link on the page is an indicator to Google of its value. A link buried in the footer of a page will hold virtually no weight whereas a link near the top (i.e. where a visitor is likely to see it) will hold much more. Another location factor is how this link is situated relative to the content around it. A link that is located within content holds more weight than a link in a typical link-page or directory format with a title and description. The inline nature of the aforementioned location indicates that the link itself is more natural.

The anchor text and formatting - The linking text used is obviously important. If you are targeting a phrase such as "SEO firms" then using these two keywords in the anchor text is going to attach relevancy between your site and these keywords. Be careful though, building a thousand links using all the same anchor text is going to look suspicious. Vary your anchor text, perhaps include other keywords and you'll find your efforts rewarded. The formatting of the link is also relevant. A link that uses bold, italics, etc. is obviously meant to be seen by a visitor and is thus more highly regarded by Google.

Relevancy - The relevancy of the site linking to you is of key importance. Getting a link on a health site if you're an SEO firm is going to hold little weight whereas a link from an SEO resource site will be much more valuable.

PageRank - While the value of PageRank is arguably dropping when one is considering it's importance in link building it is still a factor. A link from a PageRank 5 page is worth substantially more than a link from a PageRank 2 page.

Age
In a patent application from back in 2004 Google told SEO firms (and anyone else for that matter) that age was an important factor. Google has since become a domain name registrar which gives them access to whois data and thus they can clearly see the age of a domain, who it is registered to, where it is hosted, etc. The older your domain is the more legitimate Google sees it and thus the more likely they are to rank it. Additionally, domains that are registered for longer periods of time are also seen as more legitimate and thus will tend to rank higher.

Content
Google is more picky than either Yahoo! or MSN when it comes to content. While the phrase, "content is king," may be overused it is still relevant. The more content you have on your site the more likely someone is to find what they're looking for when they get there. Thus, the more content you have on your site the more likely Google is to believe a searcher will find what they're looking for there. This does not mean that you should grab every bit of content you can find and build a 500,000 page site about potatoes. The content needs to be relevant and preferably well written. While a search engine spider may not be able to tell if your content is truly well written it must appeal to a human visitor. The reason for this will be made more clear below.

A blog is a good option for the easy addition of relevant content provided that you can dedicate the time (generally only a few minutes per day) to post some new and interesting information on your industry.

Keyword density is not as large a factor on Google as on Yahoo! or MSN however it is a factor and in the SEO "game" any factor that holds weight needs to be taken into consideration in all but the least competitive areas. While a site targeting a phrase such as "bed and breakfast in the middle of nowhere" can afford weakness in some of the areas most of us cannot. As noted in the articles on MSN and Yahoo! it would be unwise for me to specify an optimal keyword density here as the optimal levels vary by site type, topic, and fluctuate with the algorithm updates. Keyword densities need to be reanalyzed approximately monthly or any time an update is noted.

How it fares in the results
How your website fares in the results is a growing factor and will only continue to gain importance as time passes. If your website appears in the results for a specific phrase yet no one click on is your website will drop out of the rankings. Arguably worse, if your website is clicked however after a few seconds Google detects that the searcher has returned to the results to find a new site your site will drop. It is for this reason that it is important to insure that the titles you write for your website are both search engine and human friendly. You want Google to rank it highly and you also need humans to click it or Google won't rank it highly (circular logic I know but valid nonetheless).

You also need to make sure that what people see when they first land on your page either is the information they are looking for or alternatively, clearly indicates where that information can be found. This point may seem obvious simply from a Usability standpoint however the number of sites out there that violate this basic principle is vast. As part of your SEO efforts you will want to take a look at your site from a user's standpoint or better yet, watch real users navigate it to see if they can find what they're looking for quickly. You have about 3 seconds to get a visitor's attention so make sure that your visitor can find what they want in that time. You may need to hire experienced web designers to bring your website up to speed however the cost of this is lower than the cost of losing rankings and business due to poor design and the falling rankings that will follow.

Conclusion
Google has the most sophisticated algorithm of the three major engines and must be treated as such. Tricks rarely work and when they do they tend to work only for a short period of time. Build a strong site with lots of quality content that is easily navigated and will appeal to your human visitors and you're off to a good start. Optimize your keyword densities and secure quality links to your site and while it may take a bit of time to get past the aging delays, you will succeed on Google.

Recommended Resources
Total Optimizer Pro - A keyword density and backlink analysis tool. This tool breaks down a variety of onsite and offsite factors giving you a full snapshot of how the top 10 got their positions.

Google Press Releases - Read the latest press releases from Google. This may not give you the algorithm but it will tell you the direction they're going. Understand this and you'll be better equipped to deal with changes down the road.

Matt Cutts Blog - Read this blog from Google software engineer Matt Cutts. Obviously he's not about to give you the algorithm (or he wouldn't be a Google engineer would he?) but he does give great advice and the occasional head's up on updates. He allows comments on his blog and many of them are useful as well.

===========================================================================================================

Search Engine Optimization for Yahoo!


The Factors

To optimize and rank highly on Yahoo!, as with any of the major engines, specific areas need to be addressed. On Yahoo! the major areas are as follows:

Keyword density
Site structure
Backlinks
Aging
Keyword Density

As noted in the article on MSN, it would be unwise for me to specify a keyword density for you to target on your website. There are two reasons for this. First, if there is a delay between the writing of this article and when you read it specific numbers could well send you off in the wrong direction. Second, you will need to analyze your specific competitors to determine what the best density is in your area and for your type of website. Optimal keyword densities are no longer a one-size-fits-all calculation. Your industry and site-type will affect the optimal densities and thus, a complete analysis using a tool such as Total Optimizer Pro will be necessary.

Additionally, optimal keyword densities change on a regular basis and so you will need to periodically reanalyze your densities and compare them with others in the top 10 to insure that your densities remain within the optimal levels. When using Total Optimizer Pro for the onsite analysis we generally aim our densities for the upper end of the top 10 results but not aiming to be the top. Generally you will see a range that appears much like a bell curve with a couple sites in the very low range (0.5 to 1.0%) and a couple site in the very high range (5.0 to 8.0%). The rest will generally fall in the middle. Ignore those in the very low and very high range and target towards the upper end though not the highest of the remaining sites and you will be on target.

Site Structure

On no other engine is site structure more important than on Yahoo! While having a good site structure is important for a vaiety of reasons, it was on Yahoo! that Beanstalk noted the most significant gains when we brought our homepage and key internals into compliance with W3C standards (the rest of the site will be brought into compliance as part of our complete redesign scheduled for launch on April 24th). While slight gains were noticeable on both Google and MSN they were so minor that they may well have just been part of the ebb-and-flow of the results. On Yahoo! however we noted a three page jump the day the changes were picked up. No other changes to the site were performed during this time.

The site structure is important for two key reasons. First, the site structure determines the order in which your page content gets seen by the search engines and thus, whch content will be given the highest priority. Content that occurs higher up in the code of your page (not necessarily in your browser) is given a higher weight than content lower down in the code. Second, a properly structured site will be lower in code through the use of CSS, reduced or eliminated table use, etc. The reduction in code will push the content higher up the page as far as a search engine is concerned and thus, it will be given more weight.

Backlinks

Like the other two major engines, having a solid backlink count from relevant sites using good anchor text practices is a major factor on Yahoo! for any reasonably competitive phrase. When it comes to calculating backlinks Yahoo! is far more similar to Google than MSN. Aspects of backlink counts that must be taken into consideration when optimizing your website for Yahoo!:

Quality of site - like Google has attempted to do in the past with PageRank and is learning to do with TrustRank, sheer numbers aren't what will get you high rankings on Yahoo!, the quality of those links is more important. We must remember than PageRank is a Google calculation, not Yahoo! and so it alone cannot determine the value of a link when we are optimizing for this engine. It can be used as a quasi-benchmark however when determining if a link is a quality link on Yahoo! we are better off to considered whether it is from a site that is ranking well on Yahoo! for the same or related phrases, does it come from a site that it related to ours, does the site link to a site that is ranking well on Yahoo! and does the link come from a trusted domain. For these purposes a trusted domain can be considered any domain that is over 3 years old, has a solid number of backlinks coming from a wide variety of sites and which at least a solid number of are non-reciprocal links.
Position of link - like all the major engines, the position of your link on the page is important. A link at or near the bottom of the page is less valuable than a link nearer to the top. Also, if your link is on a page with other links, the effect that link will have on your rankings decreases respective to the number of links on the linking page.
Anchor text - the text used to link to your site will help reinforce that the keywords in that anchor text are associated with your site/page. Also, if that anchor text in in the midst of the content it will hold greater weight than if that anchor text is in a directory-style format above a description (i.e. link a standard links page)
Non-reciprocal links - reciprocal links are certainly still valuable on Yahoo! however it is important to supplement these links with non-reciprocal links in the form of directory listings and other one-way links.
Aging

The bane of new websites is the aging delay. Many focus on Google's "sandbox" when they think of aging delays however Yahoo! employs one as well, though it is lighter and lasts a shorter duration of time. New sites and links are not given the same weight as sites and links that have been around for a while. The aging delay on sites has been extended over the past couple years however it isn't as severe as that imposed by Google. New sites can expect to find it extremely difficult to rank for competitive phrase inside of 6 months even if everything else is in place. To add to the difficulty is a delay on the value of links. When a new site launches it obviosly has no links. These links are subjesct to a delay that appears to be somewhere between 3 to 4 months before they hold their full weight.

The combination of these delays can make it very difficult for new sites to rank for competitive phrases inside of 8 to 12 months however because the restrictions are lighter than those imposed by Google one can expect to see rankings for secondary, tertiary and completely unexpected phrases far faster on Yahoo!

Conclusion

If is important to note that a very important area that needs to be considered is coming in part four of this series. Simply optimizing your website for Yahoo! will likely not get you the traffic you're hoping for. Part three will cover optimizing your website for Google and part four will be titled "SEO For The Big Three: Tieing It Together" and will outline how to tie all the optimization tactics together into a concise strategy that will result in top rankings on all three major engines.

Resources

Total Optimizer Pro - A keyword density and backlink analysis tool. This tool breaks down a variety of onsite and offsite factors giving you a full snapshot of how the top 10 got their positions.
Yahoo! Press Room - Read the latest press releases from Yahoo!. This may not give you the algorithm but it will tell you the direction they're going. Understand this and you'll be better equipped to deal with changes down the road.
DigitalPoint's Yahoo! Forum - Read the latest news, feedback and discussions on the DigitalPoint forums. A great way to keep updated but beware, not everyone in there is a qualified opinion.

Unknown said...

Search Engine Friendly Design
Search engine friendly design is not a site designed specifically for a search engine. Search engine friendly design is a user friendly Web site that can be easily found on both the crawl based and human based search engine (web directory).
Importance of the site design:
- End users/site visitors/target audience should be primary
- Human based search engines
- crawler based search engines
How you arrange words, how you place graphic images and multimedia files, will communicate to the search engines, what is important on those pages.
5 Basic Rules of Web Design
- Easy to read
- Easy to navigate ("sense of place")
- Easy to find (externally and internally)
- Consistent in layout and design
- Quick to download
--- Easy to Use ---
Easy To Find:
- On search engines, web directories, and related sites.
- Go directly to the relevant page (people should be sent directly to the relevant page) within 4-5 clicks, preferably less.
- Most important information "above the fold"
- Contact information should be easily visible and find (footer, header and never put contact info after your copyright, about us page, locations page) Search Engines:
- Index Text
- Follow Links
- Measure Popularity
The first two, index text and follow links, are what all search engines do and will always do.
TEXT COMPONENT:
- Are you using words on your pages that match what your target audience types into the search engines?
- Do you have a site navigation and URL structure that the search engines spiders can easily follow?
- Bring in an SEO early on the site design phase... This happens all the time...
Success SEO depends on these components and all are important.
What kind of text?
- The words your target audience is typing into search queries are called keywords or query words.
- When visitors view a Web page, does the content appear to be focused? Title tag, headings, breadcrumbs, cross links, intros and conclusions, product/service descriptions, and graphic images...
- Visible body text should not have perform any type of action to view the most important text of an individual web page in a browser.

Primary Text:
- Title tags
- Visible body copy
- Text at top of the page
- Text in and around hypertext links
Secondary Text:
- Meta tag content
- Alternative text (alt tags)
- Domain and file names
Optimization Tip:
Title and headline your Web pages play a roll in your rankings.
LINK COMPONENT:
Site and page architecture
Site Navigation Scheme (from best to worst)
-- Text links
-- Navigation buttons
-- Image maps
-- Menus (from and dhtml)
-- Flash
-- Consideration: dynamically generated URLs
-- You can use two alternative navigation methods on your site. I.e. flash and text links, etc.
-- Check types of text links, including navigation, breadcrumbs, contextual links, embedded text links (within the content), optimize your sitemap page well.
Informational Pages
-- Contain info your target audience is interested in
-- Do not contain a lot of sales hype but rather factual info
-- Are spider friendly Web pages
-- Often have a simpler layout
-- Visually match the rest of your Web sites
-- Check Difference between an informational page and a doorway page
Cross Linking
- In addition to a spider friendly nav scheme and a site map, all sites should have related, relevant cross links.
- Hierarchical (vertical)
-- Breadcrumbs
-- Cats - > Sub cats
Type of Web page
Page layout and structure
URL Structure
POPULARITY COMPONENT:
- Number of links
- Quality of links
- Number of times people click on links to your site
- How long end users visit your site
- How often people return to your site
Question: Do people continue to navigate your site, link to your site? bookmark your site, return to your site.
Factors that Affect Popularity
- Substantial and unique content
- How other sites are linked to your site (all about anchor text)
- Site usability - what are the 2 of the biggest complaints about site design?
Other Design Considerations
- What is a splash page?
- Why don't search engines like splash pages?
- It is either a huge graphic saying click here or flash site with a skip intro link
- If you do a splash page, put text below the fold, with text nav also
Home Page Design:
- SEF characteristics to include in your home page:
-- keyword rich text
-- At least one spider friendly nav scheme
-- Link to the most important sections on your site
-- Visible link to a site map

Unknown said...

If you've read anything about or studied Search Engine Optimization, you've come across the term "backlink" at least once. For those of you new to SEO, you may be wondering what a backlink is, and why they are important. Backlinks have become so important to the scope of Search Engine Optimization, that they have become some of the main building blocks to good SEO. In this article, we will explain to you what a backlink is, why they are important, and what you can do to help gain them while avoiding getting into trouble with the Search Engines.

What are "backlinks"? Backlinks are links that are directed towards your website. Also knows as Inbound links (IBL's). The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines, especially Google, will give more credit to websites that have a good number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.

When search engines calculate the relevance of a site to a keyword, they consider the number of QUALITY inbound links to that site. So we should not be satisfied with merely getting inbound links, it is the quality of the inbound link that matters.
A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.

For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.

Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to manipulate links on a web page to try to achieve a higher ranking, it is a lot harder to influence a search engine with external backlinks from other websites. This is also a reason why backlinks factor in so highly into a search engine's algorithm. Lately, however, a search engine's criteria for quality inbound links has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these inbound links by deceptive or sneaky techniques, such as with hidden links, or automatically generated pages whose sole purpose is to provide inbound links to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.

Another reason to achieve quality backlinks is to entice visitors to come to your website. You can't build a website, and then expect that people will find your website without pointing the way. You will probably have to get the word out there about your site. One way webmasters got the word out used to be through reciprocal linking. Let's talk about reciprocal linking for a moment.

There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.

We must be careful with our reciprocal links. There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple. We could begin preparing for this future change in the search engine algorithm by being choosier with which we exchange links right now. By choosing only relevant sites to link with, and sites that don't have tons of outbound links on a page, or sites that don't practice black-hat SEO techniques, we will have a better chance that our reciprocal links won't be discounted.

Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.

One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.

There are a few things to consider when beginning your backlink building campaign. It is helpful to keep track of your backlinks, to know which sites are linking back to you, and how the anchor text of the backlink incorporates keywords relating to your site. A tool to help you keep track of your backlinks is the Domain Stats Tool. This tool displays the backlinks of a domain in Google, Yahoo, and MSN. It will also tell you a few other details about your website, like your listings in the Open Directory, or DMOZ, from which Google regards backlinks highly important; Alexa traffic rank, and how many pages from your site that have been indexed, to name just a few.

Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.

There is another way to gain quality backlinks to your site, in addition to related site themes: anchor text. When a link incorporates a keyword into the text of the hyperlink, we call this quality anchor text. A link's anchor text may be one of the under-estimated resources a webmaster has. Instead of using words like "click here" which probably won't relate in any way to your website, using the words "Please visit our tips page for how to nurse an orphaned kitten" is a far better way to utilize a hyperlink. A good tool for helping you find your backlinks and what text is being used to link to your site is the Backlink Anchor Text Analysis Tool. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something incorporating relevant keywords. This will also help boost your quality backlinks score.

Building quality backlinks is extremely important to Search Engine Optimization, and because of their importance, it should be very high on your priority list in your SEO efforts. We hope you have a better understanding of why you need good quality inbound links to your site, and have a handle on a few helpful tools to gain those links.

Unknown said...

There's lots of ways to get more links. All the good ones involve work, but stick at it and it can be done:


Request a listing at dmoz.org

Search Yahoo or Google for "keywords add url" or "my industry submit site" to find related directories

Look for related businesses that you aren't in direct competition with, call or email them personally and ask about swapping links. If you're a hotel, do this with local restaurants and museums. If you're a bank, swap links with realtors and auto dealers.

Write articles (with embedded links) and send them out as press releases. Some will show them on their site.

Ask all your friends. All of them.

Sometimes forums (not this one) or blogs let you add a signature with your link. Participate meaningfully and this is tolerated. These may not be the best links, but they can help, especially as you control the link text.

Become an active regular participant in a forum on your subject. Reply to posts, be seen as an expert on the subject. You'll make friends amongst the other participants who will then be interested in helping you and linking to you.

Learn to use the forum's codes so you can put deep links to your interior pages that answer detailed questions. Use good keyworded link text. Should look like the next bullet:

Use Yahoo's site explorer to research who your competition has links from, call them up and ask if you can swap links too.

When you're bored working on your site, browse the web looking over what the competition has done. Look for chances to get links from them, or follow where they're linking to, many of which pages will be good places to ask for links.

Be patient. This takes a while. Be tolerant, many people will be agreeable but not get around to it. Remind politely after a while, but if they still don't get to it, forget them and keep going. I've seen friends get all wound up because someone has too much of a life to get around to updating their website. Just find new sites to ask.

Try to get the attention of local media. Getting picked up by a newspaper, even an "alternative" one can get you some very valuable links. Get interviewed, tell a colorful story.

Offer to write articles for other sites that could use them. Embed good links within them. Associations and other authorities often desperately need good material and have great rank with search engines. You could be their best friend, and vice versa.

Be a pal. Offer to create basic websites for a few friends. Now that they love you, they won't mind a link or two back to such a nice guy (or gal).

Get to know some bloggers. They're becoming a major linking force. Tell them a colorful story so they can write something good about you.

Create useful content on your site that others interested in the subject will want to link to on their own. You'll start to get links you don't know about until the referrals show up in your log.

Best of all, create link bait: a feature or function so useful to your audience that people just have to link to it. This is usually interactive. If you're into kites, a wind forecaster. Do pharmaceuticals? If they enter all their drugs, it finds any possible interactions. Baseball cards? A value projector of what cards for drafted (but not yet playing) players cards will be worth in 5 years. Who can resist checking, even if future results are not guaranteed?

Try being controversial. If you write why it's best to retire on $7 dollars a month for life, all the finance blogs are going to make fun of you with links to your opinion.

Keep trying. Keep asking. Stick at it.
Then one day, others will be saying how it's no fair you're at the top just because you've been around a while. They won't understand it was hard work, and they won't think it's fair they don't rank at the top just because they had the clever idea of creating a new site.

Unknown said...

Getting your site listed in Google is easier than ever now, since Google offers sitemaps. By using a sitemap, you can make it very easy to get all of your pages indexed by Google. This can help you raise your rankings in Google as well as help searchers find you. The beauty of sitemaps is that once you've set it, you can basically forget about it until you make updates to your site. Then you will need to add the new entries to your sitemap. Google will take care of the rest. However, if you're a non-techie, or the idea of setting up scripts or learning new software doesn't appeal to you, there are other ways to get the job done without stressing yourself.

There are plenty of online tools you can use that are absolutely free. Although these sites won't help you if you have a really large site (usually over 1,000 pages), for the average site owner, these tools will get the job done and, at most, all you'll need to know how to do is cut and paste, save, and then upload your sitemap to your site.

Below are some tools you can use to help you get started. Try the various tools out first to see which ones you like best. These are the easiest solutions for creating sitemaps and it will only take you a few minutes to get your sitemap created and submitted.

1. Sitemap Doc (http://www.sitemapdoc.com/) — This site will create a sitemap for sites up to 750 pages. All you have to do is enter your URL for the main page for your site, click the create map button, and this site will extract your urls and create your sitemap. It also allows you to customize your map by adding additional urls, as well as changing time, date, and priority settings.

2. Audit My PC (http://www.auditmypc.com/free-sitemap-generator.asp) — This sitemap creator actually gives you two options: you can create a sitemap for Google and you can create a sitemap for your visitors. The sitemap you create for your visitors can help you in getting spidered by the newer spiders offered by MSN and Yahoo.

3. ROR (http://www.rorweb.com) — ROR is very similar to XML, which is the formatting used in sitemaps. You can use it to create your sitemaps. Unlike most sitemap utilities, this tool does more. It will help you more fully describe the content on your website. You can also use it to create feeds for your products and services, as well as feeds for your site.

4. WordPress Sitemap Generator (http://www.arnebrachhold.de/2005/06/05/google-sitemaps-generator-v2-final) — For those of you who use Wordpress for your blog, this is the only tool you need to create your sitemaps. This sitemap generator is actually a plug-in you can install on your Wordpress Blog and use it to automatically generate a new sitemap for your blog each time you post. This keeps your sitemap constantly updated and it allows Google to more fully spider your blog. Once installed and updated, you can forget about it. The plug-in takes care of the rest.

5. Google Sitemap Generators List (http://code.google.com/sm_thirdparty.html) — If none of the solutions above work for you or you need something more comprehensive, Google offers a large list of scripts and software you can use to generate your sitemaps. This will be the solution for any site with a large number of pages and needs a sitemap that can't be created with the tools listed above.

Once you've created and uploaded your sitemap, then it's time to submit it. You will need a Gmail account so you can sign into Google Sitemaps. You can also find out more about the various types of content Google accepts. Follow the instructions for submitting your site. It should take you about two minutes to add your sitemap. Once you've added your sitemap, Google will take over. Not only will you be able to get your site more fully spidered by Google, you'll also get plenty of helpful statistics on your site. You can use what you learn from your statistics to improve your search engine coverage and raise your rankings.

Tin said...

SEO expert in Delhi
SEO expert in Delhi
SEO expert in Delhi

Marketing Via Digital India said...

What a nice blog as it is so helpful and informative. Keep posting blogs like this. If you need ant information about Search Engine Marketing course. Then click the link SEM Course in India

Marketing Via Digital India said...

"What a nice blog as it is so helpful and informative. Keep posting blogs like this. If you need any information about Digital Marketing Services. Then click the link
Digital Marketing Services in India
"""

Rahul Ads Expert said...

Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi
Facebook Ads Expert in Delhi