Super Seventies RockSite's Infobank - 'just the facts, ma'am'    Share this site - Email/Facebook/Twitter/Pinterest

Super Seventies RockSite! - www.superseventies.com


 

Amazon.com - Shop Now & Save



 
SEO

videos bullet icon  SEO Videos

Search Engine Optimization Simplified

Chances are good that at some point in your life you ran a search on an online
search engine and instead of one hit you received pages and pages of possible
hits. Have you ever wondered if the order the websites appear on search was
just a random grouping or if they had been placed in a specific order that just
appeared disorderly to you? The answer is that there is a very elaborate system
used to determine where a website appears during an internet search. The
process is something called search engine optimization.

Search engine optimization is the science and art of making web pages
attractive to search engines.

Next time you run an internet search look at the bottom of the page. Chances
are good that there will be a list of page numbers (normally written in blue)
for you to click if you can't find exactly what you are looking for on the
first page. If you actually look farther then the second page you will part of
a minority. Studies and research have shown that the average internet user does
not look farther then the second page of potential hits. As you can imagine it's
very important to websites to be listed on the first two pages.

Webmasters use a variety of techniques to improve their search engine ranking.

The first thing most webmasters (or website designers) do is check their meta
tags. Meta tags are special HTML tags that provide information about a web
page. Search engines can easily read Meta tags but they are written with
special type of text that is invisible to internet users. Search engines rely
on meta tags to accurately index the web sites. Although meta tags are a
critical step in search engine optimization they alone are not enough to have a
web site receive top ranking.

Search engines rely on a little device called a web crawler to locate and then
catalog websites. Web crawlers are computer programs that browse the World Wide
Web in a methodical, automated manner. Web crawlers are also sometimes called
automatic indexers, web spiders, bots, web robots, and/or worms. Web crawlers
locate and go to a website and "crawl" all over it, reading the algorithms and
storing the data. Once they have collected all the information from the website
they bring it back to the search engine where it is indexed. In addition to
collecting information about a web site some search engines use web crawlers to
harvest e-mail addresses and for maintenance tasks. Each search engine has their
own individual web crawlers and each search engine has variations on how they
gather information.

Most webmasters feel that proper use and placement of keywords helps catch the
attention of web crawlers and improve their websites ranking. Most webmaster
like to design their websites for ultimate search engine optimization
immediately but there aren't any rules that say you can't go back to your
website at any time and make improvements that will make it more attractive to
search engines.

Basic Information about Search Engine Optimization

Fifteen years ago if we need information we had to go to library. Writing
reports, and preparing for test required hours of scanning shelves filled with
books, blowing large chunks of change in the copy matching, checking out a
mountains of books, and squinting at microfilm. The internet has chanced all of
that. Now when we need to learn something all we have to do is boot up a
computer and connect to the internet.

Most people have an extensive favorites list on their computers, a simple click
of the mouse and they are at their favorite website. This is a handy feature if
you do a lot of online shopping at a particular store or spend a lot of time at
a specific chatroom. When they need to use the internet to gather information
most people consult an online search engine.

A search engine is an information retrieval system designed to help locate
information. Most people are familiar with Google, yahoo, and ask.com. Search
engines work when a user types a keyword into the little box. Once the user
types in the word the search engine scans all its files. It then provides the
user with a page that is full of options, generally twenty. The user scans the
list of options and then opens the one that sounds like it best suits their
needs. Search engines use something called search engine optimization to
determine the ranking of each web address.

Search engine optimization is the art and science of making web pages
attractive to the search engines. The more a website appeals to the search
engine the higher it will be ranked.

Crawler based search engines determine the relevancy of a website by following
a set of guidelines called algorithms. One of the first things a crawler based
search engine looks for is keywords. The more frequently a website uses a
certain keyword the higher the website will rank. Search engines believe that
more frequently a word appears the more relevant the website.

The location of the key words is as important as the frequency.

The first place a search engine looks for keywords is in the title. Web
designers should include a keyword in their HTML title tag. Web designers
should also make sure that keywords are included near the top of the page.
Search engines operate under the assumption that the web designers will want to
make any important information obvious right away.

Spamdexing is a term used to describe a webpage that uses a certain word
hundreds of times in an attempt to propel their webpage to the top of search
engines rankings. Most search engines use a variety of methods, including
customer complaints, to penalize websites that use spamming methods. Very few
internet search engines rely solely on keywords to determine website ranking.
Many search engines also use something called "off the page" ranking criteria.
Off the page ranking criteria are ranking criteria's that webmasters cannot
easily influence. Two methods of off the page search engine optimization are
link analysis and click through measurement.

Three Basic Steps to Search Engine Optimization

Search engine optimization is the art and science of making web pages appear
attractive to the search engines. The better optimized a website is, the higher
the ranking it will receive from a search engines web crawlers, the higher its
ranking the more traffic your website will have, the more traffic your website
has the more profit your website will generate. The key is good internet search
engine optimization.

Why is having a receiving a high ranking so important to the future success of
your online business? Studies have shown that consumers seldom look at websites
that don't rank a spot on of the first two pages the search engines displays.
Websites that receive a ranking that places them on the third page (or any
other pages after that) see a significantly lower amount of traffic at their
websites then one that is ranked on the second page. There is even a staggering
difference between the first and second page. In the world of e-commerce ranking
and strong search engine optimization is everything.

At first search engine optimization may feel like trying to rappel down the
Grand Canyon, a huge scary world full of big words like web crawlers, PageRank,
Meta tags, and algorithms. You've never heard of any of these things. A quick
internet search of the world algorithm doesn't help; all you got was a printout
of strange symbols and numbers arranged in complex algebraic equations.

Sit back, take a deep breath, and try to relax. Search engine optimization is a
lot simpler then you might think. First things first.

Algorithms really are every bit as complex as they look. Simply defined they
are a finite set of carefully defined instructions. Most, if not all, computer
programs are designed with strict algorithms.

PageRank is simply the program Google designed to search, index, and rank it
registered webpage's. PageRank operates on a link analysis algorithm. PageRank
is credited for Google incredible success.

Web crawlers are tools search engines use to browse the World Wide Web in a
methodical, automated manner. When web crawlers are browsing websites they are
looking for algorithms.

Meta tags are special HTML tags that provide information to about a web page.
Meta tags are written directly into the title tag and are only visible to the
search engine.

The reality of search engine optimization is that you can start to optimize
your website without any knowledge at all of the technical stuff involved in
search engine optimization. Simply stated the very first step in designing a
website that is going to be well ranked by the search engines is to create a
content rich site. What this means is that you must cram as much information
about your product into your website as you possibly can.

The third step to search engine optimization is to fill your site with keywords
that will attract the web crawler's attention. The final step in a wonderfully
optimized website is to submit it to the search engine that will compliment it.

How Title and Meta Tags are used for Search Engine Optimization

When it comes to title tags and search engine optimization there are a few
question website owners typically ask. Does each individual web page need a
different title? Is there a maximum length for title tags? Is there a title tag
limit? Are title Meta tags a good idea?

The World Wide Web Consortium requires that every single HTML document must
have a title element in the head section. They also state that the title
element should be used to identify each individual pages content.

The title tag plays four separate roles on the internet.

The first role the title tag fulfills is what librarians, other webmasters, and
directory editors use to link to other websites. A well written title tag is far
more likely to get faster reviews then one that is sloppy or incomprehendable.

The title tag is what is displayed on the visitor's browser. By displaying the
title tag in the visitors browser the web user knows exactly where they are if
they have to return to the site later on. Internet Explorer typically tires to
display the first ninety-five characters of the title tag.

Search engines display the title tag as the most important piece of information
available to web searchers.

A good title tag should be able to clearly indicate the webpage's contents to
the web user. A clear title tag is more likely to be placed in the user's
favorites list. The normal length for a good clear title tag is normally under
sixty-five characters long. Title tags should be typed in the title case.
Headers should also be typed in the title case.

When it comes to search engine optimization, the home page title is normally
the first thing the web crawlers look at when they are ranking a webpage. Your
website is introduced by your homepage title. It is important to make sure that
your title tag sounds credible.

Every single page of your website must have its very own unique title. A Meta
tag is a special HTML tag that provides information about a web page. Meta tags
do not affect the display of a webpage. Although Meta tags are placed directly
into the HTML code, they are invisible to web users. Search engines use Meta
tags to help correctly categorize a page. Meta tags are a critical part of
search engine optimization.

It is important to remember that Meta tags are not a magic solution to making
your website a raging success. The most valuable feature Meta tags offer to
website owners is the ability to control (to a certain degree) how their web
pages are described by the search engines. Meta tags can also let website
owners prevent having their website indexed at all.

Meta tag keywords are a way to provide extra test for web crawler based search
engines to index. While this is great in theory several of the major search
engines have crawlers that ignore the HTML and focus entirely on the body of
the webpage.

Google and PageRank - Search Engine Optimization's Dream Team

On September 7 1998, two Stanford University students, Larry Page and Sergey
Brin, co-founded Google, a company they started as part of a research project
in January 1996. On August 19, 2004 Google had its first public offering, the
one point six-seven billion dollars it raised gave it a net worth of
twenty-tree billion dollars. As of December 31, 2006 the Mountain View,
California based internet search and online advertising company Google Inc. had
over ten thousand full time employees. With a 50.8% market share, Google was the
most used internet search engine at the end of 2006.

When Larry Page and Sergey Brin began creating Google it was based on the
hypothesis that a search engine that could analyze the relationships between
the different websites could get better results then the techniques that
already existed. In the beginning the system used back links to estimate a
websites importance causing its creators to name it Backrub.

Pleased with the results the search engine had on the Stanford University's
website the two students registered the domain google.com on September 14,
1997. A year after registering the domain name Google Inc was incorporated.

Google began to sell advertisements associated with keyword searches in 2000.
By using text based advertisements Google was able to maintain an uncluttered
page design that encouraged maximum page loading speed. Google sold the
keywords based on a combination of clickthroughs and price bids. Bidding on the
keywords started at five cents a click.

Google's simple design quickly attracted a large population of loyal internet
users.

Google's success has allowed it the freedom to create tools and services such
as Web applications, business solutions, and advertising networks for the
general public and its expanding business environment.

In 2000 Google launched its advertising creation, AdWords. For a monthly fee
Google would both set up and then manage a companies advertising campaign.
Google relies on AdWords for the bulk of its revenue. AdWords offers its
clients pay-per-click advertising. AdWords provides adverting for local,
national, and international distribution. AdWords is able to define several
important factors in keywords when and ad is first created to determine how
much a client will pay-per-click, if the ad is eligible for ad auction, and how
the ad ranks in the auction if it is eligible.

By following a set of guidelines provided by Google, webmasters can ensure that
Google's web crawlers are able to find, index, and rank their websites.

Google offers a variety of webmaster tools that help provide information about
add sites, updates, and sitemaps. Google's webmaster tools will provide
statistics and error information about a site. The Google sitemaps will help
webmasters know what mages are present on the website.

The major factor behind Google's success is its web search services. Google
uses Page Rank for its search engine optimization program. Page rank is a link
analysis algorithm that assigns a numerical weight to every single element of a
hyperlinked set of documents, like the World Wide Web. Its purpose is to measure
the relative importance within the set. PageRank is a registered trademark of
Google. Stanford University owns PageRank's patent.

The Definition of Search Engine Optimization

Too many webpage owners feel that once they submit their page a search engine
they are guaranteed success. That's generally not the case. Simply submitting
your web page to a search engine is not always enough to get any hits. Most web
pages require search engine optimization to become truly successful.

Search engine optimization is the art and science of making web pages
attractive to the search engines. The goal of search engine optimization is to
have your website ranked in the top ten internet search hits that appear on the
first page. Why is it important to be on the first page? It's important because
the average internet user doesn't click on any of the sites listed on the
second or third page. Out of sight, out of mind. One website owner reported a
two hundred and ten percent increase on her e-commerce sight when she had her
webpage redesigned for optimal search engine optimization.

You would think that the prospect of a two hundred and ten percent increase in
sales would be all the incentive a webmaster would need to redesign their site.
That isn't always the case. There are a variety of reasons people avoid
recreating their websites.

Some people believe that search engine optimization is too difficult to
understand. The reality is that search engine optimization is fairly simple.
All it takes is a little research and most people are ready to rock.

Other people feel that there are simply too many things to learn before they
will be ready to optimize their website. Search engine optimization is just
like anything else. When you first start out you know nothing. With some
homework and a bit of trial and error and you will know exactly what it takes
to make your webpage popular with the web crawlers.

Some people believe that search engine optimization will take up lots of their
precious time. People with this particular fear should remember that old adage
about time and money. If time spent optimizing your website leads to an
increase in sales isn't it time well spent? Besides search engine optimization
is easy, once you have the hang of it won't add much to the time you would
already have to devote to updating your website.

You do not have to submit to gobs of search engines to reap the rewards of
search engine optimization.

If you have a large site you shouldn't worry about spending lots of time
optimizing it and running the risk of never finishing the process. If you have
a large website just take things one step at a time. Focus on optimizing on
page per day. Start with your most important pages and then concentrate on the
irrelevant pages. By using this one page a day method you won't run the risk of
sitting at your computer until your eyeballs fall out of your head.

It might take some time and some trial and error to optimize your website but
you will consider it time well spent when you see an increase in the amount of
traffic, the increase in traffic should lead to more sales.

Search Engine Optimization - Hoaxes

Google believes in having a good time. They especially believe in having a good
time on April Fools Day. How does a company who runs a search engine celebrate
April Fool's Day? They set up search engine hoaxes. April Fool's Day hoaxes are
fast becoming a Google tradition.

On April 1, 2000, Google announced its brand new form of search technology, a
technology they cheerfully named MentalPlex. How did MentalPlex work?
Brainwaves, all the searcher had to do was think about what they wanted to
search for, this eliminated the need for typing, effectively eliminating the
issue of spelling errors.

In 2002, Google openly discussed the genius behind its PageRank system. The
secret? Pigeons or rather PigeonRank. Google was very proud of the way they had
created a more efficient and cost effective way to rank pages. They were quick
to explain that no pigeons were cruelly treated.

April 2004 offered Google employees the opportunity to work at the Google
Lunar/Copernicus Center...on the moon. This April Fool's Day prank made several
tongue-in-cheek references to WindowXP's visual style. They named the operating
system Luna/X paying homage to Linux.

Google broke into the beverage industry in 2005 with their Google Gulp. People
who drank Google Gulp would be able to get the most out of their Google search
engines because they would be increasing their intelligence with every swallow.
Google Gulp worked through a series of algorithms that used a real time analysis
of the drinkers DNA and made precise adjustments to the brains
neurotransmitters. Google Gulp came in a variety of flavors including Google
Grape (glutamatic acid), Sero-Tonic Water (serotonin), Sugar-Free Radical (free
radicals), and Beta Carroty (beta carotene).

2006 was a time for romance. Google created Google Romance. Google's catch
phrase, which appeared on the main search page was, "Dating is a search
problem. Solve it with Google Romance." Google users were invited to use Soul
mate Search which would send them on a Contextual Date. Google invited people
to "post multiple profiles with a bulk upload."

Google has also taken advantage of April Fool's Day to announce very real
changes in the company. The reason they make real offers to consumers on April
Fool's Day is so that the consumers will think that it's a hoax, joke about it,
and then be pleasantly surprised when they find out that its real. Google
announced the launch of Gmail, e-mail that was free to the consumer and
provided one entire gigabyte of storage (that amount of storage for free was
unheard of at the time), on March 31, 2004 (most consumers found out about it
on the morning of the first). Exactly one year later they announce that they
were increasing the one gigabyte of storage to two gigabytes.

Google's map of the moon was added to Google maps on July 20, 2005. The map of
the moon was semi-real, it did show NASA images of a very small section of the
moon, but zooming in on the tiny section presented viewers with a photograph of
Swiss cheese. There was also the location of all moon landings on the map. The
map was Google's way of celebrating the thirty-sixth anniversary of the first
man on the moon but many consumers assumed that it was an extension on the
Google Copernicus hoax. Google claims, through something called Google Moon,
that in the 2069, Google Local will support all lunar businesses and addresses.

Finding a Search Engine Optimization Company

When it comes to business some people like to get their hands dirty and iron
out every little detail of every little deal and transaction. Others like to
handle the parts of the business that they know and are comfortable with,
leaving the bits and pieces they are unsure about to people who know what they
are doing.

Before you start looking for a search engine optimization company sit down and
consider your situation. What goals do you have for your website? What are your
priorities? How much can you afford to spend, remember that you pay for quality,
the lowest price isn't always the best deal.

When it is time to submit your web-based business to a search engine their are
search engine optimization companies who, for a fee, will be happy to optimize
the websites for the business owners who do not feel comfortable doing it
themselves.

Search engine optimization is the art and science of making a website
attractive to search engines. If you don't know where to find a reputable
search engine optimization company try looking in search engine optimization
forums, references or articles on reputable websites, ask friends for
recommendations, ask other webmasters if they used anyone to optimize their
sites and if they did ask which company they used and if the experience was
pleasant.

The first thing you have to watch out for when you're selecting a company to
handle your search engine optimization is scams. The first thing to do is avoid
any search engine optimization companies that are listed in the black hat
directory. Black hat search engine optimization is not really optimizing but
really just spamdexing, most search engines penalize websites that are caught
spamdexing. Also avoid any company who guarantees a ranking before they even
look at your site. Make sure the company you are considering is actually going
to do something besides add doorway pages and meta tags.

What is spamdexing?

Spamdexing is using methods that manipulate the relevancy or prominence of
resources indexed by a search engine, usually in a manner that is inconsistent
with the purpose of the indexing system. A lot of times spamdexing is done by
stuffing a website full of keywords, web crawlers (the programs search engines
use to rank websites) read the web sites they read lots of the same keyword and
assume that the sight is content rich. Based on the web crawler's findings the
website is given a high rank. Allot of the time the keywords are stuck at the
bottom of the document where the internet user can't see them. Keyword stuffing
is considered content spam.

The other common type of spamdexing is link spam. Link spam is spamdexing that
takes advantage of link ranking algorithms causing search engines to give the
guilty website a higher ranking. Link farms, hidden links, Sybil attack, wiki
spam, spam blogs (also referred to as splogs), page hijacking, buying expired
domains, and referrer log spamming are forms of link spam.

Search Engine Optimization - Budgeting

For arguments sake let's say that you own a successful bed and breakfast in the
middle of Idaho. Currently you rely mainly on word of mouth and repeat
customers. You can't help wandering if creating a website won't help attract
more attention to your little business.

A quick internet search has you rethinking your plans. There are a lot of bed
and breakfast's with web pages. You can't help but wonder what you could
possibly do to get your webpage noticed.

The key to a successful webpage is search engine optimization.

Search engine optimization is the art and science of making your website
attractive to the internets search engines. The more attractive your website is
the search engines the higher they will rank your little bed and breakfast. The
higher your website ranks the more people, hopefully, will check your website
out.

The first step towards a successful website is getting it submitted to a search
engine. Search engine submission is the act of getting your website listed with
the search engines. Search engine submission can also be referred to as search
engine registration.

One of the first things you want to consider is how much you are willing to
spend to submit your website to a search engine. It is possible to have your
site listed for free; paying for the service will generate more traffic to your
website. The cost of submitting your website to Yahoo's search engine is about
three hundred dollars a year. The three hundred dollars pays for Yahoo's human
compiled directory. The humans help influence web crawlers to your website. If
you can't afford the three hundred dollars for the human compiled directory try
to list your website and see if any of the search engine crawlers locate it. You
can go back in a few months time and pay for a human compiled search engine
later.

There businesses that, for a fee, can help you design a website that will
attract web crawlers to your website. Many of these businesses charge different
prices for different search engine optimization packages. Types of search engine
optimization services some of these companies offer include naming convention,
keyword density/syntax, blog implementation, vertical affiliates, and
third-party posting. When looking for a business or search engine consultant
looks for reciprocal links, keyword strategies, knowledge of HTML, language
skills, knowledge of search engine optimization boosters, submission
strategies, and submission tracking,

If you decide to use a search engine optimization company take your time and
shop around. Ask questions. Avoid any companies that guarantee instant success,
if it sounds too good to be true it probably is. Try to find a search engine
optimization company that will work to build the targeted content of your
website. Look for a company that offers interactive features that create
documents that will lead web crawlers to your website.

When it comes to the cost of search engine submission and search engine
optimization spending less simply means it might take a little longer to
realize your goals. The more you are able to spend the faster your website will
gain attention.

Search Engine Optimization - Web Crawlers

The terms web crawler, automatic indexers, bots, worms, web spiders, and web
robots are programs or automated scripts with browse the World Wide Web in a
methodical, automated manner. The term web crawler is the most commonly used
term.

Web crawlers are a tool used for search engine optimization.

Search engines use web crawlers to provide up to date data and information. Web
crawlers provide the requested information by creating copies of web pages that
the search engine later processes. Once the information has been processed the
search engines indexes the pages and are able to quickly download the pages
during a search. The process of web crawling is a key factor in search engine
optimization. Search engine optimization is the art and science of making web
pages attractive to search engines. Computer people call the process of using a
web crawler to rank a website spidering.

Some search engines use web crawlers for maintenance tasks. Web crawlers can
also be used for harvesting e-mail addresses. The internet is a gaping ocean of
information. In 2000, Lawrence and Giles manufactured a study that indicated the
internet search engines have only indexed approximately sixteen percent of the
Web. Web crawlers are designed to only download a tiny amount of the available
pages. A miniscule sample of what the internet has to offer.

Search engines use web crawlers because they can fetch and sort data faster
than a human could ever hope to. In an effort to maximize the download speed
while decreasing the amount of times a webpage is repeated search engines use
parallel web crawlers. Parallel web crawlers require a policy for reassigning
new URLs. There are two ways to assign URLs. A dynamic assignment is what
happens when a web crawler assigns a new URL dynamically. If there is a fixed
rule stated from the beginning of the crawl that defines how to assign new URLs
to the crawls it is called static assignment.

In order to operate at peak efficiency web crawlers have to have a highly
optimized architecture.

URL nominalization is the process of modifying and standardizing a URL in a
consistent manner. URL nomalization is sometimes called URL canonicalzation.
Web crawlers usually use URL nomilization to avoid multiple crawling of a
source.

In an attempt to attract the attention of web crawlers, and subsequently highly
ranked, webmasters are constantly redesigning their websites. Many webmasters
rely on key word searches. Web crawlers look for the location of keywords, the
amount of keywords, and links.

If you are in the process of creating a website try to avoid frames. Some
search engines have web crawlers that can not follow frames. Another thing some
search engine are unable to read are pages via CGI or database -delivery, if
possible try creating static pages and save the database for updates. Symbols
in the URL can also confuse web crawlers. You can have the best website in the
world and if a web crawler can't read it probably won't get the recognition and
ranking it deserves.

Search Engine Marketing - How it Differs from Search Engine Optimization

Search engine marketing is a set of marketing methods used to increase the
visibility of a website in search engine results pages. Types of search engine
marketing include; search engine optimization, pay per click, paid inclusion,
and social media optimization. Search engine marketing differs from search
engine optimization which is the art and science of making web pages attractive
to internet search engines.

Non-profit organizations, universities, political parties, and the government
can all benefit from search engine marketing. Businesses that sell products
and/or services online can use search engine marketing to help improve their
sales figures.

Some of the goals of search engine marketing are to develop a brand, generate
media coverage, and enhancing a reputation, and to drive business to a physical
location.

If you do not feel confident enough to try your own search engine marketing
there are several companies that will be able to help you out for a price. If
you decide to go with a search engine marketing company take your time and shop
around, find a company that really suits your own businesses search engine
marketing needs.

Stay away from companies that promise top rankings. Most companies that promise
tope ranking are more interested in your money then they are in keeping your
business. Quite often this type of company will charge you top doller, spend a
few days making sure your website has a few basic requirements and that is the
last you hear from them. This type of search engine marketing company is not
really interested in repeat customers.

Tread carefully around companies that promise first page rankings on the major
search engines like Google and Yahoo. Make sure these companies are talking
about sponsored listings and not just natural listings. Companies that are only
after natural listings traditionally charge a large monthly fee, using a small
portion of the money on sponsored listings, and pocketing the remainder.

The false promise most commonly used by shady search engine marketing companies
is the money back guarantee. Generally if you read the contract very carefully
you will lean that these companies have a very strange idea of major search
engine. Companies that have a money back guarantee typically don't deal with
the search engine movers and shakers like Google and Yahoo, instead they use
small obscure search engines that are hardly ever used.

The Search Engine Marketing Professional Organization (SEMPO) was created in
2003 to offer the public educational resources about search engine marketing
and to also promote search engine marketing. Currently SEMPO represents over
500 global search engine marketing companies. Sempo is happy to offer their
resources to the public for free. SEMPO has offers search engine marketing
training courses for any and all interested parties who would like to expand
their knowledge of search engine marketing. SEMPO's objectives are to teach
search engine marketing strategies, techniques, and successful practices, to
increase the availability and quality f its professionals, and to offer
training courses that will help to establish a benchmark for search engine
marketing. The cost of a SEMPO training course can range anywhere from five
hundred dollars for a fundamentals of search marketing class, to over two
thousand dollars for an advanced search advertising course.

Yahoo! Search Engine Optimization

Jerry Yang and David Filo were graduate students at Stanford University in
January of 1994 when they created a website that they called "Jerry's Guide to
the World Wide Web," a directory that organized other web sites into a
hierarchy. Four months later Yang and Filo renamed the search engine Yahoo!
after a word used by Jonathan Swift in Gulliver's Travels. Swift's definition
of Yahoo! was "rude, unsophisticated, uncouth."

At the end of 1994, approximately twelve months after its creation, Yang and
Filo had over one million hits on their fledgling search engine. Understanding
that they had designed something that could enjoy potential business success
Filo and Yang incorporated Yahoo! early in March of 1995, fourteen months after
its inception. Because the name Yahoo was already the brand name of other
enterprises, human propelled watercraft, barbecue sauce, and knives, Yang and
Filo were forced to add the exclamation point in order to trademark the name.
Yahoo! had it first public offering on April 12, 1996. Two point six million
shares of Yahoo! were sold at thirteen dollars a piece, earning a total of
thirty-three point eight million dollars.

By the late 1990's Yahoo! and several other internet communications company's
diversified into web portals.

In the late 1990's Yahoo! also started buying out other companies such as
eGroups and GeoCities. Because Yahoo! had a reputation for changing terms of
service when purchasing companies most of the buy outs were wrought with
controversy.

Although it stocks fell to an all time lo, Yahoo! was able to survive the
dot.com bubble burst. In order to help rebuild itself, Yahoo! started forming
partnerships with telecommunication companies and internet providers, these
alliances led to the creation of content rich broadband services that actively
competed with AOL.

With their eye on the future, the powers in charge at Yahoo! are working on
creating Yahoo!Next, a concept similar to Google Labs that contains forums that
provide places for Yahoo! users to leave feedback that will hopefully assist in
the development of future Yahoo! enterprises and technologies.

Like most successful companies Yahoo! is constantly working to improve and
expand. Yahoo! currently provides its customers with a smorgasbord of internet
services that cater o most online activities. These services include Yahoo!
Mail, Yahoo! Groups, Yahoo! Maps and Driving Directions, and Yahoo! Messenger.
While Google holds the top spot in search engines Yahoo! is standing strong in
second place. Yahoo! competes against Yahoo! by offering its customers vertical
search services such as, Yahoo! Image, Yahoo! Local, Yahoo! Shopping Search,
Yahoo! Video, and Yahoo! News. Yahoo! is proud to boast the largest, most
successful e-mail service in the world.

User generated content products such as Yahoo! Personals, Yahoo! Photos, Yahoo!
360, and Flicker offer Yahoo!'s customer's social networking services.

Yahoo! Shopping, Yahoo! Merchant Solutions, Yahoo! Store, Yahoo! Web Hosting,
Yahoo! Domains, and Yahoo! Business Email are services Yahoo! provides to small
business owners that allows them to develop their own online business using
Yahoo!'s tools.

In March of 2004 Yahoo! launched a paid inclusion program that guaranteed
commercial websites listings on Yahoo! search engines for a fee. While the paid
inclusions were lucrative for Yahoo!, they where unpopular with the online
marketing world. Business owners didn't want to pay the internet mogul for
search engine optimization. Paid inclusion simply guaranteed that the
businesses websites would be ranked; it didn't guarantee that it would be
ranked in the first two pages.




Spamdexing - the Bane of Search Engine Optimization

Methods that manipulate the relevancy or prominence of resources indexed by a
search engine, usually in a manner inconsistent with the purpose of the
indexing system is called Spamdexing.

The sheer amount of information available on the internet is mind-boggling. In
2000 a study indicated that the internet's search engines where only capable of
indexing approximately sixteen percent of available pages. That sixteen percent
adds up to pages and pages of potential hits. There are typically ten hits per
page. The average internet user never goes farther then the first set of ten.
Webmasters use a variety of techniques to increase their ranking. The art and
science of making web pages attractive to the search engines is called search
engine optimization.

The importance of a high search engine ranking started driving webmasters to
use a variety of tricks to improve their ranking the middle of the 1990's. On
May 22, 1996 The Boston Herald printed an article written by Eric Convey titled
"Porn Sneaks Way Back on Web." It is the first time the term spamdexing was
used. The word spamdexing is the merging of the word spam, the internets term
for unsolicited information, and indexing.

There are two types of spamdexing. The terms are content spam and link spam.

Content spam is the use of techniques that alter the search engines view of the
pages content. Some methods of content spam include the use of hidden text,
keyword stuffing, Meta tag stuffing, doorway pages, and scraper sites.

Taking advantage of link-based ranking algorithms which in turn gives a higher
ranking to a website is called link spam. Link spam methods include link farms,
hidden links, Sybil attack, wiki spam, spam blogs (also referred to as splogs),
page hijacking, buying expired domains, and referrer log spamming.

Some people consider spamdexing a black hat method of internet search engine
classification.

Key word stuffing is a favorite type of content spamdexing. Key word stuffing
is including a key word hundred of times on a single webpage. Given the sheer
volume of the word the search engine automatically gives that particular
webpage a higher ranking then one that might use the word legitimately. Most
websites that employ keyword stuffing place the words at the bottom of the page
or write it with text that the person surfing the web can't see. Some search
engines try to discourage key word stuffing by ranking websites with an
excessive number of keywords at the bottom of the ranking.

Some web masters like to include the name of a famous person on their site as a
keyword. The name attracts the attention of search engines and web surfers even
though the web site has nothing to do with the person.

Some websites try to steal web surfers from their competitors by including
their name as a keyword in the body text and meta tags. By doing this the
webmaster has guaranteed that the search engines with index it accurately.
Using the name of a competitor in the body of a website is normally a direct
violation of the copyright law.

Designing a Web Crawler Friendly Web Site

The most successful online businesses all have one thing in common. They all
knew how to make search engine optimization work for them.

Search engine optimization is the art and science of making websites attractive
to the internet's search engines. The first step in successfully achieving
stellar search engine optimization is to lure search engine's web crawlers to
your website. Web crawlers are computer programs that the search engines use
gather data and index information from the websites. The information the web
crawlers gather is used to determine the ranking of a webpage.

One of the fastest ways to hamper a web crawler is to construct a website that
has frames. Most search engines have crawlers that can't penetrate the frames,
if they can't get into a webpage to read it then that webpage remains unindexed
and unranked. Two search engines, Google and Inktome, have web crawlers that are
capable of penetrating frames. Before submitting your website to a search engine
do some research and find out if they have a crawler that is incapable of
penetrating any frames.

If you have written frames into your URL it will probably be worth your effort
to go back and rewrite your URL's. Once you have rewritten your URLs you might
be surprised to find that the new addresses are easier on humans as well as web
crawlers, the frameless URLs are easier to type in documents as links and
references.

Once you have rewritten your URL's it is time to start submitting your website
to search engines. Some webmasters like to use an automated search engine
submission service. If you decide to go with the submission service you should
be aware that there will be a fee involved, the minimum fee is typically
fifty-nine US dollars. This price should keep a few URLs on the search engines
for a year. Other webmasters like to avoid big fees by submitting their website
to individual search engine on their own.

Once your webpage is submitted to a search engine you need to sit down and
design a crawler page. A crawler page is a webpage that contains nothing else
expect links to every single page of your website, Use the title of each page
as the as the link text. This will also give you some extra keywords that will
help improve the ranking the crawlers assign to your website. Think of the
crawler page as a site map to the rest of your website.

Typically, the crawler page won't appear in the search results. This happens
because the page doesn't have enough text for the crawlers to give that
individual page a high ranking, after all its nothing more then a portal to the
rest of your site and your human users won't need to use it. Don't panic if it
crawlers don't instantly appear to index your website. There are a lot of
websites available on the internet that need to be crawled, indexed, and then
ranked. It can sometimes take up to three months for a web crawler to get to
yours.

How Google's PageRank Determines Search Engine Optimization

Some internet search engines are set up to look for keywords throughout a
webpage, they then use a mathematical equation that takes in the amount of time
the keywords appears on the webpage and factors it with the location of the
keywords to determine the ranking of the webpage.

Other internet search engines use a process that judges the amount of times a
webpage is linked to other web pages to determine how a webpage is ranked. The
process of using links to determine search engine ranking is called link
analysis.

Keyword searches and link analysis are both part of a routine internet search
engine procedure called search engine optimization. Search engine optimization
is the art and science of making a website attractive to search engines, the
more attractive a website appears to the search engine the higher it will rank
in searches and in the world of internet searches ranking is everything.

As 2006 faced its last weeks, Google was the internet search engine that most
internet users preferred. Approximately fifty percent of the times a consumer
turned to a search engine for their internet needs they turned to Google.
Yahoo! was the second favorite.

Most of Google's popularity is credited to its preferred form of search engine
optimization, a trademarked program Google dubbed PageRank. When PageRank was
patented the patent was assigned to Stanford University.

PageRank was designed by Larry Page, (the name is a play on his name) and
Sergey Brin while they were students at Stanford University as part of a
research project they were working on about internet search engines.

PageRank is based on the link analyses algorithm. PageRank is described as a
link analysis algorithm that assigns a numerical weight to each individual
element of a hyperlink set of documents. The purpose is to measure its relative
important with the set. The numerical weight assigned to any element is called
PageRank of E. PR(E) is the denotation used.

PageRank operates on a system similar to a voting booth. Each time it finds a
hyperlink to a webpage, PageRank counts that hyperlink as a vote that supports
the webpage. The more pages that link to the page, the more votes of support
the webpage receives. If PageRank comes across a website that has absolutely no
links connecting it to another webpage then it is not awarded any votes at all.

Tests done with a model like PageRank have shown that the system is not
infallible.

The HITS algorithm is an alternate to the PageRank algorithm.

Google's powers that be take a dim view on spamdexing. In 2005 Google designed
and activated a program called nofollow, a program they designed to allow
webmasters and bloggers to create links that PageRank would ingnore. The same
system was also used to keep spamdexing to a minumum.

Google has designed PageRank to be an eight-unit measurement. Google displays
the value PageRank places on each website directly beside each website it
displays.

It has been proposed that a version of PageRank should be used to replace ISI
impact factor so that the quality of a journal citation can be determined.

Google versus Yahoo

When it comes to internet search engines the top two are without a doubt Google
and Yahoo!.

Although the two a fierce competitors they share more common bonds then some
people might realize. Both were created by students at Stanford University.
Yahoo! was created in January of 1994 by two Stanford graduate students Jerry
Yang and David Filo. The pair originally called Yahoo! "Jerry's guide to the
World Wide Web" but later changed the name to Yahoo!, commemorating the word
the Jonathan Swift defined in his classic novel Gulliver's Travels. In the book
Swift stated that the word was "rude, unsophisticated, uncouth." Four years
after Yang and Filo had created Yahoo! and introduced it to the world (at this
time it was a internet mogul) two different Stanford University students, Larry
Page and Sergey Brin, created their own search engine, Google, as a research
project, the date was September seventh 1998. Google started out as the search
engine used on Stanford University's website before it went public on August
19, 2004. When 2006 ended Google was the leading internet search engine, it
enjoyed over 50.8% of the market.

By the time it was a year old Yahoo! had had over a million hits, the sheer
number of people who had found and were using Yahoo! prompted it creators to
incorporated their creation in May of 1995. Yahoo! went public on April 12 1996
were it earned a total of 2.6 million dollars.

Google's progress was a little slower then Yahoo!s. Shortly after creating
Google, Page and Brin registered it as the domain google.com on September 17,
1997 on Stanford University's website. Approximately one year after registering
Google on Stanford University's website the pair decided to incorporate their
research project. Finally, on August 19, 2004, Google had its very first public
offering. Google is currently the favorite internet search engine.

After its meteoritic climb to glory Yahoo!'s creators and shareholders were
confident that they were holding onto a gold mine. They didn't predict the
burst of the dot.com bubble in the early two thousands. Yahoo! survived the
crisis but the value of Yahoo! stocks dropped to $8.11, an all time low.

Yahoo! uses a combination of web crawler compiled and indexed results to rank
the websites and webpage are registered on their search engine. In addition to
rankings compiled by the web crawler, webmasters can, for a fee, purchase a
submission to Yahoo!'s human compiled directory. The annual yearly fee is about
three hundred dollars. The theory is that the listing human's provide will
influence web crawlers into giving the website a higher ranking.

Google credits its success and popularity to the program it uses to search and
rank webpage's, a program it calls PageRank. Because Google is worried about
webmasters using abusive techniques to garner higher rankings for their search
engines Google carefully keeps the hows and whys of PageRank a closely guarded
secret. Google does confess that PageRank runs on a link analysis algorithm.
PageRank was different from all the rest of the search engine optimization
techniques because it graded each page based on the number of and quality of
the links that pointed to it.

Yahoo! quickly grew fond of offering the webmasters that subscribed to its
search engine the opportunity to purchase something called paid inclusion. In
exchange for a fee, Yahoo! guaranteed that the webpage's would be ranked. What
Yahoo! didn't guarantee was what type of ranking the webpage's would receive;
they refused to promise that the webpage's would appear in the first two pages
of a search.

Google uses a pay-per-click method to charge advertisers. Each time an
advertisers link is clicked Google charges the account fifty cents.

Newer is not Always Better When it Involves Search Engine Optimization

We live in a world where everybody wants the latest and greatest, somewhere
along the way we have come to the conclusion that the newer something is the
better. If we are buying a CD it has to be the latest release from the new one
hit wonder, we don't care if the song writer couldn't tell melody from harmony
or that the singer is incapable of carrying a tune, all that matters is that
it's new. Each fall hundreds of people scramble to get to car dealerships,
frantic to drive the next years models, barely capable of waiting for them to
be unloaded off the truck, it doesn't matter if we are six months behind on car
payments on last years model which is in perfect running condition, we're
blinded by all the bells and whistles that the new cars have to offer. People
will stand in a long line, overnight, in an electrical storm to simply to spend
an unhealthy amount of money on the latest electronic gadget just because it is
brand new, we don't care that in just a few months it will be a fraction of the
cost, we have to have it now.

Even internet service suffers from the right now syndrome. For years we were
content with dialup service. Sure it was slow but it was that or nothing. Heck
we hardly noticed that it took hours to download a simple, days to upload a
couple of pictures, download a video... that was practically unheard off. We
didn't know any better. Now that the world has found out about all the new
options for internet service we have to have that. It doesn't matter that it is
double the monthly cost or we have to default on are student loans in order to
purchase the necessary equipment. If it is cordless, faster, and designed with
the latest technology we have to have it...right now.

We don't care if the old stuff is made with better materials, last longer, and
is cheaper. In our minds old equals junk.

Search engine optimization is one spot where we should force ourselves to shed
our weird inhibitions about old stuff. When it comes to search engine
optimization, age rules over youth.

Search engine optimization is the art and science of making web pages
attractive to the search engines. The more attractive a web site appears
(search engines are attracted, not to beauty, but to repetitious algorithms)
the higher it ranks in the search engines search result. A low ranking could
potentially be the kiss of death to an internet based business because studies
have shown the internet users seldom look past the second page of hits.

Search engines use web crawlers to determine a websites ranking.

Older websites and the webmasters who manage them have had more time to develop
and maintain their algorithms. They are already itemized and ranked by the
search engines, in some cases it can take three months for a web crawler to get
around to spidering a brand new website that has been submitted to the search
engine, old sites are already appearing and gaining customer recognition. If an
older site has been around long enough to have earned a loyal customer base,
even if a shuffle in the rankings causes the aged web site to be bumped from
prime ranking position, loyal customers will still look for it.

Natural Search Engine Optimization or Pay-Per-Click

The internet is literally like having the world at ones fingertips. Not only
does it provide families a cheap way to stay in touch (e-mail and instant
messaging), it allows students to cram for finals and write last minute papers
in the middle of the night, long after the library has closed, but the internet
is suddenly a way for the smallest business to break into a global market.

Let's pretend that you are the owner of a small novelty store in a small rural
town in the Midwest. Most of your merchandise is handmade trinkets and crafts
created by the residents of the small town (on commission so the up front cost
of most of your merchandise is minimal). Although business is slow during the
winter months during the tourist season you turn a tidy profit. One day as a
Chicago tourist purchases a photo of the late afternoon sun glinting off a herd
of sleeping cattle she mentions that she wishes you had a website so she could
purchase quaint Christmas gifts for her family. As she leaves the story, her
wrapped photograph tucked under her arm, you stare at your computer.

The internet could be a cheap way to increase your profit margin. You already
have your physical business, a website would simply be an addition. You look at
all the pretty knickknacks arranged throughout the store. If you expanded your
business to include a website you could sell mid-western trinkets all over the
world. It wouldn't take that much time. You have a friend that would design and
teach you how to manage a website for free. You could answer questions during
the slow times when you're not doing anything anyway. It would be a win-win
situation.

In theory you're correct. A website could be a lucrative addition to your
business.

It is possible to design website, register a domain name, and submit it to a
website. But what happens next. Just like the physical shop the website will
not do any business if there isn't any traffic. No one will visit your online
store if they don't know about it.

The chances are good that your regular customers will probably check out your
website, the ones that made items you have featured will probably tell their
friends and families about it, but the chances are good that they won't buy
anything, why should they pay for shipping and handling when they can drive a
couple of miles and purchase it directly from you. Your tourist customers might
buy from your online store but only if they know about it and since you probably
waited until the slow season to create your website it will be months before you
can tell them.

You could look into search engine optimization.

You might even want to consider something called pay-per-click.

Pay-per-click is a search engine that bases its rankings on something that is
called a bid position. A website owner bids for an elevated position in the
ranking when a certain keyword is typed into the search bar. The higher the
bid, the higher the ranking.

Businesses that use pay-per-click prefer it to natural search engine
optimization because it's an easy efficient way to improve a sites ranking and
increase its traffic. Pay-per-click also lets webmaster maintain control over
the search engine campaign.

People who for go pay-per-click to natural search engine optimization say that
the cost of pay-per-click is too high.

Controversy Lends a Helping Hand to Search Engine Optimization

It is always wonderful to hear good news. Hearing good news makes us feel good
about ourselves, the people around, our dog... heck the world is a better place
when we have good news.

Good news might make us feel good about ourselves and the world but there is
something deliciously appealing about bad news, especially if it is about
someone other then ourselves.

Bad news makes good news copy. Celebrities know that. I once watched an
interview with a well known, highly controversial, singer/songwriter, and
performer. The newspapers are always full of articles and stories about his
exploits (he and I share the same home state so I think the papers I read have
probably double what papers in the rest of the country print). The interviewer
asked this singer about one of his recent escapades. The singer kind of
chuckled and shyly admitted that while the episode had happened it had been
blown out of proportion. When the interviewer asked why the singer did nothing
to correct the allegations the singer bluntly replied...money. Each time
someone accused him of doing something awful kids started to rush to the stores
to buy his CD's, partly because his name was being splashed all over the
airwaves and was fresh in their minds when the perused the music department,
but also partly because their parents were trying to ban his music from the
house. When he was on his best behavior he didn't get any media attention and
his record sales plummeted. So, since the singer is anything but stupid and he
has a deep appreciation for the things money can buy, he goes a little bit out
of his way to perpetuate his bad boy image.

Bloggers are another group of people who understand how swiftly controversy
spreads. They know that if they write about something that is controversial
there will be a flood of readers reading their bogs and leaving feed back.
Before you know it a dialogue has started, sometimes it isn't a peaceful
dialogue but it's a dialogue just the same.

The same thing can be true about websites and search engine optimization.
Search engine optimization is the art and science of making a web site
appealing to search engines. Search engines determine the attractiveness of a
website by sending out web crawlers that look for algorithms placed throughout
the website. The more algorithms a website has the higher it gets ranked during
a search.

A second thing several search engines look for is something called link
analysis. Web crawlers look for how many links lead back to the website. The
more links leading back to a website the higher that website will rank.

Controversy is a way to get a lot of links to your website fast. For example a
breeder of Ball-headed pythons went to an exotic pet show to purchase some more
snakes for his store. While he was at the show the police stormed the pet show,
using excessive force to remove several of the exhibitors. You snapped several
graphic pictures of the event, photos you later post on your website where you
sell the snakes you breed. Others see the controversial photos posted on your
site, they tell their friends and customers. To simplify things the owner of
the second pet store posts a link on his site that attaches directly to yours.
As more and more people hear about your photos, more and more links to your
site are created. The next thing you know you are ranked on the very first page
of the search engines hits.

In addition tot the boost in your ranking you have also sold nearly all of your
saleable snakes. Controversy really does sell.

Social Media Optimization

A Popular New Trend that Breaks from Search Engine Optimization

Social media optimization is similar to search engine media optimization. The
goal of social media optimization is to drive huge amounts of people to a
specific website. Social media optimization can also be used to determine
whether or not a startup website will be successful or whether it will fall
flat with the consumers. Social media optimization uses new to encourage
traffic to a website.

Social media optimization was a name created by Rohit Bharagava, the vice
president of Interactive marketing.

Social media optimization is online tools and platforms that can be used to
share opinions, insights, and perspectives. It cant take many forms such as
text, images, audio, and video. Popular forms of social media optimization are
currently blogs, pod casts, message boards, vlogs, and wikis. Social media
optimization is anything that builds a community where people can rendezvous.
Social media optimization normally includes websites that can be used as a
platform to send out a marketing message.

Social media optimization is not something that can be forced. It is considered
a type of pull marketing; it only works if people are drawn to it. Search engine
optimization has clear goals. Webmasters who use search engine optimization want
to have a website that ranks well with the search engines.

The goals of webmasters who are trying to use social media optimization are; an
increase in linkabilty, easy book marking, mash-up, inbound links, and helping
content travel.

The rules webmasters who are using social media optimization need to live by
are, rewarding valuable (helpful) users, and they need to make sure they are a
user resource.

Webmasters who are using a social media optimization technique should make sure
they participate with their users. You need to be a part of the blogs and the
message boards.

Webmasters that are employing a social media optimization technique must know
their target audience. You need to know what appeals to that particular group
of people. It is important to remember that not everybody will love you.

Make sure you have created content. One of the words typically associated with
social media optimization is mashup. The origins of the word, mashup, gets it
start in the pop music world. Mashup is a website of application that works to
combine content from more then one source into an integrated experience. A
mashup is sometimes created as a way to gather feedback on an existing project
or body of work. Most companies use a third party via public interface. Google,
Amazon, Yahoos! APIs, eBay, AOL, and Windows Live are some of the companies
currently experimenting with mashups.

Despite the fact that social media optimization is a relatively new concept,
some people believe that social media optimization will be one of the top seven
marketing trends of 20007 along with; sharing corporate personalities, widget
marketing, auto tagging, human filtered searches, content casting, and online
identity shifting.

Social media optimization is something that encourages a fun social
environment. Enjoy it.

Algorithms-The Foundation of Search Engine Optimization

In the ninth century Abu Abdullah Muhammad ibn Musa al-Khwarizmi, a Persian
mathematician, introduced algebrac concepts and Arabic numerals while he was
working in Baghdad. During the time Baghdad was the international center for
scientific study. Abu Abdullah Muhammad ibn Musa al-Khwarizmi's process of
performing arithmetic with Arabic numerals was called algorism. In the
eighteenth century the name evolved into algorithm. Algorithms are a finite set
of carefully defined instruction. Algorithms are procedures that are used for
accomplishing some task which will end in a defined end-state. Algorithms are
used in linguistics, computers, and mathematics.

Many people like to think of algorithms as steps in a well written recipe.
Provided you follow each step of the recipe to the letter you will have an
edible dinner. As long as you follow each step of the algorithm you will find
the proper solution. Simple algorithms can be used to design complex algorithms.

Computers use algorithms as a way to process information. All computer programs
are created with algorithms (or series of algorithms) that give the computer a
list of instructions to follow. Computers usually read data from an input
device when using an algorithm to process information. In order to be
successful algorithms need to be carefully defined for a computer to read them.
Program designers need to consider every possible scenario that could arise and
set up a series of algorithms to resolve the problem. Designers have to be very
careful not to change the order of the instructions; computers cannot cope with
an algorithm that is in the wrong place. Flow of control refers to how the list
of algorithms must start at the top and go all the way to the bottom, following
every single step on the way.

Some terms that are used to describe algorithms include natural languages,
flowcharts, psudocode, and programming languages. Natural expression algorithms
are generally only seen in simple algorithms. Computers generally use
programming languages that are intended for expressing algorithms.

There are different ways to classify algorithms. The first is by the specific
type of algorithm. Types of algorithms include recursive and interative
algorithms, deterministic and non-deterministic algorithms, and approximation
algorithms. The second method used to classify algorithms is by their design
methodology or their paradigm. Typical paradigm is are divide and conquer, the
greedy method, linear programming, dynamic programming, search and enumeration,
reduction, and probalictic and heuristic paradigms. Different fields of
scientific study have different ways of classifying algorithms, classified to
make their field as efficient as possible. Some different types of algorithms
different scientific fields use include; search algorithms, merge algorithms,
string algorithms, combinatorial algorithms, cryptography, sorting algorithms,
numerical algorithms, graph algorithms, computational geometric algorithms,
data compression algorithms, and parsing techniques.

Internet search engines use algorithms to aid in search engine optimization.
Google's web crawler's use a link analysis algorithm to index and rank web
pages. In an attempt to prevent webmasters from using underhanded schemes to
influence search engine optimization, many internet search engines disclose as
little about the algorithms they use in their optimization techniques.

The History of Internet Search Engines

Just a little over ten years ago, if a person needed information they were
forced to go to the local library and spend hours entombed amongst shelves of
books. Now that the internet is available in almost every home finding
information is easier then ever before. Now when someone needs information all
they have to do is boot up their computer and type their needs into a search
engine

A search engine is an information retrieval system that is designed to help
find information stored on a ca computer system.

In 1990 the very first search engine was created by students at McGill
University in Montreal. The search engine was called Archie and it was invented
to index FTP archives, allowing people to quickly access specific files. FTPs
(short for File Transfer Protocol) are used to transfer data from one computer
to another ocer the internet, or through a network that supports TCP/IP
protocol. In its early days Archie contacted a list of FTP archives
approximately once a month with a request for a listing. Once Archie received a
listing it was stored in local files and could be searched using a UNIX grep
command. In its early days Archie was a local tool but as the kinks got worked
out and it became more efficient it became a network wide resource. Archie
users could utilize Archie's services through a variety of methods including
e-mail queries, teleneting directly to a server, and eventually through the
World Wide Web interfaces. Archie only indexed computer files.

A student at the University of Minnesota created a search engine that indexed
plain text files in 1991. They named the program Gopher after the University of
Minnesota's mascot.

In 1993 a student at MIT created Wandx, the first Web search engine.

Today, search engines match a user's keyword query with a list of potential
websites that might have the information the users is looking for. The search
engine does this by using a software code that is called a crawler to probe web
pages that match the user's keyword. Once the crawler has identified web pages
that may be what the user is looking for the search engine uses a variety of
statistical techniques to establish each pages importance. Most search engines
establish the importance of hits based on the frequency of word distribution.
Once the search engine has finished searching web pages it provides a list of
web sites to the user.

Today, when an internet user types a word into a search engine they are given a
list of websites that might be able to provide them with the information they
seek. The typical search engine provides ten potential hits per page. The
average internet user never looks farther they the second page the search
engine provides. Webmasters are constantly finding themselves forced to use new
methods of search engine optimization to be highly ranked by the search engines.

In 2000, a study was done by Lawrence and Giles that suggested internet search
engines were only able to index sixteen percent of all available webpage's.

A Brief History of Search Engine Optimization

Search engine optimization is the art and science of making web pages
attractive to internet search engines. Some interne t businesses consider
search engine optimization to be the subset of search engine marketing.

In the middle of the 1990s webmasters and search engine content providers
started optimizing websites. At the time all the webmasters had to do was
provide a URL to a search engine and a web crawler would be sent from the
search engine. The web crawler would extract link from the webpage and use the
information to index the page by down loading the page and then storing it on
the search engines server. Once the page was stored on the search engines
server a second program, called an indexer, extracted additional information
from the webpage, and determines the weight of specific words. When this was
complete the page was ranked.

It didn't take very long for people to understand the importance of being
highly ranked.

In the beginning search engines used search algorithms that webmasters provided
about the web pages. It didn't take webmasters very long to start abusing the
system requiring search engines to develop a more sophisticated form of search
engine optimization. The search engines developed a system that considered
several factors; domain name, text within the title, URL directories, term
frequency, HTML tags, on page key word proximity, Alt attributes for images, on
page keyword adjacency, text within NOFRAMES tags, web content development,
sitemaps, and on page keyword sequence.

Google developed a new concept of evaluating internet web pages called
PageRank. PageRank weighs a web page's quantity and quality based on the pages
incoming links. This method of search engine optimization was so successful
that Google quickly began to enjoy successful word of mouth and consistent
praise.

To help discourage abuse by webmasters, several internet search engines, such
as Google, Microsoft, Yahoo, and Ask.com, will not disclose the algorithms they
use when ranking web pages.The signals used today in search engine optimization
typically are; keywords in the title, link popularity, keywords in links
pointing to the page, PageRank (Google), Keywords that appear in the visible
text, links from on page to the inner pages, and placing punch line at the top
of the page.

For the most part registering a webpage/website on a search engine is a simple
task. All Google requires is a link from a site already indexed and the web
crawlers will visit the site and begin to spider its contents. Normally a few
days after registering on the search engine the main search engine spiders will
begin to index the website.

Some search engines will guarantee spidering and indexing for a small fee.
These search engines do not guarantee specific ranking. Webmaster's who don't
want web crawlers to index certain files and directories use a standard
robots.txt file. This file is located in the root directory. Occasionally a web
crawler will still crawl a page even if the webmaster has indicated he does not
wish the page indexed.

Search Engine Optimization and the Knight

On October 15, 1881 a baby by the name of Pelham Grenville Wodehouse (Plum to
his friends) was born. In 1996, one hundred and fifteen years later, a brand
new internet search engine would be named in honor of him, sort of.

P.G. Wodehouse was an extremely popular English writer who had a flair for
comedy. Magazines like The Saturday Evening Post and The Strand serialized his
novels while he spent time in Hollywood working as a screenwriter. P.G.
Wodehouse had an incredibly prolific flair for writing. His writing career
officially started in 1902 and ended in 1975. During that time he wrote
ninety-six books, several collections of short stories, screenplays, and one
musical.

When he was ninety-three years old, P.G. Wodehouse was made a Knight of the
British Empire. Two of Mr. Wodehouse's most famous characters(or perhaps
infamous, depending on your point of view), are the bumbling Bertie Wooster and
his long suffering valet, Jeeves.

P.G. Wodehouse will always be remembered for his comedic approach to writing.

In 1996, when Garret Gruener and David Warthen needed a name for the internet
search engine they created they choose the name of Wodehouse's fictional valet.
The website was called Ask Jeeves. Jeeves remained the search engines mascot
until the company retired him on February 27, 2006 a decision they announced on
September 23, 2005. Jeeves retirement prompted the internet search engine to
create a page titled "Where's Jeeves", that listed a variety of creative
activities, including growing grapes and space exploration, the valet planned
to do during his retirement. With Jeeves retired the search engine simply
became Ask.com. During his reign at Ask Jeeves, the valet was always impeccably
dressed in a beautifully tailored black suit, shiny shoes, and red tie. Although
his posture changed almost yearly on the company logo he always had the same
amicable smile.

When it was first created the idea behind Ask.com (back then it was still Ask
Jeeves) questions would be posed in regular language and answers would be
hunted down and provided. The creators of Ask Jeeves (now Ask.com) hoped that
internet users would be drawn to the intuitive, user friendly style.

The growing popularity of keyword search engines like Yahoo! and Google
prompted the powers-that-be at Ask Jeeves to overhaul their search engine to
include keyword searches in addition to answering questions. Because Ask.com
was not as quick to index new websites as some of its competitors its was not
bogged down with computer generated linkspam., when users were unable to find
usable web pages on the three most popular internet search engines, they turned
to Ask.com who still had viable pages readily available.

Today, Ask.com uses the ExpertRank algorithm to provide its users with search
results. Ask.com uses link popularity and subject-specific popularity to help
determine rankings.

Ask.com has sold technology has been sold to additional corporations including
Toshiba and Dell. A variety of web destinations, including country specific,
sites such as; Germany, Italy, Excite, Japan, the United Kingdom, the
Netherlands, Spain, IWon.com, Bloglines, and Ask For Kids are owned by Ask.com.


Best Selling Products at Amazon.com


Peace Icon  InfoBank Intro | Main Page | Usenet Forums | Search The RockSite/The Web