< *********************************************************************************** CHAPTER -1
Search Engine Optimization
Search Engine industry
As Internet started to grow and became an integral part of day-to-day work, it became almost impossible for a user to fetch the exact or relevant information from such a huge web. And hence ‘Search Engines’ were developed. Search engines became so popular that now more than 80% of the web-site visitors come from the search engines. But what exactly is a Search Engine? According to webopedia, “Search Engine is a program that searches documents for specified keywords and returns a list of the documents where the keywords were found”.
For Example, if you want to know about the Automobile market in India, you will type some keywords like automotive market, automobiles in India, automobile manufacturers in India etc. and once you click on search button, you’ll get the best relevant data related to those keywords.
Search engine industry is dominated by 3 major players- Google, Yahoo! and MSN with a market share of 35%, 32% and 16% respectively (According to searchenginewatch.com survey, 2004). A report released in March 2005 indicated that the search engines are being used between 2 to 3.5 billion times per day to find information online. And of course people using search engines are on the hunt for specific information and hence the audience is highly targeted. There are many other search engines such as AskJeeves, AOL, and Excite etc which are also famous among net users. But what is attractive is the stats regarding usage of search engines.
The use of search engine is a top online activity and netizens increasingly feel they get the information they want when they execute search queries.
On the probable eve of Google’s initial public offering, new surveys and traffic data confirm that search engines have become an essential and popular way for people to find information online. A nationwide phone survey of 1,399 Internet users between May 14 and June 17 by the Pew Internet & American Life Project shows:
• 84% of internet users have used search engines. On any given day online, more than half those using the Internet use search engines. And more than two-thirds of Internet users say they use search engines at least a couple of times per week.
• The use of search engines usually ranks only second to email use as the most popular activity online. During periods when major news stories are breaking, the act of getting news online usually surpass the use of search engines.
• There is a substantial payoff as search engines improve and people become more adept at using them. Some 87% of search engine users say they find the information they want most of the time when they use search engines.
• The convenience and effectiveness of the search experience solidifies its appeal. Some 44% say that most times they search they are looking for vital information they absolutely need.
comScore Networks tracking of Internet use shows that among the top 25 search engines:
• Americans conducted 3.9 billion total searches in June
• 44% of those searches were done from home computers, 49% were done from work computers, and 7% were done at university-based computers.
• The average Internet user performed 33 searches in June.
• The average visit to a search engine resulted in 4.4 searches.
• The average visitor scrolled through 1.8 result pages during a typical search.
• In June, the average user spent 41 minutes at search engine sites.
• comScore estimates that 40-45 percent of searches include sponsored results.
• Approximately 7 percent of searches in March included a local modifier, such as city and state names, phone numbers or the words “map” or “directions.”
• The percentage of searches that occurred through browser toolbars in June was 7%
Search Engines Market Share:
Four times voted as Most Outstanding Search Engine, Google is an undisputed market leader of search engine industry. Google is a crawler based search engine, which is known for providing both comprehensive coverage of web page and most relevant information. It attracts the largest number searches and the number goes upto 250 million searches everyday.
Yahoo! is the second largest player in the industry with 32% of market share. Yahoo! started as a human based directory but turned into Crawler based search engine in 2002. Till early 2004, it was powered by Google but after that they started to use their own technology.
Overture stands next to Google in terms of number of searches per day. It is owned by yahoo and attracts more than 167 million searches per day. Overture was the first search engine who came up with PPC program. AskJeeves initially gained fame in 1998 and 1999 as being the "natural language" search engine that let you search by asking questions and responded with what seemed to be the right answer to everything. When launched, it was run by around 100 editors who monitored search logs. Today, however, AskJeeves depends on crawler-based technology to provide results to its users.
Search Engine History:
Though Google is responsible for where the search engines stands today, actual search engine was invented much before Google incorporated.
Alan Emtage, a student at McGill University, created the first search engine in 1990 and he named it ‘Archie’. Back then there was no world wide web! FTP was the mean to share the data. It was effective in smaller groups but the data became as much fragmented as it was collected. Archie helped solve this data scatter problem by combining a script-based data gatherer with a regular expression matcher for retrieving file names matching a user query. Essentially Archie became a database of web filenames, which it would match with the user’s queries.
Archie had such popularity that in 1993 the University of Nevada System Computing Services group developed ‘Veronica’. Veronica served the same purpose as Archie, but it worked on plain text files. Soon another user interface name ‘Jughead’ appeared with the same purpose as Veronica; both of these were used for files sent via Gopher, which was created as an Archie alternative by Mark McCahill at the University of Minnesota in 1991.
Now the challenge was to automate the process. And the first internet robot was introduced. Computer robots are simply programs that automate repetitive tasks at speed impossible for humans to reproduce. He initially wanted to measure the growth of the web and created this bot to count active web servers. He soon upgraded the bot to capture actual URL's.
His database became knows as the Wandex. The Wanderer was as much of a problem as it was a solution because it caused system lag by accessing the same page hundreds of times a day.
By December of 1993, three full-fledged bot fed search engines had surfaced on the web: JumpStation, the World Wide Web Worm, and the Repository-Based Software Engineering (RBSE) spider. The JumpStation gathered info about the title and header from Web pages and retrieved these using a simple linear search. As the web grew, JumpStation slowed to a stop. The WWW Worm indexed titles and URL's. The problem with JumpStation and the World Wide Web Worm is that they listed results in the order that they found them, and provided no discrimination. The RSBE spider did implement a ranking system.
Brian Pinkerton of the University of Washington released the WebCrawler on April 20, 1994. It was the first crawler, which indexed entire pages. Soon it became so popular that during daytime hours it could not be used. AOL eventually purchased WebCrawler and ran it on their network. Then in 1997, Excite bought out WebCrawler, and AOL began using Excite to power its NetFind. WebCrawler opened the door for many other services to follow suit. Within 1 year of its debuted came Lycos, Infoseek, and OpenText.
In 1998 the last of the current search super powers, and the most powerful to date, Google, was launched. It decided to rank pages using an important concept of implied value due to inbound links. This makes the web somewhat democratic as each off going link is a vote. Google has become so popular that major portals such as AOL and Yahoo have used Google and allowed that search technology to own the lion’s share of web searches. In 1998 MSN search was launched. The Open Directory and Direct Hit were also launched in 1998.
Search Engines and Directories
Web Directory is a web search tool compiled manually by human editors. Once websites are submitted with information such as a title and description, they are assessed by an editor and, if deemed suitable for addition, will be listed under one or more subject categories. Users can search across a directory using keywords or phrases, or browse through the subject hierarchy. Best examples of a directory are Yahoo and the Open Directory Project.
The major difference between search engine and directory is the human factor. A web site search directory indexes a web site based on an independent description of a site. While directories perform many of the same functions of a web page search engine, their indexing format is different. The main difference is that directories do not spider your site to gather information about it. Instead they rely on a few text entries, typically a site title, domain name, and description, to determine which keywords describe your site. While sites in the search engines are scanned and resulted by program (crawler), they are edited manually in directories. Directories contain number of websites according to theme or industry i.e. automobile related sites are placed in one sub-directory, sports sites are placed into the other sub-directory and so on. So directories help organize thousands of web sites together. A directory contained inside another directory is called a subdirectory of that directory. Together, the directories form a hierarchy, or tree structure.
There are directories on the web for almost any category you could name. Some search engines are adding general directories to their web pages. While helping researchers by suggesting general topics to search under, these general directories also place banner ads on their search engine, which encourage some users to spend more time on their sites browsing, and the banner ads help pay directories’ costs for posting on the internet. A web directory is a directory on the World Wide Web that specializes in linking to other web sites and categorizing those links. Web directories often allow site owners to submit their site for inclusion. Editors review these submissions and include the website once it qualifies the set guidelines.
There are 5 types of directories namely Human Edited, User Categorized, User Classified, Independently Classified and Pay Per Click (PPC).
1. Human Edited (Categories):
This is the 'traditional' directory. It is the most prestigious, as each listed site is 'hand picked' and reviewed by a human editor. The assumption is that the editor is an 'expert' in his/her field and will select for inclusion only appropriate sites. Such directories usually have very clear and stringent acceptance rules, which ensure the quality of the search results. Invariably, the Directory is comprised of categories to which sites are 'assigned'. This type of Directory is relatively hard to maintain, as it is labor intensive and hence expensive. That also explains why many such directories are using volunteers to do the work. Notable examples of Human Edited Directories are Yahoo, Dmoz, Joeant and Gimpsy, but there are many more. There is no doubt that this is the most important type to submit your site to. Only the scrutiny of an independent human reviewer can ensure the quality and suitability of a web site to a given category.
2. User Categorised:
The Directory is structured in a very similar way as the Edited Directory, but it is the user's decision as to the best category to place the site in. While this is quite attractive for the Directory Owner (the users do the 'hard work') as well as the Site Owner (freedom to place the site in any category), the search results may be far from satisfactory. One such Directory is Websquash. You may get benefits from registering in such a directory, but make sure you consider all the relevant aspects.
3. User Classified:
Sites are classified by keywords, entered by the Site Owner in the Meta Tags of the home page. The attraction here is that the site is classified (potentially) by many keywords and the process is fully automatic (low maintenance). While easy to register, the sorting algorithm has very little to go by, hence the position of the site in the search results doesn't mean much. Moreover, should you choose popular keywords you have little chance of being found due to the number of sites competing with you. On the other hand, selecting a rare combination of keywords suffers from the obvious problem of the miniscule number of searchers using that combination. One of the better known examples is ExactSeek, which enjoys significant popularity. Its attraction may be related to the use of the Alexa ranking, which measures the site's popularity, as a primary sorting criterion of the searched results.
4. Independently Classified:
Instead of letting the Site Owner decide which keywords to use for finding his site, this type of directory allows every user to determine the relevancy of keywords. This latest addition to the Directory family harnesses the public vote to examine and determine relevancy of keywords to sites. Each user may choose to rate a (random) site and voice his/her opinion of the suitability of specific keywords to that site. The best example for such a site is Netnose. Due to the democratic process, it is highly likely that relevancy will be good. However, for such a site to achieve prominence requires a larger number of users willing to donate their time and effort to that rating activity.
5. Pay Per Click:While technically PPC Directories are of the User Classified type, their business model implies some significant characteristics that Site Owners should be aware of:
• A link from a PPC is never a direct or simple link. Hence being listed in a PPC directory will never help to increase Link Popularity with search Engines.
• A link from PPC directory remains in place only as long as the user’s account is cash positive.
• PPC Directories try to maximize their revenues by encouraging Site Owners to bid for as many keywords as they can, even those that are only remotely related to their site’s business.
At the beginning of the web era, users would go to directories to find sites relevant to their interests. In fact, Yahoo!, the web's number one destination, started as a directory. Nowadays, most users rely on search engines, not directories, to find what they're looking for.
When search engines started to become popular, they relied on web pages' 'keyword meta tags' to determine the topic and relevance of the page (the keyword meta tag is a section within a web page's HTML code where webmasters can insert words that are relevant to the page's content). Webmasters discovered that by stuffing their meta tags with popular search terms repeated hundreds of times, they could propel their pages to the top of the search results.
Search engines caught on to the abuse and decided to ignore the meta tags and rely instead on web page copy. Webmasters then started to overstuff their page copy with popular search terms, often writing them in the same color as the web page's background, so that they could be detected by search engines while being invisible to users.
Again, search engines discovered the trick and decided that the best way to rank a web page's content and its topical relevance was to rely on inbound links from other pages. The rationale behind this is that it is much more difficult to influence other people to link to you than it is to manipulate your own web page elements.
There are several ways to get inbound links, among them writing articles that include your bylines with a link to your page, exchanging links, and listing your site in directories.
Listing your sites in good directories is probably the best way to get quality links that are highly valued by the search engines. Since directories rely on human editors who enforce strict criteria to list a site, and since directories organize the information in highly focused categories, they are an invaluable resource for search engines to measure the quality and the relevance of a web page.
In summary, directories are important not because they generate significant traffic, but because they are given great importance by the search engines to qualify and rank web pages, and to determine their topical relevance.
Major Search Engines
Among the thousands of search engines, very few are famous. Thanks to their algorithm which helps the user to find most relevant information. As observed earlier, Google, Yahoo! and MSN are the top three search engines in the world. But then there is Teoma, Excite, Ask Jeeves, AOL, HotBot, Alta Vista, Lycos etc. also counts lots of searches.
The listing in these search engines can attract huge traffic to the website. Hence, it is very important for search engine optimizer to know which search engine is best and highly used. It is very important for searchers as well! For them well known, commercially backed search engines mean dependable results. These search engines are more likely to be well maintained and upgraded when necessary, to keep pace with the growing web.
There are 9 main features on which search engines can be evaluated. They are as below.
• Boolean: Boolean searching refers to how multiple terms are combined in a search.
and requires that both terms be found.
or lets either term be found
not means any record containing the second term will be excluded
( ) means the Boolean operators can be nested using parentheses
+ is equivalent to AND, requiring the term; the + should be placed directly in front of the search term
- is equivalent to NOT and means to exclude the term; the - should be placed directly in front of the search term
Operators can be entered in the case shown by the example.
(salad and (lime or kiwi)) not nuts
+salad –nuts lime kiwi
• Default: What happens when multiple terms are entered for a search using no Boolean operators, + or - symbols, phrase marks, or other specialized features.
Two terms could be processed as
Two AND terms
Two OR terms
or “two terms” as a pharse
• Proximity: Proximity searching refers to the ability to specify how close within a record multiple terms should be to each other. The most commonly used proximity search option in Internet finding aids is a phrase search that requires terms to be in the exact order specified within the phrase markings. The default standard for identifying phrases is to use double quotes (" ") to surround the phrase.
Phrase searching example: “Phrase searching is fun”
Beyond phrase searching, other proximity operators can specify how close terms should be to each other. Some will also specify the order of the search terms. Each search engine can define them differently and use various operator names such as NEAR, ADJ, W, or AFTER.
• Truncation: This search technique refers to the ability to search just a portion of a word. Typically, a symbol such as the asterisk is used to represent the rest of the term. End truncation is where several letters at the beginning of a word are specified but the ending can vary. With internal truncation, a symbol can represent one or more characters within a word.
Stemming related to truncation, usually refers to the ability of a search engine to find word variants such as plurals, singular forms, past tense, present tense, etc. Some stemming only cover plural and singular forms.
End Truncation Examples: Colleg* finds college, colleges, collegium, collegial
Internal Truncation Examples: col*r finds color, colour, colander
Stemming: lights finds light, lights, lighting, lit
• Case Sensitive: In general, most search engines will match upper case, lower case, and mixed case as all the same term. Some search engines have the capability to match exact case. Entering a search term in lower case will usually find all cases. In a case sensitive search engine, entering any upper case letter in a search term will invoke the exact case match
Next finds next, Next, NeXt, next
NeXT finds only NeXT
• Fields: Fields searching allows the searcher to designate where a specific search term will appear. Rather than searching for words anywhere on a Web page, fields define specific structural units of a document. The title, the URL, an image tag, or hypertext links are common fields on a Web page.
Example: title: searching will look for the word ‘searching’ in the title of a web page
• Limits: The ability to narrow search results by adding a specific restriction to the search. Commonly available limits are the date limit and the language limit. The latter would restrict the search results to only those Web pages identified as being in the specified language.
Top of Form
• Stop Words: Frequently occurring words that are not searchable. Some search engines include common words such as 'the' while others ignore such words in a query. Stop words may include numbers and frequent HTML strings. Some search engines only search stop words if they are part of a phrase search or the only search terms in the query.
Examples: the, a, is, of, be, 1, html, com
• Sorting: The ability to organize the results of a search. Typically, Internet search engines sort the results by "relevance" determined by their proprietary relevance ranking algorithms. Other options are to arrange the results by date, alphabetically by title, or by root URL or host name.
Now let’s see Major search engines and their features:
Google: right from the establishment in 1999, till date Google is most favorite search engine on net. Since its beta release, it has had phrase searching and the - for NOT, but it did not add an OR operation until Oct. 2000. In Dec. 2000, it added title searching. In June 2000 it announced a database of over 560 million pages, which grew to 4 billion in February 2004. It’s biggest strength is it’s size and scope. Google includes PDF, DOC, PS and many other file types. Also it has additional databases in the form of Google Groups, News, Directory etc.
Search Engine Boolean Default Proximity Truncation Case Fields Limits Stop Sorting
-, OR and Phrase, GAPS No, but stemming, word in phrase No intitle, inurl, link, site, more Language, filetype, date, domain Varies, + searches Relevance, site
Yahoo!: Yahoo! is one of the best known and most popular internet portals. Originally just a subject directory, it now is a search engine, directory and portal. It’s a large, new search engine. It includes cached copies of pages and also includes link to the Yahoo! directory. It also supports full Boolean searching. But it lacks in providing some advanced search features such as truncation. It indexes first 500KB of a web page and link searches require inclusion of http://
Search Engine Boolean Default Proximity Truncation Case Fields Limits Stop Sorting
Review AND, OR, NOT,
( ), - and Phrase No No intitle, url, site, inurl, link, more Language, file type, date, domain Very few Relevance, site
MSN: MSN Search is the search engine for the MSN portal site. For years it had used databases from other vendors including Inktomi, LookSmart, and Direct Hit. As of Feb. 1, 2005, it began using its own, unique database. MSN Search uses its own Web database and also has separate News, Images, and Local databases along with links into Microsoft's Encarta Encyclopedia content. Text ads, currently from Yahoo! Search Marketing Solutions (formerly known as Overture). Its large and unique database, query building Search Builder and Boolean searching, cached copies of web page including date cached and automatic local search options are its strengths.
However, limited advanced features, inconsistent availability of truncation and no title search, truncation and stemming are its weaknesses.
Search Engine Boolean Default Proximity Truncation Case Fields Limits Stop Sorting
Review AND, OR, NOT,
( ), - And Phrase No No link, site, loc, url Language, site Varies, + searches Relevance, site, sliders
Ask Jeeves / Teoma: Debuting in Spring 2001 and relaunching in April 2002, this new search engine has built its own database and offers some unique search features. It was bought by Ask Jeeves in Sept. 2001. It lacks full Boolean and other advanced search features, but it has more recently expanded and improved its search features and added an advanced search. While Teoma results can show up in three separate sections, there is only the one single database of indexed Web pages. It may also include paid ad results (from Google's AdWords database) under the heading of 'Sponsored Links.' No additional databases or portal features are directly available. Ask Jeeves switched to Teoma instead of Direct Hit in Jan. 2002 for the search engine results after its question and answer matches. Identifying metasites and Refine feature to focus on web communities are the strengths and smaller database, no free URL submissions, no cached copies of pages are its weaknesses.
Search Engine Boolean Default Proximity Truncation Case Fields Limits Stop Sorting
Review -, OR and Phrase No No intitle, inurl Language, site, date Yes, + searches Relevance, metasites
Who powers whom?
There are thousands of search engines available on internet. But it’s not possible for all of them to create, maintain and update their own database. Therefore they display results from major search engines on their SERP.
It is not necessary that all the primary and secondary results should be provided by one search engine. Different search engines can provide different results to one search engine. Directories also can be used from a third party. So a supplier and receiver relationship establishes between different search engines. This is very important to understand this relationship if you want top ranking for your site.
Now let’s check out the relationship between top 10 search engines and top 2 directories i.e. which search engine is a supplier and who is receiver.
• Google's main search results are provided solely from Google's search technology, offering results from no other engine or source.
• The Google Directory is comprised of listings from The Open Directory Project (ODP, DMOZ).
• Google provides results to AOL, NetScape, IWon, Teoma, AskJeeves and Yahoo! Web Results.
• Paid and free submissions (currently includes Inktomi's results till April).
• Paid results from Overture.
• Provides main results to HotBot, Excite, Go.com, MSN, Excite, Infospace,
About, and backup results to LookSmart, and Overture.
• MSN provides sponsored results from paid advertising sources.
• MSN provides primary results from LookSmart.
• Secondary results are provided from Inktomi.
4. AOL:• AOL results for "Recommended Sites" are listings that have been hand picked by AOL editors.
• AOL "Sponsored Sites" are supplied by Google AdWords.
• AOL "Matching Sites" are supplied by Google results. The results in AOL may not always match the results in Google as Google often updates their database more frequently.
• AOL directory listings are provided by the ODP.
5. Alta Vista:• Alta Vista receives sponsored listings from Overture and Alta Vista's own advertisers.
• Alta Vista will use results from their own database for the main search results.
• Alta Vista obtains its directory results from LookSmart.
6. HotBot:• HotBot results contain three categories: Top 10 Results, Directory Results & General Web Results.
• Top 10 results include popular sites and searches.
• Directory results are hand-picked by human editors.
• Web Results are provided by Inktomi.
• HotBot offers the capability to search the HotBot database, Lycos, Google and / or AskJeeves all from one location with the click of a button.
7. iWon:• iWon Spotlight results are comprised of web pages found within iWon or web sites that iWon has a direct business partnership with.
• iWon Sponsored Listings are provided by a variety of paid advertisements through third party pay for performance listings including Google, AdWords and Overture.
• iWon Web Site Listings are powered by Google.
• iWon Shopping Listings are provided by Dealtime.com
8. Lycos:• Lycos provides directory results from The Open Directory Project.
• Lycos provides sponsored listings from Overture.
• Lycos provides Web Results from Fast and from the Lycos network.
• Netscape's sponsored links are provided by Google AdWords.
• Netscape's matching results include sites that are handpicked by ODP editors mixed with results powered by Google.
10. AllTheWeb:• AllTheWeb crawl and index ODP results.
• AllTheWeb powers the main search results in Lycos.
• AllTheWeb provide results from Lycos.
• AllTheWeb also powers the Lycos advanced search feature, the FTP search feature and their MP3 specialty engine.
1. Dmoz: Directory listings are provided to AOL, Google, Lycos and Netscape and
many other web sites, directories & portals.
2. Yahoo!: Yahoo! Directory listings are supplied by Yahoo! editors and require a fee for commercial sites. Yahoo directory results are provided to Alta Vista.
How Search Engine Rank Pages
Broadly search engines are divided into 2 main categories:
a. Crawler based search engines
b. Human powered directories
Web crawler is a program, developed to scan the web page. Crawler scans the entire page, indexes it and lists it on the search engine. It evaluates any particular web page based on several different factors such as keywords, table, page titles, body copy etc. Since listings are one automatically, it can change if you change some content of the website.
Manual listing is done in case of ‘Human Powered Directories’. Human editors are responsible for listing any site in the directory. Webmaster needs to submit a short description to the directory for the entire site and a search looks for matches only in the description submitted. Listings are not affected if you change some content in your web site. Listing in directory and in search engines is totally different and hence parameters for listing are different in both the cases. But it’s very necessary to create most informative and content rich site to attract more visitors.
Any crawler based search engine is made up of 3 basic components.
a. Crawler or Spider
c. Search engine software
All these components work one after one and list the page on search engine. Search engine finds the website in 2 ways: 1. by accepting listings send by webmasters 2. by crawlers that roam the internet storing links to and information about each page they visit. Once the site is found by the search engine crawlers scan the entire site. While scanning crawler visits the web page, reads it and then follows link to other pages within the site. Major search engines like Google, Yahoo and MSN use multiple search engines simultaneously. Google uses 4 spiders which crawl over 100 pages per second and generating around 600KBs of data each second.
Then index program starts after crawler. Once a webpage is crawled, it is necessary to transfer them to the database. Index contains copy of such web pages scanned by crawler. If the webpage is changed index is updated with new information. It is very important that the page is added to the index. Until and unless it is indexed, it is not available to those searching with the search engine.
The search engine software performs a task of relevant listing. It searches the entire database i.e. indexed pages and matches it with the search. Then it ranks and lists the most relevant matches. These listings are done on how the search engine software is programmed. It gives listing according to what it believes the most relevant is!
There are many more factors on which search engine rank a page. We will look at it in detail later.
Broadly, it depends on On-page factors and Off-page factors. On-page factors include keyword targeting, HTML tags, Content, Anchor Text and URL while Off-page factors include Link Building, Link Popularity and Anchor Text.
Though these terms are explained later, right now let’s see what strategies any search engine opts to list a page. Crawler based search engines list the sites without any human interference. This means it ranks a page based on what it thinks the most relevant page is! There are few parameters on which crawler checks whether the site is relevant to the search query or not. This program is called Search Engine Algorithm. No one knows the exact algorithm of any search engine. But studies and research proved that there are few factors, which are common in most of the search engine algorithms.
Location of keywords: Once keywords are finalized the main task is ‘placement of keywords’. The search engine algorithm mainly revolves around the location of keywords. The keywords can be placed in HTML tags, content, Headline or in first few paragraphs. The importance varies according to location. Like keywords placed in headline or first few paragraphs are more important than other locations in web site. If keywords are placed from the beginning, search engine assumes that the page is more relevant to that particular theme.
Frequency: Though it’s very important to place keywords in the most visible parts of the web page, it is important to limit the number of keywords. This is called frequency. Search engines also measure frequency of keywords while ranking a page. Search engine analyses how often keywords appear in relation to other words in a web page. Websites with more number of keywords are considered to be more relevant than others. (Ideal Keyword Density=3-4%)
Added features in Location and Frequency: Location and frequency are just the basics of search engine algorithm. Once search engines found out that anyone can play around it and can successfully rank their pages, they increased the complexity of algorithm. Different search engines now indexes different number of web pages. Some indexes more and some less. Also some indexed pages more often than others. Hence no search engine has the exact same collection of web pages to search through. Therefore the results on different search engines are always different.
Once webmasters came to know about the frequency they cracked the algorithm by using too many keywords in a page, just to get higher rankings.
Hence, search engine started to penalize such sites. Search engine termed it as ‘spamming’. So it became very necessary for SEO companies to keep the frequency more than others but le0ss than spamming. Search engines watch for common spamming methods in a variety of ways, including following up on complaints from their users.
Off-page factors: Above mentioned are some on-page factors. Now let’s look at some common off page factors. Crawler-based search engines have plenty of experience now with webmasters who constantly rewrite their web pages in an attempt to gain better rankings. Some sophisticated webmasters may even go to great lengths to "reverse engineer" the location/frequency systems used by a particular search engine. Because of this, all major search engines now also make use of "off the page" ranking criteria.
Off the page factors are those that a webmasters cannot easily influence. Chief among these is link analysis. By analyzing how pages link to each other, a search engine can both determine what a page is about and whether that page is deemed to be "important" and thus deserving of a ranking boost. In addition, sophisticated techniques are used to screen out attempts by webmasters to build "artificial" links designed to boost their rankings.
• Link analysis: Web-based search engines have introduced one dramatically different feature for weighing and ranking pages. Link analysis works somewhat like bibliographic citation practices, such as those used by Science Citation Index. Link analysis is based on how well-connected each page is, as defined by Hubs and Authorities, where Hub documents link to large numbers of other pages (out-links), and Authority documents are those referred to by many other pages, or have a high number of "in-links"
• Popularity: Google and several other search engines add popularity to link analysis to help determine the relevance or value of pages. Popularity utilizes data on the frequency with which a page is chosen by all users as a means of predicting relevance. While popularity is a good indicator at times, it assumes that the underlying information need remains the same.
Another off page factor is click-through-measurement. In short, this means that a search engine may watch what results someone selects for a particular search, and then eventually drop high-ranking pages that aren't attracting clicks, while promoting lower-ranking pages that do pull in visitors. As with link analysis, systems are used to compensate for artificial links generated by eager webmasters.
There are few more factors such as:
• Date of Publication: Some search engines assume that the more recent the information is, the more likely that it will be useful or relevant to the user. The engines therefore present results beginning with the most recent to the less current.
• Length: While length per se does not necessarily predict relevance, it is a factor when used to compute the relative merit of similar pages. So, in a choice between two documents both containing the same query terms, the document that contains a proportionately higher occurrence of the term relative to the length of the document is assumed more likely to be relevant.
• Proximity of query terms: When the terms in a query occur near to each other within a document; it is more likely that the document is relevant to the query than if the terms occur at greater distance. While some search engines do not recognize phrases per se in queries, some search engines clearly rank documents in results higher if the query terms occur adjacent to one another or in closer proximity, as compared to documents in which the terms occur at a distance.
• Proper nouns sometimes have higher weights, since so many searches are performed on people, places, or things. While this may be useful, if the search engine assumes that you are searching for a name instead of the same word as a normal everyday term, then the search results may be noticeably slanted.
SEO is an abbreviation for ‘Search Engine Optimization’. SEO is the process of improving web pages so that it ranks higher in search engine for targeted keywords with the ultimate goal of generating more revenue for the web site. There are many SEO techniques. In general, these techniques can be categorized as On-Page Optimization, On-Site Optimization, and Off-Site Optimization.
Without specifying concrete figures, it is possible to conclude that it is hard enough to find necessary information in the Internet at one stroke nowadays. The Network now features billions of documents and their number increases according to exponential functional dependence.
The quantity of data change sessions occurring at a short period is enormous. The basic problem now lies in the absence of fully functional data updating system accessible to all the Internet surfers worldwide. According to the above listed reasons, the search engines have been created. These engines have been designed to structure the information saved up in the Network and provide web surfers handy and comprehensive search tools.
Search engine optimization is one the most effective techniques of site attendance increase. Now some figures: First, about 90 percent of Internet users find new web sites via search engines; hence, it’s much more cost effective than other marketing tactics.
Secondly, recent researches inform web masters that SEO optimized resource visitors become real clients and partners five times eagerly in comparison with banner advertising. The psychological aspect is as follows: When an average web surfer finds your site on the top positions in the search engines, he considers it to be one of the best sites over the Internet.
Finally, about 80% of search engine users stop browsing search inquiry results on the first page. Therefore, only the top 15 – 25 positions bring sizeable inflow of visitors and potential clients / customers to your site. It is very hard to do because of the severe competition.
Search engine optimization depends on 1. On-page factors and 2.Off-page factors
On-page factors include keywords, HTML tags, Content, CSS and URL rewrites whereas Off-page factors include link popularity, page rank and Anchor Text. We will look at all these factors in details as we move on. But there is a basic methodology that all the SEOs follow.
Very first step is to identify Target Keywords. Keywords plays very important role in optimizing any website. These are the words that a prospective visitor might type into search engine to find relevant web sites. For example if you are SEO consultant, your target visitors must search for SEO in India, SEO industry, Internet marketing, SEO service providers and so on. So to identify your target market and choosing keywords is very important in at the beginning of SEO campaign.
The next step is keyword placement. You can not place all the keywords in a single page. The keywords should be placed in such a way that crawler can find them on multiple pages and can believe that the page is most relevant than others. The most content focused pages are highly suitable for keyword placements. Also the frequency and positioning of keywords plays crucial role while optimizing any website. You can place the keyword through out the page but crawler doesn’t give importance to such kind of placement. According to search engine algorithm there are some places, which are more important than any other place in a web page. If keywords are placed in these positions then it is more likely the page will rank better than others. These positions include header tag, title tag, meta tags and first few paragraphs of a web page. Also keyword density matters while optimizing a page i.e. how many times you are going to repeat the keywords! The ratio of number of words to keywords in it is called keyword density.
Next is, Link Building or Link Popularity. Another way that search engines find your site pages is by following links to your site from other external sites. Having such links to your site not only provides search engines with additional opportunity to find your pages, but also provides increased visibility for your site by putting it in front of visitors on another site. Many top search engines, such as Google, will factor in the number of sites linking to yours in determining its results for a particular search query. This is known as “link popularity”. One way to think about link popularity is that each external link to your site counts as a “vote” for your site. So, the more links you have pointing at you the better, right? Well, not necessarily. Because search engines also know how to count the link popularity of the sites linking to yours, a single link from a popular site will weigh more heavily than many links from obscure unpopular sites. When it comes to getting links, quality over quantity is the way to go.
There are some precautions to take while designing a site and planning a SEO campaign. If these barriers can be removed your site will rank much more higher than others and certainly will get more visitors.
Since crawler reads text only, it’s always better to create content oriented site. Try to avoid dynamic pages and any dynamic form as much as possible. Also it’s advisable not to go for frames while designing a site. Dynamic URLs are also big barriers for SEO. Search engines generally are not able to crawl dynamic URLs and hence can’t index it properly. So its better to use non-dynamic, search engine friendly URLs. Also flash animation, videos and text are harmful for search engine optimization.
1. …………… is the best search engine in terms of number of pages indexed.
2. …………. And ……………… are the two types of search engines.
3. Many of the top search engines like Google, will factor in the number of sites that are linking to yours in determining the results for a particular search query. This is known as ……………………………..
4. SEO is the process of improving ……………………. so that it ranks …………….. in search engine for targeted …………………………
5. DMOZ is also known as ………………….
6. Search engine optimization depends on ……………………. and …………………….
7. Rank the following search engines in terms of market share.
a. Ask Jeevs
8. Name 5 types of directories.
9. State True or False.
a. Yahoo! is powered by google.
b. Dmoze is a crawler based search engine.
c. AOL displays search results from Google.
d. Yahoo! displays paid results from Overture.
e. MSN is the biggest directory in terms of pages indexed.
f. Crawler helps to copy pages in the search engines.
g. Yahoo and Google display same results for the same keywords.
h. Google was the first ever search engine introduced.
10. Which of the following are parts of a search engine
a. Crawler, index, search engine software
b. Spider, content, link building
c. Search engine algorithm, spider, directory
d. Keyword, keyword generator, scanner
11. Divide the following factors in On-page and Off-page.
a. Keyword research
b. Link popularity
c. Meta tags
d. Link building
e. Anchor text optimization
f. Content optimization
12. …………………. searching refers to how multiple terms are combined in a search.
13. SEO methodology consists of Identifying target keywords, keyword placement, ------- link building 14. …………… search technique refers to the ability to search just a portion of a
15. State all the off-page factors you know.
16. Crawler can scan each and every element present in the page including flash,
animation and graphics- State True or False.
17. Search engines find the site in 2 ways namely:
18. Google uses …………….. spiders, which crawl over 100 pages per second and
generating around 600KBs of data each second.
Keyword Selection Process
Keyword selection is very basic and the most important step in search engine optimization. The success of SEO campaign depends on keyword selection. While selecting keywords few points should be kept in mind such as they should be relevant to the site, they should describe the business suitably, they must be popular and attract considerable traffic, they should possess less competition and are less optimized.
Types of Keywords:
Keywords can be classified into 3 categories.
a. Single word keyword
b. Multi word keyword
c. Theme based keywords
a.Single word keyword: Single word keywords are considered to be the most generic keywords. Usually they are targeted on the home page to attract huge traffic. Though they attract more traffic the percentage of relevant traffic is very low. Therefore SEOs prefer to use 3-4 word keywords to attract more relevant traffic.
b.Multi word keywords: Two or more word keywords are called ‘Multi word Keywords’. Since multi word keyword contains a little described query, it attracts more targeted audience to the site. Also searchers always prefer to use 3-4 word keyword, it is more likely that your keyword matches the search query. Multi word keyword mostly contains qualifiers like place, color, location etc.
c.Theme based keywords: Theme based keywords attracts highly targeted as well as good amount of traffic. Theme based keywords are considered as primary keywords which are relate do the theme of site. They are present across the site. The traffic attracted by theme based keywords is more likely to be converted into sales.
There are some broad steps, following which SEO selects keywords for the campaign.
The first step is to understand what the web site is all about. SEOs should know the theme of the site. Keywords should be then generated according to the structure of the site. Usually sites provide general information on home pages and other higher level pages. So more generic keywords should be targeted on these pages. The specific keywords should be targeted on inner pages that describe specific product or services or whatever the offering is!
Once generic or seed keywords are generated, and then next task is to expand it into key-phrases by adding specific keywords or qualifiers. Based on these phrases then we generate large number of keywords.
The seed keywords are either given by client or SEOs themselves generates seed keywords by understanding the business of the client. Studying Meta Tags of the competitor web sites also can do this. But mind that, this is just for reference. One can not entirely depend on the meta tags. Once the seed keywords are generated then we can use Wordtracker or Overture to derive a set of keywords.
Seed keywords need to add some qualifiers to generate more precise and specific keywords. These qualifiers can be anything from location to number to color to product. These are known as ‘sub theme keywords’. By using this we can expand the number of seed keywords to 20-30. Typically a sub theme key phrase could be of 2-3-4 word length. One recent study suggests that the typical searcher often uses longer queries. Many contain more than three words.
Now comes the time to check for the popularity of these sub theme keywords. Some or all of the tools mentioned in the keywords tools section could be utilized to measure the popularity of these keywords. Based on the popularity count of these tools we should come up with a descending order listing of our most unused but good key phrase. These are our good target keywords. As the competition increases on the keywords, deeper optimization then just building keyword specific pages will have to be done.
There should be some variations on sub theme keywords to get into the mind of the user. Now our set of 25 to 30 keywords will be expanded to include singular-plural word combinations, put together the same words in a different combination, using synonyms, building on the list of words using Thesaurus & a good dictionary.
The ideal set of keywords/ key phrases is to come up with around 70-100 possible ways of looking for that web site on the search engines (our key phrases)
Research on these variations of the sub theme keywords for actually combing up with the combinations that are relatively un optimized for yet popular.
At the end of this exercise we will have the ranked list of keywords, which we can start optimizing for.
Keyword Competition Analysis
Competition analysis is the major aspect of keyword finalizing strategy. That means determining the competitive ability of the finalized keywords. There is no benchmark as such to determine whether the keyword is competitive enough or not. But it varies according to campaign and objectives. Usually competition analysis is done to search for the terms that have least competition in the respective field.
The basic step before determining the ‘keyword’ competition is to study the whole competition of your web site. To do that we need to get into the details of many aspects of the website. How long its been in the market, when it was indexed, how many links it has and the popularity of the site on internet are some of the areas, which needs to study. The more the all of above the better for the site. But keep in mind that it takes 6 months for a new site to get indexed in Google. It doesn’t rank on any competitive keyword during those 6 months. Therefore determining competition of keyword doesn’t help immediately if the site is new.
The first step for determining keyword competition is selecting the keyword and executing the query. Once you ask the search engine to search for a specific keyword, it returns with millions of results. In the right hand corner of the page it notifies the number of results on the keyword entered. For example, if we search for ‘Low price mobile phones’, search engine will come up with say 98,768,999 results. This number shows the number of pages indexed in the search engine related to the keyword ‘Low price mobile phones’. But its almost impossible to generate any competitive output with such a huge number of results. Obviously the number these pages contains the results not only for the whole page but for all the permutations and combinations in the phrase ‘Low price mobile phones’ i.e. it will also show the results for the keywords such as ‘mobile phones’, ‘low price’, ‘mobile’, ‘phone price’ and so on. But all these keywords are not competing with our product. Therefore we can say that this number just gives a vague competitive analysis.
Now the second step is to determine how many of these results are actually competing for the keyword ‘low price mobile phones’. Of course there is an ‘advanced search’ option available on many search engines, still it fails to deliver the desired result. So SEOs use ‘allintitle’ tool to know the exact number of sites that are competing for the same keyword or phrase. This is directly connected to the title tag in your HTML code. As we studied earlier, title tag is the theme of the page and should contain the most important keywords and phrases! With the ‘allintitle’ command we can actually search only title tags of the sites. And search engines shows only those results, which contains ‘low price mobile phones’ as a keyword in their ‘title tag’. The command we use to do this is
Allintitle: low price mobile phones
Once you execute the search with allintitle command, the number of results will reduce to some thousand from millions. And these results will give much more precise data about the competition for your keyword. This is how we can use title tag to decide the competitive ability of the keyword. We can also use one more factor, Anchor text, to decide the keyword competition.
Anchor text is the linked text (inline link or external link) on the webpage. Incoming links always influence search engines while ranking the page. The more the incoming links, the more relevant page and hence higher ranking. And as we know, adding keywords in the anchor text, surely help increase ranking for the site that is been linked to. Because of this, it is common for websites that are competing for certain keywords to try and build incoming links that contain their desired keywords in the anchor text. So, this is why checking for anchor text is helpful in determining competition. To do this we use ‘allinanchor’ tool. The command to execute this query is
Allinanchor: low price mobile phones
This will also reduce the number of results significantly. These are the number of links have anchor text containing the words ‘low price mobile phones’. By means of this we can know that there are certain amount of links that linked to the number of pages where there is an information regarding ‘low price mobile phones’ in some form or another. So all these pages could be our potential competitors.
Now with all the data available, you can evaluate the keywords on the basis of its competition and choose it according to your requirement.
Search Engine Optimization consists of some factors which can be changed and modified by webmaster and some which can’t. The earlier is called On-page factors and the later is Off-page factors.
In this chapter we are going to study ‘On Page Factors’.
On page factors include
• HTML tags
• URL rewrites
As we know, on page factors are related directly to the content and structure of the web site. This normally consists of pages written in 8HTML but also applies to other document formats that are indexed by search engine such as .pdf or .doc
Along with all the aspects mentioned above, it may also include reducing redundant HTML codes produced by web page authoring tools and restructuring the site to produce better linked and focused page content.
Let’s start up with the first and foremost aspect of SEO, ‘Keywords’! Keyword list is a list of significant and descriptive words that will be used to render the content to users in searches. The words should be similar to the theme of a web site and should be easily integrated in the web site content. Keywords are mental images linked to what lies in the heart of your customer. Keyword selection is based on consumer persuasive research.
The first step of any SEO campaign is ‘Keyword Research’ i.e. determining and short-listing the most relevant keywords to the product/service. To generate maximum 9ROI from online marketing campaign, careful selection and focused efforts on specific keywords is very important. Keyword selection mainly depends on the theme of website, your target audience, which search engine they are likely to use and finally what keywords they might use to find your product or service.
There are many tools available to generate keywords. These tools are known as keyword suggestion tools. The most commonly used keyword suggestion tools are Wordtracker, Overture and Google suggest. The keyword suggestion tool helps to choose relevant and popular terms related to your selected key terms.
Let’s study them one by one.
1.Wordtracker: Wordtracker is the first and considered to be the best keyword suggestion tool in the world. Wordtracker was introduced in 1999 by Andy and Mike Mindel with a motive to assist SEOs finding multiple keywords with the help of a single keyword. Wordtracker helps web site owners and search engine marketers identify keywords and phrases that are relevant to their or their client’s business and most likely to be used as queries by search engine visitors.
Wordtracker offers a database of 340 million search queries from which you can query to build keyword lists for use in SEO and PPC. All search terms are compiled from the major meta crawlers, Dogpile and Metacrawler. Wordtracker offers simple, precise, compressed, comprehensive, misspelling, lateral and thesaurus search. It gives a list of similar keywords that bear any relation with the search query with the search count. Also its competition search feature finds keywords that have few competing web pages in major search engines.
Wordtracker is useful in many ways. The most important advantage of using it gives the most popular terms used by visitors while searching for your products or services. This is invaluable insight to help you target your customers in your search engine optimization efforts and pay per click search engine advertising. It also offers competition analysis. It helps you find keywords that have few competing web pages in major search engines. And the fewer pages you have to compete with, the easier it should be to get top rankings. Wordtracker even offers a misspelling search to help you find misspelled keywords. Sprinkle these into a web page to ensure your page comes up in search results.
2.Overture: Overture’s keyword research tool is basically for Pay-per-Click advertisement where bidding takes place for top rankings in SERP of major search engines and websites including Yahoo!, MSN, AltaVista, Alltheweb, Dogpile, CNN and ESPN. Overture can make you reach over 80% of the active net population. It only charges if somebody clicks on the text advert. Also marketer can get guaranteed top rankings in search results of major search engines and websites. At present overture is delivering more than 215 million targeted leads each month.
The biggest benefit it offers is not only guaranteed top ranking but you only have to pay if a visitor clicks on the listing. Since the advertisements are placed contextually, marketer can receive a lot of targeted customers within days of advertising. Also its huge network is an added advantage. Its partnered with some big names like yahoo, MSN, AltaVista, CNN etc. If the objective is lead generation in quick time then Overture could be the best way to achieve that.
It offers self service and paid service options. In self service, marketer has to do all the work like selecting keywords, bidding and optimizing etc. whereas in paid service overture looks after the entire campaign and assure maximum ROI.
3.Google Suggest: Google suggest is another tool for keyword research introduced by Google Inc. Though it is promoted by Google, this tool is not so popular among marketers. As you type in the search box, google suggest guesses what you are typing and offers suggestions in real time. This is similar to Google’s “Did you mean” feature that offers alternative spellings for your query after you search. The only difference is Google Suggest offers it in real time. Google Suggest can make your searches more convenient and efficient by keeping you from having to reformulate your queries. It also gives the precise tarffic on the particular keyword by means of which marketers can get to know about the popularity of the keyword/phrase.
4. Digital Point Tool: This is a handy little tool will show you the results of your query from both Wordtracker and Overture for determining which phrases are searched most often. Enter a search phrase to see how often it's searched for, as well as get suggestions for alternate (but similar) keywords.
Then, once you know the keywords you want to target, you can use their keyword position tracking tool to monitor your keyword placement progress on major search engines.
Keyword finalizing strategy includes
• Keyword phrase popularity
• Keyword Competition Analysis
• Marketing relevance
Keyword Phrase popularity:
Visitors who are searching for information use many different keywords to fetch the information. Therefore it is important for SEO to select the most popular keywords amongst users. For example if some one wants to search for automobile industry, he/she will perform multiple searches with multiple keywords like automobile India, automobile industry, four wheelers, two wheelers etc.
But all these keywords do not attract huge traffic. That means all these keywords are not equally popular among the target audience. But few of them are very popular and frequently used while searching for that particular content. Wrong search phrases may get you better search engine ranking but have no search request. So to research and shortlist these keywords is a primary task of an SEO.
So the first step is to come up with keywords about your web site and checking their popularity.
Keyword Competition Analysis:
Competition analysis is necessary at every stage in SEO campaign right from keyword research. It gives insights of what strategies your competitor is adopting and why he is ranking high on those keywords. You can review the HTML code by selecting ‘view source’ option in the menu bar. This code includes the keywords, the title and the descriptions. The key onsite factors that must be considered in competition analysis are Title and Meta tags, Keyword density and content and special formats and positioning.
Providing most beneficial information is the primary purpose behind creating a web site. But the ultimate goal is final sales! While selecting keywords for SEO campaign, the basic criterion is to choose the most relevant keywords to the site. The belief behind doing it is to drive lots of traffic, which will convert into final sales. But it’s not as simple as it looks. Let’s look at this example. For a site selling broadband services, ‘telecom’ is a relevant keyword. It gets quite a lot of searches, you can drive huge amount of visitors to your site but finally there is no way that you will get any customer who has visited your site because of the keyword ‘telecom’. Because the people who type this keyword may be looking for telecom operators, telecom policies and things related to telecom and they are least interested in your broadband services. Hence selection of keywords from marketing perspective is also very important.
As we all know, web site is a source of information. Many web sites provide lots of information. It may be well diversified or industry/product specific. But the amount of data and number of features of industry/products are so high that it becomes very difficult to search engine to relate the keywords are the data. To minimize this confusion ‘theme pyramid’ was developed.
Theme pyramids revolves around organizing the whole site and placing keywords according to the site structure. We studied one word, multi word and theme based keywords previously. To rank the page higher in the search engine as well as to attract higher amount of relative traffic, we need to use all of these keywords. ‘Theme pyramid’ explains what kind of keywords should be placed on which pages to generate maximum ROI. This theory says that, start with one word keyword on index page and work the way down to the specific keyword you want to target.
Theme pyramid consists of 5 levels according to the importance of the page. Lets check them one by one.
Level one is of Index page. It has been found out that index pages rarely rank well in search engines. The reason behind it is, index pages are meant to be attractive and eye catching. This page contains links to all other documents and pages and directs the user to the whole website. The main purpose of index page is look good, easy to use and well defined! So there is nothing much to do on index page from SEO view point. The only way to rank index pages is, Link Building. But it should be kept in mind that, many of the users are interested in visiting the specific informative pages rather than index page.
The index page not only directs human visitors but search engine spiders also. The main task of index page is to provide the links to spider and give overview of the site to the directory editors.
Hence SEOs job is to link as much deep content at possible on the index page.
These are main SEO pages. Level 2 pages contains sub topics and not all the in depth information. For example, if the site is about SEO, then the level two pages could be on page factors, off-page factors, PPC etc. Then these pages could be linked to HTML tags, content optimization, technical optimization, bidding strategies etc.
Always remember that subtopics should contain the specific information only. I.e. on page factors page should contain links related to on page factors only and not PPC and all. These pages are considered as gateway pages for the customers looking for information. From SEO point of view they are high value pages. Here we should link as much deep content as possible. Here we use single word keywords to attract high traffic.
From this level onwards, all the levels are differentiated according to the depth of the content. These pages are linked to level 2 i.e. subcategory pages and not to index pages. One or two keyword phrases should be placed on these pages. It’s very tough to rank these pages high, as the keywords we use here are highly competitive. For example, if the level three page is about Meta tags then targeted keywords could be Meta keywords, meta description, comment tag etc.
These pages contain even more in depth content. You try to target these quality content pages with two-three word keywords. If this page contains Keyword research then link to content on your site and off site that targets those related keywords such as keyword research tools, keyword finalizing strategy, competition analysis etc. Don’t try to link PPC, link building and any other level 2 categories.
These are known as money pages. Some site may not need to go this deep. This is your base level site content that you use for search engine food and what your users are really after. Link to all pages above or across in the same category.
Theme Keyword Pyramid
# Seo Value Site Structure
(sub content categories)
1 No value Mosaic-service.com
2 Low value Primary single kw's On index pages (pseudo hallways) SEO AFFILIATES ADVERTISING
3 Medium value Secondary 1-2 word kw's
(doorways) ON PAGE OFF PAGE LINK BUILDING FUNTIONS PPC BANNER
4 High value 2-3 word kw phrases on high content pages HTML TAGS KEYWORDS LINK BUILDING GOOGLE PR RPL ONE WAY PRICING NETWORK BIIDING STARTEGY SITE NETWORK TARGETING REPORTING
5 Money 2-4 word kw phrases on ultra targeted pages $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $ $
As we know, spider or crawler can read text only. When spider scans the website it actually scans the HTML code of that particular website. There are several HTML tags such as title tag, 10meta tags, alt tags etc. which should be integrates with the keywords, in terms of greater optimization of the website.
Title tag plays a very crucial role in optimizing any web site. Title tag is an HTML code that shows the words that appear in the title bar at the top of the browser. Title tag, generally, is the first element of the web site. Then there come meta tags and header tags. Title tags contain the theme of the website and hence search engines give substantial importance to title tags. Title tag is the first impression of the web site for crawlers. All the major search engines evaluate the relevance of the website on the basis of the keywords present in the title tag. Title tag is also displayed on the SERP. The results shown on SERP contain the text included in title tag.
Title Tag holds significant weight and must be created carefully to ensure that they hold maximum SEO effectiveness and also that they appeal to the searchers. Since the Title Tag plays a vital role in determining your site's ranking in the SERP, you need to pay a lot of attention to the words that appear in the Title Tag and the order in which they appear. You need to develop a crisply worded Title Tag that includes your most relevant keyword phrases and performs the function of announcing the
The words encoded in the title tag do not appear anywhere else on the web page. For example if the web site is about SEO, then the appropriate title tag would be “Search engine optimization: HTML tags optimization” etc. Title tag- first element of the website is generally followed by the Meta Description tag and Meta keyword tag. Title tag code is described as bellow: (in HTML)
Title Meta earch Engine Optimization: HTML tags optimization Title
Title tag is the most important tag from the SEO perspective. Search Engine crawlers considers this tag as the most important element. The reason behind it is very simple. Since title tag communicates the theme of the webpage to the human visitors, search engines believe that the information entered in title tag must be true one. Though title tag is not a meta tag, it’s the most important tag amongst all HTML tags. Its ability to communicate theme is used by crawler. They gather overall information from the title tag and use the words in the title tag while ranking the website. The well described title tag increases the relevance significantly. All the major search engines use the title tag to evaluate the relevance of the website.
As stated earlier, title tag is also displayed on the SERP. It is the same text that explains the website in brief. It is hyperlinked to the webpage and when user clicks on it, it is redirected to the website. The Title Tag is also used as the text when you 11‘bookmark' a page or add a certain web page to your 12‘favorites' list in your browser.
Though there are many views and opinions about the on-page factors and its importance in SEO, most of the experts agree that title tag is significant tool in any SEO campaign. Now since title tag is the introduction of the website, the wording in the title tag is very important from SEO point of view. The crispy attractive title with well integrated keywords that actually invite the visitor is very necessary to develop. The title tag is actually summary of the webpage content.
As of now we know that Title Tag plays vital role in ranking the website. Perfect placement of keywords or key phrases can make significant change in rankings. The usage of keywords in title tag gives the highest weight.
The use of same keywords throughout the webpage is very important for an SEO. The keyword placement in title tag should be done in such a way that the keyword or key phrases should appear in the entire content of the site. You should use the same keywords not just in your Title Tag, but also in your page content and the Meta Description Tag of your web page. If keywords in title tag are different than that of content then is none of a use. If the keywords are no matching then the weight of the keywords in the title tag gets diluted.
Not just placement but the sequence of placement is also very important from SEO point of view. The most important keyword should appear at the beginning of the tag and then the secondary keywords and then least important keywords. If you follow this kind of placement, it will certainly create greater impact in ranking. And it displays the title in bold in SERP.
Generally search engines read about 80-90 characters. Therefore the title tag should contain that much amount of characters only. Also keyword density should be kept in mind while working on title tag.
The title tag should be relevant to the ‘webpage’ rather than ‘website’. Since each page contain different information try to vary the keywords in the title tag accordingly.
Since there is no algorithm for considering plural form and singular form, search engines like google thinks it as two different terms i.e. according to google ‘apple’ and ‘apples’ are two different terms. So its advisable to use both plural and singular forms of keywords in title tag. But they are not case sensitive, hence we can use any appropriate case (Upper case or Lower case).
Title Tag should ideally read like a phrase that makes some grammatical sense, not just a collection of keywords. This is all the more important as the Title Tag usually appears as the text when you ‘bookmark' or add a page to your ‘favorites' list. Therefore, it should make sense when a person reads it later. For instance, if you want to include the keywords Home Loans, fast clearance, No credit check in your Title Tag, you can write a Title Tag that reads:
Home Loans : fast clearance with no credit check.
Since internet visitor is looking out for information, its advisable to put some informative summary rather than company/product name. And if its very necessary to put the company name then keep it at the end.
So the title tag consists of a summary of the webpage content, it should read like a crisply worded sales pitch that is enticing enough to make the users click on your entry when it is displayed in SERP, it should include your most important keyword phrases or search terms, the sequencing of the keywords should make logical sense, Each page of your website should have different and customized Title Tag relevant to the context of that page.
Meta Description Tags:
The Meta Description Tag is a part of HTML code that allows you to give a short and concise summary of your web page content. The words placed in this Meta Tag, are often used in the search engines result pages, just below the Title Tag as a brief description of your page. In the Search Engine Results Pages, after reading the Title, a user usually studies the description of the page and decides whether she wants to visit your site or not.
Some Search Engines prefer to ignore your Meta Description Tag and build the description summary on the basis of the search term for the SERP on the fly. They usually pick up parts of the text on your page wherever the search terms appear.
The only exceptions are the Flash, Frame or All Image sites that have no content, and some high importance websites, where the search term is not found in the text. In such a case, Google picks up your entire Meta Description Tag and displays it.
This is the way Meta Description Tag appears in your site's HTML code:
meta name="description" content="Meta Tag Optimization: Title Tag Optimization and Meta Description Tag Optimization. Tips about how to Optimize your most important Tags."
Importance of Meta Description Tag
All search engines do not give very high prominence to the Meta Description Tag and may generate a description on the fly while listing your web page in SERP. However, in some search engines, a good Meta Description Tag might help a page to rank higher for your targeted search terms. This holds true for Flash, Frame or All Image sites that have no content as well as some high importance websites, where the search term is not found in the text. In such cases, some search engines pick up the exact Meta Description Tag and display it in its SERP, just below the Title Tag. Hence, it is important to write a crisp and enticing Meta Description Tag that includes your important keyword phrases and manages to interest your user, thus making her click on your entry.
Working with Meta Description Tag
Keyword Phrases and Meta Description Tag:
Include your most relevant and important keyword phrases in your page's Meta Description Tag. As in the case of Title Tag, focus on the same keyword phrases as you used in your page's Title Tag and body text. Fewer and highly targeted search phrases can boost your web page's relevance in the search engines results. Hence, stress should be on writing a brief yet informative description for your web page.
The Meta Description Tag of your web page should not read like a collection of keywords, but should be written as an informative and interesting summary of your web page.
Dos and Don'ts of a good Meta Description Tag:
1. Excessive keyword repetition should be avoided. Instead, pay attention to the sequence in which your keywords appear. Your most important terms should be placed in the beginning.
2. Make sure each page on your web site has a different and a unique Meta Description Tag using the keyword phrases that are relevant to that web page.
3. A Meta Description Tag of 25-30 words should do fine.
4. The most important keyword phrases should, ideally be placed at the beginning of your Meta Description Tag, which increases your chances of better Search Engine Rankings.
The Meta Keywords Tag:
Meta keyword tags are comparatively less important than other meta tags. SEOs add up some extra information in meta keyword tags to make the site more search engine friendly. But most of the crawlers now do not consider this tag.
Meta keyword tag is useful in a sense of providing support to the keywords and content. So in a nutshell, meta keyword tags are support tags to other HTML tags as well as content of the page. For example, if the site is about SEO, and search engine optimization is your keyword, then mentioning ‘internet marketing’ in keyword tag might help boost the page a bit higher for those words.
Remember, if you don't use the words ‘internet marketing’ on the page at all, then just adding them to the meta keywords tag is extremely unlikely to help the page do well for the term. The text in the meta keywords tag, for the few crawlers that support it, works in conjunction with the text in your body copy.
The meta keyword tag is also sometimes useful as a way to help your page come up for synonyms or unusual words that don't appear on the page itself. For instance, let's say you had a page all about the ‘Google Features’. You never actually say the word "algorithm" on this page. By having the word in your meta keywords tag, then you may help increase the odds of coming up if someone searched for ‘Google features and algorithm’. Of course you would greater increase the odds if you just used the word ‘algorithm’ in the body copy of the page itself.
Here's another example. Let's say you have a page about horseback riding, and you've written your page using "horseback" as a single word. You realize that some people may instead search for "horse back riding," with "horse back" in their searches being two separate words. If you listed these words separately in your meta keywords tag, then may be for the few crawlers that support it, your page might rank better for "horse back" riding. Sadly, the best way to ensure this would be to write your pages using both "horseback riding" and "horse back riding" in the text -- or perhaps on some of your pages, use the single word version and on others, the two word version.
Far too many people new to search engine optimization obsess with the meta keywords tag. Few crawlers support it. For those that do, it MIGHT help improve the ranking of your page. It also may very well do nothing for your page at all. In fact, repeat a particular word too often in a meta keywords tag and you could actually harm your page's chances of ranking well. Because of this, it is strongly advisable that those new to search engine optimization not even worry about the tag at all.
Even those who are experienced in search engine optimization may decide it is no longer worth using the tags. Search Engine Watch doesn't. Any meta keywords tags you find in the site were written in the past, when the keywords tag was more important. There's no harm in leaving up existing tags you may have written, but going forward, writing new tags probably isn't worth the trouble.
Meta Robots Tag:
One other Meta tag worth mentioning is the robots tag. This lets you specify that a particular page should NOT be indexed by a search engine. To keep spiders out, simply add this text between your head tags on each page you don't want indexed. The format is shown below.
You do NOT need to use variations of the meta robots tag to help your pages get indexed. They are unnecessary. By default, a crawler will try to index all your web pages and will try to follow links from one page to another.
Most major search engines support the meta robots tag. However, the robots.txt convention of blocking indexing is more efficient, as you don't need to add tags to each and every page. See the Search Engines Features page for more about the robots.txt file. If you use do a robots.txt file to block indexing, there is no need to also use meta robots tags.
The meta robots tag also has some extensions offered by particular search engines to prevent indexing of multimedia content. The article below talks about this in more depth and provides some links to help files. Search Engine Watch members should follow the link from the article to the members-only edition for extended help on the subject.
Other Meta Tags:
There are many other meta tags that exist beyond those explored in this article. For example, if you were to view the source code of this web page, you would find "author," "channel" and "date" meta tags. These mean nothing to web-wide crawlers such as Google. They are specifically for an internal search engine used by Search Engine Watch to index its own content.
There are also "Dublin Core" meta tags. The intent is that these can be used for both "internal" search engines and web-wide ones. However, no major web-wide search engine supports these tags.
How about the meta revisit tag? This tag is not recognized by the major search engines as a method of telling them how often to automatically return. They have never supported it.
Overall, just remember this. Of all the meta tags you may see out there:
Meta Robots: This tag enjoys full support, but you only need it if you DO NOT want your pages indexed.
Meta Description: This tag enjoys much support, and it is well worth using.
Meta Keywords: This tag is only supported by some major crawlers and probably isn't worth the time to implement.
Meta Everything Else: Any other meta tag you see is ignored by the major crawlers, though they may be used by specialized search engines.
“Alt Tag” is defined as the alternative text that the browser displays when the searcher does not want to or cannot see the pictures present in a web page.
For years, search engine optimizers have included their important keyword phrases in ALT text for images, feeling confident that many of the search engines considered the contents of ALT text when determining relevancy.
But some researches have out that none of the major search engines now considers Alt Text when determining relevancy. According to research by expert SEO researcher Jerry West of WebMarketingNow and Search Engine Academy, using ALT text for SEO purposes has not only diminished, but adversely affects the rankings in the SERPs [when used incorrectly]
According to a Google engineer, what you should do is create an ALT [text] that is relevant to the picture, so it gives the user a good experience, including the visually impaired. The ALT text is indexed, but it is down graded in the algorithm.
That means Search engine optimizers no longer need to use keyword phrases in the ALT text of images on their Web pages. However Alt Text can be used to describe the image. Who knows it might give you an advantage over others.
Anchor Text Optimization
As discussed earlier, anchor text is the visible hyperlinked text on the webpage. Anchor text links to a page where the same topic is discussed or it is related to that topic. That means anchor text takes visitor to another site or page where he/she can get some more insights about the topic he/she is reviewing. For example lets take the four lines written above.
As discussed earlier, anchor text is the visible hyperlinked text on the webpage. Anchor text links to a page where the same topic is discussed or it is related to that topic. That means anchor text takes visitor to another site or page where he/she can get some more insights about the topic he/she is reviewing.
Here in the above paragraph, ‘anchor text’ and ‘webpage’ are anchor texts. These words are hyperlinked to the pages which are related to anchor text and webpage respectively. There for these words are ‘anchor text’ in that paragraph.
Anchor texts are very important from SEO point of view. The page, which is linked to anchor text is believed to be highly relevant for search engines. But not only the linked page but also the page containing anchor text gains some importance while ranking on SERP, provided it is using keywords.
The website containing different information on different pages can optimized very easily by using anchor text. We will call it as ‘anchor text optimization within the site’. Generally we use ‘About Us’, ‘Click Here’, ‘More..’ and such kind of words to link the desired page within the site. Instead of that, if we use the keywords there, it will certainly help the site to rank higher in SERP. So the game is all about linking appropriate keywords to right pages.
Anchor text optimization can be done while link building campaign also. Since presence of keywords in the links pointing to your site also has greater importance, we need optimize that text as well. So it is advisable to write keywords in the title and vary the description accordingly.
It is observed that anchor text is the deciding factor in ranking pages. The site will rank fair if you don’t optimize anchor text. But it will rank much more higher, if we do anchor text optimization. Take the example of ‘miserable failure’ case. Biography of George Bush is linked with ‘miserable failure’ and still it is ranking number one for that keyword. It’s all done without any on-page optimization but only on Anchor text optimization.
Comment Tag optimization
Comment tag is an html code that describes or documents content. The code is written in between . Comment tag is invisible to the user and can only be seen in the code itself. Browser doesn’t display any code within comment tag. However search engine indexes all text within comment tag. You can use comments to explain your code, which can help you when you edit the source code at a later date.
Generally comment tag is written as,
Since search engines indexes the text in comment tag, we can use keywords and key phrases in comment tag. This will increase the keyword density and ultimately page can rank higher than others.
But using comment tag to optimize the website is becoming outdated practice. Many of the search engines have changed their algorithms and they are not ranking pages based on use of keywords in comment tags. Inktomi is the only search engine that still considers comment tags to boost search engine rankings of the web site. The main reason behind changing algorithm is its misuse. Often inserting a comment tag inside a font tag will not make it a comment tag anymore. It will get treated as any other text in the document.
The Importance of Content:
Content is what makes the website good or bad and content is the major factor which attracts more and more visitors to the site. Search engines also give very high importance the sites, which are full of relative content. Considering the main reason behind the website i.e. information, search engines emphasis more on content than anything else.
But just writing some simple text is not enough to rank the page in the search engine. There are several factors on which a search engine evaluates the content and ranks the site. The main aspect any search engine looks in the content is its benefit to the visitor. Also original and well written content is highly appreciated by the search engine. This is because, search engines believes that users are always looking for new and original content. Submitting articles is also becoming popular because of its ability to get linked to other sites. The quality of the content also plays a vital role in search engine rankings.
Lets look what advantage the well written and original content provide! Before writing any content, one thing should be kept in mind that content is the backbone of the site. The main purpose of creating a website is to provide all the possible information as deeply as possible. Therefore before writing the content you should know, what exactly your target audience is and what are their interests. Also you
Original & Well-Written Content
1. What does your audience want to find?
2. Will you have to do additional research?
3. Are you an expert writer or do you have one on staff?
What Does Your 13Audience Want To Find?
Assessing your potential visitors’ want does not require a crystal ball. If you have completed and spent quality hours on Step One of this series, fully researching your keywords, you are already well on your way. Delving into those keywords you will often find hints that will push you in the right direction.
If you have a site that provides information on skin problems such as “acne” and you have found a number of people searching for “acne treatment” and “natural acne treatment” and have thus chosen these as your targeted keyword phrases you already understand your visitors current situation and more importantly, their desire. Similarly, if you are a real estate agent and have chosen “los angeles real estate” as your phrase you know more than simply characters strung together and dropped into a search box. You know that you are dealing with people wishing to purchase or sell a home in Los Angeles. In both scenarios you know what your visitors want and, assuming you are already successful in your industry, you know what you have to do to convert that desire into a client.
Now what has to be done is to create solid, compelling content that will both grab your visitor’s attention and at the same time, make them want what you have to offer. This is not the same as selling to them when you have the opportunity to speak to them face-to-face. You are working without the benefit of watching their expressions, speaking to them about their objections or even understanding whether they are looking for information for a friend or if it is they themselves who require your services.
This leaves you with a lot of room for content. In the online environment you have to deal with every question before they ask it, and make every person feel that you can help them even though you’ve never met.
What does your audience want to find? They want to find a solution to their problem. How do you provide that? By supplying them answers to the questions that they don’t have the opportunity to ask and may not want to give you their email address to find out. FAQ pages are good but often used as sales pages, which is fine so long as you are still providing good content that your visitor isn’t reading as “sales” but rather “solutions”. Perhaps create pages of replies to emails you have received. Perhaps place a related “fact of the day” on your homepage with a link to an archive of facts related to your industry, product and/or business. You might even want to add a blog to your site. Regardless, give your visitor the answers they’re looking for and keep this information updated as you get new information and you will stand a much better chance of keeping that person surfing through your website. The longer you can keep them on your site, the greater the chance that you will build trust and once you’ve got that, you can help them with the solution to their problem.
Will you have to do additional research?
No matter how much you know there is always more out there and your visitors are probably well aware of that. If you fail to address all their questions, your visitors may very well leave your site in search of the answer. Once they’ve left your site it becomes other webmasters who now have the opportunity to present the benefits of their products or services.
Find all the information that you can and make sure that you include as much as possible on your site. The additional benefit in doing this is that constant new information on your website will not only keep visitors coming back to find new information but the search engines spiders too. If your site changes often the spiders will pick up on this and will visit you more often. While this by itself will not improve your rankings it does give you an advantage. The more often search engine spiders visit your website the faster changes you make will be picked up. The faster these changes are picked up the quicker you will be able to react to drops in rankings. If you know the spiders visit your site every second day and you drop from #8 to #12 you know that with proper tweaking to your content you may be able to recover that
loss in as little as two days.
Use professional writers:
When you need a doctor do you read a book entitled “Heart Surgery For Dummies” and buy yourself a very sharp knife. Of course you don’t and while your website may not be quite as important as your heart, it is how your company is being perceived online. This perception can be the make-or-break of all your online marketing efforts.
If you are committed to attaining high rankings, to making money online and/or promoting your business through your website, shouldn’t you also be committed to insuring that your conversions are maximized. High search engine positioning is important but so too is converting those visitors once they get to your site. You may be an expert in your field but if that field isn’t writing, and you don’t have a writer on staff, be certain to at least consider hiring one to make sure that your website is conveying the message you want in verbiage that your visitors will understand. Assuming you choose your writer well you will not only have a well-written site but you will also gain the advantage of having an outsider, who is more likely to write for people who aren't experts, creating your content.
If you feel that you are qualified to write your own content be sure to have it proofread by someone from the outside. Find someone from within your target market and demographic, and have them go through your content giving suggestions and criticism. Every change they recommend is earning you extra money. Whether you implement the changes or not you are learning something new about what people will want and expect to see on your site.
With Articles Come Links
Writing content is not just an exercise for your own website. We all know that inbound links to your site help rankings. Additionally, if those links can be ones that provide genuine targeted traffic you’re doing very well.
There are a number of methods for driving traffic to your site with paid advertising, PPC, etc. however one of the most cost-effective methods is to publish articles. Article writing is no simple task however the rewards can be enormous. Articles serve two great purposes:
1. Increased Link Popularity – When you write an article and submit it to other websites to post, they will generally link to your website from the page the article is on. Here’s a completely legitimate, relevant, and quality link to your site.
2. Exposure & Credibility – The added credibility that article writing lends to your business coupled with the added benefit of the visitors who come to your site directly from your article are invaluable.
When it comes to article writing there is little in the way of more effective advertising. You will have to find sources to publish those articles on, but once you’ve done this time-consuming task you can reuse the same list for future articles.
Get those articles on a number of quality resource sites and enjoy watching your stats and your rankings improve.
With Quality Content Comes Even More Links
Yet another benefit that derives from having a website with great content and writing articles is that, with time, your website itself will become a resource. If you provide great information that other people will find useful people will link to it naturally.
With so much emphasis in recent times on reciprocal linking some might think this is the only way to get links at all. Believe it or not there are still webmasters out there who will link to sites for no other reason than they feel their visitors will be interested in its content.
Build a good site with quality content, keep it easily navigated and create sections for specific areas (articles for example) and you will find that people will link to your site and may even link to specific articles or your articles index. Perhaps then your articles index is a good page to target an additional keyword phrase.
There are aspects of the optimization process that gain and lose importance. Content optimization is no exception to this. Through the many algorithm changes that take place each year, the weight given to the content on your pages rises and falls. Currently incoming links appear to supply greater advantage than well-written and optimized content.
While currently having a bunch of incoming links from high PageRank sites will do well for you on Google you must consider what will happen to your rankings when the weight given to incoming links drops, or how your website fares on search engines other than Google that don't place the same emphasis on incoming links.
While there are many characteristics of your content that are in the algorithmic calculations, there are a few that consistently hold relatively high priority and thus will be the focus of this article. These are:
1. Heading Tags
2. Special Text (bold, colored, etc.)
3. Inline Text Links or Anchor text
4. Keyword Density
The heading tag is code used to specify to the visitor and to the search engines what the topic is of your page and/or subsections of it. You have 6 predefined heading tags to work with ranging from H1 to H6
By default these tags appear larger than standard text in a browser and are bold. These aspects can be adjusted using the font tags or by using Cascading Style Sheets (CSS).
Due to their abuse by unethical webmasters and SEO's, the weight given to heading tags is not what it could be however the content between these tags is given increased weight over standard text. There are rules to follow with the use of heading tags that must be adhered to. If you use heading tags irresponsibly you run the risk of having your website penalized for spam even though the abuse may be unintentional.
When using your heading tags try to follow these rules:
• Never use the same tag twice on a single page
• Try to be concise with your wording
• Use heading tags only when appropriate. If bold text will do then go that route
• Don't use CSS to mask heading tags
Never use the same tag twice on a single page. While the H1 tags holds the greatest weight of the entire heading tags, its purpose is to act as the primary heading of the page. If you use it twice you are obviously not using it to define the main topic of the page. If you need to use another heading tag use the H2 tag. After that the H3 tag and so on. Generally try never to use more than 2 heading tags on a page.
Try to be concise with your wording. If you have a 2 keyword phrase that you are trying to target and you make a heading that is 10 words long then your keyword phrase only makes up about 20% of the total verbiage. If you have a 4-word heading on the other hand you would then have a 50% density and increased priority given to the keyword phrase you are targeting.
Use heading tags only when appropriate. If bold text will do then go that route. If overused the weight of the tags themselves are reduced with decreasing content and "priority" being given to different phrases at various points in the content. If you have so much great content that you feel you need to use many heading tags you should consider dividing the content up into multiple pages, each with its own tag and keyword target possibilities. For the most part, rather than using additional heading tags, bolding the content will suffice. The sizing will be kept the same as your usual text and it will stand out to the reader as part of the text but with added importance.
Don't use CSS to mask heading tags. Cascading Style Sheets (CSS) serve many great functions. They can be used to define how a site functions, looks and feels however they can also be used to mislead search engines and visitors alike. Each tags has a default look and feel. It is fine to use CSS to adjust this somewhat to fit how you want your site to look. What is not alright is to adjust the look and feel to mislead search engines. It is a simple enough task to define in CSS that your heading should appear as regular text. Some unethical SEO's will also then place their style sheet in a folder that is hidden from the search engine spiders. This is secure enough until your competitors look at the cached copy of your page (and they undoubtedly will at some point) see that you have hidden heading tags and report you to the search engines as spamming. It's an unnecessary risk that you don't need to take. Use your headings properly and you'll do just fine.
Special Text (Optimization using Bold, Italics, Underlines etc.)
When a search engine is scanning page its looking for several factors to determine what is important and what is not. In particular, it looks for text tags such as bold, underline, italics etc to help rank the page.
The reason behind it is quite simple! As the text is formatted specially, search engines thinks that it is important for users. And hence it is important for SEs as well.
"Special text" is any content on your page that is set to stand out from the rest. This includes bold, underlined, colored, highlighted, sizing and italic. This text is given weight higher than standard content and rightfully so. Bold text, for example, is generally used to define sub-headings or to pull content out on a page to insure the visitor reads it. The same can be said for the other "special text" definitions.
Search engines have thus been programmed to read this as more important than the rest of the content and will give it increased weight. For example, on the homepage we begin the content in the bold text. This serves two purposes. The first is to draw the eye to these words and further reinforce the "brand". The second purpose is to add weight to the "Search Engine Positioning" portion of the name. It effectively does both.
Reread your content and, if appropriate for both visitors and search engines, use special text when it will help draw the eye to important information and also add weight to your keywords. This does not mean that you should bold every instance of your targeted keywords nor does it mean that you should avoid using special text when it does not involve your keywords. Common sense and a reasonable grasp of sales and marketing techniques should be your guide in establishing what should and should not be drawn out with "special text".
Inline Text Links (Anchor Text)
Inline text links are links added right into text in the verbiage of your content. For example I can link some text of this article to other articles for reference.
Like special text this serves two purposes. The first is to give the reader a quick and easy way to find the information you are referring to. The second purpose of this technique is to give added weight to this phrase for the page on which the link is located and also to give weight to the target page.
While this point is debatable, there is a relatively commonly held belief that inline text links are given more weight that a text link which stands alone. If we were to think like a search engine this makes sense. If the link occurs within the content area then chances are it is highly relevant to the content itself and the link should be counted with more strength than a link placed in a footer simply to get a spider through the site.
Like "special text" this should only be employed if it helps the visitor navigate your site. An additional benefit to inline text links is that you can help direct your visitors to the pages you want them on. Rather than simply relying on visitors to use your navigation bar as you are hoping they will, with inline text links you can link to the internal pages you are hoping they will get to such as your services page, or product details.
"Keyword density" is the percentage of your total content that is made up of your targeted keywords. There is much debate in forums, SEO chat rooms and the like as to what the "optimal" keyword density might be. Estimates seem to range from 3% to 10%.
Knowing that search engines operate on mathematical formulas implies that this aspect of your website must have some certain number associated with it that will give your content the greatest chance of success.
With this in mind there are three points that you should consider:
1. Since the algorithm is very complex and hard to explore, you will never know 100% what this certain number is.
2. The optimal keyword density algorithm change regularly. Therefore even if you know the stats today, there is no guarantee that the same number will give the same ranking after some time. Therefore rather than running after optimal keyword density its better to keep it in between 2-10%.
3. The optimal keyword density for one search engine is not the same as it is for another. Chasing the density of one may very well ruin your efforts on another.
So what can you do? Your best bet is to simply place your targeted keyword phrase in your content as often as possible while keeping the content easily readable by a live visitor. Your goal here is not to sell to search engines; it is to sell to people. If you are simply aware of the phrase that you are targeting while you write your content then chances are you will attain a keyword density somewhere between 3 and 5%. Stay in this range and, provided that the other aspects of the optimization process are in place, you will rank well across many of the search engines.
Also remember when you're looking over your page that when you're reading it the targeted phrase may seem to stand out as it's used more than any other phrase on the page and may even seem like it's a bit too much. Unless you've obviously overdone it (approached the 10% rather than 5% end of the spectrum) it's alright for this phrase to stand out. This is the phrase that the searcher was searching for. When they see it on the page it will be a reminder to them what they are looking for and seeing it a few times will reinforce that you can help them find the information they need to make the right decision.
In the search engine marketing literature, keyword density is defined as
Where tfi, j is the number of times term i appears in document j and l is the total number of terms in the document. Equation 1 is a legacy idea found intermingled in the old literature on readability theory, where word frequency ratios are calculated for passages and text windows - phrases, sentences, paragraphs or entire documents - and combined with other readability tests.
Semantic Approach for content optimization
As internet marketing is growing, search engines are coming up with new algorithms so that users can get most relevant data for their queries. The latest twist in the tale is Latent Semantic Indexing.
Latent semantic indexing (LSI) means that the focus of search engines is more towards the theme of the site rather than just keywords while ranking the site. This also includes the incoming links to your site. So the job of SEOs is to make site theme based rather than optimizing it based on just keywords.
All the content should be focused on the main topic and more importantly it must be articulated by using relative phrases or words. If we use semantic words, search engines find it easier to determine the topic of your site.
Search engines are now checking out your knowledge rather than information. It is studying your site and the links leading to it for words related to your targeted keywords but not necessarily your intended keywords unless they are relevant. Semantic indexing is already evident in Google's search results since some sites rank well for keyword phrases they don't have on their site. In fact, in some cases they don't even have related words. LSI analyzes the underlying meaning inherent in links between Web sites and infers meaning from them. That doesn't mean on-site optimization is irrelevant, because strategic keyword usage on your site will generate an advantage. LSI is simply another technique. We can say that search engines are now heading towards artificial intelligence.
The principle is simple. The search engines are using advanced technologies to ensure that emphasis is upon true relevancy in a search. As long as you are creating high quality content that is useful to a user, you will get better ranking.
Latent semantic indexing adds an important step to the document indexing process. In addition to recording which keywords a document contains, the method examines the document collection as a whole, to see which other documents contain some of those same words. LSI considers documents that have many words in common to be semantically close, and ones with few words in common to be semantically distant. This simple method correlates surprisingly well with how a human being, looking at content, might classify a document collection. Although the LSI algorithm doesn’t understand anything about what the words mean, the patterns it notices can make it seem astonishingly intelligent.
Using this system makes it a lot more difficult for a SEO spammer to guess what semantically related words/phrases Google might assign most weight to in relation to the theme of the page. Its also very likely that Google will be stepping up the importance of high-quality backlinks and will almost certainly use LSI techniques to judge the impact of these links.
In previous chapter we studied the front-end factors which can be optimized to make the website search engine friendly. But there are several factors at back end that are as important as on-page factors and off page factors. Though both front end and back end factors are equally important, many companies are neglecting back end optimization i.e. technical issues in SEO.
In WWW servers all the websites are like files and Hyper Text Transfer Protocol (HTTP) is used to share those files. When users enter a site name, the request is sent to the server, server searches the file and sends it back to the user. HTTP is a protocol with the lightness and speed necessary for a distributed collaborative hypermedia information system. It is a generic stateless object-oriented protocol, which may be used for many similar tasks such as name servers, and distributed object-oriented systems, by extending the commands, or "methods", used. A feature if HTTP is the negotiation of data representation, allowing systems to be built independently of the development of new advanced representations.
But sometimes few errors occur at the server end while searching for the desired site. The misspelling of website, change in name, change in IP address, change in ISP, website reconstruction, file moved, deleted pages etc. can trouble the server to find these pages. If such stuff happens, then HTTP returns with some error page to the requested user. And this is the point when company loses out its customers. it creates so much negative impact on the customer that many of them never returns to that marketer. This is not just because of unavailability of the files but the way user come to know about it. The appearance of the HTTP error pages is very dull and hence visitors immediately look for some other sites/companies.
But we can actually customize these error pages in such a way that there is a minimal chance of loosing out the customer. But before doing that we should know, which error pages are most important from SEO point of view. There are about 7 error codes which SEOs should look for. But four of them are the most important. 200, 404, 301 and 302 are most important error codes fro SEOs. 200 state the ‘page OK’, 404 gives ‘page not found’ error while 301 and 302 are permanent and temporary redirect pages respectively. Lets see them one by one and how we can make them more effective.
200 status means the request has succeeded. The information returned with the response is dependent on the 1method used in the request.
404 ‘Page not found’:
404 means page not found! This is usually the page you get when you make a mistake spelling page name in a site, or if the page is deleted or moved. The apperance of this page is very dull and unhelpful. There is no way that a visitor can check out your other site or other section, if you don’t customize this page.
Many people have figured out that if you use a custom 404 page you can present much more helpful page to your visitors. Many people designed it to redirect to the other page or site. So anything pointing to that site will pass on to the web site.
But the problem is that if you use a redirect to pass PR from an error page to a normal page, the redirecting page will usually return a "200 OK" or 302 Redirect code, rather than a proper 404. This messes up search engines and can result in a whole bunch of indexed URL's all looking to the search engine like duplicates of your home page.
So instead of doing so much things and craeting unnecessary chaos, just try to make it a little helpful to your cusomer. Or redirect it to some helpful page. The best result from SEO point of view is for any link popularity for broken links be passed on to the page of your choice.
301’ permanat redirection’:
Sometimes companies change their website name, IP address, ISP etc. Then it becomes almost impossible for the company to notify these changes to its customers. This can cause a state of confusion, insecurity and lack of trust for the company among its customers and hence ultimately results into loss of customers. therefore we use ‘redirection’ tool by means of which customer reaches to the desired site without knowing any back-end change.
301 is an optional function that redirects the visitor from the current page or site to another page or site, while returning a response code that says that the original page or site has been ‘permanently’ moved to the new location. Search engines like this information and will readily transfer link popularity (and PageRank) to the new site quickly and with few issues. They are also not as likely to cause issues with duplication filters. SEOs like 301 redirects, and they are usually the preferred way to deal with multiple domains pointing at one website.
302’ temporary redirection’:
It redirects the visitor from the current page or site to another page or site, while returning a response code that says that the original page or site has been ‘temporarily’ moved to the new location. Search engines will often interpret these as a park, and take their time figuring out how to handle the setup. Try to avoid a 302 redirect on your site if you can (unless it truly is only a temporary redirect), and never use them as some form of click tracking for your outgoing links, as they can result in a "website hijacking" under some circumstances.
The objective is to have one resource that redirects a visitor to a completely different page or site and inform the visitor that the redirection is temporary or permanent. We need once source page and one destination page for doing that. Here when user tries to reach to the source code server redirects it to the destination code and also during transfer it informs user whether it is permanent or temporary redirect.
Multi-domain for same website:
Sometimes companies book multiple but look alike domains. Here are many reasons behind doing that. To cover your product category with similar names, Keeping competitors away from you, possibility of spelling mistakes from visitors could be some of those reasons. The ultimate goal behind doing this is not to loose a single visitor want to view the site. Search engines believe that this is spamming. Because even if the domain names are different, the content of the site is same. And you are trying to index different sites with the same content.
Bruce Clay has come up with a system called ‘IP-Funnel’ to encounter this problem. IP-Funnel system provides the same multi-domain benefits but without duplication of data for each domain. It arranges all the web sites in such a way that search engines does not view your multiple sites and hence can’t detect any spam.
Most domain services web sites (domain.com and most others) provide the ability to "point" or "forward" to another site. The multiple domains will point to a feeder site that is hosted and only contains an index file and a robots.txt file. The feeder index will include a "meta refresh" and a "no index" statement. The feeder index file should have an "optimized" title, description, and keyword tag. Next, add a 301 Permanently Moved action to the Feeder Site that will redirect to the Main Site so that any links and status is passed. The feeder site will correctly redirect to your "real" site.
Moving the site to new Host:
Sometimes company wants to move the site to some other IP or shift to other ISP. During the shifting process, most of the times users can’t find the web site. They get in a situation of confusion and chaos and hence marketer faces lots of problems. So its SEOs job to minimize all the confusing factors.
Set up the DNS on your new host to point to your existing (old host) site first. This is an important first step. Now change the TLD (top level domain) information at your domain registrar to point to this new site DNS. Your old site should still show by either by IP or domain name. Copy your existing site to your new site and validate that all files have transferred and the links work. After allowing 4 days for the DNS to be fully propagated, point your new DNS to your new site. Make sure that your old site mailboxes have been emptied before you change any DNS info at this time. Once this DNS change occurs you cannot get to your old mail. *Once everything has been validated you should then point the old DNS to your new site. This is a safety issue in case there is a lingering propagation error. Search Engine listings or bookmarked pages should transfer to your new site with a 301 redirect.
After everything has been checked you should be able to delete your old site after a sufficient amount of time has passed (not more than 3 months). Note that Google does cache the old DNS address information and until they verify that the site has moved and store the new DNS information they may not visit your new site. The 301 will assist in this area.
* If you are moving from an IIS server to Linux (Apache) you should validate your formmail scripts, and any items that may not be cross platform compatible.
If you are moving from Linux to IIS then your .htaccess file will not be compatible as well as the ability to CHMOD permissions. Validate all functions with your ISP Administrator (some of the following steps may need to be redone on your new server).
Fill in the blanks:
1. The three types of keywords are ………….. , ……………… and ……………….
2. Typically a sub theme key phrase could be of ……. To ……. word length.
3. The first step of competition analysis is ……………………………….
4. Wordtracker, Digitalpoint, Overture and Google suggest are ………………….. tools.
5. Anchor text is visible ………………… in the webpage.
State the following statements are True or False:
1. Allintitle is an HTML Meta tag.
2. We can search the keywords present in the URL by using AllinURL.
3. Allinanchor is a text present in the anchor tag
4. 301 is a permanent error page.
5. 404 is ‘not found’ page
6. Keyword density is defined as
Where tfi, j is the number of times term i appears in document j and l is the total number of terms in the document.
7. The optimal keyword density is 4.
Choose the correct option:
1. “Alt Tag” is defined as
a. The alternative text that the browser displays when the surfer does
not want to or cannot see the pictures present in a web page.
b. The special text included in the Keyword description to make the sit
search engine friendly.
c. The link to error pages.
d. The text we use while moving the site to other IP.
2. 200 HTTP code means
a. Request accepted successfully.
b. Permanent redirection
c. Temporary redirection
d. Page not found
3. While optimizing content, we should check out
a. Heading Tags
b. Special Text (bold, colored, etc.)
c. Inline Text Links or Anchor text
d. All of the above.
4. This lets you specify that a particular page should NOT be indexed by a search engine.
a. Meta Keyword tag
b. Meta Robot tag
c. Keyword description tag
d. None of the above
Which of the following elements are considered in Technical Optimization.
a. HTTP errors
b. HTML code
c. RSS Feeds
d. All of the above
Keyword competition analysis is done by
a. allinanchor, allinurl, allintitle
b. keyword research, keyword tracker, Keyword generator
c. keyword description, meta tags, URL check
d. None of the above
1. How many keywords should I use for optimizing content?
A. As of now, nobody knows the exact keyword density algorithm. But generally
SEOs go for 2-10% keyword density.
2. Can I use single word keyword in SEO campaign?
A. You can use any number of keywords you want! (Including One). But single word
keywords are very generic and attract high amount of traffic. But most of times
this is irrelative traffic.
3. Then how many words should be there in a keyword?
A. Ideally keyword should be of 2 to 4 word long.
4. Are there any free tools available for keyword research?
A. Overture, Google suggest offers free keyword generation tool.
5. How can I use Allin…. command to know about keyword competition?
A. You can type allin… command in search engine, put a colon and type the
Keywords you want to know about. Once search is completed all the data is
displayed at the right hand corner.
6. What is the importance of content in SEO?
A. Content is the MOST important element for SEO. If you have lots of content in
the web site, your web site ranks much higher than others on SERPs.
7. Can I use Java script to build my web site?
A. HTML is the most preferred language from SEO point of view. You can not
optimize java script in any ways.
8. Is it harmful if I don’t optimize/customize error pages?
A. Yes, it is very harmful in terms of loosing out customers. But in terms of effect
website, it not harmful.
Off page optimization is the term used for the actions taken off the actual web page that positively affect the performance of the page and the site. This includes everything from links from other sites, exchanges of links and the actions taken offline that affect performance of the web site.
Due to the ease with which on-page factors can be manipulated search engines now place more weight on so called off-page factors. Google made this form of ranking famous with its patented PageRank algorithm but researchers discussed using ideas such as link anchor text as far back as 1994. Off-page criteria are obtained from sources other than the website.
Probably the most important of these criteria are the quantity of inbound-links using anchor text containing your target keywords. These should come from pages covering similar topics, preferably from large, long established authority sites. Link popularity is a score that adds up the total number of links you have pointed to your site over various search engines. Search engines heavily weigh how many links there are pointing to your site. Search engines believes that more the links more useful the site for visitors. The inbound links are like recommendation to view your site. However, all links aren't equal in terms of quality. The more links you have the better. Evaluating a links importance is quite complex and is determined on various factors. Some of which include content relation, page rank and page traffic.
Content relation: Any link from content rich site is most valuable in terms of quality. Of course the content should be related to the topic of your page/site. Any website that operates the same target audience is very valuable. For example if you are ‘SEO’ company, then the links from SEO related sites will certainly create more value for your rankings.
Page Rank: It is a number between 0 to 10 that Google assigns to every web page in it’s index. In the case of Google the higher the PageRank the better! All other things being equal, a link from a PR6 site is worth around eight to ten times that of a PR5 site. But determining a links value solely on page rank is unwise as there are lots of PR0 pages that might have lots of targeted traffic. Remember that Google uses all inbound-links in its ranking process, not just those shown by the 'back links' option of the Google toolbar. If the page has the luxury of many inbound-links then mixing target keywords to cover different queries is a good idea. Since Yahoo! acquired Overture and dumped the Google index it has published an index called WebRank. This is based on the number of incoming links to a site. It would seem likely that this is also a factor in their ranking process.
Page traffic: Page traffic refers to how many potential buyers may come across a page on the web that you want to target.
Some search engines also rank sites based on usage information. This is called Click Density and was pioneered by the DirectHit engine. The search engine monitors the results pages to see which links users actually follow. This is a kind of quality control although is affected by on-page factors such as the Description META tag and other summary information the search engine uses to describe the page.
How to generate links for the site:
There are 4 ways to get links to your site. They are from directories, Reciprocal links, purchasing links and content donations.
Dmoz and other directories: Dmoz is the biggest online directory till date! It is maintained by more than 60,000 volunteer editors. Obtaining a link in this directory is very valuable. But to get listed in Dmoz is very difficult. First of all it is maintained by human editors and that too volunteers. Hence they may not be devoted to this job. There are lot of categories within the directory that do not have editors. So many submissions go unlooked until someone reviews it. So it can take almost a year to get listed in Dmoz. Other than Dmoz, there are lots of other directories. But no one create the value as like Dmoz.
Reciprocal Links: Reciprocal links are based on an agreement by two sites to link to each other. Reciprocal linking is often used by small/midsize sites as an inexpensive way to increase Web site traffic and link popularity. Reciprocal links are also known as "link swaps", "link exchanges" and "link partners". This is a kind of referral program, where we refer our customers to other companies when we don’t have what they want.
Purchasing Links: Link purchasing is done when a company is not in a position to exchange the links. For example, if you have a website but can’t link it to other site due to some reasons then you can ask others to link to your site by paying certain amount of money. The charge varies according to the page rank of the site.
The higher the page rank of your site, you can ask for more money. Putting it in short, it’s a monetary transaction for placing link in good ranked web site.
Content Donation: Sometimes experts in the company submit articles to different web sites. And thus can put a link on that website. Its a very effective in a way that you can put your knowledge in front of thousands of people and attract them to your site. Here the web sites exchange links for the content you are providing. Thus the site becomes content rich and also you can get fair amount of exposure.
Off page factors have evolved to form complex parts of search engines’ algorithms, but some of the most important factors include:
1. Building link popularity
2. Google Page Rank
3. Measuring link popularity
4. Anchor text optimization
Lets study all these factors one by one.
1. Building link popularity:
Link popularity is the number of external links that links to your site. Here quantity is the only concern and not quality. We will look into the quality later.
Link building requires dedication and consistent effort. It is the most difficult, time-consuming and important task an SEO performs to get a website ranked well. Certain strategies are well known to SEOs, while others are in the realm of the experts or forward-thinkers who puzzle them out. I have attempted to list here many of the link building tactics used by the SEO community.
Content Development: The best way to build long term link popularity is to offer good content and features that provide real value to your audience. As people discover your web site and realize its benefit the likelihood of them linking to your web site naturally increases. Link importance helps filter out the spammers who set up dozens of free sites and then link their bogus sites to the main Web site. Spamming is such a problem that some search engines don't count free Web sites at all in their link popularity scores.
Directories: There are several critical targets if you want to build up your link popularity without appearing to be a spammer. The most important is good links for Yahoo! and Dmoz directory. These are human compiled and very well reputed online directories. The listing in these directories is very important provided that they are listed in correct category with a good description. Links from these two websites are seen as validating you are the real thing.
There are several other directories than Yahoo! and Dmoz. Many directories are industry specific or topic specific. So to have a link from such topic specific directory also matters a lot. For example, if you have a news and entertainment site then linked to the site focused on news and entertainment is important.
Many times other people are not aware about your site. Hence it becomes very difficult to get linked to niche sites. The best solution for this problem is to send a request to link to your site. You must explain what your site is all about and the benefit their users will get from your site. Also make sure that the message is personalized.
Another part of the web that helps build your link popularity is resource sites. Resource sites are lists of links that people put up on their own. Though it sounds like directories there are many differences between directories and resource pages. Resource pages are for a specific topic while directories contain many sub-topics. These pages are often spread amongst friends and readers who find good information available from the resource. The visitors to these pages are highly targeted and are always in search for better information. In order to reach this audience, a good PR campaign and press releases can ensure that these individuals know you exist and have a link to your website that they can easily include.
One more way to get links is from your partners. They could be your suppliers or vendors for your business. Since you already have established a relationship with these people, there are more chances that you will receive link from them. These websites help validate your place online and also establish you within a community of websites online. The major advantage of this linking is increase in reliability. If you are linked within your community search engines believe that you are more reliable and community trusts you. Hence more the visibility among your community, more the chances that you will rank better.
Internal link building is as important as external one. Search engines scan the site following your links. Hence your link structure should be solid and easily followed by search engines. The easier you make it on your audience to find good information, the more likely they are to link to it.
Forum participation: The final way you can work to increase your link popularity is to participate in newsletters and online forums that relate to your website. Generating links through forums is often considered spam, but forums are valuable tools for link builders as long as the proper methods are employed. You don't want to just jump in and give a plug for your URL. You must participate in the discussion as an expert or authority who gives good advice. When you sign your name at the bottom of your posting, be sure to include a signature that includes a link to your website. If these forums and newsletters are archived and remain online, search engines continue to see them and the links they contain. The forum signature of a member is certainly a fair place to promote your site, and while doing so may not provide a worthy back link in the search engines' eyes, it does offer visitors on the forum a chance to see your site. In SEO forums, people constantly plug their new directories, tools, articles, etc. These are all appropriate as long as they are subtle, relevant and not overdone.
Blogging and Comments: Web Blog is another very effective way of getting backlinks. Create a personal bolg on your site and you can more easily make notes or comments at other sites that link back to you. But remember that this is not for your product promotion or advertisement. If you do so it will be considered as spamming. Try to put as much information as possible on the blog. It will certainly push you up your reputation in the market. The well researched and well written blog is always appreciated. And these kind of blogs only can generate links for the site. A well thought-out response to a relevant blog or article can bring not only search engines, but more importantly other readers of the blog to your site through the link. If your content proves interesting, you could build much more than just a single link through your comments.
Another important thing to remember is that your link in blog comments may not count at all in the search engines. Many blogs prevent access to these comments or don't provide direct, spiderable links for blog visitors. This, combined with the fact that these comments are buried at least 2-3 links deep from the main page means that you should keep your focus on the visitors you might generate, rather than the search engine optimization you might get.
However, you should still be as SE friendly as possible. If the subject matter deals directly with your keyword phrases, you're in fine shape, but if they are on related subjects, it can't hurt to use your keyword phrases once or twice in a longer comment/entry. You should also try to make your 'name' - which is typically the anchor text for your link contain at least one of your keyword phrases, but remember to keep it varied and never let it overtake the importance of getting visitors.
Unique Tools and Services: By offering specific, relevant free Internet tools/services at your site, you can generate naturally built links. It is recommended to find a tool or service that isn't currently offered or that you believe you can improve upon. For some industries, this will be much harder than others, but creativity is your best tool. Imagine what automated web service, list of resources or submission-type tool would best benefit you and then build it and offer it on your site.
The possibilities are endless, and there is little doubt that the web can automate and make easier hundreds of tasks in any industry. The key, after you have built your tool, is to promote it with industry leaders - send it to bloggers who discuss your subject matter, ask people to use and evaluate it on forums, etc. You can even try to promote it by writing an article about your findings, development or the simple existence of the tool and submitting it across the newswires.
Automated link building program: This is very risky way to build link popularity. It is found that search engines penalize for automated link building. But it follows the principle of risk and return. If you are willing Of take the risk, the returns could be highly substantial. Well-built and well hidden link networks often bring higher rankings in a short amount of time. The key is to select a provider who will offer the following:
1. A vastly distributed network that is not interlinked and doesn't share any C-Block IP addresses
2. Linking pages that contain some relevant content that has been well-linked through a sitemap system
3. Pages that are as close to naturally-designed and built as possible
4. A company that can speak at an expert level about SEO and the tactics they use to avoid detection - call them and talk to them first; their knowledge and competence should be instantly apparent.
5. Someone you trust and have researched thoroughly - I DO NOT recommend posting on forums about these companies, as you could damage your chances for success. Just ask for recommendations from trusted members of the SEO community.
6. Pages that link not only to your site(s) but to other relevant, related sites as well - even if they are your composition. That way, if you get penalized, at least your competition is suffering as well.
This is one of the riskiest strategies that can be pursued in link building and I recommend it only with extreme caution. You are breaking the search engine's printed rules by "intentionally manipulating the search results" and you should have a contingency plan in the event of disaster.
Natural link building: Although undoubtedly a difficult and time consuming method, as well as one of the more unreliable ones, natural link building is what powers most of the top sites on the Internet. This method involves developing the most useful, relevant content that provides the best resources, tools, services, prices, etc. in the industry. By offering the web community the best possible site, you can gain natural links through the power of having others on the web link to you.
Sadly, this tactic is somewhat usurped by the search engines, as newer sites often fair exceptionally poorly in the search results, especially for popular terms. In order to build natural links, people must be exposed to your site. At one time natural link building was the very best method to get traffic and links, but in the current state of search and traffic, it is more of a boost that you use to help convert some of the webmasters & bloggers who might visit your site into link builders.
Building valuable content is no easy task, and building an industry-leading resource is even more challenging. However, the long-term benefits can be spectacular, assuming you have the money and time to maintain your resource without proportional income. Remember to ask yourself what someone who comes to your site might want and try to provide the absolute best service/information in your industry in the cleanest, most usable site possible.
Press release: Press release is very effective and popular tactic to get link to the site provided that it is professionally written and keyword rich article. Generally article consists of well placed keywords and a link to the sites that is the subject of the article. It grabs lots of attention and hence results into highly targeted traffic.
The article must be relative and should provide good value to visitor. It is generally recommended to hire content writer or journalist to write the article. Distribution site is the only way to release these articles. There are many popular sites such as ‘Prleap.com’, ‘Prweb.com’, ‘Free-press-release.com’, ‘pressworld.com’ on which you can submit the article. These sites offer links, serve to a large variety of sites and provide enough flexibility to let an SEO perform well.
The distribution that is achieved will determine the value of the links and the article. It's also important to keep in mind that duplicate content penalties may hurt you - DO NOT re-publish the press release on your own site.
Article writing and Submission: Article writing and content development is considered to be the major on-page factor for SEO. But this is one of various ways to get links to your sites as well. Many websites are designed around hosting the content of others and providing links back to them. These sites are known as ‘third party article hosting’ sites. These sites cover a wide range of coverage and accept articles on variety of topics.
Some of the major 3rd party sites are IdeaMarketers.com, Buzzle.com, ebooksnbytes.com, thewhir.com, amazines.com etc. There are many topic specific sites are also available on the Internet. You can get a high visibility and exposure on Internet after submitting articles to these sites. There are many more such sites available for article submission.
The link building campaign can be done in three ways:
a. One way linking
b. Reciprocal Linking
c. Triangular Linking
One way linking:
When we say one way links, it is all about putting your link on the site without exchanging any other link i.e. one way links are links to your sites from sites which do not receive a link from your site. Since on will place any link like that, it is considered to be the toughest job in link building campaign. But one way linking is the best way to improve the ranking in search engines. Since other web sites are recommending your site without linking back, search engines believe that your site is very important and can provide excellent value to customers.
The biggest advantage of one way linking is you are automatically protected from bad back links. If the back linked site is not good enough then you can actually lose out your page rank as well as rankings in the search engine. some sites are very generic and have links to various irrelative sites. Hence with lots of irrelative sites search engines think that this is just link popularity exercise and it is simply useless for visitors. Since there is no value added for the search engine's users, they in turn give no value to these links.
Long term output is another advantage that one way linking offer! Since one way linked sites offer tremendous value it is more likely to be stay there for a longer period of time unlike reciprocal links. Website owners add these links because they think that it is a value add to the user experience. Reciprocal links could be dropped when it no longer suits to the website’s linking strategy.
Check to see if the page where the link will be located can be found in the search engine results. You can search via the entire website or by individual page. Different search engines use different syntax in looking for individual pages and links; refer to the advanced search function for each search engine for details.
Google Search Example:
Shows indexing of all pages listed in website
Shows indexing of a specific web page in website
If the page is listed in the search engine results, this means the page has been indexed by the search engine robots. This means the web page is valid for indexing and that your link will be picked up as well.
One-way link building means hard work and long term determination to achieve good link popularity. By improving the quality of your website, you improve the chance to obtain good quality natural links. Spend a set amount of time each week to seek out quality one-way links to achieve your goal. By using this long-term game plan you will be able to safely build links for optimum link popularity success.
This is the most popular practice in linking building campaign today. Reciprocal linking means two web sites that agrees to link to each other. It is also known as ‘link swaps’, ‘link exchanges’ and ‘link partners’.
A reciprocal link is an assurance about your site. This link basically explains that the site at the other end of this link feels that my site is important enough to link to, and I feel that their site is important enough that I am willing to let visitors leave my site via this link. It involves an element of trust! Few Webmasters have the time or patience to constantly monitor the sites that link back to them, so you are trusting the other site to maintain the link on their site, and not bury it under other information or delete it during a site upgrade.
A reciprocal link is not a quick fix to bring more traffic. There are many sites with over 100 banner and text links on a single page -- now how much traffic do you think the sites featured on that page were getting from those links? Many of theses linked sites, offer the same scenario: a cobbled together mishmash of links and graphics, many of them broken or out of date, none benefiting the relevant sites.
So getting quality links is the very important in reciprocal linking. Since it’s like barter system, reciprocal links are comparatively easier to build. Reciprocal links help in driving traffic to the web site as well as boosting rankings in the search engines. In case reciprocal links search engines emphasis more on quality of the links.
The good quality links consists of relevant theme, good page rank and good value add to customer. Rather than competitive, complementary sites should be considered.
Before entering into reciprocal linking there is basic groundwork to do. As any other campaign competitive analysis is very important. If you study your competitor’s site and complementary sites and examine their link pages then it can give you valuable inputs regarding the potential link partners.
While setting up reciprocal linking the first step is to find good quality and complementary sites. Then place a link to them on your web site. Only after you have placed a link to them, e-mail the owner of the site a short friendly note. Add some good points about that site. If you don’t found any plus points, then straight away delete the website from your list. Then tell the site owner that you have linked to their site giving them URL of the page where you have place your link. Then ask for link back to your site suggesting a page where a link would be appropriate. If the other party is interested then they will revert back and place the link back.
To boost their Page Rank, some webmasters concentrate on getting links only from sites that have high Page Rank. If you want to try this approach, PRsearch is a useful free search tool to use. It gives you Google search results PLUS Page Rank.
You type in a key phrase and can quickly see the Page Rank of pages optimized for that phrase. You can also click on number beside the words "inbound links" and you may find more sites with high Page Rank.
Once One way linking became almost impossible for webmasters, triangular linking introduced. This is great technique, which can manipulate reciprocal linking into one way linking. Triangular link trading is an attempt to fool the search engines into thinking that you haven't really engineered a reciprocal link trade. The way this ruse is played is that you use two websites. You have your cohort link to your website A and you link back to him from your website B. This way the search engine will not see the trade, and will instead think that you "earned" the link.
In triangular linking, we place a link on a website but put the other’s link on different web site. I.e. I’ll put my link on your web site and in exchange of that, will put your link on some other site but not mine.
With the help of this technique webmasters started to pretend that it’s a one way linking. But soon search engines found out this technique and now they are developing algorithm to conquer this problem.
Anchor Text Optimization:
Successful search engine optimization employs many factors. However, one of the most important factors is the anchor text of inbound links. Because Google, among other search engines, puts a significant amount of weight on the anchor text of inbound links, anchor text can be a decisive factor when going after top ranking on extremely competitive search terms.
Precisely how powerful is anchor text of inbound links? It is entirely possible to achieve top ranking for extremely competitive keywords without any hint of on-page-elements SEO.
Monster.com currently sits in the #1 spot for the search term Jobs. The word "jobs" does not even appear on that page in text form; the only occurrence is once in the form of ALT text.
Search Google for Computers. At this time, six of the top ten pages do not contain the word Computers in the page copy. They are there due solely to anchor text of inbound links.
Search Google for Profit. At this time, 7 of the top ten pages do not contain the word Profit in the page copy. They are there due solely to anchor text of inbound links.
Search Google for Promote. At this time, 5 of the top ten pages do not contain the word Promote in the page copy. They are there due solely to anchor text of inbound links.
Search Google for Advertise. At this time, 6 of the top ten pages do not contain the word Advertise in the page copy. They are there due solely to anchor text of inbound links.
Search Google for Click. At this time, 9 of the top ten pages do not contain the word Click in the page copy. They are there due solely to anchor text of inbound links.
Search Google for webhosting. At this time, four of the top ten pages do not contain the word webhosting in the page copy. They are there due solely to anchor text of inbound links.
Search Google for hits. At this time, the top three pages do not contain the word hits in the page copy. They are there due solely to anchor text of inbound links.
Do a search for exit. All of the sites on the first page are there due solely to anchor text of inbound links.
Same thing applies when you do a search for leave, for which Yahoo! is listed in the #1 spot. Yahoo is also #1 for the search term exit, due solely to the anchor text of hundreds of mature content sites.
So how do you, as an SEO, utilize the power of anchor text? The easiest - most cost effective - way is to choose a descriptive name for your site, and by descriptively naming your folders and files.
We are currently analyzing data from market research into consumer behavior in regard to domain names. The completed analysis will be published in an article entitled Descriptive Domains for SEO. As a preview of things to come, let me say that preliminary data shows descriptive domains - keyword domains, that is - to benefit from a much higher CTR than non-descriptive domains.
Looking through the traffic logs of websites we maintain is a daily chore. Early detection of shifts in search engine ranking is a must in this business. One thing I've often noticed is search strings containing words that appear neither on the page, nor in anchor text of inbound links - except where the URL of the page has been used as anchor text. This is where we see the benefit of descriptive file naming.
Measuring link popularity
If you want to measure the "link popularity" of your Web site against similar ones in the same "industry" there is nothing better than the online Link Popularity Check tools by various comapnies. This free online facility allows you to input the URL of your Web site while providing you back with a precise report of the total number of inbound links as measured across all of the major search engines. You can compare your site with lots of personally selected competitors in addition to pre-selected list of the popular destinations on the Web.
Measuring link popularity is important to know before attempting to submit to search engines. Track down your referral links from by using search engines. This method will give you an idea of how "popular" a search engine believes your web site is to them. That's important for those search engines that rank sites in part by the site's link popularity. Google, AOL Search, HotBot, iWon, MSN Search, Looksmart, Inktomi, Alta Vista, Northern Light.
In order to find out the number of sites linking to the domain mysite.com in AltaVista, you would type in link:mysite.com in AltaVista's search box. If you wish to exclude links from pages within the mysite.com domain, you would type in link:mysite.com -url:mysite.com.
If you want to find out how many sites are linking to a particular page (say mypage.html) in the mysite.com domain, you would type in link:mysite.com/mypage.html. Again, in order to exclude links from pages within the mysite.com domain, you would type in link:mysite.com/mypage.html -url:mysite.com.
Note that you should not type in the "www" or the "http://" prefixes.
AOL is a directory based engine. It takes its results from the Open Directory. Hence, the concept of link popularity is not very meaningful in AOL.
3. Direct Hit
There is no way you can find out the link popularity of your site in Direct Hit. This is because Direct Hit does not return the number of sites which match the search criterion.
There is no way you can find out the link popularity of your site in Excite. This is because Excite does not have a special command to measure the link popularity of your site.
5. Fast (http://www.alltheweb.com)
In order to find out the number of web sites linking to the domain mysite.com in Fast, click on the Advanced Search link. In the "Word Filters" section of the Advanced Search page, select "Must Include" from the first combo box. In the text box besides the first combo box, type in mysite.com. In the combo box to the right of the text box, select "in the link name". If you want to exclude links from pages within the mysite.com domain, type in mysite.com in the "Exclude" text box in the "Domain Filters" section. Then, click on the "FAST Search" button.
In order to find the number of links to mypage.html in the mysite.com domain, you would type in mysite.com/mypage.html in the text box besides the first combo box. Again, in order to exclude links from within the mysite.com domain, type in mysite.com in the "Exclude" text box.
Note that you should not type in the "www" or the "http://" prefixes.
In order to find out the number of sites linking to mysite.com in Google, you would type in link:mysite.com.
If you want to find out how many sites are linking to the page mypage.html in the mysite.com domain, you would type in link:mysite.com/mypage.html.
However, there is no way you can exclude links from pages within the mysite.com domain from being counted.
Google considers www.mysite.com to be different from mysite.com. This means that typing in link:mysite.com will not include the links to www.mysite.com. If you want to find out the number of links to www.mysite.com, you have to type in link:www.mysite.com. And typing in link:www.mysite.com will not include the links to mysite.com either.
This is in contrast to AltaVista which includes links to the www.mysite.com domain when you try to find the number of links to mysite.com.
There are two methods of measuring link popularity in Hotbot.
In the first case, in order to find out the number of sites linking to mysite.com, you can type in linkdomain:mysite.com. In order to exclude links from pages within the mysite.com domain, you can use linkdomain:mysite.com -domain:mysite.com.
Make sure that you do not use the "www" or "http://" prefixes when you use this method.
However, this method cannot be used to find out the number of links to specific pages in your site, i.e. you cannot use this method to find out the links to the page mypage.html in the domain mysite.com.
In order to find out the number of links to specific pages, choose "links to this URL" from the "Look for:" drop-down combo box and then type in the complete URL (i.e. http://www.mysite.com/mypage.html) in the search box. In order to exclude links from within the mysite.com domain, type http://www.mysite.com/mypage.html -domain:mysite.com in the search box after choosing "links to this URL" from the combo box. Note that for the second method, you need to use the "http://" prefix.
Lastly, you should note that in the second method, typing http://www.mysite.com will only find links to the home page of the www.mysite.com domain. If there are some sites which have linked to some of the internal pages in your site rather than your home page, this will not be included in the link popularity count.
The method of measuring link popularity in IWon is the same as the first method in HotBot. However, unlike HotBot, IWon does not have an alternative method which can be used to measure the number of links to specific pages in a domain.
In order to measure link popularity in Lycos, first click on the Advanced Search link to the right of the search box. To find out the number of sites linking to mysite.com in Lycos, you would type in ml:mysite.com in the search box. If you wish to exclude links from pages within the mysite.com domain, you would type in ml:mysite.com -h:mysite.com.
If you want to find out how many sites are linking to a particular page (say mypage.html) in the mysite.com domain, you would type in ml:mysite.com/mypage.html. Again, in order to exclude links from pages within the mysite.com domain, you would type in ml:mysite.com/mypage.html -h:mysite.com.
Note that you should not type in the "www" or the "http://" prefixes.
The method of measuring link popularity in MSN is almost the same as that in Hotbot. The first method is exactly the same. For the second method, click on the More Options tab, type in the complete URL in the "Search the web for:" text box and choose "links to URL" from the "Find:" drop-down combo box. However, unlike Hotbot, you cannot eliminate links from pages within the same domain using the second method.
Note that the More Options tab is displayed only after you search for something in MSN. It is not displayed in MSN's home page.
Netscape is a directory based engine. It takes its results from the Open Directory. If no results are found in the Open Directory, it takes its results from Google. Since it is a directory based engine, the concept of measuring link popularity is not all that meaningful. You can type in link:mysite.com to measure the number of links to the domain mysite.com. In this case, Netscape will simply take its results from Google.
12. Northern Light
There is no special command for measuring link popularity in Northern Light. To get a very approximate idea of the number of sites linking to the domain mysite.com, you can type in mysite.com. In order to eliminate the references to the mysite.com domain from within the domain, you can type in mysite.com -url:mysite.com.
To get an approximate measure of the number of links to the page mypage.html in the domain mysite.com, you can type in mysite.com/mypage.html in the search box. Again, to eliminate the references to the page from within the mysite.com domain, you would type mysite.com/mypage.html -url:mysite.com
Don't type in the "http://" or "www" prefixes.
There is no way you can find out the link popularity of your site in Webcrawler. This is because, like Excite, Webcrawler has no special command for measuring link popularity.
Other than search engines, there are numarous tools available in the market for measuring link popularity. But there are other flaws in link popularity measurement.
• To appear in link popularity results, the site upon which the link sits must have been indexed by search engines. But search engines still index only a fraction of all the web pages available. If your link sits on this ‘invisible web’, it won’t be returned.
• Even if the site on which your link sits has been indexed, there are still problems. First, if the link was added since the site was last indexed then it won’t show up. And second, if your link is buried deep within the site, it may not appear.
• Sites with restricted access or membership only areas - often sources of rich information and links - will not be accessible to search engines.
• Many link popularity tests return internal links - links from within the site itself. This shows that the content has been well-structured and optimised for search engines, but it does distort the scores on link popularity. AltaVista, for one overcomes this with the command - link:www.yoursite.com -url:www.yoursite.com.
• Search results can be inconsistent. This is because different searches may be carried out on different indexes. As Craig Silverstein of Google writes, "There are many reasons why one might see a difference in the estimated number of pages returned for the same query. It’s most likely the queries ... were sent to different Google datacenters. Depending on which datacenter finishes a query, the estimated number of results may vary."
Link popularity checkers are very useful as a guide and for research purposes they can be tremendous. However, using them as a metric to judge the effectiveness of your linking work should be treated with caution.
By all means use them to give you a rough idea, but build some solid metrics into each linking project you undertake.
Here are some things you should think about:
• Analyse the type and quality of the links that currently exist. Look for any information you can use to measure effectiveness.
• Identify the top 20, the top 50 or the top 100 sites that you would like to link to you. How many of these currently link to your site? Use that as a benchmark - so if 20 of the top 100 sites currently link to you, set yourself a target of 40.
• Check your referrer logs frequently and keep a note of referring URLs. Note which URLs drive most traffic to your site.
Measuring return on investment is a key business discipline that we all have to address. Putting some thought into what you should measure for each individual linking project will not only prove your worth, but will give you valuable insights into how you can sharpen and improve your linking strategy.
Google Page Rank
Page rank is one of the methods Google uses to determine a page’s relevance or importance. It works by counting links and text of the links pointing at a page and/or domain. PageRank is a vote, by all the other pages on the Web, about how important a page is. A link to a page counts as a vote of support. If there’s no link there’s no support.
Google use software called “PageRank” for ranking pages. Google founders Larry Page and Sergey Brin developed it. Though it is very advanced now, the basic logic remained as it is.
Quoting from the original Google paper, PageRank is defined like this:
We assume page A has pages T1...Tn, which point to it (i.e., are citations).
The parameter d is a damping factor, which can be set between 0 and 1.
We usually set d to 0.85.
Also C(A) is defined as the number of links going out of page A.
The PageRank of a page A is given as follows:
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))
Note that the PageRanks form a probability distribution over web pages, so the sum of all web pages' PageRanks will be one.
PageRank or PR(A) can be calculated using a simple iterative algorithm, and corresponds to the principal eigenvector of the normalized link matrix of the web.
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important."
Important, high-quality sites receive a higher PageRank, which Google remembers each time it conducts a search. Of course, important pages mean nothing to you if they don't match your query. So, Google combines PageRank with sophisticated text-matching techniques to find pages that are both important and relevant to your search. Google goes far beyond the number of times a term appears on a page and examines all aspects of the page's content (and the content of the pages linking to it) to determine if it's a good match for your query.
Google's complex, automated methods make human tampering with our results extremely difficult. And though we do run relevant ads above and next to our results, Google does not sell placement within the results themselves (i.e., no one can buy a higher PageRank). A Google search is an easy, honest and objective way to find high-quality websites with information relevant to your search.
There are two types of PageRank numbers that Google currently will give for a specific page. One is the toolbar PageRank shown on the Google toolbar. This number varies from 0 - 10. The other type of PageRank is in the Google directory. There is no number for it, but by using the chart below - you will be able to figure it out. In the Google directory, if a site is ranked high enough, it has a green bar listed to the left of it. The Google directory is a copy of the dmoz. If you are in the dmoz, then you will be added to the Google directory automatically.
If your site is not listed in the directory and you cannot run the Google toolbar, you can get a rough estimate by simply looking at Google search results. Find the largest keyword that your site is listed under and compare it to other pages that are listed in the Google directory. Google displays a link to the directory if a page has a directory listing. By comparing your page to pages listed above and below you, you can get a very good idea what the PR value.
Dynamic page optimization:
As Internet user base started to grow, website owners started to make web site more and more attractive and user friendly. The most important thing to keep in mind is each webpage is not a separate file but is created when a user performs some activity.
Lets see what exactly dynamic site is! Unlike the normal HTML website where The content of static pages doesn't change unless you actually code the changes into your HTML file: open the file, edit the content, save the file, and upload it to the server. All search engine spiders can index static Web pages. Dynamic web page is a template that displays specific information in response to queries. The database is connected to the web site and the response is generated through the connected database only. These sites are easy to update for webmaster. Since it is directly connected to database, the change in database reflects all the pages. It is much simpler than normal HTML pages, where you need to change the desired content in each and every page.
For the marketer, the creation of new pages or updates to existing pages is done by either making adjustments to information in the database or, when it comes to the site’s visual presentation, may mean adjustments to one or a few template pages. Of course one has to make web site like that only, but problem started when these beautiful, content rich sites failed to rank higher in search engines.
But the problem lies in its advantage itself. As studied earlier, dynamic page executes on query. Users send queries through search engines or they are already be coded into a link on the page. But a search engine spider doesn't know to use your search function - or what questions to ask. Dynamic scripts often need certain information before they can return the page content: cookie data, session id, or a query string are common requirements. Spiders usually stop indexing a dynamic site because they can't answer the question.
Search engines only believe in content and not flashy elements in your web site. Search engine crawlers are programmed in such a way that they can read the text only. Crawlers strictly ignore all the flashy elements such as pictures, frames, video etc, read it as an empty space and move on. Some search engines may not even be able to locate the dynamic page very easily. But if we make web sites SE friendly only and not user friendly then most likely you end up losing out visitor. This then presents a big problem for marketers who have done very well with their rankings in search engines using static pages but who wish to switch to a dynamic site.
This is why SEOs came up with the advanced SEO techniques to optimize dynamic pages. Here are few methods that you can use to optimize dynamic pages.
Methods to make search engine spider Dynamic Pages:
1. Use of softwares – There are various softwares available in the market, which will remove the "?" in the Query String and replace it with "/", thereby allowing the search engine spiders to index the dynamic content.
http://www.my-online-store.com/books.asp?id=1190 will change to
The latter being a static URL, it can easily be indexed by the search engine spiders.
2. Use of CGI/Perl scripts - One of the easiest ways to get your dynamic sites indexed by search engines is using CGI/Perl scripts. Path_Info or Script_Name is a variable in a dynamic application that contains the complete URL address (including the query string information). In order to fix this problem, you'll need to write a script that will pull all the information before the query string and set the rest of the information equal to a variable. You can then use this variable in your URL address.
Example - http://www.my-online-store.com/books.asp?id=1190
When you are using CGI/Perl scripts, the query part of the dynamic URL is assigned a variable.
So, in the above example "?id=1190" is assigned a variable, say "A". The dynamic URL http://www.my-online-store.com/coolpage.asp?id=1190
will change to http://www.my-online-store.com/books/A through CGI/Perl scripts which can easily be indexed by the search engines.
3. Re-configuring your web servers - (i) Apache Server - Apache has a rewrite module (mod_rewrite) that enables you to turn URLs containing query strings into URLs that search engines can index. This module however, isn't installed with Apache software by default, so you need to check with your web hosting company for installation.
(ii) Cold Fusion - You'll need to reconfigure Cold Fusion on your server so that the "?" in a query string is replaced with a '/' and pass the value to the URL.
4. Creation of a Static Page linked to an array of dynamic Pages - This approach is very effective, especially if you are the owner of a small online store selling a few products online. Just create a static page linking to all your dynamic pages. Optimize this static page for search engine rankings. Include a link title for all the product categories, place appropriate "alt" tag for the product images along with product description containing highly popular keywords relevant to your business (You can conduct keyword research for your site through http://www.wordtracker.com). Submit this static page along with all the dynamic pages in various search engines, conforming to the search engine submission guidelines.
There are few technical aspects need to be considered for optimizing dynamic websites.
Lets start with .htacess & mod-rewrite. These are the two concepts that you will have to master to understand how to cloak search engine unfriendly urls. Also keep in mind that these two components are implemented on apache server. However for IIS server, we have the equivalents available, as can be seen later in this article.
So starting from the basics
An .htaccess file just is a plain text file. It has one directive per line like this:
The "RewriteEngine" portion is the directive and "on" is a parameter that describes what "RewriteEngine" should do
The .htaccess file usually lives it the root directory of a site and allows each site to uniquely configure how Apache delivers its content. Its directives apply to the entire site, but subdirectories can contain their own .htaccess and it applies to this sub and all of its subs and so on, down thru all of your sub sub sub sub subdirectories... You could have a different .htaccess in every subdirectory and make each sub behave a little differently.
Mod-rewrite is a redirect directive to the requesting object on a apache server. Its typical format looks like
RewriteRule ^url1\.html html$ url2.html [R=301,L]
Lets look at this a little more closely. The first directive instructs Apache to follow symbolic links within the site. Symbolic links are "abbreviated nicknames" for things within the site and are usually disabled by default. Since mod_rewrite relies on them, we must turn them on.
The "RewriteEngine on" directive does exactly what it says. Mod_rewrite is normally disabled by default and this directive enables the processing of subsequent mod_rewrite directive.
In this example, we have a caret at the beginning of the pattern, and a dollar sign at the end. These are regex(regular expressions in *nix) special characters called anchors. The caret tells regex to begin looking for a match with the character that immediately follows it, in this case a "u". The dollar sign anchor tells regex that this is the end of the string we want to match.
In this simple examples, "url1\.html" and "^url1\.html$" are interchangeable expressions and match the same string, however, "url1\.html" matches any string containing "url1.html" (aurl1.html for example) anywhere in the URL, but "^url1\.html$" matches only a string which is exactly equal to "url1.html". In a more complex redirect, anchors (and other special regex characters) are often essential.
Once the page is matched it directs it to replace it by the ‘url2.html’
In our example, we also have an "[R=301,L]". These are called flags in mod_rewrite and they're optional parameters. "R=301" instructs Apache to return a 301 status code with the delivered page and, when not included as in [R,L], defaults to 302. Unlike mod_alias, mod_rewrite can return any status code that is specified in the 300-400 range and it REQUIRES the square brackets surrounding the flag, as in this example.
The "L" flag tells Apache that this is the last rule that it needs to process. It's not required in this simple example but, as the rules grow in complexity, it will become very useful.
The Apache docs for mod_rewrite are at http://httpd.apache.org/docs/mod/mod_rewrite.html
& some examples can be found at
Now if we rename or delete url1.html, then request it again. Mod_rewrite can redirect from non-existent URLs (url1.html) to existing ones. This is how essentially we cloak the dynamic pages. The first url can be the dynamic page that we want to be replaced by the the static looking ‘url 2’. This then is how cloaking works on the apache server. Though there are other methods available however this remains the most popular & reliable.
IIS Server Redirects:
As long as one uses one of the mod_rewrite cousins for IIS (iis_rewrite, isapi rewrite), the method will be mostly the same for IIS as it will for Apache. However the place, the rules are inserted will depend on which software is being used (not obviously into httpd.conf or .htacess). But the rule generation pretty much remains the same either way.
The most used framework for this genre is ispai rewrite. For more info on this consult http://www.isapirewrite.com/ . The site has a free download version of their code & a paid version for 69USD
For IIS Rewrite functionality, Qwerksoft remains the most popular alternative(http://www.qwerksoft.com/products/iisrewrite/). Again a basic free downloadable or a 99 USD purchase option exists with them.
However user experience suggests that the ISAPI_Rewrite product outperforms the others due to its ease of configuration and a bunch of other little extras. One of the biggest benefits with ISAPI_Rewrite is that you don't have to restart IIS each time you make a change to the .ini file. In other words once ispai-rewrite is installed, one can have the .ini file within the root folder so that changes can be made, as one goes along if necessary,without a restart.
Also these products support shared hosting. So the hosting provider can be convinced to buy them & install them. Some other products in this category are as under:
http://www.pstruh.cz/help/urlrepl/library.htm ( free ispai)
Also if you are using .NET platform, this works for free:
Dynamic URLs rewrites:
Dynamic pages are roadblocks to high search engine positioning. Especially those that end in "?" or "&". In a dynamic site, variables are passed to the URL and the page is generated dynamically, often from information stored in a database as is the case with many e-commerce sites. Normal .html pages are static - they are hard-coded, their information does not change, and there are no "?" or "&" characters in the URL.
URL rewrites are programming techniques that allow the returned URL to be more search engine friendly by removing the question mark (?) and ampersand (&) from the returned URL found in the location or address bar. This enables the search engines to index the page without having variables or session id's interlaced into the URL.
Pages with dynamic URLs are present in several engines, notably Google and AltaVista, even though publicly AltaVista claims their spider does not crawl dynamic URLs. To a spider a "?" represents a sea of endless possibilities - some pages can automatically generate a potentially massive number of URLs, trapping the spider in a virtually infinite loop.
As a general rule, search engines will not properly index documents that:
• contain a "?" or "&"
• End in the following document types: .cfm, .asp, .shtml, .php, .stm, .jsp, .cgi, .pl
• Could potentially generate a large number of URLs.
In these cases, where page should be dynamic it is possible to clean up their query strings. URL rewriting generally clean up ‘?’, ‘&’, ‘+’ symbols in URLs to more user friendly characters. Check out the following URL: http://www.yourdomain.com/shop.php?cat_id=1&item_id=2
This dynamic URL can be converted into: http://www.yourdomain.com/shoppinglist/apparels/shirts
This makes the page look static but in actual it is dynamic. URL rewriting needs some serious strategy and planning. There are few tools available fro URL rewriting. These are rule-based tools and the most famous tools are ‘More Rewrite’ for Apache and ISAPI rewrite for IIS. Mode Rewrite can be used to solve all sorts of URL based problems. It provides all the functions you need to manipulate URLs. But because of its complex rule based matching engine, it’s hard to learn. However once you understand the basic idea you can master all of its features. ISAPI Rewrite is a powerful URL manipulation engine based on regular expressions. It acts mostly like Apache's mod_Rewrite, but it is designed specifically for Microsoft Internet Information Server (IIS). ISAPI Rewrite is an ISAPI filter written in pure C/C++ so it is extremely fast. ISAPI Rewrite gives you the freedom to go beyond standard URL schemes and develop your own scheme.
There are two types of URL rewrites. Both are there to make I search engine friendly but the advanced URL rewrites is search engine friendly.
Non-URL Rewrite URL
The above URL indicates to the database that the returned information should be from the category with id equal to 1 and the item id equal to 2. This works fine for the system because it understands the variables. Many search engines however do not understand this form of URL.
Simple URL Rewrite
The simple URL rewrite will take the URL and modify it so that it appears without the question mark (?) and ampersand (&). This enables all search engines to index your all of your pages, but still lacks in some important areas.
Advanced URL Rewrite
The advanced URL rewrite enables your URLs to include your keywords. This is another location search engines look for important information about your pages. Being able to include keywords in your URL helps elevate your page to the top of the search engine result pages.
URLs can be cleaned server-side using a web server extension that implements content negotiation, such as mod_negotiation for Apache or PageXchanger for IIS. However, getting a filter that can do the content negotiation is only half of the job. The underlying URLs present in HTML or other files must have their file extensions removed in order to realize the abstraction and security benefits of content negotiation. Removing the file extensions in source code is easy enough using search and replace in a web editor like Dreamweaver MX or HomeSite. Some tools like w3compiler also are being developed to improve page preparation for negotiation and transmission. One word of assurance: don't jump to the conclusion that your files won't be named page.html anymore. Remember that, on your server, the precious extensions are safe and sound. Content negotiation only means that the extensions disappear from source code, markup, and typed URLs.
To avoid complications, consider creating static pages whenever possible, perhaps using the database to update the pages, not to generate them on the fly.
Cloaking and Doorway Pages
As search engine optimization started evolving and search engines became more and more intelligent, webmasters came up with many techniques to rank their sites on search engines. Cloaking is one of those techniques. It is very difficult and time consuming to make a web site both user friendly as well as search engine friendly. So webmasters came up with an idea of Cloaking. In cloaking webmasters delivers one page to search engine for indexing while serving an entirely different page to everyone else. Cloaking is the process of serving different versions of a page based upon identifiable information about the user. Often, pages are based upon Agent name and/or IP address (isp host).
There is no as such clear view that whether cloaking is ethical or unethical. But anyways it is tricking spiders and any attempt to trick a search engine is considered to be spam. Hence cloaking technique is not regularly practiced. A simple way to see if a web page is using a cloaking technique is to look at the cache. Google has a link called Cached next to almost every search result. The cache shows the web page that was indexed by search engine. If a web page that you see in the SERPs differs from cached version, then there’s possibility that the website is using cloaking technique.
As we all know, people wants make web sites user centric. They want their site to be beautiful, attractive and interactive enough to engage visitors. Certainly this enhances user experience. But this does not serve the optimization purpose. So to optimize such a site webmasters use cloaking technique. The few factors are explained bellow, which makes a webmaster to think of cloaking.
Use of flash/splash/ Videos:
HTML days are gone and flash days are in! Many of the sites are build using flash, which is totally no no for search engines. So no plain text and not even flash on the site??? The solution is to create simple HTML text document for search engines and flash pages for visitors. Just recently Google has started to index flash pages but rest of the SEs doesn’t do that.
Websites containing Images:
There are many sites that are full of pictures and images. Also they have image gallery and all. These are image-oriented sites and percentage of images is more than that of text. Obviously there is no way that these sites will rank high on SERP. Hence cloaking comes first in the mind for optimizing these pages.
Many of the times there is more HTML code as compared to the text. This is again does not suit for search engine optimization. There has to be substantial amount of text and lesser HTML coding. In this case rather than recoding eth entire websites, they found cloaking as the best option.
Now you know why, it's time to find out how. A cloaking is done by modifying a file called .htaccess. Apache server has a module called "mod_rewrite". With the help of this module in .htaccess file you can apply a cloaking technique for your web pages.
Webmasters gather search engines' IP addresses (231.258.476.13) or User-Agents (Googlebot). If mod_rewrite module detects that an IP address or user-agent belongs to a search engine, it delivers a web page that is especially designed for SEs. If IP doesn't belong to any spider, than it thinks it's a regular visitor and delivers a normal web page.
There are 5 types of cloaking:
• User Agent Cloaking (UA Cloaking)
• IP Agent Cloaking (IP Cloaking)
• IP and User Agent Cloaking (IPUA Cloaking).
• Referral based cloaking.
• Session based cloaking.
All five have unique applications and purposes, yet all 5 can fit nicely within one program.
•User Agent cloaking is good for taking care of specific agents. Wap, Wml pages for the cell phone crowd.
Active X for the IE crowd.
Quality css from the Moz and Opera crowd.
Nice black screen for the web tv'ers.
Specialty content for agents (eg: NoSmartTags, GoogleBot Noarchive)
No sense in sending out stuff with js, java, or flash than a user can't actually run.
• IP Address Cloaking is good for taking care of demographic groups. Language file generation for various countries.
Advertising delivery based on geo data.
Pages built for broad band users.
Low impact pages for overseas users.
User-time-of-day determination and custom content based on tod geo data (news, sports weather..etc)
Specifically targeting demo groups such as AOL, Mindspring etal.
• IP and Agent cloaking is good for a combo of the above. Custom content for AOL'ers using Wap phones.
Ads based upon geo data and user agent support.
The possibilities for targeting are almost endless. You'll run out of ways to reroll it before you run out of ips and agents to serve.
Indexability. Just getting your site fully indexed can be a challenge in some environments (flash, shock).
• Referrer based cloaking is basing delivery on specific referral strings. It is good for content generation such as overriding frames (about.com, ask jeeves, and the google image cache).
Preventing unwanted Hotlinking to your graphics.
• Session based cloaking. Sites that use session tracking (either from ip, or cookies) can do incredible things with content. We've all seen session cloaking in action on dynamic sites were custom content was generated for us.
The internet has just scratched the surface here.
Cloaking is the gate keeper that serves your site in it's best light, and protects your custom code from prying eyes.
Search engine cloaking is just one aspect of a much bigger picture. This is why search engines can't even consider banning cloaking. It is so widespread and pervasive, they'd have to delete 1/4th of the domains in their indexes - those would be the best sites they have listed.
Any time you hear a search engine talking about banning cloaking, listen to them very closely -- and remember. If they'd bold face lie about something so pervasive, what are they doing with the really important stuff? They can't be trusted - nor can those that are out here carrying their water.
With the assault of rogue spiders most sites are under, the growing trend of framing, agents that threaten your hrefs (smarttags), I think cloaking has a very bright future. The majority of the top 2000 sites on the net use some form of the above styles of cloaking (including ALL major search engines).
Just like cloaking these pages are also specially created for search engines, the difference is, these are ‘gateway’ or ‘bridge’ pages. They are created to do well for particular phrases. They are programmed to be visible only by specific search engine spiders. They are also known as portal pages, jump pages, gateway pages and entry pages. Doorway pages are build specifically to draw search engine visitors to your web site. They are standalone pages designed only to act as doorways to your site. Doorway pages are a very bad idea for several reasons, though many SEO firms use them routinely.
Doorway pages have acquired something of a bad reputation due to the frequent use (and abuse) of doorways in spamming the search engines. The most flagrant abuses include mass production of machine-generated pages with only minor variations, sometimes using re-direction or cloaking so the visitor does not see the actual page requested. Doorways used in this manner add to the clutter that search engines and Web searchers must contend with.
The purpose behind building Doorway pages is just to trick search engines for higher rankings. So doorway pages is considered to be unethical SEO practice. The fact is that doorway pages don't do a very good job of generating traffic, even when they are done by "experts." Many users simply hit their back buttons when presented with a doorway page. Still, many SEO firms count those first visits and report them to their clients as successes. But these very few visitors go ahead and visit their product’s page.
There are various ways to deliver Doorway pages. Lets check them one by one.
Low Tech Delivery:
When webmasters create and submit a page targeted toward a particular phrase, it is called Low Tech Delivery. Here sometimes webmasters create pages for special search engines as well. But the problem is user doesn’t arrive at the desired page. And it is most likely that if any visitor lands on non-informative page, he won’t navigate any further.
In such a case ‘Meta Refresh Tag’ plays very vital role. It is an HTML tag which automatically refresh the page in defined time. The meta refresh tag they use here is of zero second delay. Therefore use most likely won’t be able to see the optimized content before being sent elsewhere. These META tags are also a red flag to search engines that something may be wrong with the page. Because jump pages manipulate results and clutter indexes with redundant text they are banned by search engines.
Now a days search engines doesn’t accept meta refresh tags. To get around that, some webmasters submit a page, then swap it on the server with the "real" page once a position has been achieved.
This is "code-swapping," which is also sometimes done to keep others from learning exactly how the page ranked well. It's also called "bait-and-switch." The downside is that a search engine may revisit at any time, and if it indexes the "real" page, the position may drop.
But there is another problem with these pages. As they are targeted to key phases, they could be very generic in nature. So the pages can be easily copied and used on other sites. And since they are copied the fear of banning is always there.
The next step up is to deliver a doorway page that only the search engine sees. Each search engine reports an "agent" name, just as each browser reports a name. An agent is a browser, or any other piece of software that can approach web servers and browse their content. In example: Microsoft Internet Explorer, Netscape, Search Engine Spiders.
The advantage to agent name delivery is that you can send the search engine to a tailored page yet direct users to the actual content you want them to see. This eliminates the entire "bridge" problem altogether. It also has the added benefit of "cloaking" your code from prying eyes.
But still the problem is there. Someone can telnet to your web server and report their agent name as being from a particular search engine. Then they see exactly what you are delivering. Additionally, some search engines may not always report the exact same agent name, specifically to help keep people honest.
IP Delivery / Page Cloaking:
Time for one more step up. Instead of delivering by agent name, you can also deliver pages to the search engines by IP address, assuming you've compiled a list of them and maintain it. IP delivery is a technique to present different contents depending on the IP address of the client.
Everyone and everything that accesses a site reports an IP address, which is often resolved into a host name. For example, I might come into a site while connected to AOL, which in turn reports an IP of 220.127.116.11. The web server may resolve the IP address into an address: ww-tb03.proxy.aol.com, for example.
Search Engine Spam:
Search engine spamming is the unethical practice for optimizing the site to rank it high on SERP. Spamming is used to trick search engines for higher rankings with the use of some tactics such as repetitive keywords, hidden text and links etc. All the search engines penalize the website that uses spam. Since time immemorial --or at least since the Internet first began-- webmasters have been using these stratagems to dupe search engines into giving irrelevant pages high search engine placement.
Each search engine's objective is to produce the most relevant results to its visitors. Producing the most relevant results for any particular search query is the determining factor of being a popular search engine. Every search engine measures relevancy according to its own algorithm, thereby producing a different set of results. Search engine spam occurs if anybody tries to artificially influence a search engine's basis of calculating relevancy.
Each of the major search engines provide specific guidelines describing what webmasters should and should not do to their web pages in order to achieve a better search engine ranking, though that has not always been the case.
There are overall sixteen tactics that are considered search engine spam. These techniques are
• Keywords unrelated to site
• Keyword stuffing
• Mirror/duplicate content
• Tiny Text
• Doorway pages
• Link Farms
• Keyword stacking
• Hidden text
• Domain Spam
• Hidden links
• Page Swapping (bait &switch)
• Typo spam and cyber squatting
Not to be confused with the canned, processed meat, spam is the use of redundant or unethical techniques to improve search engine placement. Fortunately or unfortunately --depending on your point of view-- search engines are quickly catching on. Some won't index pages believed to contain spam; others will still index, but will rank the pages lower, while others still will ban a site altogether. Of course, not all search engines take a hard-line on spam. Tricks that are perfectly acceptable on one search engine may be considered spam by another.
Invisible Text: Hiding keywords by using the same color font and background is one of the oldest tricks in the spammers' book. These days, it's also one of the most easily detected by search engines.
Keyword Stuffing: Repeating keywords over and over again, usually at the bottom of the page (tailing) in tiny font or within meta tags or other hidden tags.
Unrelated Keywords: Never use popular keywords that do not apply to your site's content. You might be able to trick a few people searching for such words into clicking at your link, but they will quickly leave your site when they see you have no info on the topic they were originally searching for. If you have a site about Medical Science and your keywords include "Shahrukh Khan" and "Britney Spears", that would be considered unrelated keywords.
Hidden Tags: The use of keywords in hidden HTML tags like comment tags, style tags, http-equiv tags, hidden value tags, alt tags, font tags, author tags, option tags, noframes tags (on sites not using frames).
Duplicate Sites: Content duplication is considered to be search engine spamming also. Sometimes what people do is, they copy the content and name the site differently. But search engines can find it easily and they mark it as a spam. Don't duplicate a web page or doorway page, give them different names, and submit them all. Mirror pages are regarded as spam by all search engines and directories.
Link Farms: Link farm is a network of pages on one or more Web sites, heavily cross-linked with each other, with the sole intention of improving the search engine ranking of those pages and sites.
Many search engines consider the use of link farms or reciprocal link generators as spam. Several search engines are known to kick out sites that participate in any link exchange program that artificially boosts link popularity.
Links can be used to deliver both types of search engine spam, i.e. both content spam and meta spam.
Link content spam
When a link exists on a page A to page B only to affect the hub component of page A or the authority component of page B, that is an example of content spam on page A. Page B is not spamming at all. Page A should receive a spam penalty. Without further evidence, page B should not receive a penalty.
Link meta spam
When the anchor text or title text of a link either mis-describes the link target, or describes the link target using incoherent language, that is an example of link meta spam.
Reapetative Submitting: Each search engine has its own limits on how many pages can be submitted and how often. Do not submit the same page more than once a month to the same search engine and don't submit too many pages each day. Never submit doorways to directories. Decorum
Redirects: Do not list sites using URL redirects. These include welcome.to, i.am, go.to, and others. The complete site should be hosted on the same domain as the entry page. An exception may be made for sites that include a remotely hosted chat or message board as long as the bulk of the site is hosted on its own domain. Actually redirecting of page was not developed for spam, but it is becoming popular technique for spamming.
Alt Text Spamming: Tiny text consists of placing keywords and phrases in the tiniest text imaginable all over your site. Most people can't see them, but spiders can. Alt text spamming is stuffing the alt text tags (for images) with unrelated keywords or phrases.
Doorway Pages: Doorways are pages optimized only for search engine spiders in order to attract more spiders, thus more users. Usually optimized for just one word or phrase and only meant for spiders, not users.
Content Spam: It is possible when different URLs delivers same content i.e. content duplication and same URL can deliver different content as well. Both HTML and HTTP supports it and hence spamming is possible. For example, IMG support and ALT text within HTML means that image-enabled visitors to a URL will see different content to those visitors that, for various reasons, cannot view images. Whether the ability to deliver spam results in the delivery of spam is largely a matter of knowledge and ethics.
Agent based Spam: Agent based delivery is certainly not spam. But it is spam when the use of agent based delivery to identify search engine robots by user agent and deliver unique content to those robots. Since the content is only created for search engines and it is not visible for users, it is always spam.
IP Spam: Identification of search engine robots by IP name or address and delivery of unique content to those robots is considered to be spamming. As in agent based spam, though this technique is also spam when you deliver unique content only to search engines and not the users or visitors.
No Content: If sites do not contain any unique and relevant content to offer visitors, search engines can consider this spam. On that note, illegal content, duplicate content and sites consisting of large affiliate links are also considered to be of low value to search engine relevancy.
Meta Spam: Meta data is data that describes a resource. Meta spam is data that mis-describes a resource or describes a resource incoherently in order to manipulate a search engine's relevancy calculations.
Think again about the ALT tag. Not only does it provide content for a HTML resource, it also provides a description of an image resource. In this description capacity, to mis-describe an image or to describe it incoherently is meta-spam. Perhaps the best examples of meta spam at present can be found in the head section of HTML pages. Remember, though, it’s only spam if it is done purely for search engine relevancy gain.
Meta spam is more abstract than content spam. Rather than discuss it in abstract terms, we will take some examples from HTML and XML/RDF in order to illustrate meta spam and where it differs from and crosses with content spam.
Generally, anything within the section of an HTML document, or anything within the section that describes another resource, can be subverted to deliver meta spam.
To make sure that you are not spamming, you need to check out few things. The first and foremost is, you should know whether your content is really valuable for your customers and visitors or not. Any trick to attract more visitors is not going to help you for shorter period of time also. Try and make websites according to user’s tests and preferences. Always remember that, Internet users are information seekers and they want latest content all the time. So think and build a site as of there are no search engines. Avoid automated pages. Google and many other search engines do not index auto generated pages.
Inktomi does accept information pages into their free index and into their paid inclusion programs. For example, if a site contains PDF documents, and you create an information page in HTML with an abstract of each PDF document, that HTML page is acceptable to Inktomi.
How to report Search Engine Spam:
Since spamming practices are constantly evolving, it is important to know what the major search engines specifically say about spam and what practices are definitely not allowed if you would like to rank in top-tier search engines. Plus, every ethical SEO should know how to properly report any spam that they see so the search engines can correct their algorithm accordingly.
How Google Defines Spam
As part of their Webmaster Guidelines, Google outlines techniques to use to help Google locate, index and rank your website. They also specificially state that the following techniques may lead them to remove your site from the Google index:
• Hidden text or hidden links.
• Cloaking or sneaky redirects.
• Automated queries to Google.
• Pages loaded with irrelevant keywords.
• Multiple pages, subdomains, or domains with substantially duplicate content.
• "Doorway" pages created just for search engines, or other "cookie cutter" approaches such as affiliate programs with little or no original content.
However you should keep in mind that these aren't the only practices that Google disapproves of. Generally, Google doesn't like their results manipulated by deceptive practices. Their recommendation for webmasters is:
Webmasters who spend their energies upholding the spirit of the basic principles listed above will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.
To combat common search engine spam practices employed by rogue SEOs, Google has also posted a list of practices that should raise a red flag when you are looking for a search engine optimizer. According to Google, feel free to walk away from an SEO who:
• owns shadow domains
• puts links to their other clients on doorway pages
• offers to sell keywords in the address bar
• doesn't distinguish between actual search results and ads that appear in search results
• guarantees ranking, but only on obscure, long keyword phrases you would get anyway
• operates with multiple aliases or falsified WHOIS info
• gets traffic from "fake" search engines, spyware, or scumware
• has had domains removed from Google's index or is not itself listed in Google
How to Report Spam to Google
Google has a form that allows you to report spam to Google or you can e-mail Google at firstname.lastname@example.org. Note that Google rarely manually removes websites from the engine. Instead, it tweaks the search engine algorithm and spam detection software to try and eliminate the spam technique that is clogging up the engines.
How Yahoo! Defines Spam
NOTE: Altavista, All the Web and Inktomi are all owned by Yahoo!, so the Yahoo! spam policies and webmaster guidelines also apply to these search engines.
According to Yahoo!, search engine spam is webpages “that are considered unwanted and appear in search results with the intent to deceive or attract clicks, with little regard for relevance or overall quality of the user experience.” Officially, Yahoo! does not want to index sites with:
• Text that is hidden from the user
• Misuse of competitor names/products
• Pages that have substantially the same content as other pages
• Multiple sites offering the same content
• Pages in great quantity, which are automatically generated or of little value
• Pages dedicated to redirecting the user to another page
• Pages that give the search engine different content than what the end-user sees
• Pages built primarily for search engines
• Pages that use excessive pop-ups, interfering with user navigation
• Pages that use methods to artificially inflate search engine ranking
• Sites with numerous, unnecessary virtual hostnames
• Excessive cross-linking with sites to inflate a site's apparent popularity
• Pages that harm the accuracy, diversity, or relevance of search results
• Pages that seem deceptive, fraudulent, or provide a poor user experience
How to Report Spam to Yahoo!
If you find a site that is spamming in Yahoo!, you can report the spam through a form on their website.
NOTE: In addition to reporting spam, you can also report copyright violations to Yahoo!. To request that they remove any content published in violation of copyright protection, e-mail them at email@example.com.
How Teoma / Ask Jeeves Defines Spam
One of the most definitive sources of the Teoma / Ask Jeeves spam policy is on their Site Submission Terms page. Among the techniques that will keep you from being ranked are:
• Having deceptive text
• Having duplicate content
• Having metadata that does not accurately describe the content of a web page
• Including off-topic or excessive keywords
• Fabricating pages to lead users to other web pages
• Showing different content than the spidered pages to users
• Using intentionally misleading links
• Using self linking referencing patterns
• Misusing affiliate or referral programs
How to Report Spam to Teoma / Ask Jeeves
To report search engine spam to Ask Jeeves or Teoma, e-mail them at firstname.lastname@example.org
How MSN Defines Spam
MSN Search has recently added content guidelines to their website, explicitly stating that the MSNBot will see the following techniques as search engine spam:
• Stuffing pages with irrelevant keywords in order to increase a page’s keyword density, including ALT tag stuffing.
• Using hidden text or links.
• Using techniques such as creating link farms to artificially increase the number of links to your page.
Also, in an e–mail announcing the second preview release of the new MSN search, Microsoft mentioned cloaking and having duplicate content on multiple domains as things that will lead your site to being penalized or removed from the MSN Search index.
How to Report Spam to MSN
To report search engine spam to MSN, use the form on their website.
Have you seen any search engine spam lately? Instead of submitting spam reports to each engine, you can also simply submit a spam report through SEOToolSetTM.
Even those who are spamming right now and think they are getting away with it, should keep one thing in mind, when competitors check out your site (and they do), they will see it is spam and they may choose to report you. Once you have been reported to a search engine, you are likely to be penalized in search engine results for using your spam technique.
HTML frames allow authors to present documents in multiple views, which may be independent windows or subwindows. Multiple views offer designers a way to keep certain information visible, while other views are scrolled or replaced. For example, within the same window, one frame might display a static banner, a second a navigation menu, and a third the main document that can be scrolled through or replaced by navigating in the second frame.
The layout of frame could be like bellow: A framed page like the example shown is actually made up of 4 separate pages, a frameset page and three content pages. The frameset page tells the browser how big each frame should be, where they should be placed and what pages should be loaded into frame. If the browser or search engine can't display frames or is configured not to, it will render the contents of the NOFRAMES element.
The homepage or index page of a framed site is the document which contains the frameset and as you can see from the HTML above there is very little in the way of content for the search engines to read and index. What is needed is for more information to be added to the NOFRAMES element.
The best way of achieving this is to add a complete web page within the NOFRAMES tag including appropriate keyword rich headings and text. A navigation menu should also be included to provide links to all internal areas of your website. This will allow the search engines to index all areas of your website and improve accessibility for those using a browser or alternate device that does not support frames or has frames support disabled.
Placing nothing but a long list of keywords will not help your search engine position and may even be harmful.
Every web page has a unique makeup and location, which is easily definable, except frames. Frames are multiple pages listing on the same page, and why they can make site navigation simple, they do not show the pages current address. If you have an interesting article deep within your site using frames makes it hard for me to link to it. If you force me to link to your home page then I am probably not going to link to you.
You can get around frames by having a site map from the home page that links to all the framed pages, but even if these pages list high they will probably lack good navigation since the framework that contained it is not located with it in the search results.
There is an HTML tag called the NOFRAMES tag, which, when used properly, gives the search engine spiders the information they need to index your page correctly. I believe it was designed to give frames-incapable browsers — early versions of browsers that cannot read or interpret the FRAMESET tags — the ability to "see" the information on a framed site.
Unfortunately, too many sites that utilize this NOFRAMES tag put the following words into it: "You are using a browser that does not support frames. Update your browser now to view this page." It might as well say, "We are putting the kiss of death on our Web site and have no interest in being found in the search engines for relevant keywords regarding our site! Thanks for not visiting our site because you couldn't find it!"
What happens when you do the above is that the engines will read your TITLE and META tags (if you even included them) and the above information that the browser is frames-incapable, and that is what they will index for your site.
Try a search at AltaVista for the following: "does not support frames" and guess what? 260,882 pages are found! Nearly all of them are framed sites that used those words in their NOFRAMES tag. I bet that the circular-saw maker whose site is ranked number 1 for those keywords doesn't have a clue that he has put the kiss of death on his Web site! I also bet his site is nowhere to be found under the keyword "circular saws." (It isn't.)
If you want to have a framed site for whatever reason, then for goodness' sake, use your NOFRAMES tag properly! The proper usage of this tag is to take the complete HTML code from your inner page and copy it into the NOFRAMES tag.
The above information takes care of your front page. However, there are other issues having to do with getting the rest of your pages indexed properly when you use a framed site.
Most Web designers use frames for ease of navigation. That is, they have a left-hand frame with a static navigational bar or buttons that never change. When someone clicks on a button on the left, the frame to the right brings up the new page accordingly. Because of this type of design, there are usually no navigational links on any of the inner, framed pages.
Why is this bad? It's bad because you could (and should) optimize these inner pages to rank high in the search engines. But if you do, and someone searching in the engines finds them, they will be what I call orphaned pages.
I'm sure you've come across these at one time or another in your searches: a page that has a bit of information about what you were searching for but offers no way to get to the rest of the site!
Savvy Internet users might look at the URL and try finding the root directory, but most users don't have a clue about doing that. It's too bad for the site owner, who just lost some potential eyeballs — or worse, a potential customer.
If you use a framed design, it is absolutely imperative to place navigational links on all your inner pages. At the very least, include a button that links back to your home page. However, I would recommend that you have links to all your major category pages, as this will help the search engine spiders visit all the pages, index them all, and rank them high!
Pay Per Click Campaigns
Pay Per Click (PPC) campaigns are advertising campaigns that run on search engine result pages. These are also known as search engine advertising and in which payment is based on the number of times the ad is ‘clicked’ from the results list. Here you bid the amount of money you are willing to pay per click for each visitor the PPC engine sends to your Web site. The one with greater bidding amount receives the top position in the list. It allows you choose keywords you would like your ad to appear for when a search is performed.
PPC campaigns are positioned at the right hand side of the SERPs. They are two-three lines of keyword rich text advertising campaigns that are linked to the page where advertiser wants visitors to land. PPC is another mode of online advertising where the media is sold only on clicks not on impressions or actions taken.
Famous of all these is Paid Placements. It gives you the ranking on SERP depending upon your bidding amount. You don't pay to list your site; you only pay for clicks or click thru’s. Your site appears based on search results generated from selected keywords that refer to your product or service. For each keyword you determine how much you are willing to spend on a per-click basis.
Many search engines run PPC campaigns. But the major players are Google, Yahoo, MSN and AltaVista. Today, the major pay per click advertising programs is offered by Google and Yahoo. Google's program called AdWords delivers targeted pay per click ads via its own search engine and a host of partner sites. Yahoo owns Overture which has a selection of specialized PPC advertising programs. These PPC advertising programs offer the best quality traffic for most website marketing campaigns.
Small search engines like Find What and Sprinks offers direct paid placements. The link position in search engine is directly proportional to your bidding amount. The logic is simple- the more you bid the better position you will get and effectively will receive maximum click through. Google AdWords is different. They allow user opinion on how relevant and well-targeted ads for each keyword are by counting the CTR (Click through Rate) as a percentage and multiplying that by your CPC (Cost-Per-Click) to determine the rank of the results. The popular portals and search engines usually combine PPC listings with unpaid listings to ensure any given search produces sufficient paid and non-paid results. Typically, non-paid listings are provided by algorithmic search engines such as Inktomi, Google, FAST, AltaVista or Teoma.
PPC is the best, quick and cost effective advertising program. With the least investment you can get targeted traffic to your website. Generally it takes 5-6 months for search engine optimization of any site. But PPC campaigns can be set up in 5 minutes and you can get your ad at the top of SERP. So PPC can be executed during the SEO process.
While setting up any PPC campaign bidding on right keywords is the most important consideration. The ad must be attractive, persuasive and the linked page must be informative enough. The bidding price varies from keyword to keywords. Bid on some keywords might tough sky high and some might be very low. So biding on the particular keyword depends on your perception and experience. Therefore always ask SEO professionals to plan and manage your PPC campaigns.
Setting up PPC Campaign
It’s not easy to set up effective PPC campaign. There are certain things that you need to consider while setting up a PPC campaign. The first and major aspect is keyword selection.
Before going for any further the basic task has to perform-market analysis. Market analysis includes competition analysis. Figure out what your competitors are doing. What keywords they are targeting and what their strategy is! Decide on your campaign objective. It’s very confusing to determine whether the campaign is successful or not if the objective is not clear. Decide whether you want more exposure, traffic, clicks or conversion.
Just like SEO planning, keyword selection is the first step while setting up PPC campaign. The main thing to keep in mind here is you can select out keywords that directly relate to your product or service unlike SEO where you tend to use semantic approach. While selecting keywords try to use qualifiers for more precise targeting. Though there are many keyword-generation tools are available, you should be clear in your mind about the nature of keywords. Here selecting persuasive words is the key to effective keyword selection. One should deliberate upon the kind of keywords customers could use to search your or your clients product or service. Google, Overture, Wordtracker are most famous tools for keyword generation.
The campaign tracking is also done on keywords. It is considered as the basic tracking system in PPC campaign. To make the PPC campaign successful you must know your per click results. You should know the performance of each and every keyword i.e. you must know that on which keywords did your ad receive the maximum number of clicks and conversions. This is important because small number of keywords might be generating maximum results. Then investment on other keywords is in vain.
Keywords selected for the campaign should be specific. Unlike SEO where similar keywords help you in greater rankings, in PPC you can actually face a big loss if keywords are not specific. For example, if you select ‘mobile phones’, you may get a number of unqualified clicks. Therefore use terms like ‘mobile phone manufacturer’ or ‘mobile phone retailers India’ etc. to get targeted and useful traffic.
The best way to select keywords is understanding the product or service you are offering. List out all the product or services you are offering and then prepare a list rather than depending on the client. Also look at your core competencies and the differentiating points between you and your competitor. Keyword stemming involves the interchanging of singular and plural forms of a keyword or the derivation of a verb from the gerund form (the "-ing" word). For example, if "educate" was part of a keyword phrase, "educated", "educates", "education" and "educating" should also be considered.
Location qualifiers are highly effective in PPC campaign. Location keywords are less expensive and produce tremendous results as they attract highly targeted traffic.
After keyword selection, bidding on those keywords becomes highly important. The crux of PPC campaign lies in bidding strategies. Though we will go into details later, this paragraph will give you overall idea of bidding.
The placement of your ad depends on the amount you offer to search engines on those particular keywords. The higher the money the higher the position.
There are many bidding tools available in the market such as Atlas one point, BidRank or PPC Pro. These tools monitor the campaign through out and adjust your bids. So they are helpful in huge campaigns and can be less effective smaller campaigns. Constant monitoring and campaign testing is also very important. Ideally the PPC campaign should be monitored 4 times a day. The process of monitoring and optimizing should go for a month at least. The targeting is also important. It includes site, geo and duration targeting. Check out where you want to display your ads, whether its region specific, audience specific or time specific. This means if your target audience is professionals, then display ads only in office hours and don’t publish it on weekends, holidays and nights.
Rather than struggling for the top position, figure out keyword performance.
Even if top position advert is attracting more click throughs, it is not necessary that it’s the conversion rate is also higher. Lower position might be attracting less traffic but its conversion rate can be high. Therefore keyword performance tracking and lead conversion metrics study can increase your ROI.
Writing advertisement is immensely important as only after reading the ad text, visitors are going to click it. Never rely on one avert. Make at least 3-4 adverts and use them at different time on different search engines and observe their effectiveness. The advertising copy is divided into two parts. The first is ‘Title’ and second one is ‘Ad Text’. The title should be keyword rich. If the title is keyword rich then it is more likely that visitors will click your ad.
Ad copy should be created in such a way that your ad should stand out among the competitors. Write what consumer wants from you. If they are looking for Holiday packages then write something, which states both Hotels and Transportation. Similarly if they want to know about Jewelry then also includes buying information to help them. This will certainly differentiate you from your competitors. General tendency of writing ad copy is offering discounts, cheap deals, free gifts etc. but remember the function of internet- Information. So try to include informative keywords to attract more click-throughs.
As mentioned earlier trying different permutations and combinations of keywords help to receive greater click throughs. Interchange words in title and text makes huge difference. For Google Adwords, try split-testing two different titles and descriptions and add a unique tracking code to each one so you can identify which one causes the best sales conversion increase. Consider that even a single word change on your ad can create a significant jump to your sales or lead conversion rate.
Mention clearly what you want from the visitors after he clicks on the ad. This will give you immensely targeted traffic. If you want them to play a game on the website then mention that clearly. By means of this you can actually filter the traffic and can avoid those who don’t want to play games. Also if there is a need of registration then mention that also.
4.Develop proper landing page:
The PPC advert is linked with the web page of the product or service it is promoting. It could be any page according to the campaign. Once visitor clicks on it they will land to the linked page and that is the point of purchase. And hence landing page is important than anything.
Landing pages are simply web pages designed specifically for a keyword or related group of keywords. They are highly relevant to the keyword searched and consistent with the ad’s claim. They immediately focus a visitor’s attention to a primary call-to-action (most wanted response). In essence – landing pages ask your visitors to take an action.
But still many campaigns are linked to the home pages, which is strictly avoidable. Home pages are meant for wider audience and not the highly targeted audience. Home directs all the visitors to their desired locations. But when a visitor clicks on PPC ad, he expects to land on the particular page. He relates the keywords in the ad to his needs and feel that you might be providing the solution. But when he lands on the home page, its very unlikely that they will navigate further or check out the entire site. Hence the page liked to the ad must be relevant to the ad. Relevancy and consistency are essential for an effective pay-per-click marketing program. If you are able to engage customers and give them as promised in the advert then you will convert the clicks into your customers.
To motivate visitors, the design of page should be attractive. It must be easy to navigate and should contain required information. You don’t need flash and animation to coax your visitor into buying your product, instead focus on simple clean navigation that will allow your visitor to find what they want in the shortest time available.
You must need to use trial and error method in landing pages as well. Link the advert to 3-4 different landing pages and evaluate each page’s performance. Check out which page is producing more conversions and why. The main thing here to keep in mind is make things simpler for visitors.
The success of PPC campaign also depends on the optimization capability. For optimizing any campaign you need all the data regarding campaign performance. You can use free Google and Overture tracking tools that allow you to track from the click to the conversion. But there are many more tools which gives much more detailed data and analysis.
There are few client side tracking tool such as Webside Story’s Hitbox Professional or Hitbox Enterprise. With Hitbox, you can accurately track all of your marketing campaigns with one system – PPC, SEO organic, email campaigns, affiliate marketing campaigns, banner ad media buys, direct mail, etc.
The success of PPC campaign depends on all these elements. Keyword selection, bidding, landing page, ad copy and optimizing, all these elements should go hand in hand. The balance is very important. If you missed on any one of these, your conversion rate will fall.
Major PPC search engines
There are several search engines that offer PPC advertising program. But Google and Overture are undoubtedly the market leaders. They offer the best PPC program in the industry because they return with sufficient results for most of the campaigns. Also there are product or service specific search engines that can deliver better value for your campaign. In both the search engines maximum bidding amount is $100. Basically you create a small text ad using their tools and then bid on a series of keywords that will trigger your ad to be displayed.
PPC ads are displayed on the respective SERP. If you chose Google to run PPC ad, your ad will be shown in the search engine result pages of google.com. If you go to google.com and run a search, you can see the AdWord ads along the right column where it says ‘sponsored links’.
Google’s PPC program known as ‘Google AdWords’ is currently the best in the industry. What differentiates it from others is its ad positioning strategy. Unlike other search engines it is not just the bidding amount that decides the ad placement but also there is an element of quality that plays crucial part. Google offers discounts to those with higher click-through rate.
As we know keywords can be generic and targeted. Generic keywords or general phrases are less targeted but more popular and hence their bidding amount is much higher. Google's broad matching system means wide varieties of buyers are competing for the same phrases, even though they are selling completely different products or services.
Ultimately hundreds of thousands of advertisers bids on similar keywords and try to bid higher from others. This is why you need to use a systematic testing approach is necessary for AdWords to perform well for you. If you don't have a system for discovering the best quality keyword phrases for you, you'll be spending a lot more money. On a large campaign it is essential to quickly discover what makes people click. Another reason is click fraud. There are estimates that 20% of all click-throughs are fraudulent, so expect 20% of your budget disappearing.
Google's CPC program is complicated, even troublesome at times. However, it does have an automated feature called AdWords Discounter. That allows you to let them manage the bid price at a competitive level. Some of the more general and well-used keyword phrases have a set minimum bid which makes some phrases very unattractive. That's why you won't see ads on Google for certain keyword phrases. It's just too expensive given the type of sales transaction that results from clicks on that phrase. The word free can sometimes be very expensive on Google AdWords.
The Google AdWords System:
AdWord's bid prices and click through costs are not synonymous. That's because your bid could be on a variety of phrases and because their discounting system attempts to keep bid spreads from getting too large. Your specified maximum bid of $3.99 per click may actually result an average of only $2.65 per click instead.
Ads can contain 25 characters for the title, 70 characters for the ad text, and 35 characters for a display URI. America Online shows the title and ad text all on one line.
The most important fact is, decide on objective and target audience. Its very important to communicate the message clearly to customers. For more detailed google AdWords guidelines follow the link bellow:
To receive the maximum ROI from advertising on Google, you must know and learn the system. Proper keyword selection, writing text copy and linking the right page will give added advantage once you learn the system. You must calculate the visitor value of your site. This will certainly help you in drafting the budget and biding amount. It could be calculated by sells made by your site divided by number of visitors to your site.
Google AdWords is just about everything you could want in an advertising system. It is quick, responsive, offers free tracking, cheap to start, offers geo-targeting, syndication of ads is optional.
Do not get in bidding wars for generic terms. Opt to bid on more specific terms. Send people to targeted landing pages when possible.
GoogEdit is a free downloadable tool, which helps format your Google AdWords ads. It has many useful features such as number and phrase stripper. GoogEdit also wraps your AdWords ads in the various matching levels so that you can better track your traffic and conversions.
Why Google AdWords:
Within a very short period of time Google has become popular among searchers community. Their user base is huge and more importantly they are loyal. Also the average time spent on google.com by a user is much higher that that of other search engines. There is no doubt that Google is the best search engine at this point of time. They are so popular that whenever you think of search, Google come first in mind.
Google is popular because of its most relevant results. It is the best value for time and hence attracts serious visitors. Though Google is a global search engine they offer many features to attract local advertiser. They also offer geo-targeting i.e. targeting the ad to specific geographical location such as country, state etc. With all these features, Google certainly give more than 60-70% relevant traffic to your site.
Google AdWords is also integrated with their AdSense program. Therefore if you need more exposure you can chose that option as well. Google AdWord ads are also displayed on AOL, Ask Jeeves, Earthlink and business.com.
Therefore the level of exposure is very high in case of Google AdWords. Google has stated that their click-through rate is 5 times higher than their competitors. Your bids have to be among the top four in order to get listed on AOL. Up to ten ads may be shown on Google with ad placement at the top of the screen and downward along the upper right of the page in pink or green text boxes.
AdWords default to broad match ad listings. Using broad match search terms do not need to exactly match the keyword you are bidding on for your ad to show up. However if you want to bid for he exact keyword you can do that also. Google also offer free ROI tracking tool.
Google AdWords have a negative keyword tool which prevents you from paying for clickthroughs on irrelevant terms. Good terms to cancel out are usually: cheap, free, download, cracks, mp3s and wallpapers etc. The type of terms where it is obvious the searcher would not likely convert to a sale or perform the desired action.
Google only costs $5 to open an account and the account is activated immediately. Google has automatic approval of all ads - and will list them as long as they follow the guidelines.
Although Google has partnered with many sites and search engines, the ad is displayed only on the sites selected by the advertiser. Google allow syndication to be on or off.
Their web reach is very high and there is no minimum spend on bidding amount. Also they don’t demand for minimum monthly spend. You can bid as low as you wan and can assign any amount of budget.
Key points in Google AdWords:
• Always remember that advertisement is group of keywords. Therefore write custom ads for each keyword phrase you bid on and create landing pages accordingly.
• Target your ad using qualifiers and you can avoid unwanted clicks. Use brand names, locations etc to get more relative traffic. Try to fill the gap between existing solution and desired needs.
• It’s advisable for new advertisers to start up with higher bid. Initially you need to know where the clicks are coming from. And if you start with lower bid, the CTR will reduce.
• If the performance of your ad is poor i.e. you are not receiving any clicks, try to better you position by increasing bidding amount. Lower position means less click throughs and may be ad cancellation by Google.
• ROI is important than CTR. That means actual sale is important than clicks.
• Tracking use behavior is essential in online advertising. User can react in many ways such as just navigating site, subscribing to newsletter, downloading some stuff etc. AdWords provide all this data, which can be used in further research.
Overture PPC Ads
Overture introduced Pay Per click advertising to the world. Overture is now acquired by Yahoo! and working as search marketing solution arm of Yahoo! With the first ever PPC program, Overture showed the way to make money to search engines. Currently Overture is used to provide paid search listings on many of the top search engines and internet portals such as: Alltheweb, AltaVista, Yahoo, MSN, and many other portals, search engines, and web sites. Overture's top of page placement and the fact that you have lots of room for descriptive copy gives your ad writer more power to generate click-throughs.
Overture PPC Ads System:
Unlike Google Overture has defined its minimum level for every single aspect of advertising. Their minimum cost per click is .10 cents. Also they charge $50 to open an account. It takes 1 week to setup and activate an account in Overture. You must spend minimum $20 monthly for advertising on Overture.
Ad positioning in Overture is purely based on bidding amount. Overture does not consider the click through rate while positioning your ad on SERP. That Overture doesn't consider click volume when determining an ad's position in search results creates opportunity for some advertisers. In theory, some unqualified clicks can be discouraged without causing the ad to be deactivated.
But in recent development, Overture started to deactivate the ad with usually poor click through. However their critical click through volume is lower than Google.
Why to go for Overture PPC program:
PPC campaign is all about words and keywords. It is the only tool by which you can attract your customer. More words are always better. The more you explain in the ad the more conversion rate you can expect.
And this is where AdWords has edge over google. Overture allows title upto 90 characters and description upto 190 characters. And Google allows 25 characters in title and 70 characters in description. Longer copy in the Overture ad allows advertisers to more carefully qualify their clicks. This can eventually improve conversion rates from Overture clicks.
Overture’s network consists of Yahoo, MSN, CNN, AltaVista and Infospace, which means you might reach up to 80% of all Internet users. This helps in brand awareness.
But since Overture offers monthly budget system, it can create negative impression among advertisers as Google is offering daily budgets. Therefore on Overture, if your monthly budget expires before 10th of the month, your ad won’t appear for remaining days or until you refill your account.
You can place exact bids and know right away what your rank will be. If you want to appear in Yahoo top three results, you must have one of the top three bid positions. There is an auto bidding feature that allows Overture to handle your bids automatically.
Overture offers a keyword search volume reporting tool that helps find keywords that are actually used. Jimtools.com offers a combined volume/price comparison service that is very handy.
Overture PPC System and Tips:
In Google AdWords, ads are added automatically whereas in Overture human editors review your keywords and your site. You may bid on a search term only if the web site has substantial content that is clearly reflective of the search term. Also Overture ask their advertiser to rewrite the ad if they are not receiving proper click through.
Also the landing page for visitors must be according to the query phrase. Overture makes reports available for all of your approved phrases.
They may or may not allow you to bid on misspelled words. They tend to reject three or four word phrases that don't have sufficient search volume from searchers. Although they don't have minimum click through requirements you could end up spending money repeatedly on losing phrases. On the other hand, some phrases may have a very low click through rate, yet still produce some good sales. That's in contrast to Google AdWords program where you may have some good sales from a .04% click through rate, then watch your ad get deleted because it does not meet the minimum click through rate or what is referred to as CTR.
Overture will reject superlatives in your titles or descriptions which tend to make the listing appear matter of fact. There is no keyword exclusion option which means your site will come up in irrelevant searches. Titles cannot exceed 40 characters and descriptions cannot exceed 190 characters. Ads display on one line at Yahoo.
While advertising on Overture you must make sure that the site is not framed, as you can’t directly connect to the inner pages. If you bid several related phrases then you ad might appear multiple times on the same page. Also make your landing page according to the promise you made in the advert. If you are offering free white paper then don’t ask them to subscribe to your site or else mention it in the ad. If you don’t offer detailed information and easy navigation then your conversion rate will go down.
Other PPC Advertising Resources
With the growth PPC advertising has seen in the last few years, additional services have appeared to help advertisers simply the management of different PPC ad programs.
There are other PPC advertising programs available today such as Business.com, Enhance Interactive, e-Pilot, Kanoodle, e-spotting, ePilot and Search123. Overture has programs for many different countries.
Shopping Search Engines
Shopping search engines have become popular destinations for consumers. Most, with the exception of Froogle charge a price per click-through from their search function. Yahoo's Product Submit charges from 20 cents to 50 cents per click. You'll have to create and upload a list of your products, called a product feed. If you sell products, this is one of the most valuable locations to sell.
Shopping.com also has a merchant program. It charges any where from fifteen cents to a dollar per click. This site also offers good exposure to product shoppers. Bizrate offers a variation on the fixed bidding scheme of the other shopping search engines by allowing merchants to bid on top rankings.
In Europe, Kelkoo comparison shopping search engine has a strong market share, which is why it was purchased by Yahoo. You can bid on top placement for your products and appear across European from Sweden to Italy.
PPC Bid Management
If you use all of the available pay per click advertising programs, it can become a headache to manage your bids without spending all of your time. There are services that can add greater functionality for your AdWords and Overture campaigns, allowing you to do things they don't offer.
Two companies lead the way in, PPC bid management programs. Atlas One Point provides an all-in-one interface for managing PPC bids. Formerly called Go Toast, they offer a free 14 day trial of their service which includes bid management and campaign optimization.
BidRank is another useful automate pay-per-click search engine management tool that takes the pain out of PPC bid management. BidRank will check and change your PPC bids according your preferred ranking. You can set target ranks and bid maximums based on time of day and/or day of week. Basically the primary settings are used when inside the times that you have specified and the secondary settings take effect when outside of those times.
Other Useful PPC Advertising Resources
Clicktracks offers a PPC tracking and reporting feature for any PPC campaign. It can even help you calculate important e-metrics such as return on cost, return on keyword phrase for each search engine. You can compare your returns on paid versus organic search engine listings.
As of now we know that bidding is what decides fate of your PPC campaign. Now lets see what all aspect should be considered while bidding for keyword or key phrase. There are several PPC bidding strategies to apply. Each has its merit, and in some cases, may be more effective with one PPC search engine or with a set of terms. This is just not enough. There are different bidding strategies for Google and Yahoo! Their PPC programs are quite different and hence the bidding strategies also have to be different.
The bidding amount certainly depends on, how much you are willing to pay per click. If you don’t know this value then its better, you stop thinking of PPC advertising. This could be based on an industry rule of thumb or calculated based on internal factors such as profit margins.
For example, let's suppose you're bidding on the keyword phrase "search marketing" but do not know your max CPC. One way to estimate a max CPC involves taking the top 5 bids on Overture and computing the average. The current bids are: $0.51, $0.50, $0.33, $0.32, $0.31. The average is 39 cents. Use that as your max CPC to begin with.
The reverse calculation is very effective to determine how much we can spend per click. Reverse calculation means; calculate the amount to be investment on the basis of revenue you are generating from clicks. Past experience, market understanding and proper research will certainly help to calculate you CPC. Let's suppose you sell SEO package for Rs.100k and your profit margin is 20%. That leaves Rs.20k of profit for each package. Also, assume that your conversion rate will be 1%. For every 100 visitors from a PPC ad, you expect 1 sale. If you have Rs.5k of ad spend to spread over 100 visitors, you have Rs.50 to spend per click.
Also you can decide on it by calculating overall online marketing budgets. If you are willing to spend 10% of revenue on website then your total ad spend is Rs.10k. The conversion rate we have calculated is 1%. Therefore with 100 clicks in mind we can spend Rs.100 per click. As your campaign progresses and you determine your actual conversion rate, adjust the CPC accordingly.
You need to use different bidding strategies for Overture and Google as their programs are different. Google considers past performance and click through rate of the campaign whereas Overture only considers your bidding amount. For Google, use the Overture bids as your starting point in the short term and reduce the bids for the long term if your CTR is high enough.
Biding for a position gives you more CTR and not the number one position. By this you can get higher ROI since top positions are very expensive. Just think of searcher’s behavior. They don’t have any specific query in their mind but they use different combinations of keywords they can think of. Bidding has phenomenon that there are always some big gaps in between the bidding amounts. This is again because of race to reach to the top position. Consider a bidding scenario where biding started at the price of Rs.10. Someone else will bid Rs.11 again someone will bid Rs.12. But the point will certainly come when some aggressive bidder will bid on Rs.30 to obtain the top position without any fear of competition. This gap between Rs.12 and Rs.30 will be beneficial for us.
If you are concern about the first position only then initially bid higher and achieve it, which is very easy to do in Overture. And then by constant monitoring the biding you can maintain the top position. On generic keywords it is very difficult to monitor the campaign constantly as they are popular and high traffic terms. But you can do it on specific keywords, which are comparatively less competitive with higher conversion rate.
If you are biding on very specific keywords, which have less competition and low traffic, then one option is, position the advert and rely on the visitors. This strategy can be considered because, one who will search into such a specific query, he is keenly looking for that particular information.
Sometimes you bid relative to your direct competitor's offerings and listings. If you find your direct competitor at position 3 and you have a better offering for this particular search query, bid just above your competitor, but not necessarily at the top position, thus engaging the searcher's attention with a compelling ad. Terms in this category fall into Quadrants 1 and 2 depending on how compelling the offer is once the searcher have landed on your web site. The bidding strategy works well for price and feature competitive offerings.
Google AdWords Bid Management Strategy
The most important step before bidding for any PPC program is, understanding the market value of your keywords. And best way to know it is Overture. But there are several bids going around for a keyword. So you can take an average of some top biding amount and can determine the market value of your keywords. If you can afford the market value you derive, use it. Otherwise, use your max CPC. That max CPC could be set for an entire ad group or for a specific keyword phrase.
Track the ad carefully for a few days. Assuming the bid is high enough and generates sufficient traffic, you should have a good idea of the CTR within a few days. If the CTR is good (at least 2%), lower the CPC and see where your ad falls in the search results. If the CTR is sufficient, lowering the CPC should not result in your ad dropping many positions.
Then use trial and error method with different combinations. Run a query and observe the position. Drop the bid by some amount and check the position again. Again drop it and check it. Follow the procedure until you remain in the top 3-4 or whatever desired position you want to be. If your ad's CTR is very good (better than 7%) you will likely be able to drop your CPC in half without a noticeable drop in ranking.
If your ad group has many keyword phrases and there's a divergence in CTR, consider creating multiple ad groups. The more tightly focused your ad group is, the lower your CPC will ultimately become as you weed out poorly performing keyword phrases. Adding negative keywords to each Google ad group will also help increase the CTR and thereby allow you to reduce your CPC.
The main aspect of PPC advertising is not exposure, but clicks and sales conversions. The click-through rate is defined as the percentage of times a paid search ad is clicked on out of the total number of paid search ad views within a given period of time.
Click-through Rate (CTR) = Click-throughs (i.e. Total Visitors) / Impressions
Website conversion is defined as the percentage of users who visit your website and complete your primary objective (i.e. purchased a product) out of the total number of users who visit your website in a given period of time.
Website Conversion (sales conversion) = Sales / Click-throughs (i.e. Total Visitors)
So what role does each play in understanding the effectiveness of a paid search campaign?
Standard practice among advertisers is to concentrate on writing ads that achieve a high click-through rate to send more visitor traffic to their website. Unfortunately this general assumption, “more traffic equals greater positive results”, is flawed.
Consider this. Which click-through rate is better?
• A 20% click-through rate for a paid search ad that achieves zero sales (0% website conversion).
• A 0.2% click-through rate for a paid search ad that achieves 10 sales (10% website conversion).
The answer is obvious. The click-through rate, especially for newly setup PPC campaigns, is relative – it is the website conversion rate resulting from visitors clicking through a particular paid search ad that defines success or failure.
Successful paid search advertisers take a different approach. They start with the end in mind by asking, “what primary objective do I want a visitor to complete on my website?” and then they work backwards. They identify the type of visitor and buying behavior that will most likely result in a completed action (i.e. sale, registration, etc.)
In addition, they perceive their ads as automated salespeople who “qualify” visitors. Regardless of a high or low click-through rates, the focus is on generating a positive return from the advertising dollars spent.
For instance, let’s review two different ads. Ask yourself, which ad best qualifies visitors?
A. Pride Scooters
Low prices and huge selection of
scooters and other mobility equipment.
B. Pride Scooters
From $1850 while stocks last.
Houston, Texas, USA.
If you selected B. you are correct.
Ad B. qualifies visitors based on their buying behaviors and customer type most likely to purchase a Pride Scooter from the business’ website.
First, the ad states a price point (i.e. from $1850) to attract visitors seeking the website’s premium product while disqualifying ones seeking discounted or lower-priced scooters. A user researching scooters does not have to click-through the ad to find out a general price range.
Second, the ad targets a geographic region since the majority of people who buy scooters demand an actual test ride. If the company is located in Houston, Texas then users from other locations will not feel compelled to click-through the ad. (Ideally a geographically-targeted PPC campaign like using Google Adwords Regional-targeting works best in this situation).
In essence, ad B.’s goal is to pay “per click” for only visitors most likely to purchase their product. This ad attempts to “filter” unqualified visitors thereby increasing the return on investment per click-through.
Ad A. instead spends money on attracting and generating click-throughs from all visitors and relies on the website to filter qualified versus unqualified ones. This is not a wise economical approach especially if no “visitor exit strategies” are pursued.
Last, successful paid search advertisers rely on testing different ads to determine which appeal generates the best website conversion for a particular keyword. They rely on actual visitor feedback to help them determine which appeals are most effective. Once a positive return is achieved then focus is shifted to increasing the click-through rate for the best converting keywords so more sales can be realized.
So “Are you spending money to bring just anybody to your website or visitors ready to buy from you?” Think about ..is Your Paid Search Advertising Generating Positive Financial Results for your website?
Targeting Usage Demographics to Increase Paid Search Conversions
Targeting the campaign to the proper target audience is very important in terms of conversion rate.
Website conversion is when a visitor takes action on your website after clicking through your ad. It is important because it leads to financial results for your web business and generates a return on your advertising spend (ROI).
On internet and for PPC ads, to be very precise, we can target the audience geographically and demographically. Targeting through ads is another part. First we should study the demographic profile of search engine’s users.
User demographic profile study means why visitor chose one search engine over other. It could be because of the functionality, relevance and many other factors that user can perceive and prefer one search engine over other. If you can research into user demographics of Google and Overture then you can create message accordingly and increase your ROI.
To study that, you must know what Google AdWords or Overture consists of! Though there are many sites and search engines in these networks, very few of them are famous and most popular.
Below are the primary search engine usage demographics to consider when developing your PPC strategy:
1. Gender: Male versus Female
In the search engine world it is very often said that ‘Men are from google and women are from Yahoo! and MSN’. And it seems to be very true.
A May 2004 study by Hitwise showed that “55% of women prefer MSN Search while a majority of men favor Google Search”. Yahoo! Search was split even on gender with a greater focus on people 18-34 in age.
A 2004 MarketingSherpa study indicated that MSN’s user profile consisted of time-limited, married females who searched less frequently yet performed greater e-commerce searches. While Google Search was favored by professional males who performed greater news, media, entertainment and education searches with a lesser intent to purchase.
For AOL and Ask Jeeves, AOL is favored by women with less buying intent than MSN Search while Ask Jeeves is preferred by children.
Furthermore, an April 2004 iProspect study uncovered that, “women found paid ads to be more relevant than men did when searching across Google, Yahoo!, MSN and AOL.”
These statistics are startling when considering their influence on your PPC strategy since women represent roughly 75% of major household purchases and as stated in a Women.com study, control 80% of all purchasing decisions.
2. Relevancy: Paid versus Organic Listings
Another usage demographic to consider for your PPC strategy is “perceived relevancy” of paid versus organic listings. Ads perceived as having greater relevancy lead to higher website conversions.
The iProspect study referenced earlier also discovered that “Internet users are more likely to click on an organic search link on Google, and a paid search result on MSN.” Organic listings on Yahoo! were considered 61% more relevant than paid listings while AOL was split 50/50.
3. Age: Young versus Adult versus Seniors
A third usage demographic to review is age. Preferences among the top five search engines are fairly mixed among age groups; Yahoo! is a strong favorite with 18-34 year olds; while MSN and AOL have a stronger preference among the 35-55+ age group. As stated earlier, AskJeeves is favored by teens and adolescents which is growing in their buying power within American households as stated in a recent BusinessWeek research study.
• Google and Overture offers the best PPC programs and always prefer them.
• Google has the greatest reach but conversion rate on overture is high.
• Go for both the programs simultaneously
• Ad copy, keywords and landing page all are equally important.
• Consider customer demographics and psychographics while writing copy.
• Use relevant qualifiers to get mote targeted traffic.
• Usage data generated from your website is the best market research.
• Use keyword-level tracking systems to determine which PPC search engine generates the most cost effective and best converting visitors.
SEO Reporting and Conversion
Search engine optimization is a process to rank your web site higher on the search engine result pages. This requires researching search engine algorithms and discovering ways to conquer those without fooling search engines. It is the process of creating a site on the parameters of search engines and not using techniques like spamming, cloaking and redirecting etc.
It has been found that, 80% of your traffic comes from search engines. But search engines returns with millions results for a single query. Searchers do not search beyond 2-3 pages. Hence it is very important to position your company in first 2 pages for the important keywords. Search engine optimization is a very cost effective way to attract maximum traffic to your site.
Search engine optimization is part of Online Marketing. And any marketing activity needs investment. But in online world, since it is still growing, there is lot of curiosity among marketers about how to track or measure the performance of the activity.
SEO campaigns offer an advantage over traditional internet advertising methods, such as email campaigns, banner advertising or pay-per-click services. Search Engines marketing offers a unique promotional advantage when it comes to overall return on investment.
A well optimized page typically can deliver thousands of targeted visitors to your website every month. Unlike purchasing mailing lists or other expensive website marketing methods for higher search engine rankings, SEO allows you to deliver long term success and a better ROI.
Most of the time the state of confusion occurs because the success of the campaign is not clearly defined. Marketer must know what they want to achieve. At the end of the campaign you should know that the performance would be measured on the number of visitors or the total number of conversion i.e. the actual sales made on your web site.
First thing to be kept in mind is that ranking is not the only parameter to measure the success of the campaign. The success of the campaign needs to be measure on rankings, percentage of traffic from different search engines, traffic from different keywords all should be considered while tracking ROI of the campaign.
The major success of SEO campaign is ranking at the top of major search engines for popular keywords.
But since search engine algorithms are rapidly evolving no one can promise top rankings now. So the parameter for consideration is now the kind of traffic you are receiving from your keywords. Also it is more important that your keywords should not contain your company name. By measuring non-branded search traffic, we learn more about how your audience looks for you, and can further refine your campaign.
There are many search engines through which you attract traffic. But most of the visitors come through big search engines like Google, Yahoo, MSN and AOL. Therefore it is very important to know that which search engine is generating what kind of results. The more the traffic search engine is generating the more the ROI.
Action taken by a visitor is considered to be very important. Though they are not directly purchasing the product, they are definitely our potential customers. Their action include, completing a registration form, or downloading a file etc. Of course, conversion is the ultimate aim.
ROI calculation varies with the type of site. Let’s see how to calculate ROI for an e-commerce site.
Determining ROI for e-commerce site is very easy as the aim of e-commerce is make actual transactions or sell. Generally e-commerce sites trade variety of products. Hence calculate average price of the offerings. Then calculate the total sell web site is making before optimization. We can calculate it by multiplying average price and number of items sold in a month. Then after the search engine optimization calculate the number of items sold. Again multiply it with the average price and you will know the actual sales figure. After calculating the difference between sales pre optimization and post optimization, you can calculate the increase in sales. Now look at the money you have invested in a SEO campaign. If the sales figures can recover that money quickly then you are generating very good ROI else not.
E.g. Average price is 100 and pre optimization sales is 100 units per month, the total sales is 100 x 100=10,000 per month. Post SEO campaign, if sales goes up to 300 mark then you are earning 100 x 300=30,000 per month. So post SEO campaign you are able to increase sales by 20,000. And if you have invested 50,000 for SEO campaign then you can recover it within 2.5 months.
Many companies develop different websites for lead generation and corporate. Not far behind from e-commerce website with regards to the simplicity of ROI calculations are websites developed to generate leads. If the value of the generated lead is known then the ROI can be easily determined. If the lead value is not directly linked to a dollar value, a dollar value will need to be determined. Let’s assume a website generates 25 leads per month pre-optimization, and each lead is valued at 50. The website was generating 1,250 per month before the search engine optimization campaign launched. Now, the website increases its leads generated to 100 per month post-optimization launch. We can then see that the search engine optimization campaign is responsible for an additional 3,750 in leads per month. If the optimization campaign cost 25,000, we can then estimate the ROI to be seen within 7 months.
Now let’s see how we calculate ROI for the corporate or information sites. These are sites essentially created for information and not for generating leads or sale. Hence the conversion parameters for these sites have to be defined. Here we consider value of a customer not the money he is going to spend on our product/service.
Let’s assume the website had unique 2,500 visitor sessions per month before the search engine optimization campaign was launched, and the website was converting 4% of its users. Before the SEO campaign was launched, the website was converting 100 users per month. After the campaign is launched, the website experienced increased visitor sessions up to 15,000 per month, with conversion rates increasing to 6% as a result of more targeted visitors (a very conservative increase). We can then determine that the search engine optimization campaign is directly responsible for converting an additional 800 users per month.
If we were able to determine a dollar value for each converted user, we could easily determine the campaign ROI. Let’s assume that we have determined that each converted user is worth approximately $10. Using this as a base, we see that the website was generating $1,000 per month before optimization, and $9,000 post-optimization launch. The search engine optimization campaign is responsible for generating an additional $8,000 per month through converting website users. If the SEO campaign cost $25,000 for the year, we estimate that a ROI will be seen within 4 months.
There are many tracking tools available to track the performance of the campaign. If a site owner was determined to track all traffic and measure ROI, the solution would be quite costly. First of all, you’re looking at a high end web analytics program such as Hitbox Enterprise or Webtrends Live. These typically cost anywhere from a few thousand to several thousand dollars a month.
SEO Visitor Traffic Analysis
Often it is said that objective of SEO campaign is ranking the website in the top ten on the major search engines. But that’s just one of the goals of a SEO campaign. The real objective of a SEO campaign is getting qualified traffic to your site and converting as much of it as possible into leads or sales.
Suppose you are an automobile company and your SEO consultant ranks you top on keywords like Auto Toys, Auto magazines and Buses, then certainly you are not going to get the desired audience or traffic. Also sometimes your site ranks top on the desired keywords but fails to deliver results in terms of sales. There could be many reasons for that.
Traffic Analysis is the evaluation of referrals from search engines based on conversion of traffic for keyword choices. The data is found in the logs using log analysis software or a customized application to cull referrer data for the site.
Visitor traffic analysis helps to optimize the campaign in greater details to increase ROI. There are a number of software’s available in the market that provides data in respect to the above.
The visitor traffic analysis can be done on the following parameters.
• Traffic on keywords
• Keywords and landing page
• Whether the call to action was followed for conversion
• Qualified Traffic
• Converted traffic
Traffic on Keywords:
To know the details about which keywords are working well and which are not is important to study. Keywords can attract targeted as well as untargeted or generic traffic. Targeted traffic means people are interested in your product or the information on your site. They are looking for the information related to your industry. It may be for research or purchase. They generally use keywords to find the information related to your site. E.g. if they are interested in photography then they will use keywords such as wildlife photography, photographs etc. And if you are also into the same industry and use these keywords in your website, they will visit your site which is a targeted traffic. But if you are a camera manufacturer then these keywords will certainly get you irrelevant traffic.
Therefore it is necessary to select right keywords for the campaign. Also you must know the performance of every keyword. Which keyword attracted the most traffic and which has failed to attract the traffic should be known to search engine optimization consultant. Then only we can see and optimize the site further. Try to replace the non-performing keywords.
Keywords and Landing Pages:
Which keywords are directing the visitors to which pages? Try to figure out which pages are attracting more traffic and why. Include more content if it’s saturated and create different pages for different products and services. This will again attract targeted traffic. Then keyword research should be planned according to pages.
Whether the call to action was followed for conversion:
Marketer expects certain activities to be done by visitors on the web site. These can be downloading stuff, registration, writing views or asking more information about the product. As an SEO consultant we must know whether the visitors are performing that activity or not. If not then try to find out the reasons behind it. There might be wrong selection of keywords or content might be not up to the mark. If we can figure out what making consumers going away from the site, we can work on that.
Qualified Traffic:Qualified traffic means targeted traffic. You must know the demographics and psychographics of the traffic you are getting for the client. The conversion rate will be high only if you attract qualified traffic. There are several ways to get qualified traffic such as more precise and targeted keywords, well defined keywords etc.
You must be able to find out which keywords are getting qualified visitors and which are getting general visitors. The more the qualified visitors it is more beneficial for your site.
Conversion in terms of sales is undoubtedly the final aim of a marketer. So you must track how many sales you are generating for the client. Even if you are getting huge traffic but no conversions then it’s no use to the client. So track the actual sales and find out the way they are making sales.
Calculate the percentage of traffic converting into actual sales. As an SEO consultant our aim must be to increase the conversion rate for our clients and help him hit the optimal ROI mark via his search engine marketing efforts. Create value for the customer in terms of traffic, leads or customers.
How to choose SEO consultant
Search engine optimization consultant provides search engine optimization services to clients who own websites and would like to achieve a beneficial ranking in search engine. Search engine optimization services include many factors such as on page optimization, off page optimization, PPC campaigns, link building, site restructuring and lots more.
SEO consultant use different approaches and strategies to optimize the site. Different strategies need to be use for different search engines. But the ultimate aim of search engine consultant is to achieve higher rankings for the client’s web site.
You should know what the job liabilities of an SEO consultant are. SEO consultant’s job starts from website research. Then they research into keywords and finalize the list of certain keywords, which they think, can generate maximum and relevant traffic to the client’s site. Content building is another aspect of search engine optimizer. Client either provides the content or consultants write it and place keywords in the content. In any case placement of right keyword with right density is a job of SEO consultant. Also making changes in the site’s HTML coding is very important for an SEO consultant. These changes are done in mea tags, anchor tags, meta keyword and description tags etc. Site submission to directories and search engines is another job of SEO. The theme specific sites and search engines needs to be researched. Submitting the site to specific as well popular search engines is very important. And then finally tracking the performance of the site is also a job of SEO consultant.
Choosing the right Search Engine Optimization consultant can decide the success or failure of your online marketing activity. The results of this activity determine who actually finds your site, and in what context.
The ethical SEO consultants are always preferable. And those with proven track record have an added advantage because they have results to show and prove them right. But it is not necessary that if they have succeeded in the past, they will produce same kind of results for your site. The industry is so volatile that no one can guarantee you the success. So, you must weed out the sharks and unethical consultants from the reputable ones.
While a lot of companies/individuals offer SEO as a service, very few people understand what's really involved (and the position of search engine engineer requires no set credentials), so it's very hard to choose the right contractor. There are also many businesses or organizations that cannot afford the professional level of service and guidance they require.
The first thing to consider is the importance of hiring an ethical SEO specialist. SEO ethics is not just about being nice little boy scouts. An ethical SEO specialist will make sure your website is not penalized or even banned from the search engines.
Beware of guarantees promising top rankings. A reputable freelance SEO specialist or SEO firm will not provide a guarantee, because too much is out of his control. Nobody knows the search engine algorithms. SEO is simply an educated guess based on what's worked in the past for others. A lawyer cannot guarantee you will win your case. The star witness could die or leave town, the judge might be in a really bad mood, and the other lawyer might be a whiz. Some so-called consultants may get you a top placement in the short term, but ultimately get your site banned forever.
Find out exactly what on-page and off-page SEO strategies they use. Lookout for search engine spam, steer clear of anyone using these tactics.
Ask around on popular SEO forums if the person is known. See if they contribute to the community through posts, newsletters, etc. It shouldn't take much time at all to see who's real and who's a scam artist.
To make sure you are hiring an ethical SEO specialist, always check that he has a physical address posted on his website. Do ask that instead of paying for a guaranteed ranking, you can pay some up front and the rest when you achieve the rankings. Most reputable SEO specialists will ask for only 1/3 to ½ of the payment up front. Some will bill in arrears. This is a fair strategy. SEO is a risk so it's fair to pay some non-refundable money up-front just for the labor. That is a sign that he is less likely to disappear.
It is important to ask an SEO specialist about his methods before hiring him. Combine quality content and a performance based agreement with a solid, reputable SEO company and you'll probably get the results you're looking for. Using dirty tricks, called “black hat SEO”, your website will rank high initially, but after some time it is more likely banned by the search engine.
Another scam is to guarantee placement within a short period of time, and to buy pay-per-click ad space. Pay-per-click ads appear as “sponsored” listings in the search engines. While they will attract some targeted traffic, only 40% of Internet searchers click on the sponsored listings. Worse, they are temporary listings that end when the account is depleted.
A similar scam some SEO specialists do is to place temporary links on their own sites or buy paid advertising links on other sites. Once the money is paid, they remove the links on their own sites, and once the ads expire on other sites, your site loses those links and rankings also fall.