Additional information on SEO
Search Engine Optimization (SEO also search optimization) is the process of editing and layout of content on the website or via the website to increase its potential relevance to specific keywords on specific search engines, and what is important to ensure that external links on the page are right and title in abundance. This is done in order to achieve higher organic search listing and thus increasing the volume of targeted traffic from search engines.
SEO is one of the key Web Marketing activities and can target different kinds of search, including image search, local search, and industry-specific vertical search engines.
SEO considers how search engines work and the people who are looking for. Optimizing web pages involves editing its content and HTML coding to increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Sometimes on the spot structure (relationships between its content) must be changed too. Because of this it is in terms of the client, always better to incorporate Search Engine Optimization, if site is developed, than to try and retroactively apply it.
Another class of techniques, known as black hat SEO or Spamdexing, use methods such as link farms and keyword stuffing that degrade the relevance of search results and user-experience of search engines. Search engines look for sites that use these techniques in order to remove from their indices.
The term 'search engine friendly' refers to a web page that has been optimized search.
According to industry analyst Danny Sullivan, known to use the phrase search engine optimization was a spam message posted on the discussion at the 26th July 1997 .
Early versions of search algorithms relied on webmaster-provided information such as keywords, meta tag, or index files in engines like ALIWEB. Meta tags provide guidance for each page's content. But using meta data to index pages was found to be less than reliable because the webmaster of keywords in meta tag were not truly relevant to the site of actual keywords. Inaccurate, incomplete and inconsistent data in meta tags caused pages to rank for irrelevant searches.  Web content providers also manipulated a number of attributes in the HTML source of the page in an attempt to rank well in search engines .
By relying on so many factors exclusively within a webmaster's control, early search engines suffer from abuse and ranking manipulation. To ensure better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Given that the success and popularity of search engine is designed for its ability to produce the most relevant search results for each allowing those results to be false by the users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account other factors, which were much more difficult for webmasters to manipulate.
Graduate students at Stanford University, Larry Page and Sergey Brin developed "BackRub", search engines, which are based on a mathematical algorithm to rate the prominence of web pages. Number calculated according to the PageRank algorithm is based upon the amount and strength of incoming connections . PageRank estimates the likelihood that the site will be achieved through a web user who randomly surfs the web, and follows links from one page to another. In practice, this means that some links are stronger than others, because a higher PageRank page is more likely to be achieved by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users who like its simple design.  Off-page factors (such as PageRank and hyperlink analysis) were considered as well as On-Page factors (such as keyword frequency, meta tags, headings, links to websites structure) that will allow Google to to avoid the kind of manipulation seen in search engines that only considered on-Page factors for their location. Although PageRank was more difficult to game, webmasters have already developed link building tools and programs that affect the Inktomi search engine, and these methods proved similarly applicable to the acquisition of PageRank. Many sites focused on exchanging, buying and selling links, often in massive scale. Some of these systems, or link farms, which are involved in the creation of thousands of pages whose sole purpose is to link spam.  In recent years the major search engines have begun to rely more heavily on off-site factors such as age, sex, location and search history of people conducting searches in order to further improve the results.
By 2007, search engines take into account a wide range of undisclosed factors in their ranking algorithms to reduce the impact of connection handling. Google says it ranks sites using more than 200 different signals . Three major search engines, Google, Yahoo and Microsoft Live Search, do not disclose the algorithms they use for location map. Significant SEOs, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and must publish their opinions in online forums and blogs.   SEO practitioners may also study patents held by various search engines to get insight into the algorithms .
Webmasters and search engines
In 1997, recognized that search engines are webmasters trying to rank and in their search engines, and that some webmasters are even manipulating their placement in search results by stuffing pages with excessive or irrelevant keywords. Early search engines such as Infoseek, adjusted their algorithms to prevent webmasterům from manipulating the order .
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEOs. In 2005, the annual conference, AIRWeb, hostile search for information on the web , was designed to discuss and minimize the harmful effects of aggressive web content providers.
SEO companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, Wall Street Journal reported on the company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients . Wired magazine reported that the same company sued blogger Aaron Wall for writing and Prohibition . Google's Matt Cutts later confirmed that Google in fact ban Traffic Power and some of its clients .
Some search engines also approached the SEO industry, and are frequent sponsors and guests at SEO conferences, chats and workshops. In fact, with the advent of paid inclusion search engines now have some interest in optimizing the health of the community. Major search engines provide information and guidance to help optimize the site.    Google Sitemaps program , in order to learn where webmasters Google has any problems indexing their website and also provides data on Google traffic to the website. Yahoo! Site Explorer offers the option for webmasters to submit URLs, determine how many pages are in Yahoo! index and view link information .
Head of search engines Google, Yahoo! and Microsoft, use crawlers to find pages for their algorithmic search results. Sites that are linked from other search engine indexed pages do not have to be submitted because they were detected automatically. Some search engines, notably Yahoo!, Operate a paid service, to ensure that use search either for a fee or set the CPC.  Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking in the search results. , Yahoo paid inclusion program drew criticism from advertisers and competitors . Two major directories, the Yahoo Directory and Open Directory Project both require manual submission and human editorial review.  Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not detectable, it automatically after the connection .
Search engine crawlers may look at many different factors when crawling the web. Not every page is a search engine indexing. Distance of pages from the root of the site can also be a factor in whether or not the robot site .
Main article: Robots Exclusion Standard
To avoid undesirable content in the index search, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. In addition, the page can be explicitly excluded from the search engine database using meta tag specific to robots. When you visit the search page, robots.txt located in the root directory is the first file crawled. The robots.txt file is then analyzed, and learn how the robot on a site that should not be a robot. As a search engine crawler may keep a cached copy of this file may be at the site webmaster does not want to crawl through. Pages typically prevented the robot specific sites are logged, such as shopping carts and user-specific content, such as search results from internal searches. In March 2007, warned Google webmasterům that should prevent indexing of internal search results because those pages are considered search spam .
 White hat versus black hat
SEO techniques can be divided into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve. Search engines attempt to minimize the impact on the latter, among them spamdexing. Industry commentators have classified these methods are, and doctors who employ them, as either white hat SEO or black hat SEO . White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing. 
SEO is a technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. Since the search engine guidelines     are not written as a series of rules and commandments, this is an important difference to note. White hat SEO is not just about following guidelines, but to provide content search engine indexes and subsequently ranks is the same content a user sees.
White hat advice is generally summed up as creating content for users, not for search engines, and then to the content easily accessible to spiders than trying to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility, , and when two are not identical.
Black Hat SEO attempts to improve assessment methods, which were rejected at the latest search engine, or deception. One black hat technique uses text that is hidden, either as text colored similar to background, in an invisible div, or positioned outside the screen. Another method gives a different page depending on whether the page being requested human visitor or a search engine, a technique known as cloaking.
Search engines penalize sites in May to discover using black hat methods, either by reducing their location or delete their listings from their databases, absolutely. Such penalties may be applied either automatically by search engines' algorithms, or review manual pages.
One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.  Both companies, however, quickly apologized, fixed the conflicting parties, and was renewed until the Google overview .
As a marketing strategy
Eye tracking studies have shown that searchers scan a search results page from top to bottom and from left to right (right to left languages), looking for relevant results. Location or near the top of the ranking, therefore, increases the number of searchers who visit the site.  However, more search engine does not warrant recommendations to increase sales. SEO is not necessarily an appropriate strategy for each web site and other Internet marketing strategies can be much more effective, depending on the website of your goals.  A successful Internet marketing campaign may drive organic traffic to the site, but may also include the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may result from the search engines searching and indexing those sites, setting up Analytics programs to enable site owners to measure their successes, and improve site conversion rate .
SEO may generate income from investments. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued recommendations. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.  This is considered wise business practice for website operators to liberate from dependence on search engine traffic.  A top-intravilánu SEO Blog Seomoz.org  were reported, "Search for dealers in the twist of irony, receive a very small proportion of their traffic from search engines." On the contrary, their main sources of traffic are links from other websites. 
A Baidu search results page
A Baidu search results page
Search engines' market shares vary from market to market, as well as competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.  In markets outside the United States, Google is often a greater share, and Google remains the dominant search engine in the world since 2007.  As of 2006 Google held about 40% of the market in the United States, but Google has 85-90% market share in Germany.  Although there were hundreds of SEO firms in the U.S. at that time there were only about five Germany. 
In Russia the situation is reversed. Local search engine Yandex controls 50% of the paid advertising revenue, while Google has less than 9%.  In China, Baidu continues to lead in market share, although Google's share is from 2007. 
Successful search optimization for international markets may require professional translation of web pages, domain name registration with the top-level domain, the target market, web hosting, which gives the local IP address. Otherwise, the basic elements of search optimization are essentially the same, regardless of the language. 
On 17 October 2002 SearchKing bring an action in U.S. District Court, Western District of Oklahoma, against Google. SearchKing claim was that Google's tactics to prevent civilian interference spamdexing represent contractual relationships. On 27 May 2003, the court granted Google's motion to dismiss the action complaint because SearchKing "failed State is entitled to relief, which may be granted."  
In March 2006, KinderStart filed suit against Google over search engine rankings. Kinderstart website had been removed from Google's index prior to the initiation of litigation and the amount of traffic on the site decreased by 70%. On 16 March 2007 in the United States District Court for the Northern District of California (San Jose Division) dismissed the complaint KinderStart holiday, with no change, and partially granted Google a proposal to Article 11 sanctions against KinderStart is a lawyer, which requires him to pay part of Google legal expenses.
Various search engine optimization tools exist to optimize the website / URL search engine placement, including the Pagerank and backlink checkers, mass blog plumbers, content generators, RSS scrapers, typo generators (often used for the typo in the squat), cloaking software and keyword ranking tracking.