Search Engine Optimization
Search Engine Optimization (English: search engine optimization, SEO) is the process of improving the visibility of a website in search engines via “natural” or unpaid (pomegranate. More “organic” or “algorithmic”) results, which includes optimization of HTML-code structure and text of the website. Advertising on the site and increasing internal and / or feedback (backlinks) is still one of the strategies for SEO optimization. In fact, the earlier (or higher on the page) and more frequently a site appears in the results, the more visitors are likely to generate its search engine. So the site gets presence in the Internet space. The optimization may include various aspects of search as a search in a particular area or region, and image search, video or news.
As an Internet marketing strategy, search engine optimization explores how search engines work and what people search.
System administrators and content providers began optimizing sites for search engines in the mid-90s when the first search engines began to catalog the early Web space. Initially everything that webmasters had to do is put the page address or URL-but in different machines that go the website and extract hyperlinks thereof, which may then be indexed.
The process included the implementation of programs called “spiders” that search engines use to download pages from the Internet. They retain these pages on separate server for search, while a second program called “indexer” extract different information from the downloaded page as the words it contains, where they are and the importance of each. Furthermore, they are retained and the hyperlinks contained in the pages to be crawled can at a later stage.
Leading search engines like Google, Bing and Yahoo! use software known as robots or kroleri (robots / crawlers), which crawls web pages and index them using special algorithms. Indexation is the process of navigation through the pages krolerite can reach, and adding them to the database of search engines. Databases can be viewed as libraries in which search engines are looking for the most appropriate answer and return them as results set Search . Note that when you are looking for something, the search engine returns the answer from its database, rather than search for a site. Robots take into account various factors when crawling sites, so not every page is found. They monitor a variety of parameters such as distance from a page to the root directory of the site. Google offers tools for programmers (Webmaster Tools), which can produce a site map (XML Sitemap), to ensure that all pages are found, especially those that are not detectable by automated search method.