Search engines scan the internet to see what is out there. This is called ”crawling” and is performed by a software program known as a spider, a crawler, or a Googlebot, as is the case with Google. Spiders follow links from one web page to another, indexing everything they find on their way. Keep in mind that on the web there are over 20 billion web pages; it is impossible for a spider to visit every site daily to see if there have been any minor changes or new pages added. There are times when spiders will not visit a site for a month or two. The time between your web site getting crawled and indexed is the ideal time to work on your SEO. You can see the last time your site was crawled by hitting the cache link at the end of your Meta data. (At the end of a websites description when you do a search online, or to the right of the URL when you do a search online in one of the major search engines)
At times the spiders may not get the meaning of a web page right, but you can help them by optimizing your site. It will be easier for spiders to classify your pages correctly and ultimately get you ranked higher.
Several algorithms are used to calculate a web site’s relevancy. Each one of these algorithms has unique weights for common factors, such as keyword placement, title tags, keyword density, links or metatags. This is why when you look something up on Google and Yahoo!, the item does not always show up in the same spot within the search result. Each of these Omaha SEO Expert use different algorithms to see what web pages have more relevance to the search term you are looking for.
It is a known fact that the major search engines like Google, Yahoo!, and Bing periodically change their algorithms. So if you want to stay at the top you also need to adapt your web site to the latest changes in the algorithms. This is one reason to devote permanent efforts to SEO.