It is the process of getting all the web pages linked to a website. This task is performed by software called a crawler or a spider. Like Googlebot for Google, Yahoo slurp for Yahoo, and Bing bot for Bing.
It is the process of generating an index for all the fetched web pages and placing them into a huge database from where it can later be retrieved. The process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords.
It is basically a step to step process where a search request is sent, and the search engine checks and processes it. It then compares the search string that is present in the search request with the indexed pages in the database.
There is a very high chance that more than one page contains the same searched string, so the search engine starts calculating the relevancy of each of the pages in its index to the search string.
It is the last step where a search engine displays the best-matched results. Basically, it is nothing more than simply displaying them in the browser.
There are a lot of algorithm rules involved in this search process and Google, Yahoo often updates their algorithm. So it is observed that results vary for the same content, searching after some time interval. That is why you must know about the Search Engine before you start doing SEO for your website. You can also check out our blog to learn more about What is SEO? & how SEO works for websites?