How a search engine works: There are four stages
crawling
indexing
calculating relevancy.
ranking.
Crawling: it collects all information present on all websites, whether it may be millions of websites or billions of websites. It collects the information through programs like crawlers, bots, and spiders. Crawlers, bots, and spiders are called as software programs in order to gather all the data from various websites or web pages available. They go to all websites and crawl all the information present on all websites or web pages. This is called as crawling.
Indexing: In this stage, to store all the data that is gathered from the crawling stage is stored. That data can be stored in a database. Data can also be stored in categories and subcategories.
the data is divided into categories and subcategories for better relevancy and ease of search.
categories --> sub categories ---> nested sub categories --- goes on..
Calculating relevancy: it gives a score to each and every website. Based on the highest score, it will be given Ranks like 1, 2,3,... goes on. Based on the relevancy, it will be given a ranking.
It will be given a score to each and every website based on its quality. which website that gets the highest score will be given top rankings. Based on the ranking orders, the result will be displayed to us.