Skip to main content


>>>>>>>>>>>>>>>>>>>>>>>>>>The databases of Internet search engines must be constantly updated and supplemented with new web pages. This task is performed by crawlers such as Microsoft’s Bingbot.

What is a Bingbot?

The Bingbot is a web crawler developed by Microsoft and active since October 2010. After a two-year trial period, Microsoft’s Live Search was finally replaced by the new bot on 01 October 2012. Since then, the crawler searches the Internet for web pages, analyzes the content of the pages and indexes them for the Microsoft search engine Bing and the search engine of Yahoo! Each crawler or bot (RoBOTer) is a complex software program that can act independently.

How does the Bingbot work?

Crawlers or spiders like the Bingbot independently search the Internet for HTML web pages. The contents of the individual web pages are also searched and analyzed. The bot follows the internal linking on a web page and thus gradually searches all sub-pages of an Internet presence.

In doing so, the bot analyses the content and registers, for example, which keywords are used. The crawler finds further web pages by following links that refer to other web pages. Based on the information collected, the web pages are classified in the search engine’s index according to their relevance to certain topics. The Googlebot from Google works in the same way.

Do you have any more questions?

Please contact us

Further contents