A SEO performance and logs analysis tool.
BeeBOT is a data aggregation tool focused around analyzing the behavior of bots on websites. The data collected comes from the technical navigation of the tool on the site and the web server execution logs. It allows SEO managers to improve their natural referencing by analyzing crawler behavior.
Web servers have historical traces left by the crawler (via log files). These traces are clear and indisputable, unlike all attempts to analyze good or bad quality of the indexing algorithms of Google and other robots. Because of these reasons it is interesting to analyze these log files to help guide the SEO strategy.
Sponsored by Bee4, we intervened in the management and definition of needs but also in the architecture and development of the solution.