Since 2021, I have done so many benchmarks that I cannot count them. Almost from the very beginning, I have been glancing at job boards, analyzing salary reports in IT and wondering how to use this data.

And so, right after the interview with the candidate, I would often meticulously type the data into a table. Soon, everything would change - my work as a recruiter on an external project was replaced by data analysis for Bee Talents. I started with recruitment, soon sales and finance joined them, and finally setting and monitoring metrics. I went through it all.

There was no week that went by without me getting a question from the team: “Are these salary ranges for role X good?”. However, the initiative usually came from our partners, who were not sure whether the salary ranges they offered were adequate to market realities. And therefore - whether candidates would be interested in the job offer at all. From a single, insignificant duty, benchmarking was soon to become one of the main tasks I carried out.

The catch is that it was always supposed to be just a side job - something I do on the side, not my main responsibility. So over time I started thinking about ways to automate and speed up benchmarking. I describe each of them below - along with their pros and cons.

1. Forkbot

At Bee Talents, we have been using Wage Wizard (as we affectionately call this assistant) for a long time. It helps us in sales activities, but also in the daily work of recruiters. It is the fastest method of determining whether salary ranges fit into what is currently happening on the market. This assistant evolved together with GPT models - it started as a search engine and a data scraping tool. I presented its capabilities at one of the webinars , where it decided to rebel and show that both me and OpenAI still have a lot of work ahead of them.

Over time, I realized that the more functions this assistant has to perform at once, the worse it copes with each of them individually. Just like a penknife - it seems to do everything, but full-fledged tools do it much better. From long, complicated prompts, I started making simple, short and precise ones. This way, the instruction itself has fewer tokens and everything happens faster.

Currently, the assistant focuses on data that is included in publicly available salary reports. And no, I do not support plagiarism and intellectual property theft, which is why I obtained permission from the creators to use it in creating this assistant and sharing it with you.

Here's an example of what my work can do:

Screenshot: example of using Wage Wizard with a prompt - Mid Python Developer

Pros:

  • the fastest way to check forks (takes a few seconds),
  • does not require your own work and knowledge of statistics,
  • is based on professional salary reports.

Cons:

  • mainly applies to IT roles,
  • only works for typical roles.

I promised to share my bot, so after filling out the form below you will receive access to the forkbot at the email address provided.

A bot based on generative AI will only be as good as the prompts and inputs. This means that it will usually only provide what we ask for and nothing more. For a more in-depth assessment of the market situation, I recommend checking out the reports I linked at the end of this post.

Additionally, salary reports do not always answer all questions. They primarily show measures of central tendencies, i.e. the mean or median, and ignore individual results. They are completely ineffective in the case of non-standard roles that are difficult to describe with a single technology.

2. Data scraping using a browser plugin

At this point, we really are going backwards in time. Generative AI is still in its infancy – or at least out of reach for us consumers. I do these benchmarks by manually entering amounts into Google Sheets. I have nimble fingers, a keyboard with a numpad, and yet it takes a lot of time and is a source of protest. Because that’s not what my job is supposed to be! I start looking to Github and data scraping apps for salvation.

It turns out that this is too difficult and time-consuming for a greenhorn. This is where the Instant Data Scraper plug-in comes in. All you need to do is install it, go to the job offer aggregate of your choice, enter the required criteria and copy the data. After appropriate processing, we have information on salaries. Before using this tool, I recommend that you read the regulations of the site from which you are downloading the data - it is worth making sure that you are not violating the rules by scraping data.

Pros:

  • data matched exactly to our filters,
  • full control over what is included in the analysis.

Cons:

  • the data requires our processing,
  • gives us insight only into the offer snapshot - i.e. the status of the offer at the moment of download.

The method will provide us with much more detailed data than an AI agent. The price is a higher entry threshold than entering a simple prompt and the need to manually aggregate and analyze the data.

3. Salary collection application

From the very beginning of benchmarking, I dreamed of creating my own application for aggregating salaries in IT. The principle of operation is very simple - once in a while it downloads current job boards ads, selects the required information from them and puts it into the database. There, the script checks if it is not a duplicate and removes repeated ads. Thanks to information about technology, role name and company, the application would allow for easy sorting and filtering of information.

Sounds beautiful. Now - let's draw it:

Diagram: Simplified diagram of the application architecture.

I apologize to all programmers in advance for such a huge simplification. This diagram is to demonstrate that creating such an application takes time, idea, and skill. I have one of the three required components.

And yes, I realize that throwing out such an idea as part of a post on a recruitment agency website may seem off-topic. In my defense, this post shows my experiments with benchmarking and the specifics of my work in general - there is a lot of uncertainty and testing here, which ultimately allows for the development of the best solutions. Out of a dozen or so ideas, only one will be a hit - and I have to live with that.

Pros:

  • a huge database after just a few months,
  • the most accurate salary analysis,
  • granular filters that allow for very fine tuning of the analysis.

Cons:

  • high cost of creating and maintaining an application,
  • requires skills and knowledge,
  • results available only after creating MVP.

Creating your own app is a long-term investment. If I were starting my adventure with data analysis today - I would probably create an app with the help of AI that could do all of this. I have significantly expanded the scope of my skills, but I have also changed the optics - I look much broader.

Summary

Each of the above options has its advantages and disadvantages and is intended for something different. The fastest, but at the same time limited to IT method, is to use our GPT assistant - Widełkobot. A compromise between speed and accuracy will be to use a Data Scraping plugin. The most accurate, but also the most time-consuming method is to build your own application for collecting data from job boards.

I personally recommend method 2 using the Instant Data Scraper plugin. With a small entry threshold and effort, it really gives you a lot of possibilities. Even though the benchmarking era is long behind me, I still use this plugin on a daily basis.

Finally, I recommend that you read these sources:

Bulldogjob Report 2025

JustJoinIT Report 2024/2025

NFJ Report 2024/2025

Wage Wizard

Wage Wizard (Widełkobot in English)

I also bow to the authors of the reports and thank them for the opportunity to use them! And if you have any questions after reading this post, feel free to write to me on LinkedIn .