Home > Enterprise >  Why does the "hatch rate" matter when performance testing?
Why does the "hatch rate" matter when performance testing?

Time:09-22

I'm using Locust for performance testing. It has two parameters: the number of users and the rate at which the users are generated. But why are the users not simply generated all at once? Why does it make a difference?

CodePudding user response:

Looking at Locust Configuration Options I think correct option is spawn-rate

Coming back to your question, in Performance Testing world the more common term is ramp-up

The idea is to increase the load gradually, as this way you will be able to correlate other performance metrics like response time, throughput, etc. with the increasing load.

If you release 1000 users at once you will get a limited view and will be able to answer only to question whether your system supports 1000 users or not. However you won't be able to tell what is the maximum number, what is the saturation point, etc.

When you increase the load gradually you can state that i.e.

  1. Up to 250 users the system behaves normally, i.e. response time is the same, throughput increases as the load increases
  2. After 250 users response time starts growing
  3. After 400 users response time starts exceeding acceptable thresholds
  4. After 600 users errors start occurring
  5. etc.

Also if you decrease the load gradually you can tell whether the system gets back to normal when the load decreases.

  • Related