I'm trying to design an autoscaling policy to scale out/in the number of consumers listening to a queue. My first instinct was to base the scaling policy off EnqueueTime
which if too high should result in scaling out and scale in when low.
However, the way EnqueueTime
appears in Cloudwatch does not seem to match my expectations. From the documentation, EnqueueTime
is defined as
The end-to-end latency from when a message arrives at a broker until it is delivered to a consumer.
Note:
EnqueueTime does not measure the end-to-end latency from when a message is sent by a producer until it reaches the broker, nor the latency from when a message is received by a broker until it is acknowledged by the broker. Rather, EnqueueTime is the number of milliseconds from the moment a message is received by the broker until it is successfully delivered to a consumer.
I had expected EnqueueTime
to represent how long a message will "wait" in the queue until consumed, but from the screenshot, it is not clear to me how the supposed "wait time" is 1.9s despite there being nothing in the queue and no message production (EnqueueCount
= 0). I also don't understand why EnqueueTime
does not change well after the spike in traffic (the green spike). I expected the value to be close to 0ms after the spike. The metric not changing affects scaling because if the metric does not change, then the policy might erroneously scale out despite there being no traffic.
I'm also new to using ActiveMQ and am not entirely familiar with its operations. I would greatly appreciate it if somebody could somebody explain what's going on here and how to properly interpret EnqueueTime
.
CodePudding user response:
EnqueueTime
does represent how long messages "wait" in the queue until they are consumed, but it is important to note that it is an average. Therefore, it is unlikely to fit your use-case because the relative "weight" of each message's individual EnqueueTime
will change over time. It won't give you a reliably clear picture of the queue's load relative to consumption.