Home > OS >  RabbitMQ throw exception when send to dql (max frame_size)
RabbitMQ throw exception when send to dql (max frame_size)

Time:11-26

When I got message from queue and if exception was thrown, I want to get message again. So, I create my consumer with dql queue:

spring:
  cloud:
    stream:
      bindings:
        to-send-output:
          destination: to-send-event
          producer:
            required-groups:
              - to-send-event-group
        to-send-input:
          destination: to-send-event
          group: to-send-event-group
          consumer:
            max-attempts: 1
            requeueRejected: true
      rabbit:
        bindings:
          #forever retry
          to-send-input:
            consumer:
              autoBindDlq: true
              dlqTtl: 5000
              dlqDeadLetterExchange:
              maxConcurrency: 300
              frameMaxHeadroom: 25000 #added this as in documentation

I added the property frameMaxHeadroom: 25000 as it says in documentation but it still not work.

My springCloudVersion="Hoxton.RELEASE".

My dependency:

dependencies {
...
    implementation "org.springframework.cloud:spring-cloud-starter-stream-rabbit"
...
}

In repository on github I see frameMaxHeadroom property in property file.

UPD: I see that the code reduces the stack trace by the value I set (from a variable frameMaxHeadroom). I expected that I wasn't decreasing the stack trace, but increasing the value for the headers for consumer, as written in the documentation. Why it isn't work as I wait?

CodePudding user response:

frameMax is negotiated between the amqp client and server; all headers must fit in one frame. You can increase it with broker configuration.

Stack traces can be large and can easily exceed the frameMax alone; in order to leave room for other headers, the framework leaves at least 20,000 bytes (by default) free for other headers, by truncating the stacktrace header if necessary.

If you are exceeding your frameMax, you must have other large headers - you need to increase the headroom to allow for those headers, so the stack trace is truncated further.

  • Related