Home > Back-end >  C#: Throttle/rate limit outgoing HTTP requests with Polly
C#: Throttle/rate limit outgoing HTTP requests with Polly

Time:09-14

I am developing an integration solution that accesses a rate limited API. I am performing a variety of CRUD operations on the API using multiple HTTP verbs on different endpoints (on the same server though). I have been pointed towards Polly multiple times, but I haven't managed to come up with a solution that actually works.

This is what I have in my startup:

builder.Services
    .AddHttpClient("APIClient", client =>
    {
        client.BaseAddress = new Uri(C.Configuration.GetValue<string>("APIBaseAddress"));
    })
    .AddTransientHttpErrorPolicy(builder => 
        builder.WaitAndRetryAsync(new []
        {
           TimeSpan.FromSeconds(1),
           TimeSpan.FromSeconds(5),
           TimeSpan.FromSeconds(15),
        }));

This is just resilience to retry in case of failure. I have a RateLimit policy in a singleton ApiWrapper class:

public sealed class ApiWrapper
{
    private static readonly Lazy<ApiWrapper> lazy = new Lazy<ApiWrapper>(() => new ApiWrapper());
    public static ApiWrapper Instance { get { return lazy.Value; } }
    private IHttpClientFactory _httpClientFactory;
    public readonly AsyncRateLimitPolicy RateLimit = Policy.RateLimitAsync(150, TimeSpan.FromSeconds(10), 50); // 150 actions within 10 sec, 50 burst

    private ApiWrapper()
    {
    }

    public void SetFactory(IHttpClientFactory httpClientFactory)
    {
        _httpClientFactory = httpClientFactory;
    }

    public HttpClient GetApiClient()
    {
        return _httpClientFactory.CreateClient("APIClient");
    }
}

That policy is used in multiple other classes like this:

public class ApiConsumer
{
    private HttpClient _httpClient = ApiWrapper.Instance.GetApiClient();

    public async Task<bool> DoSomethingWithA(List<int> customerIDs)
    {
        foreach (int id in customerIDs)
        {
            HttpResponseMessage httpResponse = await ApiWrapper.Instance.RateLimit.ExecuteAsync(() => _httpClient.GetAsync($"http://some.endpoint"));
        }
    }
}

My expectation was that the rate limiter would not fire more requests than configured, but that does not seem to be true. From my understanding the way it works is that the rate limiter just throws an exception if there are more calls than the limit that has been configured. That's where I thought the Retry policy would come into play, so just try again after 5 or 15 seconds if it did not go through the limiter.

Then I played around a bit with Polly's Bulkhead policy, but as far as I can see that is meant to limit the amount of parallel executions.

I have multiple threads that may use different HttpClients (all created by the Factory like in the example above) with different methods and endpoints, but all use the same policies. Some threads run in parallel, some sequentially as I have to wait for their response before sending the next requests.

Any suggestions on how this can or should be achieved with Polly? (Or any other extension if there is good reason to)

CodePudding user response:

In this post I would like to clarify things around rate limiter and rate gate

Similarity

  • Both concepts can be used to throttle requests.
  • They sit between the clients and the server and they know about the server's capacity.

Difference

  • The limiter as its name implies limits the transient traffic. It short-cuts the requests if there are too many.

  • The gate on the other hand holds/delays the requests until there is enough capacity.

Algorithms

CodePudding user response:

Thanks again to @Neil and @Panagiotis for pointing me in the right direction. I wrongly assumed that the Polly rate limiter would actually delay API calls. I found a workaround that probably is not particularly nice, but for my purpose it does the trick.

I installed David Desmaisons RateLimiter package which is super simple to use. In my singleton I have now this:

public TimeLimiter RateLimiter = TimeLimiter.GetFromMaxCountByInterval(150, TimeSpan.FromSeconds(10));

I use this RateLimiter everywhere I make calls to an API endpoint like this:

HttpResponseMessage httpResponse = await ApiWrapper.Instance.RateLimiter.Enqueue(() => _httpClient.GetAsync($"http://some.endpoint"), _cancellationToken);

Does exactly what I originally expected from Polly.

  • Related