Home > database >  Calling an API 20 times to generate a list, best practice?
Calling an API 20 times to generate a list, best practice?

Time:08-08

The case is that I call an API once to get a list of tenants, then for each tenant I must call again to get a list of usages for the tenant. Unfortunately there is no way to get usages for all tenants in a single call.

Now I wish to try to save time by making these calls concurrent. Then put them all together after the last one arrives. Here is what my attempt looks like so far:

public async Task<List<Usage>> GetUsagesAsync2(Task<List<Tenant>> tenants)
{
    List<Usage> usages = new List<Usage>();

    foreach (var tenant in await tenants)
    {
        //Generate request
        RestRequest request = new RestRequest("tenants/{tenantID}/usages", Method.Get);
        request.AddParameter("tenantID", tenant.id, ParameterType.UrlSegment);
        request.AddHeader("Authorization", $"Bearer {Token.access_token}");

        //Get response
        RestResponse response = await _client.ExecuteAsync(request)
            .ConfigureAwait(false);

        //Validate response
        if (response.StatusCode != HttpStatusCode.OK)
            throw new Exception("Failed at Getting usages for a tenant: ");

        //Add to list
        var t_usage = JsonConvert.DeserializeObject<Wrapper<Usage>>(response.Content);
        usages.AddRange(t_usage.items);
    }
    return usages;
}

The code runs and does what it is supposed to, but I am not sure that it actually runs asynchronously. Takes about 7-8 seconds to run, which I find a bit long to wait on a webpage.

CodePudding user response:

Here is parallelized implementation:

.NET 6

public async Task<ConcurrentBag<Usage>> GetUsagesAsync2(Task<List<Tenant>> tenants)
{
    ConcurrentBag<Usage> usages = new ConcurrentBag<Usage>();

    await Parallel.ForEachAsync(await tenants, async (tenant) =>
    {
        //Generate request
        RestRequest request = new RestRequest("tenants/{tenantID}/usages", Method.Get);
        request.AddParameter("tenantID", tenant.id, ParameterType.UrlSegment);
        request.AddHeader("Authorization", $"Bearer {Token.access_token}");

        //Get response
        RestResponse response = await _client.ExecuteAsync(request).ConfigureAwait(false);

        //Validate response
        if (response.StatusCode != HttpStatusCode.OK)
            throw new Exception("Failed at Getting usages for a tenant: ");

        //Add to list
        var t_usage = JsonConvert.DeserializeObject<Wrapper<Usage>>(response.Content);
        usages.AddRange(t_usage.items);
    });

    return usages;
}

Pre .Net 6

public async Task<ConcurrentBag<Usage>> GetUsagesAsync2(Task<List<Tenant>> tenants)
{
    ConcurrentBag<Usage> usages = new ConcurrentBag<Usage>();
    var tasks = new List<Task>();

    foreach (var tenant in await tenants)
    {
        tasks.Add(Task.Run(async () =>
        {
            //Generate request
            RestRequest request = new RestRequest("tenants/{tenantID}/usages", Method.Get);
            request.AddParameter("tenantID", tenant.id, ParameterType.UrlSegment);
            request.AddHeader("Authorization", $"Bearer {Token.access_token}");

            //Get response
            RestResponse response = await _client.ExecuteAsync(request)
                .ConfigureAwait(false);

            //Validate response
            if (response.StatusCode != HttpStatusCode.OK)
                throw new Exception("Failed at Getting usages for a tenant: ");

            //Add to list
            var t_usage = JsonConvert.DeserializeObject<Wrapper<Usage>>(response.Content);
            usages.AddRange(t_usage.items);
        }));
    }

    await Task.WhenAll(tasks);

    return usages;
}

Take note that List is changed to ConcurrentBag because it is not thread safe

CodePudding user response:

What you could basically do is:

IEnumerable<Task<List<Usage>> loadTasks = tenants.Select(LoadUsages);
List<Usage>[] usages = await Task.WhenAll(loadTasks);

async Task<List<Usage>> LoadUsages(Tenant t) {
    // your web call goes here
    return t_usage.items;
}

But, as pointed out in the comments, this will not be throttled and might issue way too many requests at once. If you're sure the number of tenants will stay around 20 this should be fine. Otherwise you'll have to implement a more sophisticated solution that does batch processing.

  • Related