Home > Back-end >  Handling concurrency condition in entity framework core to prevent race condition
Handling concurrency condition in entity framework core to prevent race condition

Time:03-18

I am trying to run a test case and trying to pass two schedules inside like below:

 var itemToAdd = new ScheduleInputItemDto
    {
        Start = DateTime.UtcNow,
        End = DateTime.UtcNow.AddHours(1),
        ProdType = "Prod1"
    };

var addItemForScheduleRequest = (string scheduleId) => NewRequest
        .AddRoute("schedule/items")
        .AddQueryParams("scheduleId", scheduleId);

 var response = await Task.WhenAll(addItemRequest.Post(itemToAdd, false), addItemRequest.Post(itemToAdd, false));

This will post two items with same start and end time, which is causing a race condition and in this case it should insert only one of them. It then goes into the service layer:

public async Task<ScheduleResponseDto> AddItemToSchedule(int scheduleId, ScheduleInputItemDto scheduleItem)
{
    var scheduleWithId = await _scheduleRepository.GetScheduleById(scheduleId);
    scheduleWithId.AddItem(
        start: scheduleItem.Start,
        end: scheduleItem.End,
        cementType: scheduleItem.CementType,
        now: DateTime.UtcNow);
    await _scheduleRepository.Update(scheduleWithId);
    return scheduleWithId.MapToScheduleDto();
}

AddItem calls the DAO layer like below.

public Schedule()
{
    ScheduleItems = new List<ScheduleItem>();
}

public Schedule(int plantCode, DateTime now)
{
    PlantCode = plantCode;
    UpdatedOn = now;
    ScheduleItems = new List<ScheduleItem>();
}
public int ScheduleId { get; set; }
public int PlantCode { get; set; }
public DateTime UpdatedOn { get; set; }
public ICollection<ScheduleItem> ScheduleItems { get; set; }

public void AddItem(DateTime start, DateTime end, string cementType, DateTime now)
{
    var item = new ScheduleItem(start, end, cementType, now);
    ScheduleItems.Add(item);
    ConcurrentBag<ScheduleItem> concurrentBag = new ConcurrentBag<ScheduleItem>(ScheduleItems.ToList());        
    item.ValidateDoesNotOverlapWithItems(concurrentBag);
    
}

But for my case, it is inserting both of them instead of the following check I made:

The validation code is below:

    public static void Validate(this ScheduleItem currentItem, ConcurrentBag<ScheduleItem> scheduleItems)
    {
        if (scheduleItems.Any(scheduleItem => currentItem.Start < scheduleItem.End && scheduleItem.Start < currentItem.End))
        {
            throw new ValidationException("A conflict happened.");
        }
    }

The ScheduleItem model has a property named UpdatedOn which can be used as a concurrency token.

After debugging the test case, I saw both of the items posted from inside .WhenAll() have exact same DateTime. How can I prevent the later item to be inserted? Optimistic or Pessimistic concurrency control should be used in this case?

CodePudding user response:

In summary, when using a database with multiple actors (e.g. threads, requests), it is hard to ensure that the data doesn't change between a read (required for validation) and a write (insert/update).

In my opinion, the only sure way to handle race conditions is to try the insert/update and act on failures.

There is a path of using consistency levels and transactions, where the row - or the whole table - is locked for the whole unit-of-work (read, do things, write) but for this to work a strong change control is required so that the system not inadvertently broken by changing one piece and not knowing another part depends on it.

Optimistic concurrency model

One simple way to with concurrency for database updates is to use a token to help detecting conflicts. For example:

update x set y = z, timestamp = now where timestamp = timestamp_from_my_last_read;

If number of updated rows is 0 then you know someone updated the row and you need to retry, report error, or do whatever else is required. You don't need to use a timestamp; the token can be anything, even all the values - the key element is where something = something_value_when_I_last_read_this_row.

This method is called optimistic because it assumes things will be OK and you react on failures rather than assume that things will be bad from the start.

Some ORMs, including Entity Framework, natively support for this kind of concurrency handling. Please see EF Core's Handling Concurrency Conflicts.

Data store in memory of a single process

If your data store is in memory of a single process you can protect against race conditions with locking.

Let's consider this method

void Insert(X toInsert) 
{
  lock (collectionLock)
  {
     var collection = GetCollection(); 
     if (ValidateOrThrow(collection, toInsert))
     {
        collection.Insert();
     }
  }
}

As long as:

  • all code paths that wish to insert use Insert method
  • other methods (Delete, etc.) also use collectionLock when interacting with the collection

then it is guaranteed that the collection is not modified between GetCollection and .Insert.

CodePudding user response:

List<ScheduleItem> scheduleItems is not thread-safe. Try using a ConcurrentBag<ScheduleItem>

  • Related