Home > Enterprise >  How to optimize Linq query with large number of records?
How to optimize Linq query with large number of records?

Time:12-13

Please help me to optimize the below code. I have tried different methods but I am not getting a significant performance improvement. There are around 30k entries in database and it's taking around 1 min to load in local.

var alarms = from healthIssue in _context.HealthIssues.AsNoTracking()
                         join asset in _context.Assets.AsNoTracking() on healthIssue.AssetNumber equals asset.SerialNumber into joinedTable
                         from data in joinedTable.DefaultIfEmpty()
                         select new
                         {
                             ID = healthIssue.ID,
                             AssetNumber = healthIssue.AssetNumber,                                                         
                             AlarmName = healthIssue.AlarmName, 
                             Crew = data.Crew,                           
                         };
//alarmsViewModelList count is 30k  
var alarmsViewModelList = await alarms.ToListAsync();
//groupedData count = 12k 
var groupedData = alarmsViewModelList.Select(c => new { c.AssetNumber,c.AlarmName}).Distinct().ToList();
// filteralarms' count = 20k 
var filteralarms = (alarmsViewModelList.Where(c => c.AlarmSeverityLevel != AlarmSeverityLevel.Unknown).ToList());
for (int j = 0; j < groupedData.Count; j  )
{
    var alarm = groupedData[j];
    //The line is actually slowing the code.
    var alarmlist = filteralarms.AsEnumerable().Where(c => c.AlarmName == alarm.AlarmName && c.AssetNumber == alarm.AssetNumber
                            ).Select
                            (c => new
                            {
                                HealthIssueID = c.ID,
                                AlarmLastUpdateDateTime = DateTimeHelpers.FromEpochSecondsUTC(c.AlarmLastUpdatedTime),
                                AlarmSeverityLevel = c.AlarmSeverityLevel,
                                
                            }).OrderByDescending(c =>c.AlarmLastUpdateDateTime).ToList();
    int alarmCount = alarmlist.Count;
    if (alarmCount > 1)
    {
        businessLogicFunction(alarmlist); 
    }

}

CodePudding user response:

Try it with AsNoTracking, if there is an OrderBy,OrderByDescending record, use it at the end

var iq_filteralarms = alarmsViewModelList.Where(c => c.AlarmSeverityLevel != AlarmSeverityLevel.Unknown).AsNoTracking(); /* IQueryable */
            foreach (var item in alarmsViewModelList.Select(c => new
            {
                c.AssetNumber,
                c.AlarmName
            }).Distinct())
            {
                var iq_alarmlist = iq_filteralarms.Where(c => c.AlarmName == item.AlarmName && c.AssetNumber == item.AssetNumber).Select(c=> new { 
c.ID, 
c.AlarmLastUpdatedTime,
c.AlarmSeverityLevel
});
                if (iq_alarmlist.Count() > 1)
                {
                    businessLogicFunction(iq_alarmlist.AsEnumerable().Select(c => new
                    {
                        HealthIssueID = c.ID,
                        AlarmLastUpdateDateTime = DateTimeHelpers.FromEpochSecondsUTC(c.AlarmLastUpdatedTime),
                        AlarmSeverityLevel = c.AlarmSeverityLevel,

                    }).OrderByDescending(c => c.AlarmLastUpdateDateTime));
                }
            }

CodePudding user response:

A simple logical performance gain would be to remove resolved alarms as you go:

...
var alarm = groupedData[j];
//The line is actually slowing the code.
var matchingAlarms = filteralarms.Where(c => c.AlarmName == alarm.AlarmName && c.AssetNumber == alarm.AssetNumber);
var alarmlist = filteralarms.Except(matchingAlarms
                        ).Select
                         (c => new
                         {
...

CodePudding user response:

This is what I can make with linq.

  //alarmsViewModelList count is 30k  
var alarmsViewModelList = await alarms.ToListAsync();
//groupedData is almost 12k 
var groupedData = alarmsViewModelList.Select(c => new { c.AssetNumber,c.AlarmName}).Distinct().ToList();
// filteralarms' count is almost 20k 
var filteralarms = alarmsViewModelList.Where(c => c.AlarmSeverityLevel != AlarmSeverityLevel.Unknown).OrderByDescending(c => DateTimeHelpers.FromEpochSecondsUTC(c.AlarmLastUpdateDateTime));
for (int j = 0; j < groupedData.Count; j  )
{
    var alarm = groupedData[j];
    //The line is actually slowing the code.
    var alarmlist = filteralarms.Where(c => c.AlarmName == alarm.AlarmName && c.AssetNumber == alarm.AssetNumber);
    
    if (alarmlist.Count() > 1)
    {
        businessLogicFunction(alarmlist.Select
                            (c => new
                            {
                                HealthIssueID = c.ID,
                                AlarmLastUpdateDateTime = DateTimeHelpers.FromEpochSecondsUTC(c.AlarmLastUpdatedTime),
                                AlarmSeverityLevel = c.AlarmSeverityLevel,
                                
                            }).ToList()); 
    }

}

And if you can, you can make it faster by removing ToList() in businessLogicFunction like.

businessLogicFunction(alarmlist.Select
                        (c => new
                        {
                            HealthIssueID = c.ID,
                            AlarmLastUpdateDateTime = DateTimeHelpers.FromEpochSecondsUTC(c.AlarmLastUpdatedTime),
                            AlarmSeverityLevel = c.AlarmSeverityLevel,

                        })); 
  • Related