var recordsStates = from s in db.Record
join ss in db.Record_State
on s.Id equals ss.RecordId
join apct in db.RecordTypesContentMapping
on new { t1 = ss.RecordId } equals new { t1 = apct.ContentId } into ss_apct
from apct in ss_apct.DefaultIfEmpty()
where (apct.ContentSourceName == null || apct.ContentSourceName == nameof(Record))
&& ss.isCurrent && ss.StateId == (int)RecordStateType.Publish
&& !ss.isDeleated && !s.isDeleted
&& (searchRecords.CategoryIds.Count == 0 || searchRecords.CategoryIds.Contains((int)s.CategoryId))
&& (string.IsNullOrEmpty(searchRecords.RecordTitle) || (string.IsNullOrEmpty(s.RecordNameModified) ?
s.RecordName.Contains(searchRecords.RecordTitle) :
s.RecordNameModified.Contains(searchRecords.RecordTitle)))
Each table has around 1 million records.
It takes around 7-8 seconds if I send the RecordTitle empty and takes 4-5 seconds if not empty.
I tried applying NC indexes on the subject title and whatnot. It’s of type nvarchar(1000)
.
Every table is related via foreign key. I don’t know now what makes it slow.
CodePudding user response:
The problem is probably Cartesian explosion
You either improve the query performance by linq to sql ( as you should ) or you can use Split query*.
- Note that : there is a possibility that split query can cause dirty data problems , think your first sql query done but before second sql query starts there could be data change with another request thus second query will continue with old data .