result data should look like this:
[
{
Feed_id:
Feed_title:
User_Name:
project_title:
event_object: { // all event data }
},
...
]
==> feed contains project, user and taskIdsList, ===> after fetching filtered feed take all the taskList and find events.
===> Here pagination is applied to second query of events, so if data set is large and many feeds (like 1000) are fetched, it will slow down the process and use more memory.? is there any way to go around it may be event architectural changes
model and query code:
same event can be in multiple feeds. as the same task can be in multiple events.
feed model
const feedSchema = new Schema({
title: {
type: String,
required: [true, 'Please enter title.']
},
project: {
type: Schema.Types.ObjectId,
ref: 'project',
required: [true, 'Please enter project.']
},
user: {
type: Schema.Types.ObjectId,
ref: 'user',
},
task: [{
type: Schema.Types.ObjectId,
ref: 'task',
}],
usersSharedFeed: [{
type: Schema.Types.ObjectId,
ref: 'user',
default: []
}],
}, {
timestamps: true,
});
event model
const eventSchema = new Schema({
title: {
type: String,
},
taskId: {
type: Schema.Types.ObjectId,
},
description: {
type: String,
default: ''
},
endDateTime: {
type: Date,
}
}, {
timestamps: true,
});
this is the relation between feed and event through task ID.
fetch feed data where the feed is shared with user.
const feedFetched = await feedModel.find({ usersSharedFeed: { $in: req.user._id }, isArchived: false })
.populate({
path: 'user',
select: { fullName: 1, profileImage: 1 }
})
.populate({ path: 'project', select: { title: 1 } })
.select({ project: 1, task: 1, usersSharedFeed: 1, user: 1, title: 1 });
from feed data create taskIDList. and fetch events.
const events = await eventModel.find({ taskId: { $in: taskIdList }, isArchived: false })
.select({ taskId: 1, timeSpent: 1, endDateTime: 1, title: 1, imageUrl: 1, description: 1 })
.sort({ endDateTime: -1 })
.limit(parseInt(req.query.perPage) * parseInt(req.query.pageNo))
.skip(parseInt(req.query.perPage) * parseInt(req.query.pageNo) - parseInt(req.query.perPage))
and now map events data to feed by matching taskId to get desired result. which runs nested for loop, which again increases cost as fetched feed data will increse.
events.forEach((event) => {
for (let i = 0; i < feed.length; i ) {
if (feed[i].task && feed[i].task.includes(event.taskId)) {
combineFeed.push({
_id: feed[i]._id,
title: feed[i].title,
project: feed[i].project,
user: feed[i].user,
event,
});
}
}
});
Here I have not added all the fields in schema as to not increase unnecessary data.
Any kind of feedback is appreciated.
CodePudding user response:
Using aggregation to make query then you add pagination easily