Every user has a column and it will show how many people viewed his profile.
I don't need to show how many people viewed the profile only, I also want to sort views according to time.
I did it like this but if I reach 50 million and above, I'll suffer from the query because it will be slow and I may get unexpected problems.
Views table
User_Id | Viewed_Date |
---|---|
3 | ... |
2 | ... |
2 | ... |
3 | ... |
2 | ... |
2 | ... |
3 | ... |
1 | ... |
I tried to try another way and so I deleted the Views table and replaced it with the Users column
Users table
Id | Name | Views |
---|---|---|
1 | User1 | { "Last 1 day": 0, "Last 7 days": 0, "Last 30 days": 0, "Last 365 days": 0, "All time": 0 } |
2 | User2 | { "Last 1 day": 25, "Last 7 days": 0, "Last 30 days": 0, "Last 365 days": 0, "All time": 25 } |
3 | User3 | { "Last 1 day": 31, "Last 7 days": 162, "Last 30 days": 0, "Last 365 days": 0, "All time": 193 } |
The JSON looks like this
{
"Last 1 day": 0,
"Last 7 days": 0,
"Last 30 days": 0,
"Last 365 days": 0,
"All time": 0
}
I want to ask what is the best way to update views inside JSON, Any ideas, please.
(from Comment) (formerly called VIEWS
)
CREATE TABLE VISITS (
USER_ID int(11) NOT NULL,
VISITED_IN datetime NOT NULL
DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
CodePudding user response:
(You have been chastised in the Comments for the JSON approach; I won't repeat.) "50M rows" may not be an issue.
-- If you need to keep the exact time of every view, have this:
CREATE TABLE view_details (
user_id MEDIUMINT UNSIGNED ..., -- pick suitable INT size
view_date DATETIME ...,
PRIMARY KEY(user_id, view_date),
INDEX(view_date, user_id)
) ENGINE=InnoDB;
-- This is a "Summary table" of views by day by user:
CREATE TABLE daily_views (
user_id MEDIUMINT UNSIGNED ...,
view_day DATE ...,
ct SMALLINT UNSIGNED ..., -- pick suitable INT size
PRIMARY KEY(user_id, view_day),
INDEX(view_day, user_id)
) ENGINE=InnoDB;
When you insert into view_details
, also do "IODKU" ("upsert")
INSERT INTO daily_views (user_id, view_day, ct)
VALUES ($user_id, CURDATE(), 1)
ON DUPLICATE KEY UPDATE
ct = VALUES(ct) 1;
-- In MySQL 8.0, ct = OLD.ct 1
That gives you the daily counts for each user. For the weekly/monthly/etc counts.... Eg, for the week ending yesterday:
SELECT SUM(ct)
FROM daily_views
WHERE user_id = $user_id
AND view_day >= CURDATE() - INTERVAL 7 DAY
AND view_day < CURDATE();
(There are many variations on that depending on which users you want to report on and what date range is desired.)
Resist the temptation to have weekly, monthly, etc, summary tables. Usually having just one summary table is sufficient, and fast "enough", even for "all time". If you find it to be not fast enough, please provide some more statistics (typical view/day, number users, total view count, etc); we can discuss alternatives.
If you expect hundreds of views per second, we may need to tweak things some. Meanwhile, what I showed here should scale well for most applications.
More discussion: Summary Tables
But, what about JSON?
What needs to happen every midnight? Every number in the JSON for every user needs to be recomputed. (Oh, except for "all time" -- that won't change just because the clock ticked.) Sure, there are a set of SELECT ... GROUP BYs
that will compute all the data. Then you have to go to each user and update it.