Home > Back-end >  Save User Activity in json file
Save User Activity in json file

Time:01-25

I am trying to save the user activities in a json file but when ever the file size gets bigger and multiple users working on same time the json file deletes the old records.

this is my Trait

trait CustomLogActivity
{
    protected static function bootCustomLogActivity()
    {
        foreach (static::getModelEvents() as $event) {
            static::$event(function ($model) use ($event) {
                $model->recordActivity($event);
            });
        }
    }
    protected static function getModelEvents()
    {
        return ['created', 'updated', 'deleted'];
    }

    protected function recordActivity($event)
    {
        $activity = [
            'user_id' => Auth::id(),
            'type' => $event,
            'subject' => (new \ReflectionClass($this))->getShortName(),
            'timestamp' => now()
        ];

        if ($event === 'updated') {
            $activity['old_properties'] = $this->getOriginal();
            $activity['new_properties'] = $this->getAttributes();
        } else {
            $activity['properties'] = $this->getAttributes();
        }
        $this->appendToLog($activity);
    }

    protected function appendToLog($activity)
    {
        $logFile = 'activity.json';
        $log = json_encode($activity);
        Storage::append($logFile, $log);
    }

    protected function getActivityType($event)
    {
        $type = strtolower((new \ReflectionClass($this))->getShortName());

        return "{$event}_{$type}";
    }
}

CodePudding user response:

As I mentioned in some comments, I will post it as an answer so it is explanatory for anyone having these types of issues:

The error you are having is called: concurrency.

I am assuming 2 processes uses the file at the same time, so both reads the current content, but one of them after that writes, the other process already has data in memory (so the new data is not get by this process), but not the new content, so it will overwrite the file...

First of all, use a Queue (events) to send data, and then use Redis, or a database or something that is super fast for this, but not literally a file, you can lose it instantly, but not a database...

You can still use a file bu I would not recommend to do so because it depends a lot on your infrastructure:

  • If you have a load balancer with 10 machines, are you going to have 10 different files (one per machine)?
    • How do you combine them?

So what I would do is just have a queue (triggered by using an event) and let that queue, with a single worker, handle this super specific task. But you will have to have in mind the speed, if you are getting more events in the queue than the single worker can resolve, you will have to find a solution for that

  • Related