I have logs which are uploaded and inserted into MySQL table from CSV files using upload_log endpoint in views.py of a Django app. I also have another views.py in a different sub-app (Django as well) that parses the existing data from the logs table and splits it in different rows according to conditions.
The problem is the parsing happens whenever I reload the server which creates duplicates besides the fact that it is not my intention to have it like this. I need to have it so that whenever data is uploaded (post is triggered), it runs the function on that uploaded data to parse the logs.
Edit (clarification): It doesn't execute when I do py manage.py runserver - But it executes after I run the runserver command and CTRL S inside the file to reload the server.
I would appreciate it if someone guides me through this.
Here's the views.py of that task (Logs app):
from rest_framework import viewsets
from rest_framework.decorators import action
from api.packets.models import Fields
from .models import LogsParsed
from api.logs.models import Logs
from .serializers import LogsParsedSerializer
class getLogs(viewsets.ModelViewSet):
serializer_class = LogsParsedSerializer
queryset = LogsParsed.objects.all()
parsedLogs=[]
for log in Logs.objects.all():
fields = Fields.objects.filter(pac_id=log.message_id_decimal)
binaryTest=log.data_binary
for field in fields:
items=LogsParsed(
pac_id=log.message_id_decimal,
log_id=log.log_id,
log_date=log.log_date,
fld_name=field.fld_name,
value=binaryTest[field.fld_offset:field.fld_offset field.fld_len]
)
parsedLogs.append(items)
LogsParsed.objects.bulk_create(parsedLogs)
And here's the views.py of the upload from CSV (ParsedLogs app):
from django.contrib import messages
from rest_framework import viewsets
from rest_framework.decorators import action
from datetime import datetime
from rest_framework.response import Response
from .models import Logs
from .serializers import LogsSerializer
import pandas as pd
class LogsViewSet(viewsets.ModelViewSet):
serializer_class = LogsSerializer
queryset = Logs.objects.all()
@action(detail=False, methods=['POST'])
def upload_log(self, request):
"""Upload data from CSV, with validation."""
if request.method == 'POST':
file = request.FILES['file']
data_file = pd.read_csv(file, sep=';', encoding="utf8")
file_name = request.FILES['file'].name
_datetime = datetime.now()
file_upload_date = _datetime.strftime("%Y_%m_%d %H:%M:%S")
# Swap every 2-character blocks with the next 2-character blocks
def swap_bits(data):
blocks = [data[i:i 4] for i in range(0, len(data), 4)]
for i in range(len(blocks)):
blocks[i] = blocks[i][2:] blocks[i][:2]
return "".join(blocks)
# Iterating through rows
row_iter = data_file.iterrows()
logs = [
Logs(
file_upload_date=file_upload_date,
file_name=file_name,
log_date=row["Date and time[ms]"],
stamp=row[" Stamp"],
message_id=row[" Message Id"],
message_id_decimal=int(row[" Message Id"], 16),
length=row[" Length"],
data=row[" Data"],
# data_binary=swap_bits(row[" Data"])
data_binary=bin(int(swap_bits(row[" Data"]), 16))[2:]
)
for index, row in row_iter
]
Logs.objects.bulk_create(logs)
return Response({"success": "Successfully uploaded logs!"})
CodePudding user response:
As @Durai pointed out, the getLogs
class has code at the top level that is not inside of a method.
This code will be executed whenever the module is imported, as it is part of the class definition.
If you want the code to not execute automatically, put it inside of a method inside the class, and then call the method as needed.