Home > Back-end >  Paramiko exec_command, can't make code awaits for task to be completed
Paramiko exec_command, can't make code awaits for task to be completed

Time:10-18

I am building a django app under docker. In the views I call a task where paramiko makes a ssh connection from a container to another to run a third party app. The problem is that during the call to the other container I zip the 'results' folder and move it to another place. This requires a bit. The code though goes back to the views and looks for the zip file before it appears where it should be.

tasks.py

@shared_task()
def my_task():
        command2 = f"""sudo cp /.../ibt1.msh /.../ibt1.msh && \n 

        until sudo zip -r result.zip ./output/; do sleep 5; done && \n


        until sudo mv result.zip /europlexusData/result.zip; do sleep 5; done && \n
        sudo rm -rf ./output
        """

        host2 = " "
        port =   
        username = " "
        password = " "

        ssh = paramiko.SSHClient()
        ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
        ssh.connect(host2, port, username, password)
        stdin, stdout, stderror = ssh.exec_command(command2, get_pty=True)
        
        return 'Done'

views.py

    my_task.delay()
    file=open('zip/file/to/be/created/in/the/task')
    return FileResponse(file)

CodePudding user response:

Using my_task.delay() means my_task will run in parallel with the rest of your view. This means my_task might not even be started when you call open on the line after. If you remove the call to delay() your task will run until its end and open will happen after.

This is not a good practice because your request might timeout and your server will get busy for nothing, instead you should:

  1. Warn your client that the task has been started.
  2. Once you know it is done (via a messaging service, or waiting sufficiently long enough) have your client query an other route that fetches the resulting zip.
  • Related