Home > OS >  how to force input() to stop and continue the code flow
how to force input() to stop and continue the code flow

Time:05-21

I need to exit a loop that contains an input statement even if the user hasn't written anything. It receives parameters from a process too, and it must evaluate the content immediately. Something like this:

import multiprocessing

def my_function (my_queue):
    var = ""
    #### some code which finally puts something in the queue ###
    my_queue.put(var)


def main():
    my_queue = multiprocessing.Queue()

    p1 = multiprocessing.Process (target=my_function, args =(my_queue,))
    p1.daemon = True
    p1.start()

    my_var = ""
    
    while (my_queue.empty() is True and my_var == ""):
        my_var = input ("enter a parameter for my_var: ")

    #### code that evaluates the queue and the input as appropiate

## I want to exit the loop if there's something in the queue even if the user hasn't written anything

This doesn't work of course. The main loop is stacked in the input part. Any ideas? I'm using Windows. Thank you all in advance!

CodePudding user response:

Try this: we send a signal from the child to the parent process using os.kill, which raises an exception that we catch in order to escape the input function.

import multiprocessing
import signal
import time
import os


def my_function (my_queue, pid):
    var = ""
    #### some code which finally puts something in the queue ###
    time.sleep(3)
    my_queue.put(var)
    os.kill(pid, signal.SIGUSR1)


def interrupted(*args):
    print('Item added to queue before user input completed.')
    raise InterruptedError


def main():
    my_queue = multiprocessing.Queue()

    p1 = multiprocessing.Process (target=my_function, args =(my_queue,
                                                             os.getpid()))
#     p1.daemon = True
    p1.start()

    my_var = ""
    try:
        signal.signal(signal.SIGUSR1, interrupted)
        while (my_queue.empty() is True and my_var == ""):
            my_var = input ("enter a parameter for my_var: ")
    except InterruptedError:
        pass
    print('Processing results...')
    #### code that evaluates the queue and the input as appropiate

## I want to exit the loop if there's something in the queue even if the user hasn't written anything
if __name__ == '__main__':
    main()

The code above only works on Unix. I don't know whether the following code will work cross-platform, but it might:

import multiprocessing
import signal
import time
import os


def my_function (my_queue, pid):
    var = ""
    #### some code which finally puts something in the queue ###
    time.sleep(3)
    my_queue.put(var)
    os.kill(pid, signal.SIGINT)

def interrupted(*args):
    raise InterruptedError


def main():
    print('I am %s!' % os.getpid())
    my_queue = multiprocessing.Queue()

    p1 = multiprocessing.Process (target=my_function, args =(my_queue,
                                                             os.getpid()))
    p1.daemon = True
    p1.start()

    my_var = ""
    try:
        signal.signal(signal.SIGINT, interrupted)
        while (my_queue.empty() is True and my_var == ""):
            my_var = input ("enter a parameter for my_var: ")
    except InterruptedError:
        time.sleep(0.1)
        if my_queue.empty():
            print('Exiting gracefully...')
            return
    signal.signal(signal.SIGINT, signal.SIG_DFL)
    print('Processing results...')
    time.sleep(10)
    #### code that evaluates the queue and the input as appropiate

## I want to exit the loop if there's something in the queue even if the user hasn't written anything
if __name__ == '__main__':
    main()

CodePudding user response:

An answer that is presumably cross platform, by abusing the asyncio module.

I don't fully understand what I did, and I am sure it is bad form, but I've created two concurrent tasks, one that waits for input, and one that checks the queue for a value and raises an InterruptError. Because of the return_when='FIRST_EXCEPTION' setting, it was necessary for both functions to raise an exception. I returned the user input via the exception, since the function never returns.

import asyncio
import multiprocessing
import time
from aioconsole import ainput


def my_function(queue):
    time.sleep(3)
    queue.put(5)


async def my_loop(queue):
    while True:
        await asyncio.sleep(0.1)
        if not queue.empty():
            raise InterruptedError

async def my_exceptional_input():
    text = await ainput("Enter input:")
    raise InterruptedError(text)

async def main():
    queue = multiprocessing.Queue()
    p = multiprocessing.Process(target=my_function, args=(queue,))
    p.start()
    task1 = asyncio.create_task(my_exceptional_input())
    task2 = asyncio.create_task(my_loop(queue))
    result = await asyncio.wait([task1, task2], return_when='FIRST_EXCEPTION')
    try:
        task2.result()
    except asyncio.exceptions.InvalidStateError:
        text = str(task1.exception())
    except InterruptedError:
        text = ""
    print('Doing stuff with input %s...' % text)


if __name__ == '__main__':
    asyncio.run(main())

EDIT: It was silly of me to use 'FIRST_EXCEPTION'. I could have used 'FIRST_COMPLETED' like this:

import asyncio
import multiprocessing
import time
from aioconsole import ainput


def my_function(queue):
    time.sleep(3)
    queue.put(5)


async def my_loop(queue):
    while True:
        await asyncio.sleep(0.1)
        if not queue.empty():
            break


async def main():
    queue = multiprocessing.Queue()
    p = multiprocessing.Process(target=my_function, args=(queue,))
    p.start()
    task1 = asyncio.create_task(ainput("Enter text:"))
    task2 = asyncio.create_task(my_loop(queue))
    result = await asyncio.wait([task1, task2], return_when='FIRST_COMPLETED')
    try:
        text = task1.result()
        q = ""
    except asyncio.exceptions.InvalidStateError:
        text = ""
        q = queue.get()
    print('Doing stuff with input %s/%s...' % (text, q))


if __name__ == '__main__':
    asyncio.run(main())
  • Related