Home > Mobile >  Not Able To MultiProcess Python Flask Server
Not Able To MultiProcess Python Flask Server

Time:01-07

Hi Everyone I was trying creating a multiprocessed flask server with waitress but evertime i execute it i run into issues and no able to identify them the same gets executed if i try multithreading

I am using multiprocessing because i want to have conplete control of the flask server (running/terminating)

APICommunications.py

import ProcessBuilder as pb
import ServiceLogger as sl
import SystemRecovery as sr
from flask import Flask, Response
from waitress import serve
from flask_cors import CORS


class EndpointAction(object):

    def __init__(self, action):
        self.action = action
        self.logger = sl.ServiceLogger()
        self.response = Response(status=200, headers={})

    def __call__(self, *args):
        self.action()
        return self.response


class APIServer:

    def __init__(self):
        self.server = None
        self.debug = sr.Recovery().debug
        self.port = sr.Recovery().api_port
        self.host = '127.0.0.1'
        self.logger = sl.ServiceLogger()
        self.app = Flask(__name__)
        CORS(self.app, resources={r"/api/*": {"origins": "*"}})
        self.register_endpoints()
        self.server = pb.ProcessBuilder().create_process(self.run, [self])

    def add_endpoint(self, endpoint=None, endpoint_name=None, handler=None):
        self.app.add_url_rule(endpoint, endpoint_name, EndpointAction(handler))

    def register_endpoints(self):
        return None

    def start(self):
        self.server.start()

    def run(self):
        serve(self.app, port=self.port, host=self.host)

    def terminate(self):
        self.server.kill()
        return None


a = APIServer()
a.start()
print("heyy")
a.terminate()
print("thread killed")

ProcessBuilder.py

import multiprocessing
import random
import string

import ServiceLogger as sl


class ProcessBuilder:

    def __init__(self):
        self.processes = []
        self.logger = sl.ServiceLogger()

    def create_process(self, method, args=[]):
        process = multiprocessing.Process(name=self.__generate_instance_name(), target=method, args=tuple(args))
        self.processes.append(process)
        return process

    def get_process_instance(self, process_name):
        for process in self.processes:
            if str(process.name) == process_name:
                return process
        return None

    def run_process(self, process):
        process.start()

    def kill_process(self, process):
        process.kill()

    def __generate_instance_name(self):
        # printing lowercase
        letters = string.ascii_lowercase
        randomString = ''.join(random.choice(letters) for i in range(3))

        # printing letters
        letters = string.digits
        randomDigits = ''.join(random.choice(letters) for i in range(5))

        return randomString   ""   randomDigits

Output:

C:\Python\python.exe D:\Projects\APICommunications.py 
Service Logger Started...........
Service Logger Started...........
Traceback (most recent call last):
  File "D:\Projects\Red\APICommunications.py", line 52, in <module>
    a.start()
  File "D:\Projects\Red\APICommunications.py", line 41, in start
    self.server.start()
  File "C:\Python\Lib\multiprocessing\process.py", line 121, in start
    self._popen = self._Popen(self)
                  ^^^^^^^^^^^^^^^^^
  File "C:\Python\Lib\multiprocessing\context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python\Lib\multiprocessing\context.py", line 336, in _Popen
    return Popen(process_obj)
           ^^^^^^^^^^^^^^^^^^
  File "C:\Python\Lib\multiprocessing\popen_spawn_win32.py", line 94, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Python\Lib\multiprocessing\reduction.py", line 60, in dump
    ForkingPickler(file, protocol).dump(obj)
AttributeError: Can't pickle local object 'Flask.__init__.<locals>.<lambda>'

CodePudding user response:

This is not how you do it. :-)

If you want to have multiple processes answering HTTP requests on a given port, even if we work out the details so that any data there is pickable and you can actually start waitress in different processes, you'd run into the obvious wall of having more than one process trying to listen to the same port on the same computer - all but the first process would terminate with "address already in use" error by the O.S. itself.

Instead, keep in mind that waitress is not designed for production app serving and should be used only as the development http/wsgi server. Recent versions even have some support for parallel requests, but that is in multi-threading mode - and even if it supported multiprocessing configurations, that should be done from within waitress, not externally to it.

Instead, you should check the documentation on how to serve your project with a production grade WSGI server, like gunicorn or uWSGI are good examples, and both are extensively configurable, allowing one to setup how many worker processes and within each process, how many working threads should answer requests. All parallelisation and multi-processing support is not trivial, and is built deep into these projects.

  • Related