I'm trying to scan domain for pentesting purposes, the program use Multiprocessing with the result-list got passed from Processes; back to main function.
I have tried to use Global Variables mentioned in global
also in a class
. Using this reminds me that the processes lives in different memory. So i'm using manager.list()
instead; to share memory between processes
Here's what i've tried:
from multiprocessing import Process, cpu_count, Manager
class variably:
variably=bla..
....
def engine(domainlist, R):
for domain in domainlist:
try:
r = requests.get("http://" domain, headers=headers, timeout=0.7, allow_redirects=False)
if r.status_code == expected_response:
print("Success" domain)
print(domain, file=open("LazyWritesForDebugPurposes.txt", "a"))
R.append(str(domain))
elif r.status_code != expected_response:
print("Failed" domain str(r.status_code))
except:
pass
def fromtext():
....
R = []
with Manager() as manager:
num_cpus = cpu_count()
processes = []
R = manager.list()
for process_num in range(num_cpus):
section = domainlist[process_num::num_cpus]
p = Process(target=engine, args=(section,R,))
p.start()
processes.append(p)
for p in processes:
p.join()
print(R)
print(R)
print("")
print(" Total of Domains Queried : " colors.RED_BG " " str(len(R)) " " colors.ENDC)
if len(inf.result_success) >= 0:
print(" Successfull Result : " colors.GREEN_BG " " str(len(R)) " " colors.ENDC)
fromtext()
Sorry for any invalid syntax or indentation, trying to simplify the codes into more shorter snippet.
Above codes returns BrokenPipe
sometimes with ConnectionRefused
error.
From the exception, i can see that the list already appended as: ['Domain.com','Domain_2.com']
but somehow raises an exception.
Here's some Screenshot about the problem: Problematic Screenshot
EDIT:
It's looks like the list can only pass inside manager()
scope, how can i extend the data passing outside the scope, for example calling the list in different function. This below codes works:
with Manager() as manager:
num_cpus = cpu_count()
processes = []
R = manager.list()
for process_num in range(num_cpus):
section = domainlist[process_num::num_cpus]
p = Process(target=engine, args=(section,R,))
p.start()
processes.append(p)
for p in processes:
p.join()
str(len(R))
CodePudding user response:
You really want to use a queue. Create a multiprocessing.SimpleQueue
in your main thread and pass it to all your subprocesses. They can add items to this queue.
Creating your own manager is almost always a mistake.
CodePudding user response:
The problem to this that i had to explicitly convert the manager.list()
as another normal list by setting a new variable and use global
to make it usable in another function. I know that it's dirty, havent tried using Queue()
but for now, atleast it's working.
def executor():
global R
....
with Manager() as manager:
num_cpus = cpu_count()
processes = []
R = manager.list()
for process_num in range(num_cpus):
section = domainlist[process_num::num_cpus]
p = Process(target=engine, args=(section,R,))
p.start()
processes.append(p)
for p in processes:
p.join()
R = list(R)
print(R)
If there's a simplification to this, i would really appreciate it.