Home > other >  Error: maximum recursion depth exceeded in multipe processing
Error: maximum recursion depth exceeded in multipe processing

Time:11-11

i trying run multipe processing with ThreadPool and i have error maximum recursion depth exceeded

this is my code:

def extract_all_social_link(bio_url):
    data = extract_all_social_link(bio_url)
    return data
def run_extract_all_social_link(df, max_count, displays_steps = 1000):
    tt = time.time()
    user_data = []
    with concurrent.futures.ThreadPoolExecutor(max_workers=NUM_WORKERS) as executor:
        future_to_url = dict()
        cnt = 0
        for _, row in df.iterrows():
            bio_url = row["bio_url"]
            if cnt > max_count:
                break
            if pd.isna(bio_url):
                continue
            future = executor.submit(extract_all_social_link, bio_url)
            future_to_url[future] = (bio_url)
            cnt  = 1
        future_iter = concurrent.futures.as_completed(future_to_url)
        total = len(future_to_url)
        for cnt, future in tqdm(enumerate(future_iter), total=total):
            if (cnt 1) % displays_steps == 0:
                tt1 = time.time()
                print(f"{cnt 1} requests in {tt1 - tt:.3f} seconds")
                tt = tt1
            bio_url = future_to_url[future]
            try:
                data = future.result()
            except Exception as exc:
                print(f"{bio_url} generated an exception: {exc}")
        return user_data

`

and this is error:

generated an exception: maximum recursion depth exceeded

how can i fix it?

CodePudding user response:

You can view and increase this limit using https://docs.python.org/3/library/sys.html#sys.getrecursionlimit but it's bad practice in general, since it's a safeguard against stack overflows. Find a way to avoid so much recursion by refactoring your code.

CodePudding user response:

Don't even try to change the recursion limit config. You have to change the code of extract_all_social_link() that's called recursively with always the same arg bio_url.

def extract_all_social_link(bio_url):
    data = extract_all_social_link(bio_url)  # infinite recursion
    return data  # never reached !!
  • Related