Home > Software design >  Differences bewteen time consumed of python loops
Differences bewteen time consumed of python loops

Time:07-25

I am running a program and it really consumes a lot of time for a 3-layers loop. I reduced the size of loop and observed an interesting thing.

When the first layer is 50 iterations, it only comsumes 3 seconds. But when I changed it to 100 iterations, the time increased to 43 seconds. Why the time spent not doubled when the number of iterations just doubled? How the calculation complexity was calculted... I dont understand.

By the way my original designed loop was 160x192x160. It spent really a lot of time and I just stopped it. I think I need to figure out one way to solve this time problem. This is why I tried above mentioned.

start=time.time()
choice_list=[]
result_list=[]
mean_list=[]
point_info=[]
patch_radius=patch_radius
for k in range (0,50):
    for l in range (0,100):
        for h in range (0,10):
            if img[k][l][h]!=0:
               mean=patch_mean(coordinate_x=k,coordinate_y=l,coordinate_z=h,image=img,patch_radius=patch_radius)
               point_info=[k,l,h,mean]
               mean_list.append(point_info)
end=time.time()
print(end-start)

patch_mean is a function calculated the mean around a point. It is another loop. I think it would not matter. Because it is an independent fucntion. To be clear, patch raidus is a constant

def patch_mean(coordinate_x,coordinate_y,coordinate_z,image,patch_radius):
    sum=0
    count=0
    for k in range(coordinate_x- patch_radius, coordinate_x   patch_radius):
        for l in range(coordinate_y - patch_radius, coordinate_y   patch_radius):
            for h in range (coordinate_z - patch_radius, coordinate_z   patch_radius):
              if 0<k<159 and 0<l<191 and 0<h<159:
                 if img[k][l][h] != 0:
                   sum = sum   img[k][l][h]
                   count = count   1
    if count==0:
        mean=0
    else:
        mean=sum/count
    return mean

CodePudding user response:

The first iterations of your outer loop give you coordinates that are near boundary of your image. That makes the patch_mean faster to calculate, as a big chunk of its area is cut off. When you move towards the middle of the image, the computation will be slower, since you'll be able to get an average of the whole patch area, not just a part of it.

If you change the range from range(0, 50) to range(0, 100), you're will be a lot more of the middle part of the image. Those coordinates are the slow ones, so overall, the loop will be a lot slower. If you changed it to range(0, 160), you'd find that the last few iterations would speed up again, as you'd start running into the other side of the image. But the interval from 50-100 is right in the middle of the image, and will be the slowest part.

CodePudding user response:

I think I have figured out a bit what is going on. Because there is an =!0 condition in the inner loop, the part which has more zeros will calculate slower. This can be seen if I compared range(0,50) and (50,100).

How can I accelerate my program? using numpy to write the program? or try to use gpu? Is it normal that a 3-layer loop spend so long time to run (considering it invloves a ficntion patch_mean that will calculate mean around 8x8x8 block around a point)?

  • Related