Home > Mobile >  Python: Apply .apply() with a self-defined function to a Data Frame- why doesn't it work?
Python: Apply .apply() with a self-defined function to a Data Frame- why doesn't it work?

Time:10-02

I am trying to apply a self-defined function by using apply() to a data frame. Goal is to calculate the mean of each row / column with a self-defined function. But it doesn't work, probably I still don't understand the logic of .apply() fully. Can someone help me? Thanks in advance:

d = pd.DataFrame({"A":[50,60,70],"B":[80,90,100]})

def m(x):
    x.sum()/len(x)
    return x

d.apply(m(),axis=0)

CodePudding user response:

If possible the best way is a vectorized solution:

df = d.sum() / len(d)

Your solution is possible too, but you need to change to return the values, and also in apply remove (), finally axis=0 is the default value for that parameter, so it can also be removed:

def m(x):
    return x.sum()/len(x)

df = d.apply(m)
  • Related