Home > OS >  How to Subtract a column from another column if a condition is met, otherwise subtract from a differ
How to Subtract a column from another column if a condition is met, otherwise subtract from a differ

Time:09-29

I'm working with trading data and Pandas. Given a 4-column OHLC pandas DataFrame that is 100 rows in length, I'm trying to calculate if an "Upper Shadow" exists or not for an individual row and store the result in its own column. To calculate if an "Upper Shadow" exists all you have to do is take the high (H) value of the row and subtract the open (O) value if the close (C) value is less than the open value. Otherwise, you have to subtract the close value.

Right now I'm naively doing this in a for loop where I iterate over each row with an if statement.

for index, row in df.iterrows():
    if row["close"] >= row["open"]:
        df.at[index,"upper_shadow"]=float(row["high"]) - float(row["close"])
    else:
        df.at[index,"upper_shadow"]=float(row["high"]) - float(row["open"])

Is there a better way to do this?

CodePudding user response:

You can use np.maximum to calculate the maximum of close and open in a vectorized way:

import numpy as np
df['upper_shadow'] = df['high'] - np.maximum(df['close'], df['open'])

CodePudding user response:

I think @Psidom's solution is what you are looking for. However the following piece of code is another way of writing what you already have using apply-lambda...

df["upper_shadow"] = df.apply(lambda row: float(row["high"]) - float(row["close"]) if row["close"] >= row["open"] else float(row["high"]) - float(row["open"]),axis=1)
  • Related