Home > Software design >  How to calculate the cumulative sum of a column and create a new column?
How to calculate the cumulative sum of a column and create a new column?

Time:09-17

I have a pyspark dataframe:

Location    Month       Brand   Sector  TrueValue   PickoutValue
USA         1/1/2021    brand1  cars1   7418        30000       
USA         2/1/2021    brand1  cars1   1940        2000        
USA         3/1/2021    brand1  cars1   4692        2900        
USA         4/1/2021    brand1  cars1                           
USA         1/1/2021    brand2  cars2   16383104.2  16666667    
USA         2/1/2021    brand2  cars2   26812874.2  16666667    
USA         3/1/2021    brand2  cars2                           
USA         1/1/2021    brand3  cars3   75.6%       70.0%
USA         3/1/2021    brand3  cars3   73.1%       70.0%
USA         2/1/2021    brand3  cars3   77.1%       70.0%

I'm having Month values from 1/1/2021 to 12/1/2021 for each Brands. I need to create another column with the cumulative sum of the TrueValue column based on brand and sector and order by Month. The rows having % values should be cumulative sum divided by the number of months.

My expected dataframe is:

Location    Month       Brand   Sector  TrueValue   PickoutValue    TotalSumValue   
USA         1/1/2021    brand1  cars1   7418        30000           7418
USA         2/1/2021    brand1  cars1   1940        2000            9358
USA         3/1/2021    brand1  cars1   4692        2900            14050
USA         4/1/2021    brand1  cars1                               14050
USA         1/1/2021    brand2  cars2   16383104.2  16666667        16383104.2
USA         2/1/2021    brand2  cars2   26812874.2  16666667        43195978.4
USA         3/1/2021    brand2  cars2                               43195978.4
USA         1/1/2021    brand3  cars3   75.6%       70.0%           75.6%
USA         3/1/2021    brand3  cars3   73.1%       70.0%           76.3%
USA         2/1/2021    brand3  cars3   77.1%       70.0%           75.3%

For the rows having % values, this is how I need to calculate the cumulative sum ordering by month:

(75.6 0)/1 = 75.6%

(75.6 77.1)/2 = 76.3%

(75.6 77.1 73.1)/3 = 75.3%

I'm able to generate the cumulative sum but I'm not getting the cumulative sum of % values.

This is my code block:

df=df.withColumn("month_in_timestamp", to_timestamp(df.Month, 'dd/MM/yyyy'))

windowval = (Window.partitionBy('Brand','Sector').orderBy('Month')
             .rangeBetween(Window.unboundedPreceding, 0))
df1 = df1.withColumn('TotalSumValue', F.sum('TrueValue').over(windowval))

CodePudding user response:

It seems the calculation for the values with % is a cumulative average calculation. If so, you can apply cumulative sum for the values that do not contain a %, and cumulative average for the values that have % (remove the percentage sign first before calculation). You can use when-otherwise to apply both calculations.

import pyspark.sql.functions as F
from pyspark.sql.window import Window

df = df.withColumn("month_in_timestamp", F.to_timestamp(F.col("Month"), 'dd/MM/yyyy'))

# use 'month_in_timestamp' instead of 'month' 
windowval = (Window.partitionBy('Brand','Sector').orderBy('month_in_timestamp')
             .rangeBetween(Window.unboundedPreceding, 0))

df = df.withColumn("TotalSumValue", 
                   F.when(F.col("TrueValue").contains("%"), 
                          F.concat(F.avg(F.expr("replace(TrueValue, '%', '')")).over(windowval).cast("decimal(4,1)"), F.lit("%")))
                    .otherwise(F.sum('TrueValue').over(windowval).cast("decimal(13,1)")))

df.show()

#  -------- -------- ------ ------ ---------- ------------ ------------------- ------------- 
# |Location|   Month| Brand|Sector| TrueValue|PickoutValue| month_in_timestamp|TotalSumValue|
#  -------- -------- ------ ------ ---------- ------------ ------------------- ------------- 
# |     USA|1/1/2021|brand1| cars1|      7418|       30000|2021-01-01 00:00:00|       7418.0|
# |     USA|2/1/2021|brand1| cars1|      1940|        2000|2021-01-02 00:00:00|       9358.0|
# |     USA|3/1/2021|brand1| cars1|      4692|        2900|2021-01-03 00:00:00|      14050.0|
# |     USA|4/1/2021|brand1| cars1|      null|        null|2021-01-04 00:00:00|      14050.0|
# |     USA|1/1/2021|brand2| cars2|16383104.2|    16666667|2021-01-01 00:00:00|   16383104.2|
# |     USA|2/1/2021|brand2| cars2|26812874.2|    16666667|2021-01-02 00:00:00|   43195978.4|
# |     USA|3/1/2021|brand2| cars2|      null|        null|2021-01-03 00:00:00|   43195978.4|
# |     USA|1/1/2021|brand3| cars3|     75.6%|       70.0%|2021-01-01 00:00:00|        75.6%|
# |     USA|2/1/2021|brand3| cars3|     77.1%|       70.0%|2021-01-02 00:00:00|        76.4%|
# |     USA|3/1/2021|brand3| cars3|     73.1%|       70.0%|2021-01-03 00:00:00|        75.3%|
#  -------- -------- ------ ------ ---------- ------------ ------------------- ------------- 
  • Related