I have the following function that converts values to a Decimal
object:
def convert_to_decimal(value: str | int, decimals: int) -> Decimal:
divisor = 10 ** decimals
return Decimal(value) / divisor
I want to apply it to the values in a column of a DataFrame
. df['numbers']
is simply a Series
object with int
and NaN
values in it.
My current code is:
df['numbers'] = df['numbers'].apply(convert_to_decimal(value=df['numbers'],
decimals=18))
But I get the following error:
TypeError: conversion from Series to Decimal is not supported
.
I simply want to mformat every number in my df['numbers']
into a Decimal object.
CodePudding user response:
You need 2 changes:
- Call
apply()
ondf[['numbers']]
which returns pandas.DataFrame instead of pandas.Series. - Convert direct call to function
convert_to_decimal
to a call from lambda function.
df[['numbers']].apply(lambda row: convert_to_decimal(value=row['numbers'], decimals=18), axis=1)