Home > other >  What's the nicest most-pythonic way to select rows in pandas for datetime conditions
What's the nicest most-pythonic way to select rows in pandas for datetime conditions

Time:09-29

Filtering rows for Scalar and Logical conditions are quite well-known and reported here, e.g. df[df['col']>=0] can be used to filter the negative rows out. Yet, I've come to work with datetime and wonder what ways work, I might answer with approaches I find or try out myself.

CodePudding user response:

In pandas, as long as you have all of your datetime data formatted as datetime, it behaves very similiarly. Here are some examples and an additional link to reference.

Filter row row between two dates.

df[(df['date'] > '2019-12-01') & (df['date'] < '2019-12-31')]

Filter rows by index

df2.loc['2019-12-01':'2019-12-31']

https://datascientyst.com/filter-by-date-pandas-dataframe/#:~:text=Here are several approaches to filter rows in,3) Filter rows by date with Pandas query

CodePudding user response:

To complete the answer by Omnishroom, if your column in your dataframe, df, is already a datetime type, you would have no problem using what he mentioned. But if your column is a string like mine, then use .dt.date as follows:

series[series['date_column'].dt.date == datetime.date(2007, 1, 1)]

Refer to Omnishroom for more. I've just added .dt.date. Also, since my datetime strings were in another format, I didn't use that type of approach.

  • Related