Filtering rows for Scalar
and Logical
conditions are quite well-known and reported here, e.g. df[df['col']>=0]
can be used to filter the negative rows out. Yet, I've come to work with datetime and wonder what ways work, I might answer with approaches I find or try out myself.
CodePudding user response:
In pandas, as long as you have all of your datetime data formatted as datetime, it behaves very similiarly. Here are some examples and an additional link to reference.
Filter row row between two dates.
df[(df['date'] > '2019-12-01') & (df['date'] < '2019-12-31')]
Filter rows by index
df2.loc['2019-12-01':'2019-12-31']
CodePudding user response:
To complete the answer by Omnishroom
, if your column in your dataframe, df
, is already a datetime
type, you would have no problem using what he mentioned. But if your column is a string
like mine, then use .dt.date
as follows:
series[series['date_column'].dt.date == datetime.date(2007, 1, 1)]
Refer to Omnishroom
for more. I've just added .dt.date
. Also, since my datetime strings were in another format, I didn't use that type of approach.