Home > Enterprise >  Group log statements per second
Group log statements per second

Time:11-08

I wrote a logger to log some canbus values to a file, where the timestamp is in seconds. An example:

Timestamp, Can ID, data, datalength
07-Nov-22 13:40:41, 418385469, a00f325bffffffff, 8
07-Nov-22 13:40:41, 217056317, ffffff0000ffffff, 8
07-Nov-22 13:40:41, 486535229, 000000000000ffff, 8
07-Nov-22 13:40:41, 217064509, 0000ffffff0000ff, 8
07-Nov-22 13:40:42, 418385469, a00f325bffffffff, 8
07-Nov-22 13:40:42, 217056317, ffffff0000ffffff, 8
07-Nov-22 13:40:42, 486535229, 000000000000ffff, 8
07-Nov-22 13:40:42, 217064509, 0000ffffff0000ff, 8

I want to make a code which reads the .csv logfile to and makes a new one where te data is sorted as follows:

Timestamp, 418385469, 217056317, 486535229, 217064509
07-Nov-22 13:40:41, a00f325bffffffff, ffffff0000ffffff, 000000000000ffff, 0000ffffff0000ff
07-Nov-22 13:40:42, a00f325bffffffff, ffffff0000ffffff, 000000000000ffff, 0000ffffff0000ff

So sort per second the data of every Can bus messages. I'm quite a noob with python so I tried quite some things but I can't figure it out.

I've tried to include some different methods of pandas, but they don't give the disered result. And I can't find a compatible solution online.

CodePudding user response:

If you load the CSV into a dataframe df, then you can do:

df = df.pivot(index="Timestamp", columns="Can ID", values="data").reset_index()
df.columns.name = None

print(df)

Prints:

            Timestamp         217056317         217064509         418385469         486535229
0  07-Nov-22 13:40:41  ffffff0000ffffff  0000ffffff0000ff  a00f325bffffffff  000000000000ffff
1  07-Nov-22 13:40:42  ffffff0000ffffff  0000ffffff0000ff  a00f325bffffffff  000000000000ffff
  • Related