I have a market price logging table over 300 columns system logs data for every 15 seconds.. 4 rows for every minute
_id | the_datetime | column1 | column2 | column3 |
---|---|---|---|---|
1 | 2021-12-26 19:05:00 | 0.25 | 36.50 | 15.12 |
2 | 2021-12-26 19:05:15 | 0.25 | 36.60 | 15.12 |
3 | 2021-12-26 19:05:30 | 0.21 | 36.12 | 15.20 |
4 | 2021-12-26 19:05:45 | 0.32 | 36.50 | 15.22 |
5 | 2021-12-26 19:06:00 | 0.35 | 36.25 | 15.23 |
6 | 2021-12-26 19:06:15 | 0.36 | 36.33 | 15.24 |
7 | 2021-12-26 19:06:30 | 0.36 | 36.50 | 15.36 |
for colum1
SELECT _id,
column1 ,
LAG(column1 ,12) over (partition by _id order by _id desc)as colum3min
FROM the_table
order by _id desc
I need to check for every columnX price change for every 3 minutes is price change is higher than %0.5. For every row insert..
I tried to write a LAG function SQL but it doesn't worked.
can anyone help me to write me this sql for postgresql?
CodePudding user response:
You shouldn't partition by _id
, and you should order by "the_datetime"
.
with u as
(SELECT _id,
column1 ,
LAG(column1, 12) over (order by "the_datetime") as column3min
FROM the_table)
select * from u
where column1 / column3min >= 1.005 or column3min / column1 >= 1.005
Fiddle: I added some entries to show it's working.