Home > database >  How do level SQL group by optimization, query for 9 seconds, coupon index field have
How do level SQL group by optimization, query for 9 seconds, coupon index field have

Time:09-25

The SELECT coupon, COUNT (1) recCount
The FROM xx_coupon_card
GROUP BY coupon
The SQL for 9 seconds, the data quantity 35 million

CodePudding user response:

Suggest try the view index,
 
- create a view index
The create view dbo. Vxx_coupon_card
With schemabinding
As
The select coupon, count (1) recCount
The from xx_coupon_card
Group by coupon

Create unique clustered index ix_vxx_coupon_card_coupon on vxx_coupon_card (coupon)


Try again -- query
The select coupon, count (1) recCount
The from xx_coupon_card
Group by coupon

CodePudding user response:

Being the number of records, the use of a statement, how to optimize the effect is not good, because the bottleneck not statistical calculations, and read and write in the disk, Where conditions of unless you read only a few data, in solid state disk would be much better, you know the big data optimization is not simply to optimize SQL statements that simple,

CodePudding user response:

Want to see the coupon after field group by how many records, record the more the more time-consuming,

CodePudding user response:

# 1 moderator is equivalent to automatic statistical result, the efficiency is highest,
View of the result table, check the result table without calculating seconds out, of course,

The only drawback is that if the censored particularly frequent, deadlock probability increases,
Sure that no problem can be bold try

CodePudding user response:

When storing the sort coupon?

CodePudding user response:

In terms of the SQL statement, nothing worthy of optimization,
But what is the actual usage scenario this statement? Worthy of consideration,

CodePudding user response:

Looking for a better server

CodePudding user response:

Must level 9 seconds right, do you want?

CodePudding user response:

refer to the second floor datafansbj response:
must level number of records, using a single statement, how to optimize the effect is not good, because the bottleneck not statistical calculations, and read and write in the disk, Where conditions of unless you read only a few data, in solid state disk would be much better, you know the big data optimization is not simply to optimize SQL statements that simple,

Floor 2 right, must level data volume, performance problems are read disk, can increase the cache database, index, and the appropriate optimization,
  • Related