Home > Software engineering >  How to optimise simple MySql query of a 50M row table
How to optimise simple MySql query of a 50M row table

Time:08-03

I have a MySql table with a little over 50M records (over 13Gb); there are 24 numerical fields (int), one varchar and the PK. All the fields are indexed, and yet simple queries like SELECT COUNT(gID) FROM tgame t WHERE t.gPri=1 OR t.gPri=2 take over three minutes to run. The data are static, so the will be no more records added or deleted. What could be done to improve performance? The current state of affairs renders the application unviable.

CodePudding user response:

take a look at this answer, MySQL optimization of huge table it has a collection of answers to problems similar to yours

CodePudding user response:

Depending on how varied the queries against this data are likely to be, there could be an alternative to tuning the server’s performance. You mention that the data is static so you could have an additional table to store the counts of different columns and values and then just query that rather than counting up the same rows all the time just to get the same answers.

CodePudding user response:

SELECT COUNT(gID)         -- This checks whether gID is NULL
    FROM tgame t
    WHERE t.gPri=1 OR t.gPri=2;   -- OR is often deadly for performance

-->

SELECT COUNT(*)        -- This counts rows
    FROM tgame t
    WHERE t.gPri IN (1, 2);

Please provide SHOW CREATE TABLE and EXPLAIN SELECT ...

  • Related