This question is from HackerRank (Top Earners).
I wonder why
SELECT MAX(months * salary),
COUNT(*)
FROM Employee
WHERE (months * salary)
= MAX(months * salary)
this query doesn't work, but
SELECT MAX(months * salary),
COUNT(*)
FROM Employee
WHERE (months * salary)
= (SELECT MAX(months * salary) FROM Employee)
this query works.
Could you guys explain the reason?
Thanks.
I am trying to understand the operation logic of SQL.
CodePudding user response:
The first query doesn't work because the MAX() function can only be used in the SELECT or ORDER BY clauses of a query; it cannot be used without a proper clause supporting it.
The second query that you showed does work because it is using a subquery in the WHERE clause to get the maximum value of months * salary from the Employee table, and then comparing it to the values in the table to determine which rows match.
CodePudding user response:
The expression in a WHERE clause is evaluated against a single row of the outer query. It can't "see" other rows than the one it is evaluated for, therefore it can't calculate an aggregate like MAX().
Using the subquery changes that rule. The subquery calculates an aggregate, by looking at many rows. Then it returns the result as a simple scalar value to the expression in the outer query. So the outer query can still compare that scalar value against each row, as if it were a constant value. But still only one row at a time.