Home > database >  Oracle order by article 2 w data to help optimize 2 minutes
Oracle order by article 2 w data to help optimize 2 minutes

Time:09-18

The select a1. Fd_id,
A1. Fd_motive,
A1. Fd_proposer,
A1. Fd_employee_id,
(select d.f d_name
The from sys_org_dept d
Where fd_parent_no is not null
And the exists (select 1
The from sys_org_dept x
Where x.f d_no=d.f d_parent_no
And x.f d_parent_no is null)
Start with d.f d_no=a2. Fd_dept_no
Connect by the prior d.f d_parent_no=d.f d_no) as fd_sys,
A2. Fd_dept as fd_dept,
A2. Fd_dept_no,
To_char (a1) fd_create_time, 'yyyy - mm - dd hh24: mi: ss') as fd_create_time,
To_char (a1) fd_publish_time, 'yyyy - mm - dd hh24: mi: ss') as fd_publish_time,
A1. Fd_status,
A1. Fd_category,
(select to_char (wm_concat (t.f d_employee_id))
The from web_share_people t
Where t.f d_whole_innovate_id=a1. Fd_id) as fd_employee_ids,
(select to_char (wm_concat (t.f d_name))
The from web_share_people t
Where t.f d_whole_innovate_id=a1. Fd_id) as fd_names,
(select to_char (wm_concat (t.f d_weight | | '%'))
The from web_share_people t
Where t.f d_whole_innovate_id=a1. Fd_id) as fd_weights,
A1. Fd_detailed_information,
A1. Fd_improve,
A1. Fd_effect,
A1. Fd_rank,
A1. Fd_score,
(select fd_name
The from sys_org_dept s
Where s. d_no=a2. Fd_dept_no) as fd_model,
A1. Fd_auditor,
A1. FD_EFFICIENCY_ELEVATE1,
A1. FD_EFFICIENCY_ELEVATE2,
A1. FD_PROFIT_ADD,
A1. FD_PROBLEM_LESSEN,
A1. FD_AREA_SAVE
The from web_whole_innovate a1
Left the join sys_org_person a2
On a1. Fd_employee_id=a2. Fd_no
The order by a1. Fd_create_time

Remove the order by millisecond plus is 2 minutes have been added index but not a great god returns

CodePudding user response:

You build a index try:
The create index idx_test on web_whole_innovate (fd_create_time, 0);

CodePudding user response:

reference 1st floor minsic78 response:
you build such a index try:
The create index idx_test on web_whole_innovate (fd_create_time, 0);


I understand the wrong... The SQL paging outside? 20000 records is refers to the SQL finally returned to the amount of data is 20000?

In addition, get a best execution plan, good analytical performance bottlenecks
  • Related