问题
I have been asked to provide an optimal approach to perform following task,
We have a view which extract data from multiple tables and we have to perform some business logic to these extracted data and then insert the processed data into an another table. Problem here is that the View is very complicated and on execution extracts 40 Millions of records which itself takes a lots of time in execution. But out of these many records we have to perform logic on approx 25 millions of records.
For this i have suggested an approach to insert these 24 Millions record in a Global Temporary table and perform all business logic on that temp table and then insert the processed data into the final table.
I request you to please suggest if this approach is fine with this bulk data or we can still perform this task in better optimal Oracle concept. I have worked on TSQL before this and PLSQL is new to me, hence any suggestion would be really helpful. Thanks
回答1:
In Oracle you don't normally need to use global temporary tables for this sort of thing, rather you can use bulk processing with arrays:
declare
cursor c is
select col1, col2 from my_view;
type t is table of c%rowtype;
array t;
begin
open c;
loop
fetch c bulk collect into array limit 1000;
exit when array.count = 0;
for i in 1..array.count loop
null; -- Perform business logic on array(i) here
end loop;
forall i in 1..array.count
insert into final_table (col1, col2)
values (array(i).col1, array(i).col2);
end loop;
close c;
end;
That's just a minimal example - see this article for more details.
来源:https://stackoverflow.com/questions/35898642/plsql-alternative-to-stored-procedure-for-optimal-performance