We are having challenges with queries that spend most of their time on these waits:
enq: TS – contention
This wait indicates that a query on one instance of a RAC system is causing its instance to add temp space to its sort segment and remove space from another instance’s sort segment. For some reason this process is extremely slow and totally bogs down any query waiting on this type of wait. I worked an SR with Oracle on this issue and Oracle development claims this is not a bug. I’ve duplicated the problem on both 10.2 and 11.2 RAC.
Anyway, so now we periodically see queries spending most of their time on this wait and this occurs when some other query is eating up all the temp space or has done so recently. So how do I go back in time and figure out what query was using temp space? If I wanted to see the session ids of the current users of temp space on a RAC system I would just join gv$tempseg_usage to gv$session like this:
select se.inst_id,se.sid,sum(tu.blocks)*8192 from gv$tempseg_usage tu, gv$session se where tu.inst_id=se.inst_id and tu.session_addr=se.saddr and tu.session_num=se.serial# group by se.inst_id,se.sid order by se.inst_id,se.sid;
This assumes an 8K blocksize. But to go back in time you can use the TEMP_SPACE_ALLOCATED column of V$ACTIVE_SESSION_HISTORY or DBA_HIST_ACTIVE_SESS_HISTORY to identify a query that consumed a lot of temp space.
Here is an example:
select sql_id,max(TEMP_SPACE_ALLOCATED)/(1024*1024*1024) gig from DBA_HIST_ACTIVE_SESS_HISTORY where sample_time > sysdate-2 and TEMP_SPACE_ALLOCATED > (50*1024*1024*1024) group by sql_id order by sql_id;
This gives the sql_id and maximum allocated temp space of any queries that ran in the past two days and exceeded 50 gigabytes of temp space.
This is a great feature of these ASH views. Now we can go back in time and find the queries that were using all the temp space and tune them.
– Bobby
Hi Bobby,
First of all thank for posting this 🙂
We are facing the Temp issue frequently as after temp filled up we can see the alert log .
We couldn’t catch which SID and SQLText cause this issue .
Note : Current active session we can able to see who is culprit but It was happened three hours back mean how we will find out that culprit .
We tried the above query but couldn’t. Please if you know the exact query share me : rjn.msn@gmail.com
Many thanks in advance for your help.
Regards,
Raj
The query only works on 11.2 or later. If you have an older database you get this error: ORA-00904: “TEMP_SPACE_ALLOCATED”: invalid identifier. Also, if you don’t get any rows returned but don’t get an error try reducing this number: (50*1024*1024*1024) – that is 50 gigabytes and you may want something smaller. I.e. 2 gigabytes would be (2*1024*1024*1024) . Thanks for your comment.
Hi Bobby
I am using 10g database, As u stated to reduce the size to 2 and try it is still giving error ORA-00904.
How can change the query for this error.
Sorry, you can’t use this query on a 10g database.
Is there any query with you to help me to get track of sql which has used more temp space.
Thanks
Asif J K
This may be a good blog post to review: https://alexzeng.wordpress.com/2012/06/13/how-to-find-the-sql-that-using-lots-of-temp-tablespace-in-oracle/
It doesn’t show history of temp space usage but the current SQL that is using temp space.
Thanks for the post. This helped us track down the session and statement that gobble up all of temp during a particular window of time.
Chuck,
I’m glad it was helpful to you.
– Bobby
Thanks for the post Bobby. I have a question. If we need to find out how much temp space is taken by a query. For example.. I write a new query and wanted to test how much temp space it is going to use…
I know we can use your query above with time.. but is there any way…Like joining your query with V$sql where sql_text like ‘my new query’? Please suggest
Venkat,
Thank you for your comment but I don’t really have a query like that. I know that with a problematic query I’ve monitored the available space in the sort segment but I can’t find a script for that. The thing is that depending on where the query is in the execution of its plan temp space usage can go up or down. I did a quick google search and this page seems to have some scripts that you can use to monitor the temp space usage of an active session:
http://www.dbspecialists.com/files/presentations/temp_space.html
– Bobby
This comes 3 years late I know, but it may help someone …
After running your query, you can wait till the end of the current snap interval and then run a query similar to the following, modifying the interval begin and end times in the WHERE clause to match your case:
select round(temp_space_gb,1) temp_space_gb,
c.sql_text,
a.snap_id,
to_char(begin_interval_time,’dd/mm/yyyy hh24:mi’) begin_interval_time,
to_char(end_interval_time,’dd/mm/yyyy hh24:mi’) end_interval_time,
b.sql_id
from dba_hist_snapshot a
left join dba_hist_sqlstat b
on b.snap_id = a.snap_id
left join dba_hist_sqltext c
on c.sql_id = b.sql_id
inner join (
select snap_id, sql_id, max(temp_space_allocated) / (1024*1024*1024) temp_space_gb
from dba_hist_active_sess_history
group by snap_id, sql_id) d
on d.snap_id = a.snap_id
and d.sql_id = b.sql_id
where
begin_interval_time >= to_date (’02/05/2017 12:00′,’dd/mm/yyyy hh24:mi’)
and end_interval_time <= to_date ('02/05/2017 13:01 ','dd/mm/yyyy hh24:mi')
and temp_space_gb is not null
order by temp_space_gb desc;
This will show you all the SQL queries that ran during the chosen snap interval/s, ranked by their maximum sampled temp space allocation per snap interval.
You do have to wait until the end of the snap interval in which your business query ran, but that is normally a small price to pay.
Thanks Bobby for the post which pointed the way to incorporating temp space into an existing query.
Regards,
Don
Thanks for your reply. I haven’t had a chance to review it yet I appreciate your contribution.
Bobby
Bobby,
Thank you very much for your query.
I am glad that you found it helpful
— Query to check TEMP USAGE
SELECT a.tablespace_name,sum(ROUND((c.total_blocks*b.block_size)/1024/1024/1024,2))
“Total Size [GB]”,sum(ROUND((a.used_blocks*b.block_size)/1024/1024/1024,2)) “Used_size[GB]”,
sum(ROUND(((c.total_blocks-a.used_blocks)*b.block_size)/1024/1024/1024,2)) “Free_size[GB]”,
sum(ROUND((a.max_blocks*b.block_size)/1024/1024/1024,2)) “Max_Size_Ever_Used[GB]”,
sum(ROUND((a.max_used_blocks*b.block_size)/1024/1024/1024,2)) “MaxSize_ever_Used_by_Sorts[GB]” ,
sum(ROUND((a.used_blocks/c.total_blocks)*100,2)) “Used Percentage”
FROM gV$sort_segment a,dba_tablespaces b,(SELECT tablespace_name,SUM(blocks)
total_blocks FROM dba_temp_files GROUP by tablespace_name) c
WHERE a.tablespace_name=b.tablespace_name AND a.tablespace_name=c.tablespace_name group by a.tablespace_name;
Top 10 temp usage
select * from
(SELECT d.tablespace_name,a.inst_id,a.sid,a.serial#,a.username “DB Username”,a.osuser,nvl(a.module,a.program) proginfo,floor(last_call_et/3600)||’:’||floor(mod(last_call_et,3600)/60)||’:’||mod(mod(last_call_et,3600),60) lastcallet,ROUND((b.blocks*d.block_size)/1024/1024,2) “Used MB”,c.sql_id,c.sql_text
FROM gv$session a, gv$tempseg_usage b, gv$sqlarea c,dba_tablespaces d
WHERE a.saddr = b.session_addr AND c.address= a.sql_address AND c.hash_value = a.sql_hash_value AND d.tablespace_name=b.tablespace ORDER BY “Used MB” DESC)
where rownum sysdate-2/24 and
TEMP_SPACE_ALLOCATED > (10*1024*1024*1024)
group by sql_id,session_id,session_serial# order by GIG desc;
History :-
Following query gives sql_id and maximum allocated temp space of any queries that ran in the past 2 hours and exceeded 10 GB of temp space.
select sql_id,session_id,session_serial#,max(TEMP_SPACE_ALLOCATED)/(1024*1024*1024) gig
from gv$active_session_history
where
sample_time > sysdate-2/24 and
TEMP_SPACE_ALLOCATED > (10*1024*1024*1024)
group by sql_id,session_id,session_serial# order by GIG desc;
Thanks for sharing this info. It helped me alot.
Great! I am glad that it was helpful. Thank you for your comment.
Bobby
Thank You Bobby.Very good insights to do better job on SQL tuning
Thanks
Arjun.M
Thanks. I am glad that it was helpful to you.
Bobby
thanks for yor support
You are welcome. I am glad that you found this post or blog helpful.
Bobby