We are going to use this view to generate a report, displaying some useful information about it.
BLOCKS column stores archived log's size in blocks, BLOCK_SIZE stores block size in bytes, so if we multiply them (BLOCKS*BLOCK_SIZE) we get its size in bytes.
ARCHIVED indicates whether the log is archived and DELETED whether RMAN has deleted it.
The following query displays per day the volume in MBytes of archived logs generated, deleted and of those that haven't yet been deleted by RMAN.
SELECT SUM_ARCH.DAY,
SUM_ARCH.GENERATED_MB,
SUM_ARCH_DEL.DELETED_MB,
SUM_ARCH.GENERATED_MB - SUM_ARCH_DEL.DELETED_MB "REMAINING_MB"
FROM ( SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
SUM (ROUND ( (blocks * block_size) / (1024 * 1024), 2))
GENERATED_MB
FROM V$ARCHIVED_LOG
WHERE ARCHIVED = 'YES'
GROUP BY TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY')) SUM_ARCH,
( SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
SUM (ROUND ( (blocks * block_size) / (1024 * 1024), 2))
DELETED_MB
FROM V$ARCHIVED_LOG
WHERE ARCHIVED = 'YES' AND DELETED = 'YES'
GROUP BY TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY')) SUM_ARCH_DEL
WHERE SUM_ARCH.DAY = SUM_ARCH_DEL.DAY(+)
ORDER BY TO_DATE (DAY, 'DD/MM/YYYY');
DAY | GENERATED_MB | DELETED_MB | REMAINING_MB |
29/06/2012 | 315.77 | 315.77 | 0 |
30/06/2012 | 158.57 | 158.57 | 0 |
01/07/2012 | 153.09 | 153.09 | 0 |
... | ... | ... | ... |
07/07/2012 | 1,040.22 | 1,040.22 | 0 |
08/07/2012 | 637.59 | 637.59 | 0 |
09/07/2012 | 328.72 | 174.28 | 154.44 |
The following report will display the number of archived logs generated per hour per day:
SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '00', 1, NULL))
"00-01",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '01', 1, NULL))
"01-02",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '02', 1, NULL))
"02-03",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '03', 1, NULL))
"03-04",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '04', 1, NULL))
"04-05",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '05', 1, NULL))
"05-06",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '06', 1, NULL))
"06-07",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '07', 1, NULL))
"07-08",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '08', 1, NULL))
"08-09",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '09', 1, NULL))
"09-10",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '10', 1, NULL))
"10-11",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '11', 1, NULL))
"11-12",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '12', 1, NULL))
"12-13",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '13', 1, NULL))
"13-14",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '14', 1, NULL))
"14-15",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '15', 1, NULL))
"15-16",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '16', 1, NULL))
"16-17",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '17', 1, NULL))
"17-18",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '18', 1, NULL))
"18-19",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '19', 1, NULL))
"19-20",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '20', 1, NULL))
"20-21",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '21', 1, NULL))
"21-22",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '22', 1, NULL))
"22-23",
SUM (DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '23', 1, NULL))
"23-00",
COUNT (*) TOTAL
FROM V$ARCHIVED_LOG
WHERE ARCHIVED='YES'
GROUP BY TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY')
ORDER BY TO_DATE (DAY, 'DD/MM/YYYY');
DAY | 00-01 | 01-02 | 02-03 | 03-04 | ... | 20-21 | 21-22 | 22-23 | 23-00 | TOTAL |
29/06/2012 | ... | 1 | 2 | 9 | ||||||
30/06/2012 | 1 | ... | 1 | 6 | ||||||
01/07/2012 | 1 | ... | 1 | 6 | ||||||
... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
07/07/2012 | 1 | 1 | 1 | ... | 2 | 1 | 2 | 2 | 29 | |
08/07/2012 | 1 | 1 | 1 | ... | 1 | 1 | 1 | 19 | ||
09/07/2012 | 2 | 1 | ... | 9 |
So, for instance, at 30/06/12, between 03:00 and 04:00, no log was archived.
At 07/07/12, between 20:00 and 21:00, 2 logs were archived and, at that day, 29 logs were archived totally.
The combination of these scripts is:
SELECT LOG_HISTORY.*,
SUM_ARCH.GENERATED_MB,
SUM_ARCH_DEL.DELETED_MB,
SUM_ARCH.GENERATED_MB - SUM_ARCH_DEL.DELETED_MB "REMAINING_MB"
FROM ( SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '00', 1, NULL))
"00-01",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '01', 1, NULL))
"01-02",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '02', 1, NULL))
"02-03",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '03', 1, NULL))
"03-04",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '04', 1, NULL))
"04-05",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '05', 1, NULL))
"05-06",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '06', 1, NULL))
"06-07",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '07', 1, NULL))
"07-08",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '08', 1, NULL))
"08-09",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '09', 1, NULL))
"09-10",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '10', 1, NULL))
"10-11",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '11', 1, NULL))
"11-12",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '12', 1, NULL))
"12-13",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '13', 1, NULL))
"13-14",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '14', 1, NULL))
"14-15",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '15', 1, NULL))
"15-16",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '16', 1, NULL))
"16-17",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '17', 1, NULL))
"17-18",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '18', 1, NULL))
"18-19",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '19', 1, NULL))
"19-20",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '20', 1, NULL))
"20-21",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '21', 1, NULL))
"21-22",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '22', 1, NULL))
"22-23",
SUM (
DECODE (TO_CHAR (COMPLETION_TIME, 'HH24'), '23', 1, NULL))
"23-00",
COUNT (*) TOTAL
FROM V$ARCHIVED_LOG
WHERE ARCHIVED = 'YES'
GROUP BY TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY')) LOG_HISTORY,
( SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
SUM (ROUND ( (blocks * block_size) / (1024 * 1024), 2))
GENERATED_MB
FROM V$ARCHIVED_LOG
WHERE ARCHIVED = 'YES'
GROUP BY TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY')) SUM_ARCH,
( SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
SUM (ROUND ( (blocks * block_size) / (1024 * 1024), 2))
DELETED_MB
FROM V$ARCHIVED_LOG
WHERE ARCHIVED = 'YES' AND DELETED = 'YES'
GROUP BY TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY')) SUM_ARCH_DEL
WHERE LOG_HISTORY.DAY = SUM_ARCH.DAY AND SUM_ARCH.DAY = SUM_ARCH_DEL.DAY(+)
ORDER BY TO_DATE (LOG_HISTORY.DAY, 'DD/MM/YYYY');
Thanks for your hard work!
ReplyDeleteGreat work..!!
ReplyDeleteWonderful job !
ReplyDeleteCan you provide similary for RAC please?
ReplyDeleteV$ARCHIVED_LOG contains information from all RAC's instances.
DeleteSo, the queries published in this post if ran in a RAC database, you will get aggregated results for it.
Job is wonderful.
ReplyDeleteI would like suggest some improvements. First and third selects shows only days where archivelogs are deleted.
Small improvements:
SELECT SUM_ARCH.DAY,
SUM_ARCH.GENERATED_GB,
NVL(SUM_ARCH_DEL.DELETED_GB,0) "DELETED_GB",
SUM_ARCH.GENERATED_GB - NVL(SUM_ARCH_DEL.DELETED_GB,0) "REMAINING_GB"
...
WHERE SUM_ARCH.DAY = SUM_ARCH_DEL.DAY(+)
ORDER BY TO_DATE (DAY, 'DD/MM/YYYY');
and
SELECT LOG_HISTORY.*,
SUM_ARCH.GENERATED_GB,
NVL(SUM_ARCH_DEL.DELETED_GB,0)"DELETED_GB",
SUM_ARCH.GENERATED_GB - NVL(SUM_ARCH_DEL.DELETED_GB,0) "REMAINING_GB"
FROM ( SELECT TO_CHAR (COMPLETION_TIME, 'DD/MM/YYYY') DAY,
...
WHERE LOG_HISTORY.DAY = SUM_ARCH.DAY AND SUM_ARCH.DAY = SUM_ARCH_DEL.DAY(+)
ORDER BY TO_DATE (LOG_HISTORY.DAY, 'DD/MM/YYYY');
Those select show data for all archivelogs dates.
Yes, you are right, I added the modifications.
DeleteThanks for the contribution.
If the logic is correct then that seems shorter
Deleteselect to_char (completion_time, 'yyyy-mm-dd') day
, round(sum (case when archived='YES' then blocks * block_size /1048576 end), 2) generated_mb
, round(sum (case when archived='YES' and deleted='YES' then blocks * block_size /1048576 end), 2) deleted_mb
, round(sum (case when archived='YES' and deleted='NO' then blocks * block_size /1048576 end), 2) remaining_mb
from v$archived_log
group by to_char (completion_time, 'yyyy-mm-dd')
order by day
/
I also used NVL function in selects especially in REMAINING column. I you think use of NVL is not important please omit this comment.
ReplyDeletehey, great work!
ReplyDeletei had to improve it a bit because i want to use it in a RAC+Dataguard enviroment
so i had to extent ther WHERE clausel for each query with
>>> and standby_dest='NO' <<<
i think this would work now generic for each type of database
regards
Andreas
Thank you. These queries are really great.
ReplyDeleteThis is awesome ... Many thanks for the hard work !!!
ReplyDeleteHi,
ReplyDeleteCan you advise how to change the SQL to have hour and dates output instead with the date as the heading?
Example as below:
HOUR 01-JAN-2015 02-JAN-2015 03-JAN-2015 04-JAN-2015
00
01
02
03
Hello.
DeleteNo, I can't think of a way to do this.
In this query we pivot V$ARCHIVED_LOG, which means we aggregate rows and we convert them to columns (The HOUR portion of COMPLETION_TIME).
With this method [SUM(DECODE(...))] or even the 11g PIVOT operator, you must have a finite set of values to decode the rows to them. A day always has 24 hours [00-01,...,23-00], but the days displayed in V$ARCHIVED_LOG are not always the same.
SAME OUTPUT, but better looking output.
ReplyDeleteAdd this at the beggining:
set linesize 300
set pagesize 300
col 00-01 format 999
col 01-02 format 999
col 02-03 format 999
col 03-04 format 999
col 04-05 format 999
col 05-06 format 999
col 06-07 format 999
col 07-08 format 999
col 08-09 format 999
col 10-11 format 999
col 11-12 format 999
col 12-13 format 999
col 13-14 format 999
col 14-15 format 999
col 15-16 format 999
col 16-17 format 999
col 17-18 format 999
col 18-19 format 999
col 20-21 format 999
col 21-22 format 999
col 22-23 format 999
col 23-00 format 999