the situation: a customer faces issues with his oracle database. you cannot go on-site for some reasons and you cannot connect to the database in question. what options do you have to support them ?
- request a current awr/ash report and try find an issue from there
- request execution plans if the issue is with bad performing statements and try if you might see anything there without knowing the data
- try to support them by phone
there is another option some people might not know or are not aware of: dumping the awr data and importing it to your own database for further analysis. as this will include only awr data no user data will be submitted ( except for literals in sql statements, if there are any ) and you don’t need to worry about data privacy.
first of all you need to ask for an awr extract. there is a script which does all the work and you just need to execute it:
SQL> @?/rdbms/admin/awrextr.sql
the header displayed already tells you what the script will do and how to use it:
~~~~~~~~~~~~~ AWR EXTRACT ~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~ This script will extract the AWR data for a range of snapshots ~ ~ into a dump file. The script will prompt users for the ~ ~ following information: ~ ~ (1) database id ~ ~ (2) snapshot range to extract ~ ~ (3) name of directory object ~ ~ (4) name of dump file ~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
the parameters the script will ask for are the same as when creating awr reports except you’ll additionally need a directory to store the dump file ( DATA_PUMP_DIR, by default ).
that’s it. the script will start the dump process:
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | The AWR extract dump file will be located | in the following directory/file: | /opt/oracle/product/base/11.2.0.3/rdbms/log/ | demo.dmp | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | | *** AWR Extract Started ... | | This operation will take a few moments. The | progress of the AWR extract operation can be | monitored in the following directory/file: | /opt/oracle/product/base/11.2.0.3/rdbms/log/ | rman.log End of AWR Extract
… and once finished the dump file can be copied from the location provided:
SQL> !ls /opt/oracle/product/base/11.2.0.3/rdbms/log/ rman.dmp rman.log dp.log
so far for the dump part. once you received the dump file how to import it into your own database ? not a big deal:
SQL> @?/rdbms/admin/awrload.sql
again, the header will tell you pretty much everything you’ll need to know:
~~~~~~~~~~ AWR LOAD ~~~~~~~~~~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~ This script will load the AWR data from a dump file. The ~ ~ script will prompt users for the following information: ~ ~ (1) name of directory object ~ ~ (2) name of dump file ~ ~ (3) staging schema name to load AWR data into ~ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
a few things to take care of:
- when you get asked for the dumpfile pass the filename without the file extension ( in my case “rman” instead of “rman.dmp” )
- the load will temporarily create a staging schema ( AWR_STAGE ) and will drop it once the load finished
so, now you have the remote databases’ awr data available on your private database. what to do next ? maybe, in the first step you would like to create a set of awr reports to get an overview of the remote system. now, your are able to do this.
in a first step you’ll need the dbid and instance id for the remote database:
SQL> select DBID,INSTANCE_NUMBER,DB_NAME,INSTANCE_NAME,HOST_NAME from DBA_HIST_DATABASE_INSTANCE; DBID INSTANCE_NUMBER DB_NAME INSTANCE_NAME HOST_NAME ------------- --------------- --------- ---------------- -------------------- 3551282359 1 DEMO demo ol6ora.local 1664148667 1 RMAN rman ol6ora2.local
in may case “RMAN” is the remote database for which the awr statistics where imported to my local database “DEMO”. let’s see what snapshots are available for this dbid:
SQL> select SNAP_ID,BEGIN_INTERVAL_TIME,END_INTERVAL_TIME from dba_hist_snapshot where DBID=1664148667; SNAP_ID BEGIN_INTERVAL_TIME END_INTERVAL_TIME ------- ------------------------- ------------------------- 8096 16-FEB-13 04.00.52.846 AM 16-FEB-13 05.00.54.990 AM 8097 16-FEB-13 05.00.54.990 AM 16-FEB-13 06.00.57.548 AM 8098 16-FEB-13 06.00.57.548 AM 16-FEB-13 07.00.59.883 AM 8099 16-FEB-13 07.00.59.883 AM 16-FEB-13 08.00.01.952 AM 8100 16-FEB-13 08.00.01.952 AM 16-FEB-13 09.00.04.200 AM 8101 16-FEB-13 09.00.04.200 AM 16-FEB-13 10.00.06.321 AM 8102 16-FEB-13 10.00.06.321 AM 16-FEB-13 11.00.09.183 AM 8103 16-FEB-13 11.00.09.183 AM 16-FEB-13 12.00.11.274 PM ...
that’s enough information to produce some reports:
SQL> define inst_num = 1; SQL> define inst_name = 'rman'; SQL> define db_name = 'RMAN'; SQL> define dbid = 1664148667; SQL> define begin_snap = 8098; SQL> define end_snap = 8100; SQL> define report_type = 'html'; SQL> define report_name = /tmp/first_awr_report.html SQL> @?/rdbms/admin/awrrpti.sql
maybe the reports you generated are sufficient for giving the right hints. but there is even more you could do with all the statistic data available:
as an example, let’s say you want to display the load of the server for a period of two snapshots. for this there is statistic called “LOAD” which you can query:
SELECT to_char(t.begin_interval_time, 'DD.MM.YYYY-HH24:MI:SS' ) , o.value FROM dba_hist_osstat o , dba_hist_snapshot t WHERE o.dbid = 1664148667 AND o.dbid = t.dbid AND o.snap_id = t.snap_id AND o.snap_id BETWEEN 8098 AND 8110 AND o.stat_name = 'LOAD' ORDER BY o.snap_id; TO_CHAR(T.BEGIN_INT VALUE ------------------- ---------- 16.02.2013-06:00:57 .829101563 16.02.2013-07:00:59 .629882813 16.02.2013-08:00:01 .309570313 16.02.2013-09:00:04 .1796875 16.02.2013-10:00:06 .329101563 16.02.2013-11:00:09 .279296875 16.02.2013-12:00:11 .119140625 16.02.2013-13:00:13 .649414063 16.02.2013-14:00:15 .919921875 16.02.2013-15:00:17 .459960938 16.02.2013-16:00:20 .889648438 16.02.2013-17:00:22 .629882813 16.02.2013-18:00:24 .549804688
wouldn’t it be nice to display theses values in a picture ? put these numbers to a file:
echo "16.02.2013-06:00:57 .829101563 16.02.2013-07:00:59 .629882813 16.02.2013-08:00:01 .309570313 16.02.2013-09:00:04 .1796875 16.02.2013-10:00:06 .329101563 16.02.2013-11:00:09 .279296875 16.02.2013-12:00:11 .119140625 16.02.2013-13:00:13 .649414063 16.02.2013-14:00:15 .919921875 16.02.2013-15:00:17 .459960938 16.02.2013-16:00:20 .889648438 16.02.2013-17:00:22 .629882813 16.02.2013-18:00:24 .549804688" > /tmp/graph1.lst
define a gnuplot definition:
echo "set terminal png set xlabel \"Time\" set xdata time set timefmt "%d.%m.%Y-%H:%M:%S" set title 'OS-Load' set format y \"%10.1f\" plot \"/tmp/graph1.lst\" using 1:2 with lines title 'OS Load'" > /tmp/graph1.plot
.. create the png file:
gnuplot /tmp/graph1.plot > graph1.png
… and have a look at it:
display graph1.png
server load
for those who don’t know: dbms_xplan also works with dbid, so you may generate the execution plans for statements from the remote instance ( although the below is not a very useful example ):
SELECT * FROM TABLE ( dbms_xplan.display_awr ( '3s1hh8cvfan6w' , '2137789089' , 1664148667 -- remote_dbid , 'ADVANCED' ) ); Plan hash value: 2137789089 --------------------------------------------------------------------------------------------- | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time | --------------------------------------------------------------------------------------------- | 0 | SELECT STATEMENT | | | | 30 (100)| | | 1 | COLLECTION ITERATOR PICKLER FETCH| DISPLAY | 8168 | 16336 | 30 (4)| 00:00:01 | --------------------------------------------------------------------------------------------- Query Block Name / Object Alias (identified by operation id): ------------------------------------------------------------- 1 - SEL$F5BB74E1 / KOKBF$0@SEL$2 Outline Data ------------- /*+ BEGIN_OUTLINE_DATA IGNORE_OPTIM_EMBEDDED_HINTS OPTIMIZER_FEATURES_ENABLE('11.2.0.3') DB_VERSION('11.2.0.3') OPT_PARAM('query_rewrite_enabled' 'false') ALL_ROWS FORCE_XML_QUERY_REWRITE XML_DML_RWT_STMT XMLINDEX_REWRITE XMLINDEX_REWRITE_IN_SELECT NO_COST_XML_QUERY_REWRITE OUTLINE_LEAF(@"SEL$F5BB74E1") MERGE(@"SEL$2") OUTLINE(@"SEL$1") OUTLINE(@"SEL$2") FULL(@"SEL$F5BB74E1" "KOKBF$0"@"SEL$2") END_OUTLINE_DATA */
combine all this into a set of scripts and you are ready to do some analysis even if you do not have access to the database. together with the power of gnuplot for creating nice graphs you are in a position for doing an initial troubleshooting.
two more sample images generated out of the awr data: